Przejdź do treści

Lensless Microscopy

In-line holography muti-height Gerchberg-Saxton phase retrieval with automatic affine transform

In-line holography muti-height Gerchberg-Saxton phase retrieval with automatic affine transform – firstly, each in-line hologram is reconstructed (propagated to the object plane), then ‘AutoAffineTransform.m’ function is applied to extract and match features between n-th and last reconstruction. Basing on these features, affine transforms (AT) are estimated and applied to the input holograms to correct the xy shift and magnification mismatch between holograms. fter this preprocessing step, Gerchberg-Saxton (GS) multi-height algorithm is performed to retrieve object phase with minimized twin image noise comparing to single-frame angular spectrum backpropagation.

Author: Mikołaj Rogalski

UTIRnet – supervised universal convolutional neural network for twin-image effect removal in digital in-line holographic microscopy (DIHM). UTIRnet is trained fully on numerically generated images, which makes it a easy to adapt for any DIHM system (no collecting and labelling experimental training data is required).

Author: Mikołaj Rogalski

DarkFocus - autofocusing algorithm for lensless digital in-line holographic microscopy

dark field

dark field gradient

DarkFocus – autofocusing algorithm for lensless digital in-line holographic microscopy (DIHM). Basing on the given range of distances, it finds the distance between the hologram and object focus plane in fully automatic fashion. Thanks to the darkfield propagation, DarkFocus is works as well for amplitude (absorbing) as for phase (transparent) objects.

Author: Maciej Trusiak

DarkTrack

DarkTrack – algorithm for tracking small objects in 3D (x,y,z) in lensless microscopy. It employs simple binarization and segmentation procedures to localize objects in x,y and then with the use of DarkFocus, it finds focus distance (z location) of each object inside field of view. If provided with a set of holograms, then it is also linking detected objects in adjacent frames to provide temporal tracking.

Author: Mikołaj Rogalski

Fourier ptychography

Fourier ptychographic microscopy (FPM)-app

FPM app – GUI application for Fourier ptychographic microscopy reconstruction, that was designed to work with datasets collected with LED array microscope. User-friendly interface allows for simple adjustment of LED array used in the experiment along with other system parameters. FPM app also enables simulating FPM system and creating synthetic FPM datasets.

Author: Mikołaj Rogalski

Optical fringe pattern processing

DeepOrientation

DeepOrientation – convolutional neural network that retrieves local orientation map of input optical fringe pattern image (in 0-pi range). Input image needs to be firstly preprocessed (denoised and background removed). Network was trained on simulated data, but is able to work both on simulated and experimental images without any further training.
 
Author: Maria Cywińska

fpFIF2 - 2D fringe pattern Fast Iterative FIltering

fpFIF2 – 2D implementation of Iterative Filtering algorithm, that was designed to work with optical fringe patterns. It decomposes the given image into several instrinct mode functions (IMFs) and each IMF consists a different frequency component of the image. After summing up the appropriate IMFs, it is possible to obtain: noise, fringes or background component of the input image.
 
Author: Mikołaj Rogalski

DeepDensityNetModel

DeepDensity – convolutional neural network that retrieves local density map of input optical fringe pattern image. Input image needs to be firstly preprocessed (denoised and background removed). Network was trained on simulated data, but is able to work both on simulated and experimental images without any further training.
 
Author: Maria Cywińska

iPG-BEMD - improved period guided bidimensional empirical mode decomposition algorithm

iPG-BEMD – improved period guided bidimensional empirical mode decomposition algorithm – fully automatic and adaptive fringe pattern pre-processing technique that is based on empirical mode decomposition algorithm. It decomposes the given fringe pattern image into three components: noise, fringes or background.
 
Author: Paweł Gocłowski