Defocus particle tracking (DPT) has gained increasing importance for its use to determine particle trajectories in all three dimensions with a single-camera system, as typical for a standard microscope, the workhorse of today's ongoing biomedical revolution. DPT methods derive the depth coordinates of particle images from the different defocusing patterns that they show when observed in a volume much larger than the respective depth of field. Therefore it has become common for state-of-the-art methods to apply image recognition techniques. Two of the most commonly and widely used DPT approaches are the application of (astigmatism) particle image model functions (MF methods) and the normalized cross-correlations between measured particle images and reference templates (CC methods). Though still young in the field, the use of neural networks (NN methods) is expected to play a significant role in future and more complex defocus tracking applications. To assess the different strengths of such defocus tracking approaches, we present in this work a general and objective assessment of their performances when applied to synthetic and experimental images of different degrees of astigmatism, noise levels, and particle image overlapping. We show that MF methods work very well in low-concentration cases, while CC methods are more robust and provide better performance in cases of larger particle concentration and thus stronger particle image overlap. The tested NN methods generally showed the lowest performance, however, in comparison to the MF and CC methods, they are yet in an early stage and have still great potential to develop within the field of DPT.
Barnkob R., Cierpka C., Chen M., Sachs S., Mader P., Rossi M. (2021). Defocus particle tracking: A comparison of methods based on model functions, cross-correlation, and neural networks. MEASUREMENT SCIENCE & TECHNOLOGY, 32(9), 1-14 [10.1088/1361-6501/abfef6].
Defocus particle tracking: A comparison of methods based on model functions, cross-correlation, and neural networks
Rossi M.
Ultimo
2021
Abstract
Defocus particle tracking (DPT) has gained increasing importance for its use to determine particle trajectories in all three dimensions with a single-camera system, as typical for a standard microscope, the workhorse of today's ongoing biomedical revolution. DPT methods derive the depth coordinates of particle images from the different defocusing patterns that they show when observed in a volume much larger than the respective depth of field. Therefore it has become common for state-of-the-art methods to apply image recognition techniques. Two of the most commonly and widely used DPT approaches are the application of (astigmatism) particle image model functions (MF methods) and the normalized cross-correlations between measured particle images and reference templates (CC methods). Though still young in the field, the use of neural networks (NN methods) is expected to play a significant role in future and more complex defocus tracking applications. To assess the different strengths of such defocus tracking approaches, we present in this work a general and objective assessment of their performances when applied to synthetic and experimental images of different degrees of astigmatism, noise levels, and particle image overlapping. We show that MF methods work very well in low-concentration cases, while CC methods are more robust and provide better performance in cases of larger particle concentration and thus stronger particle image overlap. The tested NN methods generally showed the lowest performance, however, in comparison to the MF and CC methods, they are yet in an early stage and have still great potential to develop within the field of DPT.File | Dimensione | Formato | |
---|---|---|---|
2021-Barnkob-MST-comparison-pp.pdf
Open Access dal 09/06/2022
Tipo:
Postprint
Licenza:
Licenza per Accesso Aperto. Creative Commons Attribuzione - Non commerciale - Non opere derivate (CCBYNCND)
Dimensione
6.09 MB
Formato
Adobe PDF
|
6.09 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.