An automatic tool, targeting low-cost, low-power, point-of-care embedded system, is proposed for fluorescence diagnostic imaging. This allows for a quick and accurate diagnosis even when used by nonexpert operators. To achieve this goal, an embedded system has been equipped with an end-to-end deep-learning algorithm that does not require manual parameter tuning to perform a diagnosis. The proposed deep convolutional model, named BrightNet, is based on a single-shot detector neural network, modified to estimate the brightness of the detected fluorescent spots in a low-density protein or DNA microarray and finalize the diagnosis. Several optimization steps are presented to compress the inference model size, which is required for the deployment into a portable resource-constrained device. The resulting inference time is about 66 [ms] on an i7 3770K desktop CPU and is estimated to be lower than 5 [s] on an ARM-Cortex M7 considering 1.1 × 109 multiply-accumulate operations. BrightNet has been successfully validated for the detection and discrimination of four different serotypes of the dengue virus in a set of human samples as well as for the diagnosis of West Nile virus in horse sera. When evaluated on the considered diagnostic tasks, BrightNet provides better average accuracy than a state-of-the-art variational approach that requires operator intervention, with significant additional advantages of complete automation and quicker diagnosis.

Samore A., Rusci M., Lazzaro D., Melpignano P., Benini L., Morigi S. (2020). BrightNet: A Deep CNN for OLED-Based Point of Care Immunofluorescent Diagnostic Systems. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 69(9), 6766-6775 [10.1109/TIM.2020.2973913].

BrightNet: A Deep CNN for OLED-Based Point of Care Immunofluorescent Diagnostic Systems

Rusci M.
Membro del Collaboration Group
;
Lazzaro D.
Membro del Collaboration Group
;
Benini L.
Membro del Collaboration Group
;
Morigi S.
2020

Abstract

An automatic tool, targeting low-cost, low-power, point-of-care embedded system, is proposed for fluorescence diagnostic imaging. This allows for a quick and accurate diagnosis even when used by nonexpert operators. To achieve this goal, an embedded system has been equipped with an end-to-end deep-learning algorithm that does not require manual parameter tuning to perform a diagnosis. The proposed deep convolutional model, named BrightNet, is based on a single-shot detector neural network, modified to estimate the brightness of the detected fluorescent spots in a low-density protein or DNA microarray and finalize the diagnosis. Several optimization steps are presented to compress the inference model size, which is required for the deployment into a portable resource-constrained device. The resulting inference time is about 66 [ms] on an i7 3770K desktop CPU and is estimated to be lower than 5 [s] on an ARM-Cortex M7 considering 1.1 × 109 multiply-accumulate operations. BrightNet has been successfully validated for the detection and discrimination of four different serotypes of the dengue virus in a set of human samples as well as for the diagnosis of West Nile virus in horse sera. When evaluated on the considered diagnostic tasks, BrightNet provides better average accuracy than a state-of-the-art variational approach that requires operator intervention, with significant additional advantages of complete automation and quicker diagnosis.
2020
Samore A., Rusci M., Lazzaro D., Melpignano P., Benini L., Morigi S. (2020). BrightNet: A Deep CNN for OLED-Based Point of Care Immunofluorescent Diagnostic Systems. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 69(9), 6766-6775 [10.1109/TIM.2020.2973913].
Samore A.; Rusci M.; Lazzaro D.; Melpignano P.; Benini L.; Morigi S.
File in questo prodotto:
File Dimensione Formato  
final.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 1.79 MB
Formato Adobe PDF
1.79 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/771945
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 8
social impact