Ultrasonic wavefield imaging with a non-contact technology can provide detailed information about the health status of an inspected structure. However, high spatial resolution, often necessary for accurate damage quantification, typically demands a long scanning time. In this work, we investigate a novel methodology to acquire high-resolution wavefields with a reduced number of measurement points to minimize the acquisition time. Such methodology is based on the combination of compressive sensing and convolutional neural networks to recover high spatial frequency information from low-resolution images. A data set was built from 652 wavefield images acquired with a laser Doppler vibrometer describing guided ultrasonic wave propagation in eight different structures, with and without various simulated defects. Out of those 652 images, 326 cases without defect and 326 cases with defect were used as a training database for the convolutional neural network. In addition, 273 wavefield images were used as a testing database to validate the proposed methodology. For quantitative evaluation, two image quality metrics were calculated and compared to those achieved with different recovery methods or by training the convolutional neural network with non-wavefield images data set. The results demonstrate the capability of the technique for enhancing image resolution and quality, as well as similarity to the wavefield acquired on the full high-resolution grid of scan points, while reducing the number of measurement points down to 10% of the number of scan points for a full grid.

Keshmiri Esfandabadi Y., Bilodeau M., Masson P., De Marchi L. (2020). Deep learning for enhancing wavefield image quality in fast non-contact inspections. STRUCTURAL HEALTH MONITORING, 19(4), 1003-1016 [10.1177/1475921719873112].

Deep learning for enhancing wavefield image quality in fast non-contact inspections

Keshmiri Esfandabadi Y.
;
De Marchi L.
2020

Abstract

Ultrasonic wavefield imaging with a non-contact technology can provide detailed information about the health status of an inspected structure. However, high spatial resolution, often necessary for accurate damage quantification, typically demands a long scanning time. In this work, we investigate a novel methodology to acquire high-resolution wavefields with a reduced number of measurement points to minimize the acquisition time. Such methodology is based on the combination of compressive sensing and convolutional neural networks to recover high spatial frequency information from low-resolution images. A data set was built from 652 wavefield images acquired with a laser Doppler vibrometer describing guided ultrasonic wave propagation in eight different structures, with and without various simulated defects. Out of those 652 images, 326 cases without defect and 326 cases with defect were used as a training database for the convolutional neural network. In addition, 273 wavefield images were used as a testing database to validate the proposed methodology. For quantitative evaluation, two image quality metrics were calculated and compared to those achieved with different recovery methods or by training the convolutional neural network with non-wavefield images data set. The results demonstrate the capability of the technique for enhancing image resolution and quality, as well as similarity to the wavefield acquired on the full high-resolution grid of scan points, while reducing the number of measurement points down to 10% of the number of scan points for a full grid.
2020
Keshmiri Esfandabadi Y., Bilodeau M., Masson P., De Marchi L. (2020). Deep learning for enhancing wavefield image quality in fast non-contact inspections. STRUCTURAL HEALTH MONITORING, 19(4), 1003-1016 [10.1177/1475921719873112].
Keshmiri Esfandabadi Y.; Bilodeau M.; Masson P.; De Marchi L.
File in questo prodotto:
File Dimensione Formato  
DeepLearning_DeMarchi.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 3.64 MB
Formato Adobe PDF
3.64 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/712954
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 32
  • ???jsp.display-item.citation.isi??? 30
social impact