Synthetic defect generation is an important aid for advanced manufacturing and production processes. Industrial scenarios rely on automated image-based quality control methods to avoid time-consuming manual inspections and promptly identify products not complying with specific quality standards. However, these methods show poor performance in the case of ill-posed low-data training regimes, and the lack of defective samples, due to operational costs or privacy policies, strongly limits their large-scale applicability.To overcome these limitations, we propose an innovative architecture based on an unpaired image-to-image (I2I) translation model to guide a transformation from a defect-free to a defective domain for common industrial products and propose simultaneously localizing their synthesized defects through a segmentation mask. As a performance evaluation, we measure image similarity and variability using standard metrics employed for generative models. Finally, we demonstrate that inspection networks, trained on synthesized samples, improve their accuracy in spotting real defective products.

Adversarial defect synthesis for industrial products in low data regime / P. Coscia, A. Genovese, F. Scotti, V. Piuri - In: 2023 IEEE International Conference on Image Processing (ICIP)[s.l] : IEEE, 2023 Sep 11. - ISBN 978-1-7281-9835-4. - pp. 1360-1364 (( convegno ICIP tenutosi a Kuala Lumpur nel 2023 [10.1109/ICIP49359.2023.10222874].

Adversarial defect synthesis for industrial products in low data regime

P. Coscia
Primo
;
A. Genovese
Secondo
;
F. Scotti
Penultimo
;
V. Piuri
Ultimo
2023

Abstract

Synthetic defect generation is an important aid for advanced manufacturing and production processes. Industrial scenarios rely on automated image-based quality control methods to avoid time-consuming manual inspections and promptly identify products not complying with specific quality standards. However, these methods show poor performance in the case of ill-posed low-data training regimes, and the lack of defective samples, due to operational costs or privacy policies, strongly limits their large-scale applicability.To overcome these limitations, we propose an innovative architecture based on an unpaired image-to-image (I2I) translation model to guide a transformation from a defect-free to a defective domain for common industrial products and propose simultaneously localizing their synthesized defects through a segmentation mask. As a performance evaluation, we measure image similarity and variability using standard metrics employed for generative models. Finally, we demonstrate that inspection networks, trained on synthesized samples, improve their accuracy in spotting real defective products.
Synthetic defect generation; generative adversarial network; defective mask; residual network
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
   Edge AI Technologies for Optimised Performance Embedded Processing (EdgeAI)
   EdgeAI
   MINISTERO DELLO SVILUPPO ECONOMICO
   101097300

   Green responsibLe privACy preservIng dAta operaTIONs
   GLACIATION
   EUROPEAN COMMISSION

   Machine Learning-based, Networking and Computing Infrastructure Resource Management of 5G and beyond Intelligent Networks (MARSAL)
   MARSAL
   EUROPEAN COMMISSION
   H2020
   101017171

   SEcurity and RIghts in the CyberSpace (SERICS)
   SERICS
   MINISTERO DELL'UNIVERSITA' E DELLA RICERCA
   codice identificativo PE00000014
11-set-2023
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
icip23.pdf

accesso aperto

Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 2.08 MB
Formato Adobe PDF
2.08 MB Adobe PDF Visualizza/Apri
Adversarial_Defect_Synthesis_for_Industrial_Products_in_Low_Data_Regime.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 6.11 MB
Formato Adobe PDF
6.11 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/979348
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact