As an essential element in industrial steel, automatic defect recognition can guarantee the surface quality through focused supervised learning with ample labelled samples. However, defect recognition inevitably features with data-limiting characteristic under the influence of costly expert labelling. To address this problem, a novel framework, Instance Contrast (InCo), is proposed with the inspiration of contrastive learning. This framework consists of two streams. One with instance labels attributed to the unlabelled data in each batch for classification, which is called Batch Instance Discrimination (BID). The other with different enhanced samples embedding of the same image aggregated by a new function named dynamic weighted variance loss (DWV loss). Therefore, better semantic features can be learned by model due to the moderation of embedding distance between similar steel defect images. Experimental results on the NEU-CLS database validate that the proposed method achieves 89.86% classification accuracy with only fine-tuning on the 1:32 training data, outperforming other general contrastive learning methods.

Instance contrastive learning with dynamic weighted variance for small sample steel defect recognition / Y. Liang, J. Chen, W. Zhou, X. Ying, Y. Zhai, R. DONIDA LABATI, V. Piuri, F. Scotti. - In: ELECTRONICS LETTERS. - ISSN 0013-5194. - 58:2(2022 Jan), pp. 50-52. [10.1049/ell2.12361]

Instance contrastive learning with dynamic weighted variance for small sample steel defect recognition

R. DONIDA LABATI;V. Piuri
Penultimo
;
F. Scotti
Ultimo
2022

Abstract

As an essential element in industrial steel, automatic defect recognition can guarantee the surface quality through focused supervised learning with ample labelled samples. However, defect recognition inevitably features with data-limiting characteristic under the influence of costly expert labelling. To address this problem, a novel framework, Instance Contrast (InCo), is proposed with the inspiration of contrastive learning. This framework consists of two streams. One with instance labels attributed to the unlabelled data in each batch for classification, which is called Batch Instance Discrimination (BID). The other with different enhanced samples embedding of the same image aggregated by a new function named dynamic weighted variance loss (DWV loss). Therefore, better semantic features can be learned by model due to the moderation of embedding distance between similar steel defect images. Experimental results on the NEU-CLS database validate that the proposed method achieves 89.86% classification accuracy with only fine-tuning on the 1:32 training data, outperforming other general contrastive learning methods.
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
gen-2022
3-nov-2021
Article (author)
File in questo prodotto:
File Dimensione Formato  
Electronics Letters - 2021 - Liang - Instance contrastive learning with dynamic weighted variance for small sample steel.pdf

accesso aperto

Tipologia: Publisher's version/PDF
Dimensione 730.48 kB
Formato Adobe PDF
730.48 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/897097
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact