As an essential element in industrial steel, automatic defect recognition can guarantee the surface quality through focused supervised learning with ample labelled samples. However, defect recognition inevitably features with data-limiting characteristic under the influence of costly expert labelling. To address this problem, a novel framework, Instance Contrast (InCo), is proposed with the inspiration of contrastive learning. This framework consists of two streams. One with instance labels attributed to the unlabelled data in each batch for classification, which is called Batch Instance Discrimination (BID). The other with different enhanced samples embedding of the same image aggregated by a new function named dynamic weighted variance loss (DWV loss). Therefore, better semantic features can be learned by model due to the moderation of embedding distance between similar steel defect images. Experimental results on the NEU-CLS database validate that the proposed method achieves 89.86% classification accuracy with only fine-tuning on the 1:32 training data, outperforming other general contrastive learning methods.
Instance contrastive learning with dynamic weighted variance for small sample steel defect recognition / Y. Liang, J. Chen, W. Zhou, X. Ying, Y. Zhai, R. DONIDA LABATI, V. Piuri, F. Scotti. - In: ELECTRONICS LETTERS. - ISSN 0013-5194. - 58:2(2022 Jan), pp. 50-52. [10.1049/ell2.12361]
Instance contrastive learning with dynamic weighted variance for small sample steel defect recognition
R. DONIDA LABATI;V. PiuriPenultimo
;F. ScottiUltimo
2022
Abstract
As an essential element in industrial steel, automatic defect recognition can guarantee the surface quality through focused supervised learning with ample labelled samples. However, defect recognition inevitably features with data-limiting characteristic under the influence of costly expert labelling. To address this problem, a novel framework, Instance Contrast (InCo), is proposed with the inspiration of contrastive learning. This framework consists of two streams. One with instance labels attributed to the unlabelled data in each batch for classification, which is called Batch Instance Discrimination (BID). The other with different enhanced samples embedding of the same image aggregated by a new function named dynamic weighted variance loss (DWV loss). Therefore, better semantic features can be learned by model due to the moderation of embedding distance between similar steel defect images. Experimental results on the NEU-CLS database validate that the proposed method achieves 89.86% classification accuracy with only fine-tuning on the 1:32 training data, outperforming other general contrastive learning methods.File | Dimensione | Formato | |
---|---|---|---|
Electronics Letters - 2021 - Liang - Instance contrastive learning with dynamic weighted variance for small sample steel.pdf
accesso aperto
Tipologia:
Publisher's version/PDF
Dimensione
730.48 kB
Formato
Adobe PDF
|
730.48 kB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.