Touch exploration of fabric is used to evaluate its properties, and it could further be leveraged to understand a consumer's sensory experience and preference so as to support them in real time to make careful clothing purchase decisions. In this paper, we open up opportunities to explore the use of technology to provide such support with our FabricTouch dataset, i.e., a multimodal dataset of fabric assessment touch gestures. The dataset consists of bilateral forearm movement and muscle activity data captured while 15 people explored 114 different garments in total to evaluate them according to 5 properties (warmth, thickness, smoothness, softness, and flexibility). The dataset further includes subjective ratings of the garments with respect to each property and ratings of pleasure experienced in exploring the garment through touch. We further report baseline work on automatic detection. Our results suggest that it is possible to recognise the type of fabric property that a consumer is exploring based on their touch behaviour. We obtained mean F1 score of 0.61 for unseen garments, for 5 types of fabric property. The results also highlight the possibility of additionally recognizing the consumer's subjective rating of the fabric when the property being rated is known, mean F1 score of 0.97 for unseen subjects, for 3 rating levels.

FabricTouch: A Multimodal Fabric Assessment Touch Gesture Dataset to Slow Down Fast Fashion / T. Olugbade, L. Lin, A. Sansoni, N. Warawita, Y. Gan, X. Wei, B. Petreca, G. Boccignone, D. Atkinson, Y. Cho, S. Baurley, N. Bianchi-Berthouze - In: 2023 11th International Conference on Affective Computing and Intelligent Interaction (ACII)[s.l] : IEEE, 2023. - ISBN 979-8-3503-2743-4. - pp. 1-8 (( Intervento presentato al 11. convegno International Conference on Affective Computing and Intelligent Interaction tenutosi a Cambridge nel 2023 [10.1109/ACII59096.2023.10388086].

FabricTouch: A Multimodal Fabric Assessment Touch Gesture Dataset to Slow Down Fast Fashion

G. Boccignone;
2023

Abstract

Touch exploration of fabric is used to evaluate its properties, and it could further be leveraged to understand a consumer's sensory experience and preference so as to support them in real time to make careful clothing purchase decisions. In this paper, we open up opportunities to explore the use of technology to provide such support with our FabricTouch dataset, i.e., a multimodal dataset of fabric assessment touch gestures. The dataset consists of bilateral forearm movement and muscle activity data captured while 15 people explored 114 different garments in total to evaluate them according to 5 properties (warmth, thickness, smoothness, softness, and flexibility). The dataset further includes subjective ratings of the garments with respect to each property and ratings of pleasure experienced in exploring the garment through touch. We further report baseline work on automatic detection. Our results suggest that it is possible to recognise the type of fabric property that a consumer is exploring based on their touch behaviour. We obtained mean F1 score of 0.61 for unseen garments, for 5 types of fabric property. The results also highlight the possibility of additionally recognizing the consumer's subjective rating of the fabric when the property being rated is known, mean F1 score of 0.97 for unseen subjects, for 3 rating levels.
Dataset; fabric; gesture recognition; movement; multimodal; muscle activity; touch
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
2023
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
2023_FabricTouch_A_Multimodal_Fabric_Assessment_Touch_Gesture_Dataset_to_Slow_Down_Fast_Fashion.pdf

accesso aperto

Tipologia: Publisher's version/PDF
Dimensione 3.21 MB
Formato Adobe PDF
3.21 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/1030310
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact