Review scores collect users’ opinions in a simple and intuitive manner. However, review scores are also easily manipulable, hence they are often accompanied by explanations. A substantial amount of research has been devoted to ascertaining the quality of reviews, to identify the most useful and authentic scores through explanation analysis. In this paper, we advance the state of the art in review quality analysis. We introduce a rating system to identify review arguments and to define an appropriate weighted semantics through formal argumentation theory. We introduce an algorithm to construct a corresponding graph, based on a selection of weighted arguments, their semantic similarity, and the supported ratings. We provide an algorithm to identify the model of such an argumentation graph, maximizing the overall weight of the admitted nodes and edges. We evaluate these contributions on the Amazon review dataset by McAuley et al. [15], by comparing the results of our argumentation assessment with the upvotes received by the reviews. Also, we deepen the evaluation by crowdsourcing a multidimensional assessment of reviews and comparing it to the argumentation assessment. Lastly, we perform a user study to evaluate the explainability of our method. Our method achieves two goals: (1) it identifies reviews that are considered useful, comprehensible, truthful by online users and does so in an unsupervised manner, and (2) it provides an explanation of quality assessments.

Assessing the Quality of Online Reviews Using Formal Argumentation Theory / D. Ceolin, G. Primiero, J. Wielemaker, M. Soprano (LECTURE NOTES IN ARTIFICIAL INTELLIGENCE). - In: Web Engineering / [a cura di] M. Brambilla, R. Chbeir, F. Frasincar, I. Manolescu. - [s.l] : Springer, 2021. - ISBN 9783030742959. - pp. 71-87 (( Intervento presentato al 21. convegno ICWE tenutosi a Biarritz nel 2021 [10.1007/978-3-030-74296-6_6].

Assessing the Quality of Online Reviews Using Formal Argumentation Theory

G. Primiero;
2021

Abstract

Review scores collect users’ opinions in a simple and intuitive manner. However, review scores are also easily manipulable, hence they are often accompanied by explanations. A substantial amount of research has been devoted to ascertaining the quality of reviews, to identify the most useful and authentic scores through explanation analysis. In this paper, we advance the state of the art in review quality analysis. We introduce a rating system to identify review arguments and to define an appropriate weighted semantics through formal argumentation theory. We introduce an algorithm to construct a corresponding graph, based on a selection of weighted arguments, their semantic similarity, and the supported ratings. We provide an algorithm to identify the model of such an argumentation graph, maximizing the overall weight of the admitted nodes and edges. We evaluate these contributions on the Amazon review dataset by McAuley et al. [15], by comparing the results of our argumentation assessment with the upvotes received by the reviews. Also, we deepen the evaluation by crowdsourcing a multidimensional assessment of reviews and comparing it to the argumentation assessment. Lastly, we perform a user study to evaluate the explainability of our method. Our method achieves two goals: (1) it identifies reviews that are considered useful, comprehensible, truthful by online users and does so in an unsupervised manner, and (2) it provides an explanation of quality assessments.
Formal argumentation theory; Online reviews; Information quality
Settore M-FIL/02 - Logica e Filosofia della Scienza
   Dipartimenti di Eccellenza 2018-2022 - Dipartimento di FILOSOFIA
   MINISTERO DELL'ISTRUZIONE E DEL MERITO
2021
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
ICWE21__Copy_ (12).pdf

accesso riservato

Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 224.01 kB
Formato Adobe PDF
224.01 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Ceolin2021_Chapter_AssessingTheQualityOfOnlineRev.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 630.09 kB
Formato Adobe PDF
630.09 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/843864
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 3
social impact