The growing use of machine learning (ML) in medical prognostics, diagnostics, and treatment recommendations offers new powerful tools for addressing pressing health challenges. However, similarly to their knowledge-based predecessors, ML algorithms are not neutral and can mirror prevailing patterns of power and disadvantage in healthcare, entrenching racial disparities. The developers of clinical algorithms often face a paradox. Ignoring race can introduce racial bias through proxies. However, explicitly taking race into account can also replicate harmful stereotypes embedded in the category of race. This paper explores the problem of race and ethnicity in clinical algorithms from the European legal perspective, offering three main contributions. Firstly, it tackles the importance of the correct operationalization of race, arguing that since race is a social construct, developers of clinical algorithms should pay particular attention to the dimension of race that the data represents. In doing so, the paper addresses the challenges to race operationalization in the European context. Secondly, the paper analyses how race is used in the design of clinical algorithms, with a particular focus on fairness measures that take race into account at the prediction time, including race correction. Thirdly, the paper explores the legality of such measures under the Racial Equality Directive.

Assessing the Legality of Using the Category of Race and Ethnicity in Clinical Algorithms - the EU Anti-discrimination Law Perspective / M.A. WÓJCIK-SUFFIA (CEUR WORKSHOP PROCEEDINGS). - In: EWAF 2023 : European Workshop on Algorithmic Fairness / [a cura di] J.M. Alvarez, A. Fabris, C. Heitz, C. Hertweck, M. Loi, M. Zehlike. - [s.l] : CEUR-WS, 2023. - pp. 1-18 (( 2. European Workshop on Algorithmic Fairness Winterthur 2023.

Assessing the Legality of Using the Category of Race and Ethnicity in Clinical Algorithms - the EU Anti-discrimination Law Perspective

M.A. WÓJCIK-SUFFIA
2023

Abstract

The growing use of machine learning (ML) in medical prognostics, diagnostics, and treatment recommendations offers new powerful tools for addressing pressing health challenges. However, similarly to their knowledge-based predecessors, ML algorithms are not neutral and can mirror prevailing patterns of power and disadvantage in healthcare, entrenching racial disparities. The developers of clinical algorithms often face a paradox. Ignoring race can introduce racial bias through proxies. However, explicitly taking race into account can also replicate harmful stereotypes embedded in the category of race. This paper explores the problem of race and ethnicity in clinical algorithms from the European legal perspective, offering three main contributions. Firstly, it tackles the importance of the correct operationalization of race, arguing that since race is a social construct, developers of clinical algorithms should pay particular attention to the dimension of race that the data represents. In doing so, the paper addresses the challenges to race operationalization in the European context. Secondly, the paper analyses how race is used in the design of clinical algorithms, with a particular focus on fairness measures that take race into account at the prediction time, including race correction. Thirdly, the paper explores the legality of such measures under the Racial Equality Directive.
Clinical algorithms; race-correction algorithms; ML in healthcare; racial fairness; EU anti-discrimination law
Settore GIUR-17/A - Filosofia del diritto
2023
https://ceur-ws.org/Vol-3442/paper-51.pdf
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
2023 MA Wojcik EWAF_23 paper.pdf

accesso aperto

Tipologia: Publisher's version/PDF
Licenza: Creative commons
Dimensione 472.69 kB
Formato Adobe PDF
472.69 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/1203617
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact