Skip to main content
Article
Prevention of Bias and Discrimination in Clinical Practice Algorithms
JAMA (2023)
  • Carmel Shachar, Harvard University
  • Sara Gerke, Penn State Dickinson Law
Abstract
The Department of Health and Human Services (DHHS) recently announced its intention to combat the use of biased algorithms in health care decision-making and telehealth services.1 It is a fact that many clinical algorithms are flawed, either because they incorporate bias by design or because they are trained on biased data sets, and it is important to combat discrimination by algorithm. Some examples of discrimination by algorithm include a machine learning tool to diagnose Alzheimer disease that misdiagnoses patients who speak with certain accents; an algorithm designed to distinguish between malignant and benign moles that was trained mostly on light-skinned patients; and a scheduling tool whose algorithm encouraged double booking for lower-income patients because they are more likely to be a “no-show.”

Disciplines
Publication Date
January 5, 2023
DOI
doi:10.1001/jama.2022.23867
Citation Information
Carmel Shachar and Sara Gerke. "Prevention of Bias and Discrimination in Clinical Practice Algorithms" JAMA Vol. 329 Iss. 4 (2023) p. 283
Available at: http://works.bepress.com/sara-gerke/88/