Skip to main content
The Scored Society: Due Process for Automated Predictions
Faculty Scholarship
  • Danielle Keats Citron, University of Maryland Francis King Carey School of Law
  • Frank A. Pasquale, University of Maryland Francis King Carey School of Law
Document Type
Publication Date
  • Big Data,
  • predictions,
  • artificial intelligence
Big Data is increasingly mined to rank and rate individuals. Predictive algorithms assess whether we are good credit risks, desirable employees, reliable tenants, valuable customers—or deadbeats, shirkers, menaces, and “wastes of time.” Crucial opportunities are on the line, including the ability to obtain loans, work, housing, and insurance. Though automated scoring is pervasive and consequential, it is also opaque and lacking oversight. In one area where regulation does prevail—credit—the law focuses on credit history, not the derivation of scores from data. Procedural regularity is essential for those stigmatized by “artificially intelligent” scoring systems. The American due process tradition should inform basic safeguards. Regulators should be able to test scoring systems to ensure their fairness and accuracy. Individuals should be granted meaningful opportunities to challenge adverse decisions based on scores miscategorizing them. Without such protections in place, systems could launder biased and arbitrary data into powerfully stigmatizing scores.
Publication Citation
89 Washington Law Review 1 (2014).
Citation Information
89 Washington Law Review 1 (2014).