Predictions — Whether Algorithmic or Human — May Not Be Fair
(This op-ed was first published in the Boston Globe on November 2, 2020.)

Algorithms recommend movies and determine whose taxes are audited. And in courtrooms across the country, they assess the risk a defendant poses to public safety.
On Tuesday, while Americans nationwide elect the country’s next president, Californians will also decide where to steer their state’s criminal justice system. Voting on Proposition 25, they’ll choose whether to replace the use of bail money with a cashless system that relies on algorithms to determine who is released and who is detained while their cases unfold.
The choice raises challenging questions about criminal justice reform, but also prompts deep philosophical queries on the role of predictive algorithms in American society more broadly.
(Continue reading the op-ed on the Boston Globe’s page here.)
Sharad Goel is an assistant professor at Stanford University and director of the Stanford Computational Policy Lab; Julian Nyarko is an assistant professor of law at Stanford University; and Roseanna Sommers is an assistant professor of law at the University of Michigan.