The PSA was developed using the largest, most diverse set of pretrial records ever assembled—750,000 cases from nearly 300 jurisdictions across the United States. The researchers analyzed the data to determine which factors were most predictive of the likelihood a person would fail to appear at future court appearances or would commit a new crime—or a new violent crime—while awaiting trial.


The vast majority of the 750,000 people in our nation's jails on any given day are awaiting trial and presumed innocent. Many of them are charged with low-level, nonviolent crimes, but are unable to afford money bail. Taxpayers spend $14 billion on pretrial incarceration each year—and that does not take into account the human costs. Even a short time in pretrial detention can cause someone to lose his or her job, housing, health care, and child custody.

A number of research studies have documented the harmful effects of pretrial detention on individuals, families, and communities. For example, people who are detained before trial are more likely to be convicted, more likely to be sentenced to incarceration, and more likely to receive longer sentences than people who were not detained.

Learn more

Laura and John Arnold Foundation's Research Values

While more jurisdictions begin to adopt the PSA, the Laura and John Arnold Foundation (LJAF) remains committed to several key values. First is evaluation. The criminal justice community needs objective evidence as to whether the PSA actually improves decision making when judges start using it. To that end, LJAF is funding several independent researchers (including Harvard, MDRC, and Research Triangle International) to determine the PSA's impact in jurisdictions ranging from the entire state of New Jersey to Dane County, Wisconsin. The preliminary data is promising: jurisdictions are reporting decreases in jail populations without increases in crime.

The second key value is local validation and improvement. As it stands, the PSA is a national model. It is likely, however, that the PSA could be improved by being tailored to a given jurisdiction's data. In as many PSA jurisdictions as possible, LJAF wants to start collecting more years of local data, and to analyze whether the PSA's predictive algorithm could be further improved.

LJAF is keenly aware of the arguments against using data science and algorithms in criminal justice. First, some have argued that algorithms run the risk of perpetuating or even exacerbating existing biases within the system. Where discriminatory policing practices exist, a risk assessment might reflect that bias, because certain assessed individuals are more likely to have prior offenses as a result of over-policing. This is a serious concern, and all stakeholders must support and engage with efforts to eradicate bias from all phases of the criminal justice system. There is no one answer, but LJAF believes that relying on data—rather than a judge's instinct or interview questions that are themselves subjective and introduce bias—will move us toward a fairer system.

Second, some have argued that algorithms are necessarily imperfect, and that they cannot capture everything that is relevant about a defendant (including courtroom demeanor). LJAF agrees wholeheartedly, and believes that no one assessment should substitute for informed judgment by a human being. The point of the PSA and other algorithms is to help human judgment be better informed.

Above all, LJAF values listening, learning, and collaboration. Over time, research will show what works and what could work better. LJAF will always remain open to continuous improvement of the PSA and other thoughtful reform efforts.

Selected Research Articles

  • The Public Safety Assessment: A Re-Validation and Assessment of Predictive Utility and Differential Prediction by Race and Gender in KentuckyAuthors: Matthew DeMichele, Peter Baumgartner, Michael Wenger, Kelle Barrick, Megan Comfort & Shilpi Misra
    • The authors found the PSA predicts well across all three outcomes—Failure to Appear (FTA), New Criminal Activity (NCA), and New Violent Criminal Activity (NVCA); its predictive abilities fall within what is considered good in the criminal justice field.
    • For NCA and NVCA, the PSA predicts equally well for black and white defendants.
    • For FTA, the PSA predicts differently for black and white defendants in that it assigns black defendants lower risk scores than white defendants who fail to appear at the same rate.
  • What do Criminal Justice Professionals Think About Risk Assessment at Pretrial?Authors: Matthew DeMichele, Peter Baumgartner, Kelle Barrick, Megan Comfort, Samuel Scaggs & Shilpi Misra
    • 79 percent of judges surveyed report that the PSA’s recommendation “always” or “often” informs their decision making.
    • 61 percent of judges, prosecutors, public defenders, and pretrial officers surveyed report they “often” agree with the PSA’s recommendation.
  • The Intuitive-Override Model: Nudging Judges Toward Pretrial Risk Assessment InstrumentsAuthors: Matthew DeMichele, Megan Comfort, Shilpi Misra, Kelle Barrick & Peter Baumgartner
    • The authors’ findings indicate that judges perceive a tension when reconciling the actuarial aspect of the PSA and their experience-based inclination to learn about defendants’ lives as a way of assessing risk through determinations of culpability and blameworthiness.
    • Authors recommend the creation of researcher-judge feedback loops along with increased transparency of model development.
  • Assessing Risk Assessment in ActionAuthor: Megan Stevenson
    • Stevenson’s paper shows that a 2011 law making risk assessment a mandatory part of the bail decision in Kentucky led to a significant change in bail setting practice, but only a small increase in pretrial release.
    • Risk assessment had no effect on racial disparities in pretrial detention once differing regional trends were accounted for.
    • Stevenson calls for additional research and experimentation to help risk assessment produce larger benefits.