TIL that over 60% of jurisdictions in the US use algorithmic pretrial assessment tools to help determine who should be jailed before their court date and who should not.

Why YSK: In many places it can take years for a case to come to trial. Judges have to make a decision in such cases to determine whether an accused person should be jailed in the interim. In recent years many jurisdictions have adopted algorithmic tools that essentially create a “risk profile” which assigns a value. Threshold values are used to classify the scores (for example “High Risk”). Some models include iterative algorithms (machine learning) and some do not.

The use of these models in pretrial assessments can have some major consequences. As you might know—aggregate data cannot be used to make sound predictions about individuals. What aggregate data allows you to do is model trends, correlations, and averages across groups.

We often use these profiles to help us make predictions on behalf of individuals. This is what “risk factors” in the medical sciences are usually doing. If 80% of people who are vegetarian and weigh 120 lbs have bone density issues later in life I might want to look at interventions to reduce that risk even if I could potentially fall into the 20% who have no issue.

While this seems acceptable with the intent to protect my health, I would be quite unhappy if a similarly accurate model was used to determine whether or not the state considers me a risk prior to trial. Now if I fall in the 20% I may turn out to be innocent and held against my will. Courts use similar tools to decide whether a child should be removed from a home prior to CPS investigation.

Now add in the fact that these models are fed existing criminal justice data which is often flawed and almost always contains a racial bias.

One familiar example is that black people are arrested at higher rates than other groups due to well documented racial bias in both policing policies and police themselves. A pretrial risk assessment tool that takes prior arrests into account will obviously perpetuate this bias.

Someone from the ACLU recently gave a presentation on this topic and I had never heard of it! Thought others might be interested as well.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • TodayILearned
  • DreamBathrooms
  • everett
  • osvaldo12
  • magazineikmin
  • thenastyranch
  • rosin
  • normalnudes
  • Youngstown
  • Durango
  • slotface
  • ngwrru68w68
  • kavyap
  • mdbf
  • InstantRegret
  • JUstTest
  • ethstaker
  • GTA5RPClips
  • tacticalgear
  • Leos
  • anitta
  • modclub
  • khanakhh
  • cubers
  • cisconetworking
  • provamag3
  • megavids
  • tester
  • lostlight
  • All magazines