Are Risk Assessment Algorithms Fair, or Racist?

Above The Noise
31 May 201704:38

Summary

TLDRThis script discusses how the US criminal justice system employs algorithms to forecast criminal behavior, aiming to reduce prison populations while ensuring public safety. Algorithms, akin to Netflix's recommendation system, analyze factors like age, criminal history, and education to predict recidivism. Despite their potential to alleviate mass incarceration, these tools are controversial due to concerns about perpetuating racial biases and inaccuracies in predictions.

Takeaways

  • 🌐 The US has a high incarceration rate, with a disproportionate number of people of color imprisoned.
  • 💰 High prison rates are costly and contribute to racial inequalities.
  • 🔍 There's a push from both political parties to reduce prison populations while maintaining public safety.
  • 💻 Law enforcement is using computer algorithms to help with this effort.
  • 🎥 Algorithms are formulas used for various purposes, like predicting movie preferences on Netflix.
  • 🔮 Risk assessment tools in the justice system predict the likelihood of reoffending.
  • 👨‍⚖️ Parole boards consider various factors, including behavior and participation in prison programs, to decide parole.
  • 🧠 Human biases can influence parole board decisions, which risk assessment tools aim to mitigate.
  • 📊 These tools use data like age, criminal history, and education level to assess risk.
  • 📈 Well-validated risk assessment tools often outperform professional opinion alone in predicting behavior.
  • ⚖️ There are concerns that these tools could perpetuate racial biases present in the criminal justice system.
  • 🔍 There is a call for more oversight and testing to understand the impact of these tools on racial disparities.

Q & A

  • What is the current state of mass incarceration in the United States?

    -In 2013, one out of every 110 adults in America were incarcerated, with a disproportionate number of people of color being imprisoned.

  • How does the racial disparity manifest in the U.S. prison system?

    -In 2010, black men were six times more likely to be behind bars than white men, indicating a significant racial disparity.

  • What is a parole hearing and why is it important?

    -A parole hearing is a process where a parole board, typically composed of former law enforcement professionals, decides if an inmate is ready for release. It's important for determining recidivism risk and public safety.

  • What role do computer algorithms play in the criminal justice system?

    -Computer algorithms, specifically risk assessment tools, are used to predict the likelihood of an individual reoffending, aiming to assist parole boards in making more informed decisions.

  • How do risk assessment tools function?

    -These tools use data such as age, criminal history, education level, and personal circumstances to predict recidivism. They compare an individual's responses to a database of past offenders to assess risk.

  • What are some potential biases that could affect parole board decisions?

    -Parole boards are made up of humans who may have personal biases that could influence their judgment on whether an individual will reoffend.

  • How do risk assessment tools compare to professional opinion in predicting behavior?

    -Research shows that well-validated risk assessment tools often outperform professional opinion alone in predicting recidivism.

  • What are the potential benefits of using risk assessment tools in the criminal justice system?

    -They could help reduce mass incarceration by keeping low-risk offenders out of prison or releasing them earlier, thus potentially winding down mass incarceration.

  • What are the controversies surrounding the use of risk assessment tools?

    -There are concerns that these tools might unintentionally perpetuate racial biases already present in the system, as they could be influenced by factors like arrest rates which disproportionately affect certain racial groups.

  • Why is it important to have oversight and testing for risk assessment tools?

    -Oversight and testing are crucial to ensure that the algorithms are fair, unbiased, and effective, and to understand their impact on racial disparities within the criminal justice system.

  • What are some of the questions that the script raises for further discussion?

    -The script raises questions about the accuracy and fairness of predictive algorithms, the potential for racial bias, and the need for transparency and regulation in their use within the criminal justice system.

Outlines

00:00

🔍 Criminal Justice and Math

This paragraph discusses the application of mathematical algorithms in the criminal justice system to predict criminal behavior. It highlights the issue of mass incarceration in America, particularly the disproportionate imprisonment of people of color. The paragraph explains how risk assessment tools, analogous to Netflix's recommendation algorithms, are used to predict the likelihood of reoffending. These tools consider various factors such as age, criminal history, education, and substance abuse to make their predictions. The goal is to reduce the prison population while ensuring public safety, but there are concerns about potential racial bias and the imperfections of these predictive tools.

Mindmap

Keywords

💡Criminal Justice System

The criminal justice system refers to the institutions and processes involved in the administration of the law, particularly in relation to criminal offenses. It encompasses the police, courts, and corrections. In the video, it is discussed in the context of using math and algorithms to predict criminal behavior, aiming to reduce prison populations while ensuring public safety.

💡Mass Incarceration

Mass incarceration is a term used to describe the phenomenon of imprisoning large numbers of people, often disproportionately affecting certain demographics. The video uses the term to highlight the high prison rates in America, particularly among black men, and the push to reduce these rates.

💡Computer Algorithms

Computer algorithms are sets of rules or formulas used by computers to perform tasks. In the video, algorithms are discussed as tools used in the criminal justice system to predict criminal behavior, similar to how Netflix uses them to predict movie preferences.

💡Risk Assessment Tools

Risk assessment tools are algorithms used in the criminal justice system to predict the likelihood of a criminal reoffending. These tools analyze data such as age, criminal history, and education level to make predictions. The video explains how these tools are used in parole hearings to assist in deciding whether to release inmates.

💡Parole Hearing

A parole hearing is a legal proceeding where a parole board evaluates an inmate's suitability for release before the completion of their sentence. The video describes how parole boards consider various factors, including behavior in prison and participation in programs, and how risk assessment tools can aid in this decision-making process.

💡Bias

Bias refers to a predisposition or preference towards or against something, which can influence judgment. In the video, human bias in parole boards is contrasted with the objective nature of risk assessment tools, which aim to reduce the influence of personal biases in predicting criminal behavior.

💡Racial Disparities

Racial disparities refer to the unequal treatment or outcomes experienced by different racial groups. The video discusses concerns that risk assessment tools might inadvertently perpetuate existing racial biases in the criminal justice system, such as higher arrest rates for black men.

💡Predictive Accuracy

Predictive accuracy is the measure of how well a model or tool can predict future outcomes. The video notes that while risk assessment tools are more accurate than professional opinion alone, they are not perfect and cannot predict with 100% certainty whether someone will reoffend.

💡Oversight

Oversight refers to the process of monitoring or supervising to ensure proper conduct or adherence to rules. In the context of the video, it discusses the need for greater oversight and testing of risk assessment tools to ensure they do not contribute to racial bias.

💡Data Sets

Data sets are collections of data, often used for analysis or to train algorithms. The video implies that the effectiveness and fairness of risk assessment tools depend on the quality and representativeness of the data sets they are based on.

💡Public Safety

Public safety refers to the protection of the health, welfare, and security of the public. The video discusses the balance between reducing prison populations and ensuring public safety, which is a key consideration in the use of risk assessment tools.

Highlights

The criminal justice system uses math to predict criminal behavior.

In 2013, one out of 110 adults were locked up in America.

A disproportionate amount of prisoners are people of color.

In 2010, black men were six times more likely to be behind bars than white men.

High prison rates are referred to as mass incarceration.

Efforts from both Democrats and Republicans aim to reduce the prison population.

Computer algorithms are used to predict if a person will recommit a crime.

Risk assessment tools are algorithms used in the justice system.

Parole boards decide if a convict is ready to be released.

Parole boards are made up of former law enforcement professionals.

Risk assessment tools use data to predict the likelihood of reoffending.

These tools consider factors like age, criminal record, education level, and substance abuse.

Controversy exists around risk assessment tools and their potential to contribute to racial biases.

Research shows risk assessment tools predict behavior better than professional opinion alone.

Risk assessment tools could help reduce mass incarceration by keeping low-risk offenders out of prison.

There's a push for greater oversight and more testing of risk assessment tools.

Predictive algorithms are only as good as the people writing them and the data they're based on.

Transcripts

play00:03

- What's worse than math?

play00:06

Prison.

play00:07

Today we're talking about how the criminal justice system

play00:09

uses math to predict criminal behavior.

play00:13

(upbeat music)

play00:17

First off, America has a ton of people in prison.

play00:20

In 2013, one out of 110 adults were locked up

play00:24

and a disproportionate amount of those are people of color.

play00:28

In 2010, black men were six times more likely

play00:31

to be behind bars than white men.

play00:33

If you've heard the term mass incarceration,

play00:35

it's referring to these high prison rates.

play00:37

And keeping so many people in prison costs a lot of money

play00:39

and contributes to racial inequalities in America

play00:42

so there's a big push from both Democrats

play00:44

and Republicans to find ways to reduce the prison population

play00:47

while making sure the public stays safe.

play00:49

And some areas of law enforcement are turning

play00:51

to computer algorithms to help them do that.

play00:53

Computer algorithms are all around us.

play00:55

They're basically formulas used to do stuff,

play00:56

like predict what movies you like,

play00:58

and that's exactly what Netflix does.

play01:00

Netflix uses algorithms to track what you watch.

play01:02

They match your viewing habits with a database of users

play01:05

and make predictions on what movies and TV shows

play01:07

you'll like based on what people who've watched

play01:08

similar things to you have liked.

play01:11

One of these is not like the other.

play01:13

How did Paddington get in there?

play01:16

But instead of predicting what you wanna watch,

play01:17

the justice system uses algorithms to predict

play01:20

if you'll recommit a crime.

play01:22

These algorithms are called risk assessment tools

play01:24

and here's how they're used.

play01:26

Okay, let's say you've been convicted of a crime.

play01:28

You're serving your time in prison

play01:29

and now you're up for parole.

play01:31

You would have what we call a parole hearing

play01:33

where a group of people, a parole board,

play01:35

decides if you'll get out, and parole boards are typically

play01:37

made up of former law enforcement professionals

play01:39

like police, prosecutors, and prison guards.

play01:42

At the hearing, they ask you questions to try figure out

play01:44

if you understand why you committed the crime

play01:46

and if you have any strategies

play01:47

to prevent yourself from doing it again.

play01:50

They also consider things like your behavior in prison

play01:52

and if you've gone through any prison programs

play01:53

like AA or anger management and they use all this

play01:56

information to decide if you're ready to be released.

play01:59

Basically, they wanna make sure that if you get out,

play02:01

you're not gonna go and commit another crime.

play02:03

But parole boards are made up of humans

play02:04

and we all come with bias.

play02:05

And a person's own biases might influence

play02:07

if he or she thinks you'll go on to commit another crime.

play02:10

And this is where risk assessment tools come in.

play02:11

Essentially, the tools use data to predict

play02:13

how likely you are to recommit a crime.

play02:15

And they make those predictions a lot like Netflix does.

play02:17

But instead of tracking what you're watching,

play02:19

they're looking for risk factors and are filled out

play02:21

a lot like questionnaires.

play02:23

They often take into account things like your age,

play02:24

your past criminal record, your education level,

play02:27

where you live, and if you've ever had

play02:29

any drug or alcohol problem.

play02:32

Some of these tools go deep.

play02:34

Here are some actual questions from one of them.

play02:36

How much do you agree or disagree with these statements:

play02:38

a hungry person has the right to steal;

play02:41

some people don't deserve any respect

play02:42

and should be treated like animals;

play02:44

if someone insults my family, friends, or group,

play02:46

they are asking for trouble.

play02:48

I don't even know how I'd answer some of these questions.

play02:51

The algorithm then compares your answers

play02:52

to a database of past offenders.

play02:54

Those past offenders have been tracked over time

play02:56

to see if they went on to commit another crime.

play02:59

So if a lot of people who share your risk factors

play03:01

have committed another crime, you'd be considered high risk.

play03:04

And parole boards can use this info

play03:05

to help decide if they'll let you out.

play03:07

For example, they'd be more likely

play03:08

to release a low risk offender.

play03:10

And compared to professional opinion alone,

play03:12

research shows that in a variety of settings,

play03:14

well validated risk assessment tools do a better job

play03:17

of predicting behavior than professional opinion alone.

play03:20

This is encouraging because depending on how they're used,

play03:22

they could help wind down mass incarceration.

play03:24

For example, we could keep low risk offenders out of prison

play03:27

or let them out of jail earlier.

play03:29

But risk assessment tools aren't without controversy.

play03:32

Just like how you're not gonna like 100% of your Netflix

play03:34

recommendations, these tools can't predict

play03:36

with 100% certainty if you'll commit another crime.

play03:38

And while race is not explicitly included as a risk factor,

play03:41

some people fear that tools could contribute to racial

play03:43

biases already seen in the criminal justice system.

play03:46

For example, police tend to arrest black men for marijuana

play03:48

possession at a much higher rate than white men.

play03:51

Even though the two groups use it the same amount.

play03:53

So if arrest rates are considered as a risk factor,

play03:55

it's possible the tools could disproportionately

play03:57

affect black people.

play03:58

And for the most part, a lot of these tools

play04:00

haven't been studied to see if they show racial bias.

play04:03

At the end of the day, no predictive algorithm is perfect.

play04:06

They're only as good as the people writing them

play04:08

and the data sets that they're based on.

play04:10

And it's not clear yet what impact these tools have

play04:12

on racial disparities in our criminal justice.

play04:14

This is why so many people

play04:16

want greater oversight and more testing.

play04:18

And now we wanna hear from you.

play04:19

What questions does this raise for you?

play04:22

Throw some out there and we'll see if we can answer them.

Rate This

5.0 / 5 (0 votes)

Etiquetas Relacionadas
Criminal JusticeAlgorithmsPredictive ModelingMass IncarcerationBias in AIRisk AssessmentParole DecisionsRacial DisparitiesData AnalysisPublic Safety
¿Necesitas un resumen en inglés?