Are Risk Assessment Algorithms Fair, or Racist?
Summary
TLDRThis script discusses how the US criminal justice system employs algorithms to forecast criminal behavior, aiming to reduce prison populations while ensuring public safety. Algorithms, akin to Netflix's recommendation system, analyze factors like age, criminal history, and education to predict recidivism. Despite their potential to alleviate mass incarceration, these tools are controversial due to concerns about perpetuating racial biases and inaccuracies in predictions.
Takeaways
- ๐ The US has a high incarceration rate, with a disproportionate number of people of color imprisoned.
- ๐ฐ High prison rates are costly and contribute to racial inequalities.
- ๐ There's a push from both political parties to reduce prison populations while maintaining public safety.
- ๐ป Law enforcement is using computer algorithms to help with this effort.
- ๐ฅ Algorithms are formulas used for various purposes, like predicting movie preferences on Netflix.
- ๐ฎ Risk assessment tools in the justice system predict the likelihood of reoffending.
- ๐จโโ๏ธ Parole boards consider various factors, including behavior and participation in prison programs, to decide parole.
- ๐ง Human biases can influence parole board decisions, which risk assessment tools aim to mitigate.
- ๐ These tools use data like age, criminal history, and education level to assess risk.
- ๐ Well-validated risk assessment tools often outperform professional opinion alone in predicting behavior.
- โ๏ธ There are concerns that these tools could perpetuate racial biases present in the criminal justice system.
- ๐ There is a call for more oversight and testing to understand the impact of these tools on racial disparities.
Q & A
What is the current state of mass incarceration in the United States?
-In 2013, one out of every 110 adults in America were incarcerated, with a disproportionate number of people of color being imprisoned.
How does the racial disparity manifest in the U.S. prison system?
-In 2010, black men were six times more likely to be behind bars than white men, indicating a significant racial disparity.
What is a parole hearing and why is it important?
-A parole hearing is a process where a parole board, typically composed of former law enforcement professionals, decides if an inmate is ready for release. It's important for determining recidivism risk and public safety.
What role do computer algorithms play in the criminal justice system?
-Computer algorithms, specifically risk assessment tools, are used to predict the likelihood of an individual reoffending, aiming to assist parole boards in making more informed decisions.
How do risk assessment tools function?
-These tools use data such as age, criminal history, education level, and personal circumstances to predict recidivism. They compare an individual's responses to a database of past offenders to assess risk.
What are some potential biases that could affect parole board decisions?
-Parole boards are made up of humans who may have personal biases that could influence their judgment on whether an individual will reoffend.
How do risk assessment tools compare to professional opinion in predicting behavior?
-Research shows that well-validated risk assessment tools often outperform professional opinion alone in predicting recidivism.
What are the potential benefits of using risk assessment tools in the criminal justice system?
-They could help reduce mass incarceration by keeping low-risk offenders out of prison or releasing them earlier, thus potentially winding down mass incarceration.
What are the controversies surrounding the use of risk assessment tools?
-There are concerns that these tools might unintentionally perpetuate racial biases already present in the system, as they could be influenced by factors like arrest rates which disproportionately affect certain racial groups.
Why is it important to have oversight and testing for risk assessment tools?
-Oversight and testing are crucial to ensure that the algorithms are fair, unbiased, and effective, and to understand their impact on racial disparities within the criminal justice system.
What are some of the questions that the script raises for further discussion?
-The script raises questions about the accuracy and fairness of predictive algorithms, the potential for racial bias, and the need for transparency and regulation in their use within the criminal justice system.
Outlines
๐ Criminal Justice and Math
This paragraph discusses the application of mathematical algorithms in the criminal justice system to predict criminal behavior. It highlights the issue of mass incarceration in America, particularly the disproportionate imprisonment of people of color. The paragraph explains how risk assessment tools, analogous to Netflix's recommendation algorithms, are used to predict the likelihood of reoffending. These tools consider various factors such as age, criminal history, education, and substance abuse to make their predictions. The goal is to reduce the prison population while ensuring public safety, but there are concerns about potential racial bias and the imperfections of these predictive tools.
Mindmap
Keywords
๐กCriminal Justice System
๐กMass Incarceration
๐กComputer Algorithms
๐กRisk Assessment Tools
๐กParole Hearing
๐กBias
๐กRacial Disparities
๐กPredictive Accuracy
๐กOversight
๐กData Sets
๐กPublic Safety
Highlights
The criminal justice system uses math to predict criminal behavior.
In 2013, one out of 110 adults were locked up in America.
A disproportionate amount of prisoners are people of color.
In 2010, black men were six times more likely to be behind bars than white men.
High prison rates are referred to as mass incarceration.
Efforts from both Democrats and Republicans aim to reduce the prison population.
Computer algorithms are used to predict if a person will recommit a crime.
Risk assessment tools are algorithms used in the justice system.
Parole boards decide if a convict is ready to be released.
Parole boards are made up of former law enforcement professionals.
Risk assessment tools use data to predict the likelihood of reoffending.
These tools consider factors like age, criminal record, education level, and substance abuse.
Controversy exists around risk assessment tools and their potential to contribute to racial biases.
Research shows risk assessment tools predict behavior better than professional opinion alone.
Risk assessment tools could help reduce mass incarceration by keeping low-risk offenders out of prison.
There's a push for greater oversight and more testing of risk assessment tools.
Predictive algorithms are only as good as the people writing them and the data they're based on.
Transcripts
- What's worse than math?
Prison.
Today we're talking about how the criminal justice system
uses math to predict criminal behavior.
(upbeat music)
First off, America has a ton of people in prison.
In 2013, one out of 110 adults were locked up
and a disproportionate amount of those are people of color.
In 2010, black men were six times more likely
to be behind bars than white men.
If you've heard the term mass incarceration,
it's referring to these high prison rates.
And keeping so many people in prison costs a lot of money
and contributes to racial inequalities in America
so there's a big push from both Democrats
and Republicans to find ways to reduce the prison population
while making sure the public stays safe.
And some areas of law enforcement are turning
to computer algorithms to help them do that.
Computer algorithms are all around us.
They're basically formulas used to do stuff,
like predict what movies you like,
and that's exactly what Netflix does.
Netflix uses algorithms to track what you watch.
They match your viewing habits with a database of users
and make predictions on what movies and TV shows
you'll like based on what people who've watched
similar things to you have liked.
One of these is not like the other.
How did Paddington get in there?
But instead of predicting what you wanna watch,
the justice system uses algorithms to predict
if you'll recommit a crime.
These algorithms are called risk assessment tools
and here's how they're used.
Okay, let's say you've been convicted of a crime.
You're serving your time in prison
and now you're up for parole.
You would have what we call a parole hearing
where a group of people, a parole board,
decides if you'll get out, and parole boards are typically
made up of former law enforcement professionals
like police, prosecutors, and prison guards.
At the hearing, they ask you questions to try figure out
if you understand why you committed the crime
and if you have any strategies
to prevent yourself from doing it again.
They also consider things like your behavior in prison
and if you've gone through any prison programs
like AA or anger management and they use all this
information to decide if you're ready to be released.
Basically, they wanna make sure that if you get out,
you're not gonna go and commit another crime.
But parole boards are made up of humans
and we all come with bias.
And a person's own biases might influence
if he or she thinks you'll go on to commit another crime.
And this is where risk assessment tools come in.
Essentially, the tools use data to predict
how likely you are to recommit a crime.
And they make those predictions a lot like Netflix does.
But instead of tracking what you're watching,
they're looking for risk factors and are filled out
a lot like questionnaires.
They often take into account things like your age,
your past criminal record, your education level,
where you live, and if you've ever had
any drug or alcohol problem.
Some of these tools go deep.
Here are some actual questions from one of them.
How much do you agree or disagree with these statements:
a hungry person has the right to steal;
some people don't deserve any respect
and should be treated like animals;
if someone insults my family, friends, or group,
they are asking for trouble.
I don't even know how I'd answer some of these questions.
The algorithm then compares your answers
to a database of past offenders.
Those past offenders have been tracked over time
to see if they went on to commit another crime.
So if a lot of people who share your risk factors
have committed another crime, you'd be considered high risk.
And parole boards can use this info
to help decide if they'll let you out.
For example, they'd be more likely
to release a low risk offender.
And compared to professional opinion alone,
research shows that in a variety of settings,
well validated risk assessment tools do a better job
of predicting behavior than professional opinion alone.
This is encouraging because depending on how they're used,
they could help wind down mass incarceration.
For example, we could keep low risk offenders out of prison
or let them out of jail earlier.
But risk assessment tools aren't without controversy.
Just like how you're not gonna like 100% of your Netflix
recommendations, these tools can't predict
with 100% certainty if you'll commit another crime.
And while race is not explicitly included as a risk factor,
some people fear that tools could contribute to racial
biases already seen in the criminal justice system.
For example, police tend to arrest black men for marijuana
possession at a much higher rate than white men.
Even though the two groups use it the same amount.
So if arrest rates are considered as a risk factor,
it's possible the tools could disproportionately
affect black people.
And for the most part, a lot of these tools
haven't been studied to see if they show racial bias.
At the end of the day, no predictive algorithm is perfect.
They're only as good as the people writing them
and the data sets that they're based on.
And it's not clear yet what impact these tools have
on racial disparities in our criminal justice.
This is why so many people
want greater oversight and more testing.
And now we wanna hear from you.
What questions does this raise for you?
Throw some out there and we'll see if we can answer them.
Browse More Related Video
5.0 / 5 (0 votes)