Yeshimabeit Milner warns that the true existential threat is big data in the hands of a few (4/8)

OxfordUnion
27 Nov 202313:27

Summary

TLDRYeshi Milner, founder of Data for Black Lives, argues against AI as an existential threat, emphasizing the real danger lies in the concentration of data power in few hands. She discusses the historical context of eugenics and its influence on data science, highlighting the disproportionate impact of algorithms like FICO on marginalized communities.

Takeaways

  • 🌟 The speaker, Yeshi Milner, is the founder and executive director of Data for Black Lives, a movement of scientists and activists aiming to use data for social change.
  • 🔍 Yeshi argues against the idea that AI is an existential threat, instead focusing on the concentration of power in the form of data in the hands of a few as the real threat.
  • 📚 The concept of 'existential threat' is defined in the context of AI, drawing on philosophical traditions and the potential for AI to cause human extinction or drastically curtail humanity's potential.
  • 📈 The speaker criticizes the use of AI and data science in ways that perpetuate harmful stereotypes and biases, particularly against black communities.
  • 💼 Yeshi highlights the role of credit scores, such as FICO, in disproportionately affecting black communities, noting that black people are three times more likely to have a low score despite similar educational and financial backgrounds.
  • 🏦 The speaker calls for the abolition of FICO and similar systems, which she sees as perpetuating racial disparities and reinforcing systemic racism through algorithmic decision-making.
  • 🌐 The impact of AI and data-driven systems is not evenly distributed, with black and minority communities bearing the brunt of negative consequences, such as reduced access to housing, education, and employment.
  • 🌱 Yeshi advocates for reclaiming data as a tool for protest, accountability, and collective action, emphasizing the need to change who controls and benefits from data technologies.
  • 👥 The speaker invites the audience to join the movement to change not just the debate around AI and data, but also to ensure that diverse voices are included in the decision-making process.
  • 🏛️ The speaker warns against the dangers of necropolitics, where social and political power dictates who lives and who dies, and how AI and data can be used to enforce oppressive systems.

Q & A

  • Who is Yeshi Milner and what is her role?

    -Yeshi Milner is the founder and executive director of Data for Black Lives, a movement of scientists and activists working to make data a tool for social change.

  • What is the main argument Yeshi Milner is making against the motion?

    -Yeshi Milner argues against the motion that AI is an existential threat, stating that the true existential threat is the concentration of power in the form of data in the hands of a few.

  • How does Yeshi Milner define 'existential threat' in the context of AI?

    -She defines 'existential threat' as an event that could cause human extinction or permanently or drastically curtail humanity's potential, drawing on the philosophical tradition of eugenics and transhumanist concepts.

  • What is the paperclip experiment and why is it relevant to the discussion?

    -The paperclip experiment is a thought experiment authored by Nick Bostrom, one of the founders of the existential risk movement. It is relevant because it illustrates the potential dangers of AI if its optimization processes are misaligned with human values.

  • What is the connection between eugenics and the development of AI?

    -Yeshi Milner suggests that the development of AI and data science has been influenced by eugenic ideologies, which have historically been used to justify discrimination and oppression based on race.

  • What incident involving Nick Bostrom is mentioned in the script?

    -The script mentions an incident where Nick Bostrom admitted to writing a racist email, expressing beliefs about racial differences in intelligence, which Yeshi Milner criticizes as a reflection of harmful biases in AI development.

  • How does Yeshi Milner describe her personal experience with the concept of 'risk'?

    -She describes her personal experience with the concept of 'risk' as being introduced to it at a young age in an elementary school computer lab, where she was told she was 'at risk' due to being enrolled in an after-school program for at-risk youth.

  • What is the FICO score and why is it problematic according to Yeshi Milner?

    -The FICO score is a credit score algorithm used in the United States to determine an individual's creditworthiness. Yeshi Milner argues that it is problematic because it reflects and reinforces racial biases, with black people being three times more likely to have a low score.

  • What is the term 'necropolitics' and how does it relate to the use of AI?

    -Necropolitics is a term used to describe the use of social and political power to dictate how some people may live and how some people must die. Yeshi Milner relates it to the use of AI by suggesting that AI can be used to enforce and justify harmful societal structures and beliefs.

  • What is the main call to action in Yeshi Milner's speech?

    -The main call to action in Yeshi Milner's speech is to abolish big data and dismantle the structures that put the power of data into the hands of a few, rather than focusing on AI as an existential threat.

  • How does Yeshi Milner suggest we should be using AI and data?

    -She suggests that we should be using AI and data as tools to promote equity, fairness, and justice, and to put these powerful tools into the hands of people who need them the most.

Outlines

00:00

🌐 The Misuse of AI and Data in Society

Yeshi Milner, founder and executive director of Data for Black Lives, begins by expressing gratitude for the opportunity to speak and highlights the importance of young women in politics. He argues against the notion that AI is an existential threat, instead focusing on the concentration of data power in the hands of a few as the real threat. He defines 'existential threat' in the context of AI and criticizes the philosophical tradition that underpins it, including the resurgence of eugenics and its influence on data science and scientific management practices. Milner also addresses the issue of racism in the field of existential risk, specifically mentioning Nick Bostrom's controversial views and their implications for how data is used and perceived in society.

05:02

💼 The Impact of Algorithms on Social Inequality

Milner delves into the personal impact of societal stereotypes and how they are ingrained in public imagination and encoded in AI technologies. He uses the example of credit scores, specifically the FICO algorithm, to illustrate how these algorithms can perpetuate racial disparities. He points out that black people are three times more likely to have a low credit score, which affects their ability to rent, access education, and find employment. Milner also discusses the broader economic disparities faced by black and minority ethnic communities in the UK and the US, emphasizing the need to address the power structures that enable such algorithms to disproportionately affect certain groups.

10:03

🚀 Reclaiming Data for Social Change

In the final paragraph, Milner shifts the focus from the fear of AI to the potential misuse of AI and data. He argues that the real concern should be how AI is used to reinforce existing biases and power structures. He introduces the concept of necropolitics, which describes the use of power to determine who lives and who dies, and how this is reflected in the deployment of AI technologies. Milner calls for a pause not in the development of AI, but in the structures that allow the misuse of data. He encourages the audience to join the movement to reclaim data as a tool for protest, accountability, and collective action, and to change the narrative around who controls and benefits from these technologies.

Mindmap

Keywords

💡Existential threat

In the context of the video, 'existential threat' refers to a risk that could lead to human extinction or significantly curtail humanity's potential. The speaker argues against the notion that AI itself is an existential threat, instead focusing on the misuse of data and power concentration as the real dangers. The term is used to highlight the severity of the issues discussed, drawing from philosophical traditions and contrasting with global catastrophic risks that are recoverable.

💡Data for Black Lives

Data for Black Lives is a movement mentioned in the script, founded by the speaker, Yeshi Milner. It consists of scientists and activists aiming to use data as a tool for social change rather than a weapon of oppression. The movement is relevant to the video's theme as it exemplifies the potential for data to be used in ways that promote equity and justice, countering the misuse of data that the speaker criticizes.

💡Concentration of power

The concentration of power is a central theme in the video, specifically in the context of data control. The speaker argues that the real existential threat is not AI but the few entities that control vast amounts of data. This power imbalance can lead to misuse and exploitation, reinforcing existing inequalities and injustices, as opposed to the more evenly distributed benefits that could be achieved with more equitable data management.

💡Eugenics

Eugenics is a historical movement that influenced the speaker's perspective on the misuse of data. It involved promoting certain genetic traits and discouraging others, often based on racial or class prejudices. The speaker connects eugenics to modern data science practices, suggesting that historical biases can be encoded into today's algorithms, leading to discriminatory outcomes.

💡FICO score

The FICO score is an example used in the video to illustrate the impact of data algorithms on individuals' lives. It is a credit scoring system that determines access to financial services like loans and housing. The speaker criticizes the FICO system for perpetuating racial disparities, with black individuals being more likely to receive lower scores, thus facing greater barriers to financial stability.

💡Algorithm

An algorithm, in the context of the video, is a set of instructions or a process used to solve a problem. The speaker discusses how algorithms, particularly those used in AI, can be biased if not carefully designed. Algorithms are not inherently good or bad but can reinforce existing power structures and inequalities if they are based on flawed or biased data.

💡Necropolitics

Necropolitics is a concept introduced by the speaker to describe the use of social and political power to control life and death. It is relevant to the video's theme as it connects the misuse of data and AI to the creation of conditions that devalue certain lives, effectively treating them as less than human.

💡Risk

The term 'risk' is used by the speaker to discuss how certain groups, particularly black individuals, are disproportionately labeled as 'at risk.' This labeling can become a self-fulfilling prophecy, leading to negative outcomes like early pregnancy, prison, or death. The speaker connects this concept to the broader misuse of data and its impact on societal perceptions and policies.

💡Credit scores

Credit scores are a specific type of algorithm discussed in the video that can have significant impacts on individuals' access to financial services. The speaker argues that these scores, which are often determined by factors like debt and payment history, can be racially biased, leading to systemic financial disadvantages for black communities.

💡Techno-feudalism

Techno-feudalism is a term used to describe a potential future where power is concentrated in the hands of a few entities, particularly those controlling AI and data. The speaker warns against this scenario, suggesting that it could lead to a society where the benefits of technology are not equitably distributed, and the majority suffer under the control of a technological elite.

💡Data as a tool

The speaker emphasizes the potential for data to be used as a tool for social change, rather than a weapon of oppression. This concept is central to the video's message, advocating for the equitable use of data to promote fairness and justice. The speaker's work with Data for Black Lives is an example of this approach, aiming to reclaim data for the benefit of marginalized communities.

Highlights

Thanking Madame President, Chief of Staff Davis, and Aaliyah for the invite to speak at a critical juncture in world history.

Introduction of Yeshi Milner, founder and executive director of Data for Black Lives, a movement to use data for social change.

Arguing against the motion that AI is an existential threat, focusing instead on the concentration of power in data.

Defining 'existential threat' in the context of AI, grounded in the resurgence of a distinct philosophical tradition.

Discussion of the paperclip experiment and its relation to the existential risk concept.

Explaining that the term 'existential threat' is used to make the threat appear more dire.

Linking the concept of existential risk to transhumanist concepts like diogenic pressures and eugenics.

Historical context of eugenics and its influence on thinking in Europe and North America.

Addressing the impact of eugenics on data science and scientific management practices.

Personal story of understanding 'risk' from a young age and its implications for at-risk youth.

Experience of navigating a world where being black is often seen as a problem or threat.

Example of credit scores and how they are used as a powerful algorithm to control access to resources.

Statistic on black people being three times more likely to have a low FICO score, despite controlling for education, debt, and income.

Discussion of the impact of credit scores on housing, education, and employment opportunities.

Highlighting the racial disparities in wealth and the role of algorithms in perpetuating these disparities.

Calling for a focus on how AI and data are used, rather than fearing AI becoming human.

Invitation to join the movement to reclaim data as protest, accountability, and collective action.

Emphasizing the need to change who is at the table when discussing the creation and use of AI technologies.

Transcripts

play00:00

thank

play00:01

you I want to thank Madame President uh

play00:04

Chief of Staff Davis and Aaliyah for the

play00:06

invite it is wonderful to see young

play00:09

women at the helm of one of the most

play00:10

important political institutions at a

play00:12

critical juncture in our world history

play00:15

thank you my name is yeshi Milner and I

play00:18

am the founder and executive director of

play00:20

data for black lives we are a movement

play00:22

of scientists and activists working to

play00:25

make data a tool for social change

play00:28

instead of a weapon of political iCal

play00:30

oppression today it's with honor and

play00:32

actually great

play00:34

pleasure of mine to argue against the

play00:36

motion of the house that AI is an

play00:39

existential

play00:41

threat in a time of genocidal

play00:44

Warfare extreme income inequality driven

play00:47

by inflation and other economic

play00:50

practices and at the same time immense

play00:52

political

play00:53

division I'm here to argue that the true

play00:56

existential threat is the concentration

play00:59

of power in the form of data in the

play01:02

hands of a few you see the opposition

play01:04

nodding so there you

play01:06

go but before I get into my argument I

play01:10

want to Define existential

play01:14

threat as stated in the motion but also

play01:17

in the context of AI it is not a blanket

play01:20

vanilla term but one grounded in the

play01:24

Resurgence of a distinct philosophical

play01:28

tradition you

play01:31

genics it's interesting that the

play01:33

opposition opened with the paperclip

play01:35

experiment because that's a concept

play01:37

authored by Mr Nick

play01:39

Bostrom the founder one of the founders

play01:41

of the What's called the ex crisis

play01:44

movement he defines existential risk as

play01:48

an event that could cause human

play01:50

extinction or permanently or drastically

play01:53

curtail Humanity's

play01:55

potential this is different from a

play01:57

global catastrophic risk as you defines

play02:00

it which we can recover from and the use

play02:03

of the term existential threat instead

play02:05

of risk is actually as a a linguistic

play02:09

Trope in order to make the threat appear

play02:11

more

play02:12

dire he builds on these ideas of

play02:15

existential risk using transhumanist

play02:19

Concepts like diogen

play02:22

pressures dienic is the opposite of

play02:25

eugenic and it's based on the theory

play02:27

that people who were deemed inferior

play02:30

categorically

play02:31

racially would reproduce faster they're

play02:34

the

play02:35

threat diogen pressures are only solved

play02:39

by Eugenics

play02:41

practices Eugenics was an international

play02:44

movement articulating genetic

play02:47

hypotheses research and immense policy

play02:52

prescriptions during the first three

play02:54

decades of the 20th

play02:56

century Eugenics along with racism and

play02:59

imperi

play03:00

iism had profound influence on thinking

play03:03

in

play03:04

Europe in the it originated in the

play03:08

UK but in Germany France Scandinavia and

play03:12

North America where I live in the US and

play03:15

there it became more than a theory but a

play03:18

political and ideological

play03:23

regime while bom or anyone in the

play03:26

singularity or ex crisis movement didn't

play03:28

invent J Eugenics it lasted well before

play03:32

their time and if we were to be honest

play03:34

it shapes the very data science and

play03:36

scientific management practices of today

play03:39

so there's actually another context to

play03:42

this definition of existential risk that

play03:43

I want to bring in because it's not

play03:45

what's being said and defined it's also

play03:47

what's been

play03:49

redacted Bostrom admitted to writing a

play03:52

racist email where he stated take for

play03:54

example the following sentence blacks

play03:57

are more stupid than whites I like that

play04:00

sentence and I think it's true but

play04:02

recently I've begun to believe that I

play04:04

won't have much success with that with

play04:07

most people if I speak like that that

play04:09

they would think that I were racist that

play04:11

I disliked black people and thought it

play04:13

was fair if blacks were treated badly I

play04:16

don't it's just that based on what I

play04:18

have read I think it's probable that

play04:20

black people have a lower average IQ

play04:23

than Mankind in general and I think that

play04:25

IQ is highly correlated with what we

play04:27

normally mean by smart and stupid can go

play04:29

on with this email but he ends it by

play04:32

using the the nword which he does not

play04:34

redact by the

play04:35

way I don't just come to you as someone

play04:38

who's the movement of an organization

play04:39

that I started in 2017 to counteract the

play04:43

ways in which data is weaponized against

play04:44

black people I come to you as somebody

play04:46

who understands quite literally this

play04:49

idea of risk the very first time I even

play04:52

heard the word risk or threat was at 9

play04:55

years old in the computer lab at my

play04:57

elementary school another young student

play04:59

student said I'm at risk shocked I said

play05:02

what are you talking about you're at

play05:03

risk like are you okay because she was

play05:06

enrolled in an atrisk an after school

play05:09

program for atrisk youth she believed

play05:12

what they had told her was that she

play05:14

would one day be at risk of early

play05:15

pregnancy

play05:17

prison early

play05:19

death uh joining a

play05:23

gang and to be honest for so many of the

play05:26

young people that I went to elementary

play05:27

school and high school with that became

play05:30

a self-fulfilling prophecy while I

play05:32

survived what we know in the US as a

play05:34

school to prison pipeline many people

play05:37

did not in my high school graduation the

play05:39

first row were of empty seats to

play05:42

commemorate people who would who died

play05:43

too soon as a young black person I've

play05:46

learned how to navigate this world where

play05:50

to be black is to be a problem to be a

play05:53

threat to be a

play05:55

risk and these

play05:57

ideas that I have to count and fight as

play06:01

stereotypes of just dark of just the

play06:04

mere fact of being dark or

play06:06

pigmented have been ingrained automated

play06:10

into the public

play06:11

imagination in the very same way that

play06:14

they've been encoded into powerful

play06:17

artificial intelligence Technologies of

play06:19

today and weaponized by those who

play06:22

currently have control and power one

play06:24

example that I'll use is of credit

play06:26

scores I've brought this demand to the

play06:29

White House just this month to Congress

play06:31

to AB to abolish FICO which is the

play06:34

predominant most powerful algorithm at

play06:36

work in our country FICO is the fair and

play06:40

Isaac Corporation it was started by a

play06:42

mathematician and an engineer 25 years

play06:44

ago to use artificial intelligence as a

play06:48

way to eliminate human bias when

play06:51

providing people access to

play06:53

credit the inputs to the FICO algorithm

play06:57

as we're told are the amount of debt we

play06:58

have the percentage of Mis payments um

play07:02

all of this information are provided

play07:05

through a collusion of data Brokers

play07:07

experion Equifax and Transunion Experian

play07:10

which is actually a UK based company and

play07:13

then they're fed into the FICO algorithm

play07:15

according to the shareholder reports at

play07:17

FICO their scores are their number one

play07:21

product without a credit score you can't

play07:24

rent a house you can't qualify for

play07:28

student loans you can't

play07:30

that increasingly you can't get a job in

play07:33

America and black people are three times

play07:36

more likely to be scored at below 620

play07:39

than white people even when controlling

play07:42

for Education debt and

play07:51

income FICO scores reflect the ways in

play07:54

which algorithms hold put tremendous

play07:56

power over our

play07:58

lives and as I

play08:00

mentioned the impact is not on everyone

play08:04

yes there is a threat but it's not

play08:06

evenly

play08:08

distributed right here in the

play08:11

UK according to the Color of Money

play08:15

Report black and Bangladeshi families

play08:18

have 10 times less wealth than white

play08:23

families black Caribbean families have

play08:26

20 P for every $1 made by white fames

play08:30

and to even go back to the US

play08:33

example what's the impact of credit

play08:36

scores black people are 133% of the

play08:39

population but represent 55% of unhoused

play08:44

and I speak about this with passion

play08:45

because I grew up with a single mother

play08:47

who put herself through college graduate

play08:50

school send her kids to iwe colleges on

play08:52

a full ride was a leader in her

play08:54

community but because of these blackbox

play08:58

algorithms that are racially encoded and

play09:00

quite powerful with no recourse we often

play09:03

were homeless all because of a

play09:05

three-digit number while it is in

play09:07

violation of federal law to deny people

play09:11

housing employment and education based

play09:14

on race you can't sue an

play09:17

algorithm algorithms have now become not

play09:21

Godlike not super intelligent but the

play09:24

way for people to be racist without

play09:26

being a

play09:28

racist

play09:30

by definition what is an algorithm this

play09:32

is what makes up artificial

play09:35

intelligence it's a step-by-step process

play09:38

to solve a problem a recipe is an

play09:42

algorithm a list of instructions to make

play09:45

the dish the ingredients that make up

play09:47

the dish and a result based on what we

play09:50

Define as

play09:53

success whether we want to focus on

play09:55

making something healthy or something

play09:58

that just tastes good good regardless of

play10:00

Health these qu these decisions are

play10:03

determined by a question what are we

play10:06

optimizing algorithms are not just input

play10:08

and output they're based on the

play10:10

objective function what are we

play10:13

optimizing for computational algorithms

play10:16

what we are discussing is much more

play10:18

complex but the function Remains the

play10:21

Same today I ask all of you are we

play10:24

optimizing for a

play10:27

future where data AI is being used as a

play10:30

tool to promote Equity fairness Justice

play10:33

and for everybody including communities

play10:35

of color or a techn feudalist society

play10:39

where a machine learning model dos out

play10:42

Mass

play10:44

suffering to quote a political theorist

play10:47

that we that we love achil MBE who has

play10:50

done ex extensive work on on actual

play10:55

threats especially on the idea of

play10:58

necropolitics

play10:59

necropolitics is the use of social and

play11:02

political power to dictate how some

play11:05

people may live and how some people must

play11:09

die the deployment of

play11:11

necropolitics create what he calls death

play11:14

worlds or new and unique forms of

play11:17

existence with vast populations are

play11:20

subjected to living conditions that

play11:23

confer upon them the status of the

play11:26

Dead I come to you from the Land of the

play11:30

Dead communities where people have been

play11:33

stripped of opportunities where the fire

play11:37

hoses and police dogs of the past have

play11:40

been replaced with the FICO algorithms

play11:43

scoring models and risk assessments at

play11:45

the

play11:48

present we are not afraid of AI becoming

play11:51

human what we are afraid of of how it is

play11:54

going to be used and it has been used to

play11:57

enforce to reinforce and

play11:59

justify LGH held

play12:02

beliefs of who was human and who is

play12:07

not who deserves the rights and

play12:09

privileges afforded by not just stiens

play12:13

but

play12:14

life we need to pause we don't need to

play12:16

pause AI sorry but we need to abolish

play12:19

big data we need to dismantle the

play12:22

structures that put the power of data

play12:26

into the hands of a very few

play12:30

yes a

play12:33

Reas no I'm going to get I'm going to

play12:36

get to what I'm going to talk about let

play12:37

me

play12:38

finish because let me say it if we focus

play12:43

on AI as an existential threat we miss

play12:46

the opportunity to put these powerful

play12:49

tools into the hands of people who need

play12:51

it the most at data for black lives we

play12:53

are reclaiming data as protest as

play12:56

accountability and as collective action

play12:59

through my work which you can read about

play13:00

online and look at we've been able to

play13:02

shape the role that data plays in public

play13:04

life and we want to invite all of you to

play13:07

not just join us in voting for the

play13:08

opposition but join our movement to

play13:11

finally change the cast of characters of

play13:14

not just who's at these debates but

play13:16

who's at the table about how these

play13:17

Technologies are made thank

play13:21

[Applause]

play13:26

you

Rate This

5.0 / 5 (0 votes)

Ähnliche Tags
AI EthicsData PowerSocial ChangeRacial BiasEconomic InequalityAlgorithmic ControlCredit ScoresFICO AlgorithmNecropoliticsData for Black Lives
Benötigen Sie eine Zusammenfassung auf Englisch?