Yeshimabeit Milner warns that the true existential threat is big data in the hands of a few (4/8)
Summary
TLDRYeshi Milner, founder of Data for Black Lives, argues against AI as an existential threat, emphasizing the real danger lies in the concentration of data power in few hands. She discusses the historical context of eugenics and its influence on data science, highlighting the disproportionate impact of algorithms like FICO on marginalized communities.
Takeaways
- 🌟 The speaker, Yeshi Milner, is the founder and executive director of Data for Black Lives, a movement of scientists and activists aiming to use data for social change.
- 🔍 Yeshi argues against the idea that AI is an existential threat, instead focusing on the concentration of power in the form of data in the hands of a few as the real threat.
- 📚 The concept of 'existential threat' is defined in the context of AI, drawing on philosophical traditions and the potential for AI to cause human extinction or drastically curtail humanity's potential.
- 📈 The speaker criticizes the use of AI and data science in ways that perpetuate harmful stereotypes and biases, particularly against black communities.
- 💼 Yeshi highlights the role of credit scores, such as FICO, in disproportionately affecting black communities, noting that black people are three times more likely to have a low score despite similar educational and financial backgrounds.
- 🏦 The speaker calls for the abolition of FICO and similar systems, which she sees as perpetuating racial disparities and reinforcing systemic racism through algorithmic decision-making.
- 🌐 The impact of AI and data-driven systems is not evenly distributed, with black and minority communities bearing the brunt of negative consequences, such as reduced access to housing, education, and employment.
- 🌱 Yeshi advocates for reclaiming data as a tool for protest, accountability, and collective action, emphasizing the need to change who controls and benefits from data technologies.
- 👥 The speaker invites the audience to join the movement to change not just the debate around AI and data, but also to ensure that diverse voices are included in the decision-making process.
- 🏛️ The speaker warns against the dangers of necropolitics, where social and political power dictates who lives and who dies, and how AI and data can be used to enforce oppressive systems.
Q & A
Who is Yeshi Milner and what is her role?
-Yeshi Milner is the founder and executive director of Data for Black Lives, a movement of scientists and activists working to make data a tool for social change.
What is the main argument Yeshi Milner is making against the motion?
-Yeshi Milner argues against the motion that AI is an existential threat, stating that the true existential threat is the concentration of power in the form of data in the hands of a few.
How does Yeshi Milner define 'existential threat' in the context of AI?
-She defines 'existential threat' as an event that could cause human extinction or permanently or drastically curtail humanity's potential, drawing on the philosophical tradition of eugenics and transhumanist concepts.
What is the paperclip experiment and why is it relevant to the discussion?
-The paperclip experiment is a thought experiment authored by Nick Bostrom, one of the founders of the existential risk movement. It is relevant because it illustrates the potential dangers of AI if its optimization processes are misaligned with human values.
What is the connection between eugenics and the development of AI?
-Yeshi Milner suggests that the development of AI and data science has been influenced by eugenic ideologies, which have historically been used to justify discrimination and oppression based on race.
What incident involving Nick Bostrom is mentioned in the script?
-The script mentions an incident where Nick Bostrom admitted to writing a racist email, expressing beliefs about racial differences in intelligence, which Yeshi Milner criticizes as a reflection of harmful biases in AI development.
How does Yeshi Milner describe her personal experience with the concept of 'risk'?
-She describes her personal experience with the concept of 'risk' as being introduced to it at a young age in an elementary school computer lab, where she was told she was 'at risk' due to being enrolled in an after-school program for at-risk youth.
What is the FICO score and why is it problematic according to Yeshi Milner?
-The FICO score is a credit score algorithm used in the United States to determine an individual's creditworthiness. Yeshi Milner argues that it is problematic because it reflects and reinforces racial biases, with black people being three times more likely to have a low score.
What is the term 'necropolitics' and how does it relate to the use of AI?
-Necropolitics is a term used to describe the use of social and political power to dictate how some people may live and how some people must die. Yeshi Milner relates it to the use of AI by suggesting that AI can be used to enforce and justify harmful societal structures and beliefs.
What is the main call to action in Yeshi Milner's speech?
-The main call to action in Yeshi Milner's speech is to abolish big data and dismantle the structures that put the power of data into the hands of a few, rather than focusing on AI as an existential threat.
How does Yeshi Milner suggest we should be using AI and data?
-She suggests that we should be using AI and data as tools to promote equity, fairness, and justice, and to put these powerful tools into the hands of people who need them the most.
Outlines
🌐 The Misuse of AI and Data in Society
Yeshi Milner, founder and executive director of Data for Black Lives, begins by expressing gratitude for the opportunity to speak and highlights the importance of young women in politics. He argues against the notion that AI is an existential threat, instead focusing on the concentration of data power in the hands of a few as the real threat. He defines 'existential threat' in the context of AI and criticizes the philosophical tradition that underpins it, including the resurgence of eugenics and its influence on data science and scientific management practices. Milner also addresses the issue of racism in the field of existential risk, specifically mentioning Nick Bostrom's controversial views and their implications for how data is used and perceived in society.
💼 The Impact of Algorithms on Social Inequality
Milner delves into the personal impact of societal stereotypes and how they are ingrained in public imagination and encoded in AI technologies. He uses the example of credit scores, specifically the FICO algorithm, to illustrate how these algorithms can perpetuate racial disparities. He points out that black people are three times more likely to have a low credit score, which affects their ability to rent, access education, and find employment. Milner also discusses the broader economic disparities faced by black and minority ethnic communities in the UK and the US, emphasizing the need to address the power structures that enable such algorithms to disproportionately affect certain groups.
🚀 Reclaiming Data for Social Change
In the final paragraph, Milner shifts the focus from the fear of AI to the potential misuse of AI and data. He argues that the real concern should be how AI is used to reinforce existing biases and power structures. He introduces the concept of necropolitics, which describes the use of power to determine who lives and who dies, and how this is reflected in the deployment of AI technologies. Milner calls for a pause not in the development of AI, but in the structures that allow the misuse of data. He encourages the audience to join the movement to reclaim data as a tool for protest, accountability, and collective action, and to change the narrative around who controls and benefits from these technologies.
Mindmap
Keywords
💡Existential threat
💡Data for Black Lives
💡Concentration of power
💡Eugenics
💡FICO score
💡Algorithm
💡Necropolitics
💡Risk
💡Credit scores
💡Techno-feudalism
💡Data as a tool
Highlights
Thanking Madame President, Chief of Staff Davis, and Aaliyah for the invite to speak at a critical juncture in world history.
Introduction of Yeshi Milner, founder and executive director of Data for Black Lives, a movement to use data for social change.
Arguing against the motion that AI is an existential threat, focusing instead on the concentration of power in data.
Defining 'existential threat' in the context of AI, grounded in the resurgence of a distinct philosophical tradition.
Discussion of the paperclip experiment and its relation to the existential risk concept.
Explaining that the term 'existential threat' is used to make the threat appear more dire.
Linking the concept of existential risk to transhumanist concepts like diogenic pressures and eugenics.
Historical context of eugenics and its influence on thinking in Europe and North America.
Addressing the impact of eugenics on data science and scientific management practices.
Personal story of understanding 'risk' from a young age and its implications for at-risk youth.
Experience of navigating a world where being black is often seen as a problem or threat.
Example of credit scores and how they are used as a powerful algorithm to control access to resources.
Statistic on black people being three times more likely to have a low FICO score, despite controlling for education, debt, and income.
Discussion of the impact of credit scores on housing, education, and employment opportunities.
Highlighting the racial disparities in wealth and the role of algorithms in perpetuating these disparities.
Calling for a focus on how AI and data are used, rather than fearing AI becoming human.
Invitation to join the movement to reclaim data as protest, accountability, and collective action.
Emphasizing the need to change who is at the table when discussing the creation and use of AI technologies.
Transcripts
thank
you I want to thank Madame President uh
Chief of Staff Davis and Aaliyah for the
invite it is wonderful to see young
women at the helm of one of the most
important political institutions at a
critical juncture in our world history
thank you my name is yeshi Milner and I
am the founder and executive director of
data for black lives we are a movement
of scientists and activists working to
make data a tool for social change
instead of a weapon of political iCal
oppression today it's with honor and
actually great
pleasure of mine to argue against the
motion of the house that AI is an
existential
threat in a time of genocidal
Warfare extreme income inequality driven
by inflation and other economic
practices and at the same time immense
political
division I'm here to argue that the true
existential threat is the concentration
of power in the form of data in the
hands of a few you see the opposition
nodding so there you
go but before I get into my argument I
want to Define existential
threat as stated in the motion but also
in the context of AI it is not a blanket
vanilla term but one grounded in the
Resurgence of a distinct philosophical
tradition you
genics it's interesting that the
opposition opened with the paperclip
experiment because that's a concept
authored by Mr Nick
Bostrom the founder one of the founders
of the What's called the ex crisis
movement he defines existential risk as
an event that could cause human
extinction or permanently or drastically
curtail Humanity's
potential this is different from a
global catastrophic risk as you defines
it which we can recover from and the use
of the term existential threat instead
of risk is actually as a a linguistic
Trope in order to make the threat appear
more
dire he builds on these ideas of
existential risk using transhumanist
Concepts like diogen
pressures dienic is the opposite of
eugenic and it's based on the theory
that people who were deemed inferior
categorically
racially would reproduce faster they're
the
threat diogen pressures are only solved
by Eugenics
practices Eugenics was an international
movement articulating genetic
hypotheses research and immense policy
prescriptions during the first three
decades of the 20th
century Eugenics along with racism and
imperi
iism had profound influence on thinking
in
Europe in the it originated in the
UK but in Germany France Scandinavia and
North America where I live in the US and
there it became more than a theory but a
political and ideological
regime while bom or anyone in the
singularity or ex crisis movement didn't
invent J Eugenics it lasted well before
their time and if we were to be honest
it shapes the very data science and
scientific management practices of today
so there's actually another context to
this definition of existential risk that
I want to bring in because it's not
what's being said and defined it's also
what's been
redacted Bostrom admitted to writing a
racist email where he stated take for
example the following sentence blacks
are more stupid than whites I like that
sentence and I think it's true but
recently I've begun to believe that I
won't have much success with that with
most people if I speak like that that
they would think that I were racist that
I disliked black people and thought it
was fair if blacks were treated badly I
don't it's just that based on what I
have read I think it's probable that
black people have a lower average IQ
than Mankind in general and I think that
IQ is highly correlated with what we
normally mean by smart and stupid can go
on with this email but he ends it by
using the the nword which he does not
redact by the
way I don't just come to you as someone
who's the movement of an organization
that I started in 2017 to counteract the
ways in which data is weaponized against
black people I come to you as somebody
who understands quite literally this
idea of risk the very first time I even
heard the word risk or threat was at 9
years old in the computer lab at my
elementary school another young student
student said I'm at risk shocked I said
what are you talking about you're at
risk like are you okay because she was
enrolled in an atrisk an after school
program for atrisk youth she believed
what they had told her was that she
would one day be at risk of early
pregnancy
prison early
death uh joining a
gang and to be honest for so many of the
young people that I went to elementary
school and high school with that became
a self-fulfilling prophecy while I
survived what we know in the US as a
school to prison pipeline many people
did not in my high school graduation the
first row were of empty seats to
commemorate people who would who died
too soon as a young black person I've
learned how to navigate this world where
to be black is to be a problem to be a
threat to be a
risk and these
ideas that I have to count and fight as
stereotypes of just dark of just the
mere fact of being dark or
pigmented have been ingrained automated
into the public
imagination in the very same way that
they've been encoded into powerful
artificial intelligence Technologies of
today and weaponized by those who
currently have control and power one
example that I'll use is of credit
scores I've brought this demand to the
White House just this month to Congress
to AB to abolish FICO which is the
predominant most powerful algorithm at
work in our country FICO is the fair and
Isaac Corporation it was started by a
mathematician and an engineer 25 years
ago to use artificial intelligence as a
way to eliminate human bias when
providing people access to
credit the inputs to the FICO algorithm
as we're told are the amount of debt we
have the percentage of Mis payments um
all of this information are provided
through a collusion of data Brokers
experion Equifax and Transunion Experian
which is actually a UK based company and
then they're fed into the FICO algorithm
according to the shareholder reports at
FICO their scores are their number one
product without a credit score you can't
rent a house you can't qualify for
student loans you can't
that increasingly you can't get a job in
America and black people are three times
more likely to be scored at below 620
than white people even when controlling
for Education debt and
income FICO scores reflect the ways in
which algorithms hold put tremendous
power over our
lives and as I
mentioned the impact is not on everyone
yes there is a threat but it's not
evenly
distributed right here in the
UK according to the Color of Money
Report black and Bangladeshi families
have 10 times less wealth than white
families black Caribbean families have
20 P for every $1 made by white fames
and to even go back to the US
example what's the impact of credit
scores black people are 133% of the
population but represent 55% of unhoused
and I speak about this with passion
because I grew up with a single mother
who put herself through college graduate
school send her kids to iwe colleges on
a full ride was a leader in her
community but because of these blackbox
algorithms that are racially encoded and
quite powerful with no recourse we often
were homeless all because of a
three-digit number while it is in
violation of federal law to deny people
housing employment and education based
on race you can't sue an
algorithm algorithms have now become not
Godlike not super intelligent but the
way for people to be racist without
being a
racist
by definition what is an algorithm this
is what makes up artificial
intelligence it's a step-by-step process
to solve a problem a recipe is an
algorithm a list of instructions to make
the dish the ingredients that make up
the dish and a result based on what we
Define as
success whether we want to focus on
making something healthy or something
that just tastes good good regardless of
Health these qu these decisions are
determined by a question what are we
optimizing algorithms are not just input
and output they're based on the
objective function what are we
optimizing for computational algorithms
what we are discussing is much more
complex but the function Remains the
Same today I ask all of you are we
optimizing for a
future where data AI is being used as a
tool to promote Equity fairness Justice
and for everybody including communities
of color or a techn feudalist society
where a machine learning model dos out
Mass
suffering to quote a political theorist
that we that we love achil MBE who has
done ex extensive work on on actual
threats especially on the idea of
necropolitics
necropolitics is the use of social and
political power to dictate how some
people may live and how some people must
die the deployment of
necropolitics create what he calls death
worlds or new and unique forms of
existence with vast populations are
subjected to living conditions that
confer upon them the status of the
Dead I come to you from the Land of the
Dead communities where people have been
stripped of opportunities where the fire
hoses and police dogs of the past have
been replaced with the FICO algorithms
scoring models and risk assessments at
the
present we are not afraid of AI becoming
human what we are afraid of of how it is
going to be used and it has been used to
enforce to reinforce and
justify LGH held
beliefs of who was human and who is
not who deserves the rights and
privileges afforded by not just stiens
but
life we need to pause we don't need to
pause AI sorry but we need to abolish
big data we need to dismantle the
structures that put the power of data
into the hands of a very few
yes a
Reas no I'm going to get I'm going to
get to what I'm going to talk about let
me
finish because let me say it if we focus
on AI as an existential threat we miss
the opportunity to put these powerful
tools into the hands of people who need
it the most at data for black lives we
are reclaiming data as protest as
accountability and as collective action
through my work which you can read about
online and look at we've been able to
shape the role that data plays in public
life and we want to invite all of you to
not just join us in voting for the
opposition but join our movement to
finally change the cast of characters of
not just who's at these debates but
who's at the table about how these
Technologies are made thank
[Applause]
you
Посмотреть больше похожих видео
The Future of Data | Tiago Santos | TEDxEUBusinessSchoolBarcelona
Hillel Kobrovski, cyber Lecturer @Ono Academic College about AI's Influence on the cyber profession
How Data Structures & Algorithms are Actually Used
You need data literacy now more than ever – here’s how to master it | Talithia Williams
Introduction To Artificial Intelligence | What Is AI?| Artificial Intelligence Tutorial |Simplilearn
Les 10 dangers de l'intelligence artificielle
5.0 / 5 (0 votes)