What do tech companies know about your children? | Veronica Barassi | TEDxMileHigh
Summary
TLDRThe speaker, an anthropologist and mother, recounts her experience with a hospital's consent form that raised concerns about data privacy. She delves into the pervasive data collection on children from conception, highlighting how their intimate information is shared for profit. The talk exposes the profiling of children by AI and predictive analytics, which can influence their life chances, and warns of the biases in these technologies. She calls for political solutions to ensure data rights as human rights to protect future generations from algorithmic discrimination.
Takeaways
- 🤰 The speaker shared a personal experience of being rushed to the hospital while pregnant, where she was pressured to agree to donate her umbilical cord without fully understanding the terms.
- 📄 She highlighted the issue of agreeing to terms and conditions without understanding them, which can lead to the unknowing sharing of personal and genetic data.
- 👶 The speaker, an anthropologist and mother, discussed the vast amount of data being collected about children from before birth, raising concerns about privacy and consent.
- 🔎 She launched a research project called 'Child Data Citizen' to explore the implications of this data collection on children's rights and futures.
- 📱 Mobile health apps and other technologies are transforming intimate behavioral and health data into profit by sharing it with third parties, often beyond the health sector.
- 🏠 Children are being tracked by various technologies in their everyday life, including home technologies, educational platforms, and online records, without comprehensive understanding or control.
- 🤖 Artificial intelligence and predictive analytics are profiling individuals based on their data traces, which can impact rights and opportunities significantly.
- 🏦 Profiling by AI is used in various sectors like banking, insurance, recruitment, and law enforcement, often without transparency or accuracy.
- 🔒 The speaker argues that we cannot trust these technologies with profiling our children due to their inherent biases and inaccuracies.
- 🧐 Algorithms are not objective; they are designed within specific cultural contexts and are shaped by cultural values, leading to potential biases in AI decisions.
- 🏛️ Political solutions are needed to recognize data rights as human rights and to ensure a more just future for our data and our children's data.
- 👧 The speaker expressed fear for her daughters' future, where current data collection could lead to algorithmic discrimination and limit their opportunities to become their own persons.
Q & A
What significant event occurred in 2017 that prompted the speaker to question the terms and conditions of data sharing?
-The speaker fell on the bathroom floor while eight months pregnant, which induced labor. At the hospital, she was presented with forms to donate the umbilical cord and noticed a clause about using the cord cells for any future research without specifying the purpose, which made her uncomfortable.
What is the speaker's profession and how does it relate to her interest in data privacy?
-The speaker is an anthropologist and a mother of two. Her profession involves studying human societies and cultures, which led her to become interested in the vast amounts of data being collected about children and the implications of such data collection.
What is the name of the research project the speaker launched to investigate the issue of data collection on children?
-The research project is called 'Child Data Citizen' and aims to explore and understand the impact of data collection on children.
What was the main concern the speaker had when she noticed the clause about future research in the hospital forms?
-The speaker was concerned about the vagueness of the clause, which allowed for the use of her baby's genetic data for any future research without specifying the purpose or obtaining more informed consent.
How does the speaker describe the current state of data sharing in apps and online platforms?
-The speaker describes it as a system where data is often shared with third parties without users' full awareness or consent. This is exemplified by the British Medical Journal's research showing that many mobile health apps share information with third parties, including non-health sector companies.
What is the potential impact of data profiling on individuals, according to the speaker?
-Data profiling can impact individuals' rights and opportunities significantly, as it is used by banks, insurers, employers, and even the police and courts to make decisions about loans, premiums, job suitability, and criminal potential.
Why does the speaker believe that relying on AI and predictive analytics for profiling humans is problematic?
-The speaker believes it is problematic because these technologies are not objective and are based on biased algorithms and databases. They cannot account for the unpredictability and complexity of human experience and are inherently flawed.
What example does the speaker provide to illustrate the intrusive nature of data profiling on children?
-The speaker mentions an example where educational data brokers profiled children as young as two years old based on various categories and sold these profiles, including personal details, to companies that could use the information for marketing purposes.
What is the speaker's main argument against the current use of technology in profiling children?
-The speaker argues that the current use of technology in profiling children is invasive and potentially harmful, as it can lead to algorithmic discrimination and error, and may prevent children from becoming their own persons due to future judgments based on collected data.
What solution does the speaker propose to address the issue of data rights and profiling?
-The speaker proposes political solutions, urging governments to recognize data rights as human rights and to work towards greater data justice for individuals and children.
What is the speaker's ultimate concern regarding the data collected on her daughters?
-The speaker is concerned that the data collected on her daughters may be used to judge them in the future, potentially preventing them from achieving their hopes and dreams based on algorithmic decisions and biases.
Outlines
🤰 The Awakening Concern Over Child Data Privacy
The speaker recounts a personal experience in 2017 when she was pregnant and faced with a consent form for donating her unborn child's umbilical cord. The form's vague terms about future research made her uncomfortable, prompting her to refuse. This incident sparked her interest in the broader issue of children's data privacy. As an anthropologist and mother, she began researching the vast amounts of data collected about children, often without informed consent. She emphasizes that tracking starts even before birth, with parents using apps and sharing information online, and continues through various technologies that collect intimate data, which is then often shared with third parties for profit.
👶 The Ubiquitous Surveillance of Children's Data
The speaker discusses how children are tracked by a multitude of technologies in their daily lives, including home devices, educational platforms, online records, and connected toys. She highlights the lack of control parents have over this data and the potential for profiling based on this information. The use of AI and predictive analytics to make decisions about individuals, including children, is critiqued for its reliance on potentially biased data. The speaker provides examples of how data brokers profile children as young as two years old based on various categories and sell this information to companies, which can significantly impact the children's opportunities in life.
🚔 The Dangers of Algorithmic Bias in Profiling
The speaker addresses the issue of algorithmic bias, noting that AI technologies used for predictive policing have been trained on biased data, leading to perpetuation of police bias. She argues that while companies might attempt to fix these issues, bias in algorithms is inherent and cannot be completely eradicated. The speaker calls for political solutions and recognition of data rights as human rights. She expresses concern for her daughters' futures, fearing that the data collected about them could lead to algorithmic discrimination and limit their life opportunities. She concludes with a call to action for greater data justice and protection of children's data.
Mindmap
Keywords
💡Umbilical Cord Donation
💡Terms and Conditions
💡Data Privacy
💡Anthropologist
💡Child Data Citizen
💡Data Brokers
💡Algorithmic Bias
💡Predictive Analytics
💡Data Justice
💡Human Rights
💡Algorithmic Discrimination
Highlights
In 2017, the speaker fell on the bathroom floor, inducing labor while eight months pregnant, leading to a hospital visit.
A hospital assistant presented forms for donating umbilical cord blood, which included a clause for future unspecified research.
The speaker felt uncomfortable with the vague terms of the research consent and sought to opt out.
The incident sparked the speaker's interest in how data is collected and used, particularly concerning children.
The speaker, an anthropologist and mother, launched the 'Child Data Citizen' research project to explore data traces of children.
The speaker argues that the problem of data collection is bigger than just social media sharing.
Children's data is tracked from conception, with parents using various apps and online resources during pregnancy.
Mobile health apps were found to share user data with third parties, including non-health sector companies.
Children are tracked by a variety of technologies in their daily lives, including smart home devices and educational platforms.
Artificial intelligence and predictive analytics are used to profile individuals based on their data traces.
Profiles created from data can impact rights and opportunities, such as in banking, insurance, employment, and the legal system.
The speaker highlights the intrusive nature of data profiling, including the profiling of young children based on various categories.
Algorithmic bias in AI technologies used for predictive policing has been identified as a serious issue.
The speaker emphasizes that algorithms are not objective and are influenced by the cultural context and values of their designers.
The speaker calls for political solutions and recognition of data rights as human rights.
There is a call to action for individuals, organizations, and institutions to demand greater data justice.
The speaker expresses concern for her daughters' future, fearing the impact of algorithmic discrimination and error on their lives.
The speaker concludes by urging collective action to ensure a more just future for children's data rights.
Transcripts
so in 2017 I fell on the bathroom floor
in my home in Los Angeles I was eight
months pregnant default induced my labor
so I was rushed to hospital as I waited
for my doctor to arrive a young
assistant walked into the room where's
that big smile on her face and a folder
in her hands she had been told that I
had agreed to donate the umbilical cord
to the hospital and so she gave me the
forms of science I started reading the
phone the terms of conditions and
suddenly I noticed this sentence they
read something along the lines of I
agree for the court cells to be used for
any future research any future research
does that sound boring leave vague to
you I immediately thought about science
fiction stories on human cloning and
well I felt uncomfortable so I asked her
for more information or whether I could
opt out from that specific clause but
she couldn't really answer my questions
and then she said huh it's just the form
you just sign it
not just the phone that was my baby's
genetic data and I refused but that day
I had the perfect example of how natural
and how accepted it has become to agree
to terms and conditions without giving
it a second thought every day every week
we agreed to terms and conditions and
when we do this we provide companies
with a lawful right to do whatever they
want with our data and with the data of
our children which makes us wonder how
much data are we giving away of children
and more are its implications I'm an
anthropologist and I'm also the mother
of two little girls and I started to
become interested in these questions in
2015 when
suddenly realized that they were vast
almost unimaginable amounts of data
traces that are being produced and
collected about children so I launched a
research project which is called child
data citizen and they're aimed a feeling
in the blank now you may think that I'm
here to blame you for posting photos of
your children on social media well the
truth is that at the end of this talk
you know you kind of feel different
about that or you may feel different
about that but that's not really the
point the problem is way bigger than
so-called sharing thing this is about
systems not individuals you and your
habits are not to blame for the very
first time in history we are tracking
the individual data of children from
long before they're born sometimes from
the moment of conception and then
throughout their lives
you see when parents decide to conceive
they go online to look for ways to get
pregnant or they download the violation
tracking apps when they do get pregnant
they post ultrasound of their babies on
social media they download pregnancy
apps or they consult dr. Google for all
sorts of things like you know for
miscarriage risk when flying or
abdominal cramps in early pregnancy I
know because I've done it and many times
and then when the baby's born they track
every nap every feed every life event on
different technologies and all these
technologies transform the babies most
intimate behavioral and health data into
profit by sharing it with others so to
give you an idea of how this works in
2019 the British Medical Journal
published the research that showed that
out of 24 mobile health apps 19 shared
information with third parties and these
third parties share the information with
216 other organizations of this 216
other fourth parties
only three belonged to the health sector
the other companies that had access to
that data were big tech companies like
Google Facebook or Oracle they were
digital advertising companies and there
was also a consumer credit reporting
agency so you gave right ad companies
and credit agency may already have data
points on little babies but mobile apps
web searches and social media are really
just the tip of the iceberg because
children are being tracked by multiple
technologies in their everyday life
they're tracked by home technologies and
virtual assistants in their homes they
are tracked by educational platforms and
educational technologies in their
schools they're tracked by online
records and online portals as their
doctor's office distracted by the
Internet connected toys the online games
and many many many many other
technologies so during my research a lot
of parents came up to me and they were
like so what why does it matter if my
children are being tracked we can
nothing's right well it matters it
matters because today individuals are
not only being tracked they are also
being profiled on the basis of their
data traces artificial intelligence and
predictive analytics are being used
through harness as much data as possible
of an individual life from different
sources family history purchasing habits
social media comments and then they
bring this data together to make
data-driven decisions about the
individual and these technologies are
used everywhere banks use them to decide
loans insurers use them to decide
premiums recruiters and and employers
use them to decide whether one is a good
fit for a job or not also the police and
courts use them to determine whether one
is a potential criminal or is likely to
recommend a crime we have no knowledge
or
all over the ways in which those who buy
sell and process our data are profiling
us and our children but these profiles
can come to impact our rights in
significant ways to give you an example
in 2018 the New York Times published a
the news that the data that had been
gathered through online college planning
service there are actually completed by
millions of high school kids across the
u.s. who are looking for a college
program or a scholarship had been sold
to educational data brokers now
educational researchers afford them who
studied educational data brokers
revealed that these companies profile
kids as young as two on the basis of
different categories ethnicity religion
affluence social awkwardness and many
other random categories and then they
sell these profiles together with the
name of the kid the home address and the
contact details to different companies
including trading career institutions
student loans and student credit card
companies to push the boundaries the
researchers afford them asked an
educational data broker to provide them
with a list of 14 to 15 year old girls
who were interested in family planning
services the data broker agreed to
provide them the list so imagine how
intimate and how intrusive that is for
our kids but educational data brokers
are really just an example the truth is
that our children are being profiled in
ways that we cannot control but that can
significantly impact their chances in
life so we need to ask ourselves can we
trust these technologies when it comes
to profiling our children can we
my answer is no as an anthropologist I
believe that artificial intelligence and
predictive analytics can be great to
predict the course of a disease or to
fight climate change so we need to
abandon the belief that these
technologies can objectively profile
humans and that we can rely on them to
make data-driven decisions about
individual lives because they can't
profile humans data traces are not the
mirror of who we are humans think one
thing and say the opposite feel one way
and act differently algorithmic
predictions or our digital practices
cannot account for the unpredictability
and complexity of human experience but
on top of that these technologies are
always always in one way or another
biased
you see algorithms are by definition
sets or rules or steps that have been
designed to achieve a specific result
okay but these sets of rules or steps
cannot be objective because they've been
designed by human beings within specific
cultural context and are shaped by
specific cultural values so when
machines learn they learn from biased
algorithm and they often learn from
biased databases as well at the moment
we're seeing the first examples of
algorithmic bias and some of these
examples are frankly terrifying this
year the AI now Institute in New York
published a report that revealed that
the AI technologies that are being used
for predictive policing have been
trained on dirty data this is basically
data that had been gathered during
historical periods of known racial bias
and non transparent police practices
because these technologies are being
trained with dirty data general
objective and their outcomes are only
amplifying and perpetrating police bias
and error so I think we you're faced
with a fundamental problem in our
society we are started to trust
technologies when it comes to
human beings we know that in profiling
humans these technologies are always
going to be biased and are never really
going to be accurate companies can fix
this problem because they lined up with
a technical solution or by appointing a
new AI ethics board because algorithmic
error and bias can be reduced but it
cannot be eradicated and we cannot
protect ourselves from this so what we
need now is actually political solutions
we need governments to recognize that
our data rights are our human rights
until this happens we cannot hope for a
more just future and when I think about
my daughter's future and this is also
the reason why I'm here
I am scared I worried that my daughters
are gonna be exposed to all sorts of
algorithmic discrimination and error I
also worry that the data that is being
collected from them today he's gonna
prevent them from becoming their own
persons you see the difference between
me and my daughter is that there is no
public record out there of my childhood
there is no evidence on whether my mom
smoked during pregnancy or a family
member has a criminal history certainly
no database of all the stupid things
that have done and thought when I was a
teenager well but for my daughter's this
may be different the data that is being
collected from them today may be used to
judge them in the future and can come to
prevent their hopes and dreams and I
don't want that I think that it's time
it's time that we all step up it's time
that we start working together as
individuals as organizations and as
institutions and that we demand greater
data justice for us and for our children
before it's too late
thank you
you
Посмотреть больше похожих видео
Let's get to the root of racial injustice | Megan Ming Francis | TEDxRainier
Computers Can Predict When You're Going to Die… Here's How
Ethical Implications of Business Analytics | Dominic Ligot
Inizia a contare: il potere e i limiti dei dati nello svelare il mondo | Donata Columbro | TEDxCuneo
Will technology shape our future or will we | Deborah Nas | TEDxAlkmaar
Dentro gli algoritmi che regolano il nostro tempo - Donata Columbro
5.0 / 5 (0 votes)