What do tech companies know about your children? | Veronica Barassi | TEDxMileHigh

TEDx Talks
30 Jan 202013:37

Summary

TLDRThe speaker, an anthropologist and mother, recounts her experience with a hospital's consent form that raised concerns about data privacy. She delves into the pervasive data collection on children from conception, highlighting how their intimate information is shared for profit. The talk exposes the profiling of children by AI and predictive analytics, which can influence their life chances, and warns of the biases in these technologies. She calls for political solutions to ensure data rights as human rights to protect future generations from algorithmic discrimination.

Takeaways

  • 🤰 The speaker shared a personal experience of being rushed to the hospital while pregnant, where she was pressured to agree to donate her umbilical cord without fully understanding the terms.
  • 📄 She highlighted the issue of agreeing to terms and conditions without understanding them, which can lead to the unknowing sharing of personal and genetic data.
  • 👶 The speaker, an anthropologist and mother, discussed the vast amount of data being collected about children from before birth, raising concerns about privacy and consent.
  • 🔎 She launched a research project called 'Child Data Citizen' to explore the implications of this data collection on children's rights and futures.
  • 📱 Mobile health apps and other technologies are transforming intimate behavioral and health data into profit by sharing it with third parties, often beyond the health sector.
  • 🏠 Children are being tracked by various technologies in their everyday life, including home technologies, educational platforms, and online records, without comprehensive understanding or control.
  • 🤖 Artificial intelligence and predictive analytics are profiling individuals based on their data traces, which can impact rights and opportunities significantly.
  • 🏦 Profiling by AI is used in various sectors like banking, insurance, recruitment, and law enforcement, often without transparency or accuracy.
  • 🔒 The speaker argues that we cannot trust these technologies with profiling our children due to their inherent biases and inaccuracies.
  • 🧐 Algorithms are not objective; they are designed within specific cultural contexts and are shaped by cultural values, leading to potential biases in AI decisions.
  • 🏛️ Political solutions are needed to recognize data rights as human rights and to ensure a more just future for our data and our children's data.
  • 👧 The speaker expressed fear for her daughters' future, where current data collection could lead to algorithmic discrimination and limit their opportunities to become their own persons.

Q & A

  • What significant event occurred in 2017 that prompted the speaker to question the terms and conditions of data sharing?

    -The speaker fell on the bathroom floor while eight months pregnant, which induced labor. At the hospital, she was presented with forms to donate the umbilical cord and noticed a clause about using the cord cells for any future research without specifying the purpose, which made her uncomfortable.

  • What is the speaker's profession and how does it relate to her interest in data privacy?

    -The speaker is an anthropologist and a mother of two. Her profession involves studying human societies and cultures, which led her to become interested in the vast amounts of data being collected about children and the implications of such data collection.

  • What is the name of the research project the speaker launched to investigate the issue of data collection on children?

    -The research project is called 'Child Data Citizen' and aims to explore and understand the impact of data collection on children.

  • What was the main concern the speaker had when she noticed the clause about future research in the hospital forms?

    -The speaker was concerned about the vagueness of the clause, which allowed for the use of her baby's genetic data for any future research without specifying the purpose or obtaining more informed consent.

  • How does the speaker describe the current state of data sharing in apps and online platforms?

    -The speaker describes it as a system where data is often shared with third parties without users' full awareness or consent. This is exemplified by the British Medical Journal's research showing that many mobile health apps share information with third parties, including non-health sector companies.

  • What is the potential impact of data profiling on individuals, according to the speaker?

    -Data profiling can impact individuals' rights and opportunities significantly, as it is used by banks, insurers, employers, and even the police and courts to make decisions about loans, premiums, job suitability, and criminal potential.

  • Why does the speaker believe that relying on AI and predictive analytics for profiling humans is problematic?

    -The speaker believes it is problematic because these technologies are not objective and are based on biased algorithms and databases. They cannot account for the unpredictability and complexity of human experience and are inherently flawed.

  • What example does the speaker provide to illustrate the intrusive nature of data profiling on children?

    -The speaker mentions an example where educational data brokers profiled children as young as two years old based on various categories and sold these profiles, including personal details, to companies that could use the information for marketing purposes.

  • What is the speaker's main argument against the current use of technology in profiling children?

    -The speaker argues that the current use of technology in profiling children is invasive and potentially harmful, as it can lead to algorithmic discrimination and error, and may prevent children from becoming their own persons due to future judgments based on collected data.

  • What solution does the speaker propose to address the issue of data rights and profiling?

    -The speaker proposes political solutions, urging governments to recognize data rights as human rights and to work towards greater data justice for individuals and children.

  • What is the speaker's ultimate concern regarding the data collected on her daughters?

    -The speaker is concerned that the data collected on her daughters may be used to judge them in the future, potentially preventing them from achieving their hopes and dreams based on algorithmic decisions and biases.

Outlines

00:00

🤰 The Awakening Concern Over Child Data Privacy

The speaker recounts a personal experience in 2017 when she was pregnant and faced with a consent form for donating her unborn child's umbilical cord. The form's vague terms about future research made her uncomfortable, prompting her to refuse. This incident sparked her interest in the broader issue of children's data privacy. As an anthropologist and mother, she began researching the vast amounts of data collected about children, often without informed consent. She emphasizes that tracking starts even before birth, with parents using apps and sharing information online, and continues through various technologies that collect intimate data, which is then often shared with third parties for profit.

05:01

👶 The Ubiquitous Surveillance of Children's Data

The speaker discusses how children are tracked by a multitude of technologies in their daily lives, including home devices, educational platforms, online records, and connected toys. She highlights the lack of control parents have over this data and the potential for profiling based on this information. The use of AI and predictive analytics to make decisions about individuals, including children, is critiqued for its reliance on potentially biased data. The speaker provides examples of how data brokers profile children as young as two years old based on various categories and sell this information to companies, which can significantly impact the children's opportunities in life.

10:02

🚔 The Dangers of Algorithmic Bias in Profiling

The speaker addresses the issue of algorithmic bias, noting that AI technologies used for predictive policing have been trained on biased data, leading to perpetuation of police bias. She argues that while companies might attempt to fix these issues, bias in algorithms is inherent and cannot be completely eradicated. The speaker calls for political solutions and recognition of data rights as human rights. She expresses concern for her daughters' futures, fearing that the data collected about them could lead to algorithmic discrimination and limit their life opportunities. She concludes with a call to action for greater data justice and protection of children's data.

Mindmap

Keywords

💡Umbilical Cord Donation

Umbilical cord donation refers to the act of giving permission for the medical use of the umbilical cord, which contains stem cells, after childbirth. In the video, the speaker recounts an experience where she was asked to donate her umbilical cord without fully understanding the terms, highlighting the theme of consent and awareness regarding personal data and biological materials.

💡Terms and Conditions

Terms and conditions are the contractual provisions that users agree to in order to use a service or product. The video emphasizes the common practice of accepting these terms without fully reading or understanding them, which can lead to unknowing consent to data sharing, as illustrated by the umbilical cord donation scenario.

💡Data Privacy

Data privacy concerns the appropriate handling and protection of personal information. The video discusses the collection of vast amounts of data on children, raising questions about how much data is being given away and the implications for privacy and consent.

💡Anthropologist

An anthropologist is a social scientist who studies human societies and cultures and their development. The speaker, an anthropologist, brings a unique perspective to the issue of data collection on children, framing it within a broader social and cultural context.

💡Child Data Citizen

Child Data Citizen is the name of the research project launched by the speaker to investigate the data traces being produced and collected about children. It represents the central focus of the video, examining the extent and impact of data collection on young lives.

💡Data Brokers

Data brokers are companies that collect and sell personal data to third parties. The video mentions educational data brokers that profile children based on various categories and sell this information to companies, raising concerns about the ethics and implications of such practices.

💡Algorithmic Bias

Algorithmic bias refers to the systemic errors in algorithms that can lead to unfair or discriminatory outcomes. The video discusses how AI technologies used for predictive policing have been trained on biased data, leading to perpetuated biases in their outcomes.

💡Predictive Analytics

Predictive analytics uses data, statistical algorithms, and machine learning techniques to identify the likelihood of future outcomes based on historical data. The video critiques the reliance on these technologies for making decisions about individuals' lives, emphasizing their limitations and potential for error.

💡Data Justice

Data justice is the concept of fair and ethical treatment in the collection, use, and distribution of data. The speaker calls for greater data justice to protect individuals and children from algorithmic discrimination and to ensure their rights are respected.

💡Human Rights

Human rights are the basic rights and freedoms to which all individuals are entitled. The video argues that data rights should be recognized as human rights, emphasizing the need for political solutions to protect against the misuse of personal data.

💡Algorithmic Discrimination

Algorithmic discrimination occurs when algorithms used in decision-making processes result in unfair treatment of certain groups. The video expresses concern that the data collected on children could be used to judge them in the future, potentially leading to discrimination.

Highlights

In 2017, the speaker fell on the bathroom floor, inducing labor while eight months pregnant, leading to a hospital visit.

A hospital assistant presented forms for donating umbilical cord blood, which included a clause for future unspecified research.

The speaker felt uncomfortable with the vague terms of the research consent and sought to opt out.

The incident sparked the speaker's interest in how data is collected and used, particularly concerning children.

The speaker, an anthropologist and mother, launched the 'Child Data Citizen' research project to explore data traces of children.

The speaker argues that the problem of data collection is bigger than just social media sharing.

Children's data is tracked from conception, with parents using various apps and online resources during pregnancy.

Mobile health apps were found to share user data with third parties, including non-health sector companies.

Children are tracked by a variety of technologies in their daily lives, including smart home devices and educational platforms.

Artificial intelligence and predictive analytics are used to profile individuals based on their data traces.

Profiles created from data can impact rights and opportunities, such as in banking, insurance, employment, and the legal system.

The speaker highlights the intrusive nature of data profiling, including the profiling of young children based on various categories.

Algorithmic bias in AI technologies used for predictive policing has been identified as a serious issue.

The speaker emphasizes that algorithms are not objective and are influenced by the cultural context and values of their designers.

The speaker calls for political solutions and recognition of data rights as human rights.

There is a call to action for individuals, organizations, and institutions to demand greater data justice.

The speaker expresses concern for her daughters' future, fearing the impact of algorithmic discrimination and error on their lives.

The speaker concludes by urging collective action to ensure a more just future for children's data rights.

Transcripts

play00:10

so in 2017 I fell on the bathroom floor

play00:15

in my home in Los Angeles I was eight

play00:18

months pregnant default induced my labor

play00:20

so I was rushed to hospital as I waited

play00:24

for my doctor to arrive a young

play00:26

assistant walked into the room where's

play00:28

that big smile on her face and a folder

play00:31

in her hands she had been told that I

play00:34

had agreed to donate the umbilical cord

play00:36

to the hospital and so she gave me the

play00:39

forms of science I started reading the

play00:42

phone the terms of conditions and

play00:44

suddenly I noticed this sentence they

play00:46

read something along the lines of I

play00:48

agree for the court cells to be used for

play00:51

any future research any future research

play00:57

does that sound boring leave vague to

play00:59

you I immediately thought about science

play01:02

fiction stories on human cloning and

play01:04

well I felt uncomfortable so I asked her

play01:07

for more information or whether I could

play01:09

opt out from that specific clause but

play01:13

she couldn't really answer my questions

play01:14

and then she said huh it's just the form

play01:18

you just sign it

play01:21

not just the phone that was my baby's

play01:24

genetic data and I refused but that day

play01:27

I had the perfect example of how natural

play01:31

and how accepted it has become to agree

play01:34

to terms and conditions without giving

play01:36

it a second thought every day every week

play01:40

we agreed to terms and conditions and

play01:43

when we do this we provide companies

play01:46

with a lawful right to do whatever they

play01:48

want with our data and with the data of

play01:52

our children which makes us wonder how

play01:58

much data are we giving away of children

play02:00

and more are its implications I'm an

play02:05

anthropologist and I'm also the mother

play02:07

of two little girls and I started to

play02:10

become interested in these questions in

play02:11

2015 when

play02:13

suddenly realized that they were vast

play02:15

almost unimaginable amounts of data

play02:18

traces that are being produced and

play02:20

collected about children so I launched a

play02:24

research project which is called child

play02:26

data citizen and they're aimed a feeling

play02:28

in the blank now you may think that I'm

play02:32

here to blame you for posting photos of

play02:35

your children on social media well the

play02:39

truth is that at the end of this talk

play02:40

you know you kind of feel different

play02:42

about that or you may feel different

play02:43

about that but that's not really the

play02:45

point the problem is way bigger than

play02:48

so-called sharing thing this is about

play02:52

systems not individuals you and your

play02:56

habits are not to blame for the very

play02:58

first time in history we are tracking

play03:01

the individual data of children from

play03:03

long before they're born sometimes from

play03:06

the moment of conception and then

play03:08

throughout their lives

play03:09

you see when parents decide to conceive

play03:12

they go online to look for ways to get

play03:15

pregnant or they download the violation

play03:18

tracking apps when they do get pregnant

play03:22

they post ultrasound of their babies on

play03:25

social media they download pregnancy

play03:27

apps or they consult dr. Google for all

play03:31

sorts of things like you know for

play03:33

miscarriage risk when flying or

play03:36

abdominal cramps in early pregnancy I

play03:38

know because I've done it and many times

play03:41

and then when the baby's born they track

play03:45

every nap every feed every life event on

play03:48

different technologies and all these

play03:51

technologies transform the babies most

play03:54

intimate behavioral and health data into

play03:57

profit by sharing it with others so to

play04:02

give you an idea of how this works in

play04:04

2019 the British Medical Journal

play04:06

published the research that showed that

play04:09

out of 24 mobile health apps 19 shared

play04:14

information with third parties and these

play04:17

third parties share the information with

play04:20

216 other organizations of this 216

play04:25

other fourth parties

play04:27

only three belonged to the health sector

play04:29

the other companies that had access to

play04:32

that data were big tech companies like

play04:35

Google Facebook or Oracle they were

play04:39

digital advertising companies and there

play04:41

was also a consumer credit reporting

play04:44

agency so you gave right ad companies

play04:48

and credit agency may already have data

play04:51

points on little babies but mobile apps

play04:55

web searches and social media are really

play04:57

just the tip of the iceberg because

play05:00

children are being tracked by multiple

play05:02

technologies in their everyday life

play05:04

they're tracked by home technologies and

play05:06

virtual assistants in their homes they

play05:09

are tracked by educational platforms and

play05:11

educational technologies in their

play05:12

schools they're tracked by online

play05:14

records and online portals as their

play05:16

doctor's office distracted by the

play05:18

Internet connected toys the online games

play05:21

and many many many many other

play05:23

technologies so during my research a lot

play05:27

of parents came up to me and they were

play05:28

like so what why does it matter if my

play05:32

children are being tracked we can

play05:35

nothing's right well it matters it

play05:40

matters because today individuals are

play05:43

not only being tracked they are also

play05:46

being profiled on the basis of their

play05:48

data traces artificial intelligence and

play05:51

predictive analytics are being used

play05:54

through harness as much data as possible

play05:56

of an individual life from different

play05:58

sources family history purchasing habits

play06:02

social media comments and then they

play06:04

bring this data together to make

play06:06

data-driven decisions about the

play06:08

individual and these technologies are

play06:11

used everywhere banks use them to decide

play06:14

loans insurers use them to decide

play06:17

premiums recruiters and and employers

play06:21

use them to decide whether one is a good

play06:23

fit for a job or not also the police and

play06:27

courts use them to determine whether one

play06:30

is a potential criminal or is likely to

play06:33

recommend a crime we have no knowledge

play06:39

or

play06:41

all over the ways in which those who buy

play06:43

sell and process our data are profiling

play06:46

us and our children but these profiles

play06:49

can come to impact our rights in

play06:51

significant ways to give you an example

play06:58

in 2018 the New York Times published a

play07:01

the news that the data that had been

play07:04

gathered through online college planning

play07:06

service there are actually completed by

play07:10

millions of high school kids across the

play07:12

u.s. who are looking for a college

play07:14

program or a scholarship had been sold

play07:17

to educational data brokers now

play07:21

educational researchers afford them who

play07:24

studied educational data brokers

play07:26

revealed that these companies profile

play07:29

kids as young as two on the basis of

play07:32

different categories ethnicity religion

play07:36

affluence social awkwardness and many

play07:41

other random categories and then they

play07:44

sell these profiles together with the

play07:47

name of the kid the home address and the

play07:50

contact details to different companies

play07:53

including trading career institutions

play07:56

student loans and student credit card

play07:59

companies to push the boundaries the

play08:03

researchers afford them asked an

play08:05

educational data broker to provide them

play08:08

with a list of 14 to 15 year old girls

play08:12

who were interested in family planning

play08:14

services the data broker agreed to

play08:18

provide them the list so imagine how

play08:21

intimate and how intrusive that is for

play08:24

our kids but educational data brokers

play08:28

are really just an example the truth is

play08:30

that our children are being profiled in

play08:32

ways that we cannot control but that can

play08:35

significantly impact their chances in

play08:37

life so we need to ask ourselves can we

play08:43

trust these technologies when it comes

play08:45

to profiling our children can we

play08:50

my answer is no as an anthropologist I

play08:54

believe that artificial intelligence and

play08:56

predictive analytics can be great to

play08:58

predict the course of a disease or to

play09:00

fight climate change so we need to

play09:03

abandon the belief that these

play09:05

technologies can objectively profile

play09:07

humans and that we can rely on them to

play09:10

make data-driven decisions about

play09:12

individual lives because they can't

play09:14

profile humans data traces are not the

play09:17

mirror of who we are humans think one

play09:20

thing and say the opposite feel one way

play09:22

and act differently algorithmic

play09:24

predictions or our digital practices

play09:26

cannot account for the unpredictability

play09:29

and complexity of human experience but

play09:33

on top of that these technologies are

play09:36

always always in one way or another

play09:39

biased

play09:41

you see algorithms are by definition

play09:44

sets or rules or steps that have been

play09:48

designed to achieve a specific result

play09:50

okay but these sets of rules or steps

play09:53

cannot be objective because they've been

play09:55

designed by human beings within specific

play09:57

cultural context and are shaped by

play09:59

specific cultural values so when

play10:02

machines learn they learn from biased

play10:04

algorithm and they often learn from

play10:08

biased databases as well at the moment

play10:11

we're seeing the first examples of

play10:13

algorithmic bias and some of these

play10:15

examples are frankly terrifying this

play10:19

year the AI now Institute in New York

play10:22

published a report that revealed that

play10:24

the AI technologies that are being used

play10:27

for predictive policing have been

play10:30

trained on dirty data this is basically

play10:34

data that had been gathered during

play10:36

historical periods of known racial bias

play10:40

and non transparent police practices

play10:43

because these technologies are being

play10:45

trained with dirty data general

play10:48

objective and their outcomes are only

play10:51

amplifying and perpetrating police bias

play10:54

and error so I think we you're faced

play10:57

with a fundamental problem in our

play10:59

society we are started to trust

play11:01

technologies when it comes to

play11:04

human beings we know that in profiling

play11:08

humans these technologies are always

play11:10

going to be biased and are never really

play11:12

going to be accurate companies can fix

play11:16

this problem because they lined up with

play11:20

a technical solution or by appointing a

play11:22

new AI ethics board because algorithmic

play11:26

error and bias can be reduced but it

play11:29

cannot be eradicated and we cannot

play11:33

protect ourselves from this so what we

play11:37

need now is actually political solutions

play11:39

we need governments to recognize that

play11:41

our data rights are our human rights

play11:52

until this happens we cannot hope for a

play11:56

more just future and when I think about

play12:01

my daughter's future and this is also

play12:03

the reason why I'm here

play12:04

I am scared I worried that my daughters

play12:09

are gonna be exposed to all sorts of

play12:10

algorithmic discrimination and error I

play12:13

also worry that the data that is being

play12:17

collected from them today he's gonna

play12:20

prevent them from becoming their own

play12:21

persons you see the difference between

play12:24

me and my daughter is that there is no

play12:26

public record out there of my childhood

play12:28

there is no evidence on whether my mom

play12:31

smoked during pregnancy or a family

play12:34

member has a criminal history certainly

play12:39

no database of all the stupid things

play12:41

that have done and thought when I was a

play12:43

teenager well but for my daughter's this

play12:48

may be different the data that is being

play12:51

collected from them today may be used to

play12:55

judge them in the future and can come to

play12:58

prevent their hopes and dreams and I

play13:02

don't want that I think that it's time

play13:05

it's time that we all step up it's time

play13:07

that we start working together as

play13:10

individuals as organizations and as

play13:13

institutions and that we demand greater

play13:15

data justice for us and for our children

play13:18

before it's too late

play13:19

thank you

play13:25

you

Rate This

5.0 / 5 (0 votes)

Ähnliche Tags
Data PrivacyChild ProfilingDigital EthicsAlgorithmic BiasSurveillance SocietyGenetic DataParental ConcernsSocial MediaHealth AppsData Justice
Benötigen Sie eine Zusammenfassung auf Englisch?