What do tech companies know about your children? | Veronica Barassi | TEDxMileHigh
Summary
TLDRThe speaker, an anthropologist and mother, recounts her experience with a hospital's consent form that raised concerns about data privacy. She delves into the pervasive data collection on children from conception, highlighting how their intimate information is shared for profit. The talk exposes the profiling of children by AI and predictive analytics, which can influence their life chances, and warns of the biases in these technologies. She calls for political solutions to ensure data rights as human rights to protect future generations from algorithmic discrimination.
Takeaways
- 🤰 The speaker shared a personal experience of being rushed to the hospital while pregnant, where she was pressured to agree to donate her umbilical cord without fully understanding the terms.
- 📄 She highlighted the issue of agreeing to terms and conditions without understanding them, which can lead to the unknowing sharing of personal and genetic data.
- 👶 The speaker, an anthropologist and mother, discussed the vast amount of data being collected about children from before birth, raising concerns about privacy and consent.
- 🔎 She launched a research project called 'Child Data Citizen' to explore the implications of this data collection on children's rights and futures.
- 📱 Mobile health apps and other technologies are transforming intimate behavioral and health data into profit by sharing it with third parties, often beyond the health sector.
- 🏠 Children are being tracked by various technologies in their everyday life, including home technologies, educational platforms, and online records, without comprehensive understanding or control.
- 🤖 Artificial intelligence and predictive analytics are profiling individuals based on their data traces, which can impact rights and opportunities significantly.
- 🏦 Profiling by AI is used in various sectors like banking, insurance, recruitment, and law enforcement, often without transparency or accuracy.
- 🔒 The speaker argues that we cannot trust these technologies with profiling our children due to their inherent biases and inaccuracies.
- 🧐 Algorithms are not objective; they are designed within specific cultural contexts and are shaped by cultural values, leading to potential biases in AI decisions.
- 🏛️ Political solutions are needed to recognize data rights as human rights and to ensure a more just future for our data and our children's data.
- 👧 The speaker expressed fear for her daughters' future, where current data collection could lead to algorithmic discrimination and limit their opportunities to become their own persons.
Q & A
What significant event occurred in 2017 that prompted the speaker to question the terms and conditions of data sharing?
-The speaker fell on the bathroom floor while eight months pregnant, which induced labor. At the hospital, she was presented with forms to donate the umbilical cord and noticed a clause about using the cord cells for any future research without specifying the purpose, which made her uncomfortable.
What is the speaker's profession and how does it relate to her interest in data privacy?
-The speaker is an anthropologist and a mother of two. Her profession involves studying human societies and cultures, which led her to become interested in the vast amounts of data being collected about children and the implications of such data collection.
What is the name of the research project the speaker launched to investigate the issue of data collection on children?
-The research project is called 'Child Data Citizen' and aims to explore and understand the impact of data collection on children.
What was the main concern the speaker had when she noticed the clause about future research in the hospital forms?
-The speaker was concerned about the vagueness of the clause, which allowed for the use of her baby's genetic data for any future research without specifying the purpose or obtaining more informed consent.
How does the speaker describe the current state of data sharing in apps and online platforms?
-The speaker describes it as a system where data is often shared with third parties without users' full awareness or consent. This is exemplified by the British Medical Journal's research showing that many mobile health apps share information with third parties, including non-health sector companies.
What is the potential impact of data profiling on individuals, according to the speaker?
-Data profiling can impact individuals' rights and opportunities significantly, as it is used by banks, insurers, employers, and even the police and courts to make decisions about loans, premiums, job suitability, and criminal potential.
Why does the speaker believe that relying on AI and predictive analytics for profiling humans is problematic?
-The speaker believes it is problematic because these technologies are not objective and are based on biased algorithms and databases. They cannot account for the unpredictability and complexity of human experience and are inherently flawed.
What example does the speaker provide to illustrate the intrusive nature of data profiling on children?
-The speaker mentions an example where educational data brokers profiled children as young as two years old based on various categories and sold these profiles, including personal details, to companies that could use the information for marketing purposes.
What is the speaker's main argument against the current use of technology in profiling children?
-The speaker argues that the current use of technology in profiling children is invasive and potentially harmful, as it can lead to algorithmic discrimination and error, and may prevent children from becoming their own persons due to future judgments based on collected data.
What solution does the speaker propose to address the issue of data rights and profiling?
-The speaker proposes political solutions, urging governments to recognize data rights as human rights and to work towards greater data justice for individuals and children.
What is the speaker's ultimate concern regarding the data collected on her daughters?
-The speaker is concerned that the data collected on her daughters may be used to judge them in the future, potentially preventing them from achieving their hopes and dreams based on algorithmic decisions and biases.
Outlines
هذا القسم متوفر فقط للمشتركين. يرجى الترقية للوصول إلى هذه الميزة.
قم بالترقية الآنMindmap
هذا القسم متوفر فقط للمشتركين. يرجى الترقية للوصول إلى هذه الميزة.
قم بالترقية الآنKeywords
هذا القسم متوفر فقط للمشتركين. يرجى الترقية للوصول إلى هذه الميزة.
قم بالترقية الآنHighlights
هذا القسم متوفر فقط للمشتركين. يرجى الترقية للوصول إلى هذه الميزة.
قم بالترقية الآنTranscripts
هذا القسم متوفر فقط للمشتركين. يرجى الترقية للوصول إلى هذه الميزة.
قم بالترقية الآنتصفح المزيد من مقاطع الفيديو ذات الصلة
What is 'data colonialism'? - BBC REEL
Let's get to the root of racial injustice | Megan Ming Francis | TEDxRainier
Computers Can Predict When You're Going to Die… Here's How
Perbedaan Privacy Policy dengan Information Security Policy
Privacy and data protection
Ethical Implications of Business Analytics | Dominic Ligot
5.0 / 5 (0 votes)