The moral bias behind your search results | Andreas Ekström
Summary
TLDRIn this thought-provoking talk, the speaker explores the myth of unbiased search results in Google, revealing how algorithms are influenced by human biases. Using examples like the manipulation of Michelle Obama's image in search results and a counter-campaign against Anders Behring Breivik, the speaker shows how search engines are not neutral. The talk highlights the intersection of technology and human values, urging a closer connection between the humanities and tech to understand the implications of these biases and their impact on knowledge and society.
Takeaways
- 😀 Google is the preferred search engine for many because it 'works,' is widely recognized, or simply because people don't know of alternatives.
- 😀 People often trust that Google provides the 'best, unbiased' search results, which is a philosophical ideal rather than a reality.
- 😀 Simple search queries, such as factual questions (e.g., 'What is the capital of France?'), are straightforward and easy for Google to answer.
- 😀 More complex questions, like 'Why is there an Israeli-Palestinian conflict?', require subjective interpretation and can't be answered purely by facts.
- 😀 Knowledge is shaped by individual perspectives, making search results inherently biased depending on who is doing the searching.
- 😀 Filters like social circles, media, and personal biases influence how people interpret and value facts, creating knowledge that is context-dependent.
- 😀 Search engines like Google are not equipped to handle the nuances of knowledge, as they are optimized for isolated facts, not subjective interpretations.
- 😀 Google’s algorithms can be manipulated, as demonstrated by the racist image campaign against Michelle Obama in 2009, which distorted search results.
- 😀 Google intervened to clean up biased search results when Michelle Obama’s image was manipulated, but didn’t take similar action with Anders Behring Breivik.
- 😀 The inconsistency in Google's response to manipulation highlights the inherent power Google holds in shaping narratives and public perception.
- 😀 The underlying message is that search engines are not impartial; they reflect the biases of their creators and the inputs of the internet community, making 'unbiased' search results an illusion.
Q & A
Why does the speaker ask students why they use Google?
-The speaker asks students why they use Google to highlight the common perceptions of search engines and to provoke thought about the biases behind Google's search results.
What are the three common answers the speaker receives from students regarding why they use Google?
-The three common answers are: 1) 'Because it works', 2) 'I really don't know of any alternatives', and 3) 'With Google, I'm certain to always get the best, unbiased search result.'
What does the speaker say about the idea of getting unbiased search results from Google?
-The speaker argues that the idea of unbiased search results is a myth. He explains that biases inevitably influence search results due to human intervention, whether in the algorithms or content curation.
What is the difference between looking for 'isolated facts' versus 'knowledge' according to the speaker?
-Isolated facts are straightforward answers that don't require interpretation (e.g., the capital of France). In contrast, knowledge involves complex understanding, where facts are filtered and valued differently based on personal, cultural, or societal factors.
How does the speaker illustrate the complexity of finding knowledge through search engines?
-The speaker uses the example of the Israeli-Palestinian conflict, showing that to understand complex issues, one must weigh many facts, interpret them subjectively, and engage with others' perspectives, which is not easily achieved through a search engine.
How did the racist campaign against Michelle Obama affect search results?
-In 2009, a racist campaign targeted Michelle Obama by uploading distorted images of her with captions and filenames designed to manipulate search results, causing offensive images to appear when searching for her on Google.
What did Google do to address the racist manipulation of search results for Michelle Obama?
-Google intervened by manually cleaning the search results and removing the offensive image, recognizing it as a racist distortion that needed correction.
What was the role of Nikke Lindqvist in the campaign against Anders Behring Breivik's search results?
-Nikke Lindqvist, a Swedish web developer and SEO expert, initiated a campaign to manipulate search results for Anders Behring Breivik by encouraging people to upload images of dog poop with captions and filenames containing Breivik's name, to protest his actions.
Why didn't Google intervene in the case of Anders Behring Breivik's manipulated search results?
-Google did not intervene in Breivik's case because, in this instance, there was no attempt to correct the results based on the perceived moral evaluation of the individual. Google did not manually clean up the results, unlike in the case of Michelle Obama.
What underlying issue does the speaker raise about Google's role in determining search results?
-The speaker highlights the issue of power and bias in search engines. He argues that Google's algorithms are influenced by human values, and the company itself decides what is 'true' or 'false' based on its own moral judgments.
Outlines

Esta sección está disponible solo para usuarios con suscripción. Por favor, mejora tu plan para acceder a esta parte.
Mejorar ahoraMindmap

Esta sección está disponible solo para usuarios con suscripción. Por favor, mejora tu plan para acceder a esta parte.
Mejorar ahoraKeywords

Esta sección está disponible solo para usuarios con suscripción. Por favor, mejora tu plan para acceder a esta parte.
Mejorar ahoraHighlights

Esta sección está disponible solo para usuarios con suscripción. Por favor, mejora tu plan para acceder a esta parte.
Mejorar ahoraTranscripts

Esta sección está disponible solo para usuarios con suscripción. Por favor, mejora tu plan para acceder a esta parte.
Mejorar ahoraVer Más Videos Relacionados

Kevin Slavin - How algorithms shape our world

How to keep human bias out of AI | Kriti Sharma

Murray Gell-Mann: Beauty and truth in physics

The Myth of Race | Sharad Paul | TEDxAuckland

Hypocritical oaths -- medicine's dirty secrets | Charlotte Blease | TEDxFulbrightDublin

Are we in control of our decisions? | Dan Ariely
5.0 / 5 (0 votes)