TED Talks - What FACEBOOK And GOOGLE Are Hiding From The World - The Filter Bubble

Scott McLeod
25 Nov 201311:44

Summary

TLDRThe transcript discusses the concept of 'filter bubbles' on the internet, where algorithms on platforms like Facebook and Google personalize content based on user behavior, potentially isolating them in echo chambers. The speaker calls for algorithms to be designed with public life and civic responsibility in mind, advocating for transparency and user control over the information they receive. The narrative also includes a personal story of innovation, where an individual improves a prosthetic arm using historical patents, highlighting the potential for technology to enhance lives.

Takeaways

  • 🌐 The internet was initially seen as a democratizing force that could connect the world, but there's a shift in how information flows online that could undermine this potential.
  • 🔍 Facebook and Google use algorithms to personalize content, which can lead to users being in a 'filter bubble' where they only see what the algorithm thinks they want to see.
  • 📊 Personalization algorithms can create a skewed perception of reality, as they are based on past behavior and immediate interests, potentially excluding broader or more challenging perspectives.
  • 🔑 Algorithms lack the ethical considerations that human editors might have, which could lead to a narrow and biased view of the world for users.
  • 🌟 The concept of a 'filter bubble' suggests that the internet is showing users a tailored version of reality rather than a comprehensive one.
  • 🎥 Netflix queues illustrate the struggle between aspirational and impulsive choices, which can be mirrored in the information we consume online.
  • 📰 Traditional media gatekeepers have been replaced by algorithmic ones, which may not have the same ethical considerations for public discourse.
  • 🔄 The internet's promise of connecting people and introducing them to new ideas is at risk if it only reinforces existing beliefs and preferences.
  • 🛠️ The story of Michael ESSEC shows how the internet can be a tool for personal growth and innovation, but it also highlights the importance of access to diverse information.
  • 🌱 There's a call for the new gatekeepers of the internet to encode a sense of civic responsibility into their algorithms to ensure a balanced and diverse flow of information.

Q & A

  • What is the main concern about the newsfeed algorithm raised by the journalist?

    -The main concern is that the newsfeed algorithm prioritizes personal relevance over global importance, potentially leading to a narrow and self-centered view of the world.

  • What does the term 'filter bubble' refer to in the context of the transcript?

    -A 'filter bubble' refers to a personalized unique universe of information that online algorithms create for each individual based on their online behavior, without their conscious control over what is included or excluded.

  • How does the Facebook algorithm affect the diversity of content in a user's feed?

    -The Facebook algorithm edits out content from users' feeds based on their past interactions, such as which links they click on, potentially leading to a lack of diversity and an echo chamber effect.

  • What is the difference between search results for the same query on Google according to the transcript?

    -Google personalizes search results based on various signals like the type of device, browser, and location of the user, leading to different search outcomes even for the same query at the same time.

  • Why is the lack of standardization in Google search results a concern?

    -The lack of standardization in Google search results can lead to a personalized but potentially skewed view of information, which might not reflect a comprehensive or balanced reality.

  • What is the potential impact of personalized filters on the balance of information consumption?

    -Personalized filters may disrupt the balance of information consumption by favoring content that is immediately appealing over more substantial or important content, leading to an 'information junk food' diet.

  • How does the Netflix example illustrate the struggle between aspirational and impulsive information consumption?

    -The Netflix example shows the struggle between aspirational choices (watching educational content like 'Waiting for Superman') and impulsive choices (re-watching entertaining content like 'Ace Ventura'), highlighting how algorithms can influence this balance.

  • What historical parallel does the speaker draw between current algorithms and early 20th-century newspapers?

    -The speaker draws a parallel between the current algorithms and early 20th-century newspapers by emphasizing the need for algorithms to develop a sense of civic responsibility, similar to how journalistic ethics evolved for newspapers.

  • What responsibility does the speaker believe new gatekeepers of the internet should have?

    -The speaker believes that new gatekeepers (algorithms) should encode a sense of public life and civic responsibility into their code, ensuring transparency and control over the information that users are exposed to.

  • How does the story of Michael ESSEC relate to the broader theme of the transcript?

    -Michael ESSEC's story of restoring a motorcycle and improving a prosthetic arm illustrates the theme of taking control and improving upon existing systems, which aligns with the broader message of taking control over algorithmic filters to ensure a more balanced and connected internet experience.

Outlines

00:00

🌐 The Filter Bubble: Personalized Web Experiences

The paragraph discusses the concept of a 'filter bubble', where online platforms like Facebook and Google use algorithms to personalize content based on user behavior, potentially leading to a skewed and limited worldview. The speaker recounts their experience of conservative voices disappearing from their Facebook feed due to algorithmic curation. The paragraph also highlights how search engines like Google tailor search results based on various signals, leading to different users seeing different results for the same query. The speaker argues that while personalization can be convenient, it may also create echo chambers, shielding users from diverse perspectives and important information.

05:02

🎥 The Struggle Between Aspirational and Impulsive Information Consumption

This paragraph uses the analogy of Netflix queues to illustrate the struggle between our aspirational and impulsive information consumption habits. The speaker points out that algorithmic filters, which are designed to predict and cater to our immediate preferences, can lead to an overabundance of 'information junk food' and a lack of balanced, nutritious content. The paragraph argues for the need to program algorithms with a sense of civic responsibility and ethical considerations, similar to the role of editors in traditional media, to ensure a well-rounded and diverse information diet for users.

10:04

⚙️ Innovation Through Understanding and Improvisation

The final paragraph shifts focus to a personal story of innovation and problem-solving. It tells the story of Michael, an individual who, after losing part of his arm, sought to improve his prosthetic by researching and reverse-engineering an antique arm. Michael's journey highlights the importance of understanding and taking apart existing solutions to create better, more accessible alternatives. His work on a mechanical arm that could be a viable and affordable option for amputees underscores the potential for technology to empower individuals and improve their quality of life.

Mindmap

Keywords

💡Relevance

Relevance in the context of the video refers to the concept that online platforms like Facebook and Google tailor content to individual users based on their perceived interests. This is exemplified when Zuckerberg mentions a squirrel dying in one's yard being more relevant than people dying in Africa. The video discusses how this focus on relevance can lead to a narrow, personalized web experience, potentially isolating users from diverse perspectives.

💡Filter Bubble

A filter bubble is a personalized algorithmic selection of information that a user sees online, which is created based on their browsing history and preferences. The video argues that these bubbles can lead to an echo chamber effect, where users are only exposed to information that aligns with their existing views, thus limiting their exposure to a diverse range of perspectives. The term is used to illustrate the potential negative impact of personalized content curation on the internet.

💡Algorithmic Editing

Algorithmic editing refers to the process by which online platforms use algorithms to curate and display content to users. The video discusses how Facebook and Google use algorithms to personalize search results and news feeds, which can lead to the invisibility of certain types of content. This concept is central to the video's argument that the internet is becoming less about connecting diverse viewpoints and more about reinforcing existing beliefs.

💡Personalization

Personalization is the customization of web content based on individual user data, such as search history and click behavior. The video highlights how major websites like Yahoo News, Huffington Post, and the New York Times are adopting personalization, which can lead to a filter bubble. The script uses personalization as an example of how the internet is shifting away from a shared experience to a more individualized one.

💡Civic Responsibility

Civic responsibility in the video is discussed in the context of the ethical obligations that internet platforms have towards society. The speaker calls for algorithms to be programmed with a sense of civic responsibility, ensuring that they not only show relevant content but also expose users to challenging and important information. This concept is tied to the broader theme of the internet's role in democracy and public life.

💡Information Junk Food

Information junk food is a metaphor used in the video to describe content that is easily consumable and appealing but lacks nutritional value. The video suggests that algorithmic filters can lead to an overexposure to such content, similar to how junk food can be detrimental to health. This concept is used to critique the potential imbalance in information diets created by personalized algorithms.

💡Gatekeepers

Gatekeepers in the video refer to the entities that control the flow of information. Traditionally, this role was filled by editors in newspapers and other media. However, the video argues that with the rise of the internet, human gatekeepers have been replaced by algorithmic ones. The speaker calls for the new gatekeepers to encode a sense of responsibility into their algorithms, ensuring a diverse and balanced information flow.

💡Prosthetic Innovation

Prosthetic innovation is showcased in the video through the story of Michael, who, after losing part of his arm, sought to improve upon traditional prosthetic designs. He researched historical patents and aimed to create a more reliable and user-friendly prosthetic. This part of the video serves as a metaphor for the broader theme of innovation and improvement, suggesting that understanding and deconstructing existing systems can lead to better solutions.

💡Public Life

Public life in the video is discussed in relation to the role of the internet in fostering community and diverse perspectives. The speaker argues that for the internet to fulfill its potential as a connector of people and ideas, it must facilitate exposure to a wide range of content, not just what is personally relevant. Public life is tied to the video's central message about the importance of a diverse and accessible internet for societal well-being.

💡Transparency

Transparency in the video is used to describe the need for algorithms to be open about how they determine what content users see. The speaker argues that users should be able to understand the rules that govern their online experiences and have control over their filters. Transparency is presented as a key requirement for ensuring that personalization does not lead to the exclusion of important or challenging information.

Highlights

Zuckerberg's analogy of a squirrel dying in your front yard being more relevant than people dying in Africa to illustrate the concept of personalized newsfeeds.

The author's personal experience of growing up in rural Maine and the internet's role in connecting the world.

Concerns about the invisible shift in how information flows online and its potential negative impact on society.

The discovery that Facebook's algorithm was editing out conservative content from the author's feed based on their clicking behavior.

Google's personalized search results and the lack of a standard Google experience due to algorithmic tailoring.

The demonstration of how different search results can be for the same query, using the example of searching for 'Egypt'.

The concept of a 'filter bubble' where the internet shows users what it thinks they want to see, not necessarily what they need to see.

The challenge of maintaining a balanced information diet with algorithmic filters that prioritize immediate clicks.

The historical comparison of human gatekeepers in media to the current algorithmic gatekeepers and the need for ethical considerations in algorithms.

The call for transparency in algorithms and user control over what information gets through their filters.

The importance of the internet in connecting people and introducing them to new ideas, which is at risk with personalized filters.

The story of Michael Essec, who, after losing part of his arm, used his mechanical skills to improve his prosthetic.

Michael's innovative approach to reverse-engineering an antique arm to create a more reliable and repairable prosthetic.

The potential impact of Michael's work on providing affordable and functional prosthetic limbs to amputees.

The message that with understanding and the ability to take things apart, one can make them better, applied to both technology and life.

Transcripts

play00:14

Mark Zuckerberg a journalist was asking

play00:17

him a question about the newsfeed and

play00:20

the journalist was asking him you know

play00:22

why is this so important and Zuckerberg

play00:25

said a squirrel dying in your front yard

play00:28

may be more relevant to your interest

play00:31

right now than people dying in Africa

play00:34

and I want to talk about what a web

play00:36

based on that idea of relevance might

play00:39

look like so when I was growing up in a

play00:42

in a really rural area in Maine you know

play00:45

the internet meant something very

play00:46

different to me it meant a connection to

play00:49

the world it meant something that would

play00:50

connect us all together and I was sure

play00:53

that it was going to be great for

play00:55

democracy and for our society but

play00:58

there's this kind of shift in how

play01:00

information is flowing online and it's

play01:03

invisible and if we don't pay attention

play01:06

to it it could be a real problem so I

play01:11

first noticed this in a place I spend a

play01:13

lot of time my Facebook page I'm

play01:15

progressive politically big surprise but

play01:18

I've always you know gone out of my way

play01:20

to meet conservatives I like hearing

play01:22

what they're thinking about

play01:23

I like seeing what they link to I like

play01:25

learning a thing or two and so I was

play01:27

kind of surprised when I noticed one day

play01:29

that the Conservatives had disappeared

play01:31

from my Facebook feed and what it turned

play01:35

out was going on was that Facebook was

play01:36

looking at which links I clicked on and

play01:39

it was noticing that actually I was

play01:41

clicking more on my liberal friends

play01:43

links than on my conservative friends

play01:45

links and without consulting me about it

play01:48

it had edited them out they disappeared

play01:54

so Facebook isn't the only place that's

play01:57

doing this kind of invisible algorithmic

play01:59

editing of the web Google is doing it

play02:02

too if I search for something and you

play02:05

search for something even right now at

play02:07

the very same time we may get very

play02:09

different search results even if you're

play02:12

logged out one engineer told me there

play02:15

are 57 signals that Google looks at

play02:18

everything from what kind of computer

play02:20

you're on to what kind of browser you're

play02:23

using

play02:24

to where you're located that it uses to

play02:27

personally tailor your query results

play02:29

think about it for a second

play02:31

there is no standard Google anymore and

play02:35

you know the funny thing about this is

play02:36

that it's hard to see you can't see how

play02:39

different your search results are from

play02:41

anyone else's but a couple of weeks ago

play02:44

I asked a bunch of friends to Google

play02:46

Egypt and to send me screenshots of what

play02:49

they got so here's my friend Scott's

play02:52

screenshot and here's my friend Daniel

play02:56

screenshot when you put them

play02:58

side-by-side you don't even have to read

play03:00

the links to see how different these two

play03:02

pages are but when you do read the links

play03:05

it's really quite remarkable Daniel

play03:09

didn't get anything about the protests

play03:12

in Egypt at all in his first page of

play03:14

Google results

play03:14

Scott's results were full of them and

play03:16

this was the big story of the day at

play03:18

that time that's how different these

play03:20

results are becoming so it's not just

play03:22

Google on Facebook either you know this

play03:25

is something that's sweeping the web

play03:26

there are a whole host of companies that

play03:28

are doing this kind of personalization

play03:29

Yahoo News the biggest news site on the

play03:32

Internet is now personalized different

play03:34

people get different things Huffington

play03:37

Post Washington Post New York Times all

play03:39

flirting with personalization in various

play03:41

ways and where this this moves us very

play03:44

quickly toward a world in which the

play03:47

Internet is showing us what it thinks we

play03:49

want to see but not necessarily what we

play03:53

need to see as Eric Schmidt said it'll

play03:57

be very hard for people to watch or

play03:59

consume something that is not in some

play04:01

sense been tailored for them so I do

play04:06

think this is a problem and I think if

play04:09

you take all of these filters together

play04:10

if you take all of these algorithms you

play04:13

get what I call a filter bubble and your

play04:17

filter bubble is kind of your own

play04:18

personal unique universe of information

play04:21

that you live in online and what's in

play04:24

your filter bubble depends on who you

play04:27

are and it depends on what you do but

play04:30

the thing is that you don't decide what

play04:32

gets in and more importantly you don't

play04:36

actually see

play04:37

what gets edited out so one of the

play04:39

problems with the filter bubble was

play04:41

discovered by some researchers at

play04:42

Netflix and they were looking at the

play04:44

Netflix queues and they notice something

play04:46

kind of funny that a lot of us probably

play04:47

have noticed which is there are some

play04:49

movies that just sort of zip right up

play04:52

and out to our houses they enter the

play04:54

queue they just zip right out so Ironman

play04:57

zips right out right and Waiting for

play04:59

Superman can wait for a really long time

play05:02

what they discovered was that in our

play05:05

Netflix queues there's kind of this epic

play05:07

struggle going on between our future

play05:11

aspirational selves and our more

play05:13

impulsive present selves you know we all

play05:16

want to be someone who has watched

play05:17

Rashomon but right now we want to watch

play05:23

Ace Ventura for the 4th time so the best

play05:28

editing gives us a bit of both it gives

play05:30

us a little bit of Justin Bieber and a

play05:32

little bit of Afghanistan it gives us

play05:33

some information vegetables it gives us

play05:36

some information dessert and the

play05:39

challenge with these kind of algorithmic

play05:41

filters these personalized filters is

play05:43

that because they're mainly looking at

play05:45

what you click on first you know you

play05:49

don't it can throw off that balance and

play05:52

instead of a balance that information

play05:54

diet you can end up surrounded by

play05:57

information junk food so what this

play06:01

suggests is actually that we may have

play06:03

the story about the internet wrong in a

play06:05

broadcast society you know this is how

play06:06

the founding mythology goes right in a

play06:08

broadcast society there were these

play06:11

gatekeepers the editors and they

play06:13

controlled the flows of information and

play06:15

Along Came the internet and it swept

play06:17

them out of the way and it allowed us

play06:19

all of us to connect together and it was

play06:21

awesome but that's not actually what's

play06:24

happening right now what we're seeing is

play06:27

more of a passing of the torch from

play06:30

human gatekeepers to algorithmic ones

play06:33

and the thing is that the algorithms

play06:37

don't yet have the kind of embedded

play06:39

ethics that the editors did

play06:43

so if algorithms are going to curate the

play06:45

world for us if they're going to decide

play06:47

what we get to see and what we don't get

play06:49

to see that we need to make

play06:51

that they're not just keyed to relevance

play06:53

we need to make sure that they also show

play06:56

us things that are uncomfortable or

play06:58

challenging or important this is what

play07:00

Ted does right other points of view and

play07:02

the thing is we've actually kind of been

play07:05

here before as a society in 1915 it's

play07:10

not like newspapers were sweating a lot

play07:12

about their civic responsibilities then

play07:15

people kind of noticed that they were

play07:17

doing something really important that in

play07:19

fact you couldn't have a functioning

play07:21

democracy if citizens didn't get a good

play07:25

flow of information that the newspapers

play07:29

were critical because they were acting

play07:30

as the filter and that journalistic

play07:32

ethics developed it wasn't perfect but

play07:35

it got us through the last century and

play07:38

so now we're kind of back in 1915 on the

play07:43

web and we need the new gatekeepers to

play07:47

encode that kind of responsibility into

play07:49

the code that they're writing you know I

play07:51

know there are a lot of people here from

play07:53

Facebook and from Google Larry and

play07:55

Sergey who you know people who have

play07:57

helped build the web as it is and I'm

play07:58

grateful for that but we really need to

play08:01

you to make sure that these algorithms

play08:04

have encoded in them a sense of the

play08:07

public life a sense of civic

play08:08

responsibility we need you to make sure

play08:11

that they're transparent enough that we

play08:13

can see what the rules are that

play08:15

determine what gets through our filters

play08:16

and we need you to give us some control

play08:19

so that we can decide what gets through

play08:22

and what doesn't because I think we

play08:26

really need the Internet to be that

play08:28

thing that we all dreamed of it being we

play08:31

need it to connect us all together we

play08:33

need it to introduce us to new ideas and

play08:36

new people and different perspectives

play08:38

and it's not going to do that if it

play08:42

leaves us all isolated in a web of one

play08:45

thang

play09:07

when I was 16 I found this motorcycle it

play09:11

was a dilapidated that piece of junk

play09:14

that wasn't going but I realized how

play09:17

special it was

play09:19

restoration of this motorcycle has

play09:22

taught me to improvise with mechanical

play09:24

things my name is Michael ESSEC I live

play09:27

in Hobart Tasmania under Ector of

play09:30

dynamic welding and engineering it's

play09:32

pretty cool place we do machine shop

play09:35

work we do aluminum welding seven years

play09:39

ago I had an accident and lost the best

play09:41

part of my arm

play09:42

I got a prosthetic on my own insurance

play09:45

it didn't work it was very unreliable it

play09:48

wasn't fixed to my body and flopped

play09:51

about it simply wasn't up to the job I

play09:55

thought if you can figure a motorcycle

play09:58

why can't you fix a prosthetic

play10:01

so I did a search to find out as much

play10:04

information as possible and it came

play10:07

across the cans artificial arm this is a

play10:11

antique arm from the early 1900s and in

play10:15

its time it was so mechanically

play10:17

advantage we patent search on Google I

play10:21

was able to find every single cans

play10:23

patent online so I set about trying to

play10:28

reverse engineers

play10:32

the arm itself has many good features

play10:35

its body powered which means it won't

play10:38

break down you can use it for heavy-duty

play10:40

work and you can also repair it yourself

play10:42

and hold it here and hold a BF exactly

play10:45

some of the new things I hope to add to

play10:47

this arm is an extra finger joint and it

play10:50

would be gear driven will also be the

play10:53

lighter with space-age material such as

play10:54

magnesium and carbon fiber that is so

play10:58

cool well if you think how many people

play11:01

have lost limbs but can't afford the new

play11:03

electronic limbs I mean it is rewarding

play11:07

knowing that one day people will build a

play11:10

way of something that Mike and I are

play11:11

working on my hope for this artificial

play11:15

arm is to wear it and share it

play11:18

so amputees can live a normal life again

play11:26

it doesn't matter if it's a bike or an

play11:29

arm or your life if you can take it

play11:32

apart if you can understand it you can

play11:35

make it better

Rate This

5.0 / 5 (0 votes)

Ähnliche Tags
Algorithmic BiasDigital EthicsFilter BubbleInformation PersonalizationMedia ConsumptionSocial Media ImpactDemocracy ConcernsInternet FreedomCivic ResponsibilityInnovation Story
Benötigen Sie eine Zusammenfassung auf Englisch?