Beware online "filter bubbles" | Eli Pariser

TED
2 May 201109:05

Summary

TLDRThe speaker discusses the concept of 'filter bubbles' on the internet, where algorithms curate personalized content based on user behavior, potentially isolating individuals from diverse perspectives. They recount personal experiences with Facebook and Google search, highlighting how these platforms tailor content without user consent, leading to a narrow, biased view of the world. The talk calls for a balance between relevance and exposure to challenging or important content, urging tech giants like Facebook and Google to embed civic responsibility into their algorithms and provide users with transparency and control over their digital experiences.

Takeaways

  • 🔍 Mark Zuckerberg's quote about news feed relevance highlights the prioritization of personal interest over global events.
  • 🌐 The speaker grew up in a rural area where the Internet symbolized a connection to the world and a potential boon for democracy and society.
  • 🔄 There's an unnoticed shift in online information flow that could become problematic if overlooked.
  • 📱 The speaker noticed a lack of conservative voices on their Facebook feed due to algorithmic editing based on their clicking habits.
  • 🔍 Google personalizes search results based on numerous signals, leading to vastly different outcomes for different users.
  • 👥 The lack of standardization in Google search results means there's no 'standard Google' anymore.
  • 📊 Personalization is widespread across the web, with major news sites tailoring content to individual users.
  • 🕶 The concept of a 'filter bubble' is introduced, describing a personalized online universe of information that isn't necessarily balanced.
  • 🎬 Netflix queues illustrate the struggle between aspirational and impulsive viewing choices, affected by algorithmic filters.
  • 📚 The speaker suggests that algorithmic filters can disrupt the balance of information, leading to an 'information junk food' environment.
  • 👥 The transition from human to algorithmic gatekeepers is likened to a torch passing, with a call for algorithms to incorporate ethical considerations.
  • 📖 Historical parallels are drawn to the development of journalistic ethics in response to the realization of newspapers' importance to democracy.
  • 🛠️ There's a call to action for tech companies like Facebook and Google to encode public life and civic responsibility into their algorithms.
  • 👀 Transparency and user control over what information passes through filters are emphasized as necessary for a healthy and connected Internet.
  • 🌏 The final message is the need for the Internet to fulfill its dream of connecting people, introducing new ideas, and avoiding isolation in a 'Web of one'.

Q & A

  • What was the main point Zuckerberg was making when he compared a squirrel dying in someone's front yard to people dying in Africa?

    -Zuckerberg was emphasizing the idea of relevance in the news feed. He suggested that what is personally more relevant to an individual, even something as trivial as a squirrel dying in their yard, might be more important to them at that moment than a significant event happening far away, like people dying in Africa.

  • How did the speaker's experience with Facebook illustrate the concept of a filter bubble?

    -The speaker noticed that conservatives had disappeared from their Facebook feed because the algorithm was observing their clicking behavior and noticed they clicked more on liberal friends' links. Facebook, without the speaker's consent, edited out the conservative content, demonstrating how a filter bubble can form based on our online behavior.

  • What is a filter bubble according to the speaker?

    -A filter bubble is a personalized, unique universe of information that one lives in online. It is determined by who you are and what you do online, but you don't decide what gets in or what gets edited out, which can lead to an unbalanced and potentially skewed view of the world.

  • Why might personalized search results on Google be problematic?

    -Personalized search results can be problematic because they create a personalized version of the internet for each user, which means that there is no standard Google anymore. This can lead to a lack of exposure to diverse perspectives and information, potentially isolating users in their own informational echo chambers.

  • How does the speaker describe the shift in information flow online?

    -The speaker describes the shift as invisible and something that could be a real problem if not paid attention to. It involves algorithms editing the web without human consultation, tailoring content to individual users' perceived interests rather than providing a balanced view of the world.

  • What is the concern with algorithmic filters and how they might affect our information diet?

    -The concern is that algorithmic filters, by focusing mainly on immediate clicks and preferences, can throw off the balance of a balanced information diet. Instead of a diverse range of information, users might end up surrounded by information that is more like junk food, appealing but not necessarily nutritious or well-rounded.

  • What historical comparison does the speaker make regarding the role of newspapers and the current state of the internet?

    -The speaker compares the current state of the internet to the early 20th-century newspapers, where there was a realization that a functioning democracy required a good flow of information. Just as journalistic ethics developed in response, the speaker calls for a similar development of ethical considerations in the algorithms that curate our online experiences.

  • What responsibility does the speaker believe new gatekeepers of the internet should take on?

    -The speaker believes that new gatekeepers, such as the creators of algorithms, should encode a sense of public life and civic responsibility into their code. They should ensure transparency in how their algorithms work and give users control over what information gets through their filters.

  • How does the speaker suggest we might ensure that algorithms show us not just what is relevant, but also what is important or challenging?

    -The speaker suggests that we need to make sure algorithms are not just keyed to relevance but are also programmed to show us uncomfortable, challenging, or important information. This would help maintain a balance and expose us to diverse viewpoints and issues.

  • What does the speaker fear might happen if the internet continues to isolate us in personalized filter bubbles?

    -The speaker fears that if the internet continues to isolate us in personalized filter bubbles, it will not fulfill its potential to connect us all together, introduce us to new ideas, people, and perspectives. Instead, it might leave us in a 'Web of one,' which could be detrimental to society and democracy.

  • What is the speaker's final call to action for those involved in building and shaping the internet?

    -The speaker's final call to action is for those involved in building and shaping the internet, including people from Facebook and Google, to ensure that the algorithms they create have a sense of public life and civic responsibility. They should make the algorithms transparent and give users control over their information filters.

Outlines

00:00

🌐 The Impact of Personalized Algorithms on Web Content

The speaker discusses the influence of personalized algorithms on the content we see online, using Mark Zuckerberg's analogy of a squirrel's death being more relevant than distant human tragedies. The speaker's personal experience with Facebook's algorithm editing out conservative viewpoints from their feed is highlighted. This is followed by an explanation of how Google personalizes search results based on numerous signals, leading to vastly different user experiences. The speaker emphasizes the invisible nature of these algorithmic changes and their potential threat to democracy and societal cohesion, coining the term 'filter bubble' to describe the personalized information universes we inhabit online.

05:00

📚 The Challenge of Maintaining Balanced Information Diets

This paragraph delves into the concept of 'filter bubbles' and their implications for our information consumption. The speaker uses Netflix as an example to illustrate the struggle between aspirational and impulsive choices in media consumption, suggesting that the best curation strikes a balance. The paragraph highlights the shift from human gatekeepers to algorithmic ones and the lack of ethical considerations in these algorithms. The speaker calls for a new era of gatekeepers who encode a sense of public life and civic responsibility into their algorithms. They urge for transparency in how these algorithms work and for user control over the information they receive, to ensure the internet fulfills its dream of connecting people and broadening perspectives.

Mindmap

Keywords

💡Relevance

Relevance refers to the quality of being closely connected or appropriate to the matter at hand. In the video's context, it is used to describe how online platforms like Facebook and Google prioritize content that they believe is more pertinent to a user's interests. Zuckerberg's example of a squirrel dying in one's front yard being more relevant than people dying in Africa illustrates the concept of personalized relevance, which is central to the video's theme of how online algorithms shape our information consumption.

💡Filter Bubble

A filter bubble is a term used to describe a personalized algorithmic search engine result that is catered to the user's previous preferences. It represents the idea that online platforms create a unique universe of information for each user based on their online behavior, without their direct input. The video discusses how these bubbles can isolate users from diverse perspectives and information, which is crucial for a well-functioning democracy and a connected society.

💡Algorithmic Editing

Algorithmic editing refers to the process by which online platforms use algorithms to curate and filter content that users see. The video script mentions that Facebook and Google are examples of platforms that perform this kind of editing, tailoring search results and news feeds to align with individual user's preferences and behaviors. This concept is key to understanding how users might be unknowingly shielded from diverse viewpoints.

💡Personalization

Personalization is the customization of a product or service for an individual user. In the context of the video, personalization is applied to digital content, where websites like Yahoo News, Huffington Post, and the New York Times tailor the news and information presented to users based on their past behavior and preferences. The speaker argues that while personalization can seem convenient, it may also lead to a narrow and potentially skewed view of the world.

💡Information Diet

Information diet is a metaphorical term that compares the consumption of information to eating a meal. It suggests that just as a balanced diet is important for physical health, a balanced intake of information is crucial for mental and societal health. The video warns that algorithmic filters can disrupt this balance by favoring 'information junk food' over more substantial content, leading to an unbalanced and potentially harmful consumption of news and perspectives.

💡Gatekeepers

Gatekeepers in the context of the video refer to entities that control the flow of information. Traditionally, this role was filled by human editors in newspapers and other media outlets. However, the video discusses how this role has shifted from humans to algorithms on the internet. The concern is that algorithms, lacking the ethical considerations of human editors, may not provide a diverse or balanced view of information.

💡Civic Responsibility

Civic responsibility is the idea that individuals and organizations have a duty to contribute to the well-being of their community or society. In the video, the speaker calls for the new gatekeepers of information, the algorithm creators, to encode a sense of civic responsibility into their work. This means ensuring that their algorithms promote public life and contribute positively to society, rather than isolating individuals in personalized information bubbles.

💡Public Life

Public life refers to the collective activities and interactions that occur in society outside of one's private life. The video emphasizes the importance of public life in fostering a connected and informed society. It argues that algorithms should not only reflect personal preferences but also promote engagement with public issues and diverse viewpoints, which are essential for a healthy public life.

💡Transparency

Transparency in the video refers to the openness and clarity with which algorithms operate. The speaker argues for the need for algorithms to be transparent enough that users can understand the rules that determine what information passes through their filters. Transparency is crucial for users to trust the systems and to have control over their own information consumption.

💡Control

Control, in this context, means the ability of users to influence and manage what information they receive online. The video suggests that while algorithms can offer personalized experiences, they should also provide users with some level of control over their information intake. This control is important to prevent users from being passively subjected to a limited range of perspectives and to ensure a more democratic and diverse information landscape.

💡Connected Society

A connected society is one where individuals are linked through communication and shared experiences. The video script speaks to the original promise of the internet as a tool for creating such a society, where people can access a wide range of information and perspectives. The concern raised is that the current trend of personalized algorithms might undermine this connectedness by creating isolated 'webs of one'.

Highlights

Mark Zuckerberg's statement on the importance of the news feed and relevance of local events versus global tragedies.

The idea of a web based on relevance and the speaker's personal experience growing up in rural Maine with the Internet as a connection to the world.

Shift in online information flow that is invisible and could pose a problem if not addressed.

Personal observation of conservatives disappearing from the speaker's Facebook feed due to algorithmic editing.

Facebook's algorithmic editing without user consultation, leading to a personalized but potentially skewed information bubble.

Google's personalized search results based on 57 different signals, leading to a lack of a standard Google experience.

Experiment with friends Googling 'Egypt', showing stark differences in search results and the absence of certain topics for some users.

Personalization sweeping the web with major news sites like Yahoo News, Huffington Post, Washington Post, and New York Times.

The concept of a filter bubble, a personal universe of information online that is not under the user's control.

Netflix queues as an example of the struggle between aspirational and impulsive selves in the context of filter bubbles.

The challenge of maintaining a balanced information diet with algorithmic filters that may favor 'information junk food'.

The myth of the Internet as a liberator from gatekeepers and the reality of passing the torch to algorithmic gatekeepers.

The need for algorithms to have embedded ethics and to show users not just relevant but also uncomfortable, challenging, or important information.

Historical comparison to 1915 and the development of journalistic ethics, suggesting a need for similar ethical development in web algorithms.

A call to action for tech giants like Facebook and Google to encode public life and civic responsibility into their algorithms.

The importance of transparency in algorithms and giving users control over what information gets through their filters.

The speaker's vision for the Internet as a connector of people, ideas, and perspectives, rather than isolating individuals in a 'web of one'.

Transcripts

play00:15

Mark Zuckerberg,

play00:17

a journalist was asking him a question about the news feed.

play00:20

And the journalist was asking him,

play00:22

"Why is this so important?"

play00:24

And Zuckerberg said,

play00:26

"A squirrel dying in your front yard

play00:28

may be more relevant to your interests right now

play00:31

than people dying in Africa."

play00:34

And I want to talk about

play00:36

what a Web based on that idea of relevance might look like.

play00:40

So when I was growing up

play00:42

in a really rural area in Maine,

play00:44

the Internet meant something very different to me.

play00:47

It meant a connection to the world.

play00:49

It meant something that would connect us all together.

play00:52

And I was sure that it was going to be great for democracy

play00:55

and for our society.

play00:58

But there's this shift

play01:00

in how information is flowing online,

play01:02

and it's invisible.

play01:05

And if we don't pay attention to it,

play01:07

it could be a real problem.

play01:10

So I first noticed this in a place I spend a lot of time --

play01:13

my Facebook page.

play01:15

I'm progressive, politically -- big surprise --

play01:18

but I've always gone out of my way to meet conservatives.

play01:20

I like hearing what they're thinking about;

play01:22

I like seeing what they link to;

play01:24

I like learning a thing or two.

play01:26

And so I was surprised when I noticed one day

play01:29

that the conservatives had disappeared from my Facebook feed.

play01:33

And what it turned out was going on

play01:35

was that Facebook was looking at which links I clicked on,

play01:39

and it was noticing that, actually,

play01:41

I was clicking more on my liberal friends' links

play01:43

than on my conservative friends' links.

play01:46

And without consulting me about it,

play01:48

it had edited them out.

play01:50

They disappeared.

play01:54

So Facebook isn't the only place

play01:56

that's doing this kind of invisible, algorithmic

play01:58

editing of the Web.

play02:01

Google's doing it too.

play02:03

If I search for something, and you search for something,

play02:06

even right now at the very same time,

play02:08

we may get very different search results.

play02:11

Even if you're logged out, one engineer told me,

play02:14

there are 57 signals

play02:16

that Google looks at --

play02:19

everything from what kind of computer you're on

play02:22

to what kind of browser you're using

play02:24

to where you're located --

play02:26

that it uses to personally tailor your query results.

play02:29

Think about it for a second:

play02:31

there is no standard Google anymore.

play02:35

And you know, the funny thing about this is that it's hard to see.

play02:38

You can't see how different your search results are

play02:40

from anyone else's.

play02:42

But a couple of weeks ago,

play02:44

I asked a bunch of friends to Google "Egypt"

play02:47

and to send me screen shots of what they got.

play02:50

So here's my friend Scott's screen shot.

play02:54

And here's my friend Daniel's screen shot.

play02:57

When you put them side-by-side,

play02:59

you don't even have to read the links

play03:01

to see how different these two pages are.

play03:03

But when you do read the links,

play03:05

it's really quite remarkable.

play03:09

Daniel didn't get anything about the protests in Egypt at all

play03:12

in his first page of Google results.

play03:14

Scott's results were full of them.

play03:16

And this was the big story of the day at that time.

play03:18

That's how different these results are becoming.

play03:21

So it's not just Google and Facebook either.

play03:24

This is something that's sweeping the Web.

play03:26

There are a whole host of companies that are doing this kind of personalization.

play03:29

Yahoo News, the biggest news site on the Internet,

play03:32

is now personalized -- different people get different things.

play03:36

Huffington Post, the Washington Post, the New York Times --

play03:39

all flirting with personalization in various ways.

play03:42

And this moves us very quickly

play03:45

toward a world in which

play03:47

the Internet is showing us what it thinks we want to see,

play03:51

but not necessarily what we need to see.

play03:54

As Eric Schmidt said,

play03:57

"It will be very hard for people to watch or consume something

play04:00

that has not in some sense

play04:02

been tailored for them."

play04:05

So I do think this is a problem.

play04:07

And I think, if you take all of these filters together,

play04:10

you take all these algorithms,

play04:12

you get what I call a filter bubble.

play04:16

And your filter bubble is your own personal,

play04:19

unique universe of information

play04:21

that you live in online.

play04:23

And what's in your filter bubble

play04:26

depends on who you are, and it depends on what you do.

play04:29

But the thing is that you don't decide what gets in.

play04:33

And more importantly,

play04:35

you don't actually see what gets edited out.

play04:38

So one of the problems with the filter bubble

play04:40

was discovered by some researchers at Netflix.

play04:43

And they were looking at the Netflix queues, and they noticed something kind of funny

play04:46

that a lot of us probably have noticed,

play04:48

which is there are some movies

play04:50

that just sort of zip right up and out to our houses.

play04:53

They enter the queue, they just zip right out.

play04:56

So "Iron Man" zips right out,

play04:58

and "Waiting for Superman"

play05:00

can wait for a really long time.

play05:02

What they discovered

play05:04

was that in our Netflix queues

play05:06

there's this epic struggle going on

play05:09

between our future aspirational selves

play05:12

and our more impulsive present selves.

play05:15

You know we all want to be someone

play05:17

who has watched "Rashomon,"

play05:19

but right now

play05:21

we want to watch "Ace Ventura" for the fourth time.

play05:24

(Laughter)

play05:27

So the best editing gives us a bit of both.

play05:29

It gives us a little bit of Justin Bieber

play05:31

and a little bit of Afghanistan.

play05:33

It gives us some information vegetables;

play05:35

it gives us some information dessert.

play05:38

And the challenge with these kinds of algorithmic filters,

play05:40

these personalized filters,

play05:42

is that, because they're mainly looking

play05:44

at what you click on first,

play05:48

it can throw off that balance.

play05:52

And instead of a balanced information diet,

play05:55

you can end up surrounded

play05:57

by information junk food.

play05:59

What this suggests

play06:01

is actually that we may have the story about the Internet wrong.

play06:04

In a broadcast society --

play06:06

this is how the founding mythology goes --

play06:08

in a broadcast society,

play06:10

there were these gatekeepers, the editors,

play06:12

and they controlled the flows of information.

play06:15

And along came the Internet and it swept them out of the way,

play06:18

and it allowed all of us to connect together,

play06:20

and it was awesome.

play06:22

But that's not actually what's happening right now.

play06:26

What we're seeing is more of a passing of the torch

play06:29

from human gatekeepers

play06:31

to algorithmic ones.

play06:34

And the thing is that the algorithms

play06:37

don't yet have the kind of embedded ethics

play06:40

that the editors did.

play06:43

So if algorithms are going to curate the world for us,

play06:46

if they're going to decide what we get to see and what we don't get to see,

play06:49

then we need to make sure

play06:51

that they're not just keyed to relevance.

play06:54

We need to make sure that they also show us things

play06:56

that are uncomfortable or challenging or important --

play06:59

this is what TED does --

play07:01

other points of view.

play07:03

And the thing is, we've actually been here before

play07:05

as a society.

play07:08

In 1915, it's not like newspapers were sweating a lot

play07:11

about their civic responsibilities.

play07:14

Then people noticed

play07:16

that they were doing something really important.

play07:19

That, in fact, you couldn't have

play07:21

a functioning democracy

play07:23

if citizens didn't get a good flow of information,

play07:28

that the newspapers were critical because they were acting as the filter,

play07:31

and then journalistic ethics developed.

play07:33

It wasn't perfect,

play07:35

but it got us through the last century.

play07:38

And so now,

play07:40

we're kind of back in 1915 on the Web.

play07:44

And we need the new gatekeepers

play07:47

to encode that kind of responsibility

play07:49

into the code that they're writing.

play07:51

I know that there are a lot of people here from Facebook and from Google --

play07:54

Larry and Sergey --

play07:56

people who have helped build the Web as it is,

play07:58

and I'm grateful for that.

play08:00

But we really need you to make sure

play08:03

that these algorithms have encoded in them

play08:06

a sense of the public life, a sense of civic responsibility.

play08:09

We need you to make sure that they're transparent enough

play08:12

that we can see what the rules are

play08:14

that determine what gets through our filters.

play08:17

And we need you to give us some control

play08:19

so that we can decide

play08:21

what gets through and what doesn't.

play08:24

Because I think

play08:26

we really need the Internet to be that thing

play08:28

that we all dreamed of it being.

play08:30

We need it to connect us all together.

play08:33

We need it to introduce us to new ideas

play08:36

and new people and different perspectives.

play08:40

And it's not going to do that

play08:42

if it leaves us all isolated in a Web of one.

play08:45

Thank you.

play08:47

(Applause)

Rate This

5.0 / 5 (0 votes)

Related Tags
Algorithmic EditingFilter BubbleOnline PersonalizationDigital EthicsInformation DietSocial Media ImpactWeb RelevanceInternet DemocracyGoogle SearchFacebook Feed