TED Talks - What FACEBOOK And GOOGLE Are Hiding From The World - The Filter Bubble
Summary
TLDRThe transcript discusses the concept of 'filter bubbles' on the internet, where algorithms on platforms like Facebook and Google personalize content based on user behavior, potentially isolating them in echo chambers. The speaker calls for algorithms to be designed with public life and civic responsibility in mind, advocating for transparency and user control over the information they receive. The narrative also includes a personal story of innovation, where an individual improves a prosthetic arm using historical patents, highlighting the potential for technology to enhance lives.
Takeaways
- 🌐 The internet was initially seen as a democratizing force that could connect the world, but there's a shift in how information flows online that could undermine this potential.
- 🔍 Facebook and Google use algorithms to personalize content, which can lead to users being in a 'filter bubble' where they only see what the algorithm thinks they want to see.
- 📊 Personalization algorithms can create a skewed perception of reality, as they are based on past behavior and immediate interests, potentially excluding broader or more challenging perspectives.
- 🔑 Algorithms lack the ethical considerations that human editors might have, which could lead to a narrow and biased view of the world for users.
- 🌟 The concept of a 'filter bubble' suggests that the internet is showing users a tailored version of reality rather than a comprehensive one.
- 🎥 Netflix queues illustrate the struggle between aspirational and impulsive choices, which can be mirrored in the information we consume online.
- 📰 Traditional media gatekeepers have been replaced by algorithmic ones, which may not have the same ethical considerations for public discourse.
- 🔄 The internet's promise of connecting people and introducing them to new ideas is at risk if it only reinforces existing beliefs and preferences.
- 🛠️ The story of Michael ESSEC shows how the internet can be a tool for personal growth and innovation, but it also highlights the importance of access to diverse information.
- 🌱 There's a call for the new gatekeepers of the internet to encode a sense of civic responsibility into their algorithms to ensure a balanced and diverse flow of information.
Q & A
What is the main concern about the newsfeed algorithm raised by the journalist?
-The main concern is that the newsfeed algorithm prioritizes personal relevance over global importance, potentially leading to a narrow and self-centered view of the world.
What does the term 'filter bubble' refer to in the context of the transcript?
-A 'filter bubble' refers to a personalized unique universe of information that online algorithms create for each individual based on their online behavior, without their conscious control over what is included or excluded.
How does the Facebook algorithm affect the diversity of content in a user's feed?
-The Facebook algorithm edits out content from users' feeds based on their past interactions, such as which links they click on, potentially leading to a lack of diversity and an echo chamber effect.
What is the difference between search results for the same query on Google according to the transcript?
-Google personalizes search results based on various signals like the type of device, browser, and location of the user, leading to different search outcomes even for the same query at the same time.
Why is the lack of standardization in Google search results a concern?
-The lack of standardization in Google search results can lead to a personalized but potentially skewed view of information, which might not reflect a comprehensive or balanced reality.
What is the potential impact of personalized filters on the balance of information consumption?
-Personalized filters may disrupt the balance of information consumption by favoring content that is immediately appealing over more substantial or important content, leading to an 'information junk food' diet.
How does the Netflix example illustrate the struggle between aspirational and impulsive information consumption?
-The Netflix example shows the struggle between aspirational choices (watching educational content like 'Waiting for Superman') and impulsive choices (re-watching entertaining content like 'Ace Ventura'), highlighting how algorithms can influence this balance.
What historical parallel does the speaker draw between current algorithms and early 20th-century newspapers?
-The speaker draws a parallel between the current algorithms and early 20th-century newspapers by emphasizing the need for algorithms to develop a sense of civic responsibility, similar to how journalistic ethics evolved for newspapers.
What responsibility does the speaker believe new gatekeepers of the internet should have?
-The speaker believes that new gatekeepers (algorithms) should encode a sense of public life and civic responsibility into their code, ensuring transparency and control over the information that users are exposed to.
How does the story of Michael ESSEC relate to the broader theme of the transcript?
-Michael ESSEC's story of restoring a motorcycle and improving a prosthetic arm illustrates the theme of taking control and improving upon existing systems, which aligns with the broader message of taking control over algorithmic filters to ensure a more balanced and connected internet experience.
Outlines
🌐 The Filter Bubble: Personalized Web Experiences
The paragraph discusses the concept of a 'filter bubble', where online platforms like Facebook and Google use algorithms to personalize content based on user behavior, potentially leading to a skewed and limited worldview. The speaker recounts their experience of conservative voices disappearing from their Facebook feed due to algorithmic curation. The paragraph also highlights how search engines like Google tailor search results based on various signals, leading to different users seeing different results for the same query. The speaker argues that while personalization can be convenient, it may also create echo chambers, shielding users from diverse perspectives and important information.
🎥 The Struggle Between Aspirational and Impulsive Information Consumption
This paragraph uses the analogy of Netflix queues to illustrate the struggle between our aspirational and impulsive information consumption habits. The speaker points out that algorithmic filters, which are designed to predict and cater to our immediate preferences, can lead to an overabundance of 'information junk food' and a lack of balanced, nutritious content. The paragraph argues for the need to program algorithms with a sense of civic responsibility and ethical considerations, similar to the role of editors in traditional media, to ensure a well-rounded and diverse information diet for users.
⚙️ Innovation Through Understanding and Improvisation
The final paragraph shifts focus to a personal story of innovation and problem-solving. It tells the story of Michael, an individual who, after losing part of his arm, sought to improve his prosthetic by researching and reverse-engineering an antique arm. Michael's journey highlights the importance of understanding and taking apart existing solutions to create better, more accessible alternatives. His work on a mechanical arm that could be a viable and affordable option for amputees underscores the potential for technology to empower individuals and improve their quality of life.
Mindmap
Keywords
💡Relevance
💡Filter Bubble
💡Algorithmic Editing
💡Personalization
💡Civic Responsibility
💡Information Junk Food
💡Gatekeepers
💡Prosthetic Innovation
💡Public Life
💡Transparency
Highlights
Zuckerberg's analogy of a squirrel dying in your front yard being more relevant than people dying in Africa to illustrate the concept of personalized newsfeeds.
The author's personal experience of growing up in rural Maine and the internet's role in connecting the world.
Concerns about the invisible shift in how information flows online and its potential negative impact on society.
The discovery that Facebook's algorithm was editing out conservative content from the author's feed based on their clicking behavior.
Google's personalized search results and the lack of a standard Google experience due to algorithmic tailoring.
The demonstration of how different search results can be for the same query, using the example of searching for 'Egypt'.
The concept of a 'filter bubble' where the internet shows users what it thinks they want to see, not necessarily what they need to see.
The challenge of maintaining a balanced information diet with algorithmic filters that prioritize immediate clicks.
The historical comparison of human gatekeepers in media to the current algorithmic gatekeepers and the need for ethical considerations in algorithms.
The call for transparency in algorithms and user control over what information gets through their filters.
The importance of the internet in connecting people and introducing them to new ideas, which is at risk with personalized filters.
The story of Michael Essec, who, after losing part of his arm, used his mechanical skills to improve his prosthetic.
Michael's innovative approach to reverse-engineering an antique arm to create a more reliable and repairable prosthetic.
The potential impact of Michael's work on providing affordable and functional prosthetic limbs to amputees.
The message that with understanding and the ability to take things apart, one can make them better, applied to both technology and life.
Transcripts
Mark Zuckerberg a journalist was asking
him a question about the newsfeed and
the journalist was asking him you know
why is this so important and Zuckerberg
said a squirrel dying in your front yard
may be more relevant to your interest
right now than people dying in Africa
and I want to talk about what a web
based on that idea of relevance might
look like so when I was growing up in a
in a really rural area in Maine you know
the internet meant something very
different to me it meant a connection to
the world it meant something that would
connect us all together and I was sure
that it was going to be great for
democracy and for our society but
there's this kind of shift in how
information is flowing online and it's
invisible and if we don't pay attention
to it it could be a real problem so I
first noticed this in a place I spend a
lot of time my Facebook page I'm
progressive politically big surprise but
I've always you know gone out of my way
to meet conservatives I like hearing
what they're thinking about
I like seeing what they link to I like
learning a thing or two and so I was
kind of surprised when I noticed one day
that the Conservatives had disappeared
from my Facebook feed and what it turned
out was going on was that Facebook was
looking at which links I clicked on and
it was noticing that actually I was
clicking more on my liberal friends
links than on my conservative friends
links and without consulting me about it
it had edited them out they disappeared
so Facebook isn't the only place that's
doing this kind of invisible algorithmic
editing of the web Google is doing it
too if I search for something and you
search for something even right now at
the very same time we may get very
different search results even if you're
logged out one engineer told me there
are 57 signals that Google looks at
everything from what kind of computer
you're on to what kind of browser you're
using
to where you're located that it uses to
personally tailor your query results
think about it for a second
there is no standard Google anymore and
you know the funny thing about this is
that it's hard to see you can't see how
different your search results are from
anyone else's but a couple of weeks ago
I asked a bunch of friends to Google
Egypt and to send me screenshots of what
they got so here's my friend Scott's
screenshot and here's my friend Daniel
screenshot when you put them
side-by-side you don't even have to read
the links to see how different these two
pages are but when you do read the links
it's really quite remarkable Daniel
didn't get anything about the protests
in Egypt at all in his first page of
Google results
Scott's results were full of them and
this was the big story of the day at
that time that's how different these
results are becoming so it's not just
Google on Facebook either you know this
is something that's sweeping the web
there are a whole host of companies that
are doing this kind of personalization
Yahoo News the biggest news site on the
Internet is now personalized different
people get different things Huffington
Post Washington Post New York Times all
flirting with personalization in various
ways and where this this moves us very
quickly toward a world in which the
Internet is showing us what it thinks we
want to see but not necessarily what we
need to see as Eric Schmidt said it'll
be very hard for people to watch or
consume something that is not in some
sense been tailored for them so I do
think this is a problem and I think if
you take all of these filters together
if you take all of these algorithms you
get what I call a filter bubble and your
filter bubble is kind of your own
personal unique universe of information
that you live in online and what's in
your filter bubble depends on who you
are and it depends on what you do but
the thing is that you don't decide what
gets in and more importantly you don't
actually see
what gets edited out so one of the
problems with the filter bubble was
discovered by some researchers at
Netflix and they were looking at the
Netflix queues and they notice something
kind of funny that a lot of us probably
have noticed which is there are some
movies that just sort of zip right up
and out to our houses they enter the
queue they just zip right out so Ironman
zips right out right and Waiting for
Superman can wait for a really long time
what they discovered was that in our
Netflix queues there's kind of this epic
struggle going on between our future
aspirational selves and our more
impulsive present selves you know we all
want to be someone who has watched
Rashomon but right now we want to watch
Ace Ventura for the 4th time so the best
editing gives us a bit of both it gives
us a little bit of Justin Bieber and a
little bit of Afghanistan it gives us
some information vegetables it gives us
some information dessert and the
challenge with these kind of algorithmic
filters these personalized filters is
that because they're mainly looking at
what you click on first you know you
don't it can throw off that balance and
instead of a balance that information
diet you can end up surrounded by
information junk food so what this
suggests is actually that we may have
the story about the internet wrong in a
broadcast society you know this is how
the founding mythology goes right in a
broadcast society there were these
gatekeepers the editors and they
controlled the flows of information and
Along Came the internet and it swept
them out of the way and it allowed us
all of us to connect together and it was
awesome but that's not actually what's
happening right now what we're seeing is
more of a passing of the torch from
human gatekeepers to algorithmic ones
and the thing is that the algorithms
don't yet have the kind of embedded
ethics that the editors did
so if algorithms are going to curate the
world for us if they're going to decide
what we get to see and what we don't get
to see that we need to make
that they're not just keyed to relevance
we need to make sure that they also show
us things that are uncomfortable or
challenging or important this is what
Ted does right other points of view and
the thing is we've actually kind of been
here before as a society in 1915 it's
not like newspapers were sweating a lot
about their civic responsibilities then
people kind of noticed that they were
doing something really important that in
fact you couldn't have a functioning
democracy if citizens didn't get a good
flow of information that the newspapers
were critical because they were acting
as the filter and that journalistic
ethics developed it wasn't perfect but
it got us through the last century and
so now we're kind of back in 1915 on the
web and we need the new gatekeepers to
encode that kind of responsibility into
the code that they're writing you know I
know there are a lot of people here from
Facebook and from Google Larry and
Sergey who you know people who have
helped build the web as it is and I'm
grateful for that but we really need to
you to make sure that these algorithms
have encoded in them a sense of the
public life a sense of civic
responsibility we need you to make sure
that they're transparent enough that we
can see what the rules are that
determine what gets through our filters
and we need you to give us some control
so that we can decide what gets through
and what doesn't because I think we
really need the Internet to be that
thing that we all dreamed of it being we
need it to connect us all together we
need it to introduce us to new ideas and
new people and different perspectives
and it's not going to do that if it
leaves us all isolated in a web of one
thang
when I was 16 I found this motorcycle it
was a dilapidated that piece of junk
that wasn't going but I realized how
special it was
restoration of this motorcycle has
taught me to improvise with mechanical
things my name is Michael ESSEC I live
in Hobart Tasmania under Ector of
dynamic welding and engineering it's
pretty cool place we do machine shop
work we do aluminum welding seven years
ago I had an accident and lost the best
part of my arm
I got a prosthetic on my own insurance
it didn't work it was very unreliable it
wasn't fixed to my body and flopped
about it simply wasn't up to the job I
thought if you can figure a motorcycle
why can't you fix a prosthetic
so I did a search to find out as much
information as possible and it came
across the cans artificial arm this is a
antique arm from the early 1900s and in
its time it was so mechanically
advantage we patent search on Google I
was able to find every single cans
patent online so I set about trying to
reverse engineers
the arm itself has many good features
its body powered which means it won't
break down you can use it for heavy-duty
work and you can also repair it yourself
and hold it here and hold a BF exactly
some of the new things I hope to add to
this arm is an extra finger joint and it
would be gear driven will also be the
lighter with space-age material such as
magnesium and carbon fiber that is so
cool well if you think how many people
have lost limbs but can't afford the new
electronic limbs I mean it is rewarding
knowing that one day people will build a
way of something that Mike and I are
working on my hope for this artificial
arm is to wear it and share it
so amputees can live a normal life again
it doesn't matter if it's a bike or an
arm or your life if you can take it
apart if you can understand it you can
make it better
تصفح المزيد من مقاطع الفيديو ذات الصلة
Beware online "filter bubbles" | Eli Pariser
How Social Media Algorithm Works (With CHIARA ZAMBRANO)
Dentro gli algoritmi che regolano il nostro tempo - Donata Columbro
Dark Patterns: How design seeks to control us | Sally Woellner | TEDxSydney
Is the internet expanding or narrowing our minds?
Video Sample Social Media
5.0 / 5 (0 votes)