Is the internet expanding or narrowing our minds?

Above The Noise
6 Apr 202208:59

Summary

TLDRIn this thought-provoking video, Myles Bess explores the impact of social media algorithms on our worldviews. He delves into how these algorithms, driven by our online behaviors, create personalized content bubbles that can both enhance and limit our perspectives. The video raises concerns about the potential for echo chambers and the amplification of misinformation, while also offering strategies to break free from these constraints and foster a more diverse and critical engagement with online content.

Takeaways

  • 📱 Social media platforms like TikTok and YouTube use recommendation algorithms to curate content based on user behavior and preferences.
  • 🔍 Recommendation algorithms work by collecting data on user interactions such as likes, comments, shares, and watching habits to predict and serve content that appeals to the user.
  • 🌟 The sheer volume of content uploaded daily makes it impossible for users to discover everything, hence the necessity for algorithms to filter and recommend.
  • 💸 Social media companies rely on keeping users engaged for longer periods to increase ad revenue, which is why algorithms aim to show content that users are likely to engage with.
  • 🔒 The exact mechanisms of these algorithms are closely guarded secrets, known as the 'secret sauce' of these platforms.
  • 🌐 Algorithms also consider trends and popular content that many users are engaging with, which can sometimes lead to the amplification of misinformation.
  • 🤔 The internet can both expand and narrow the mind, as algorithms may create 'filter bubbles' or 'echo chambers' where users are only exposed to content that aligns with their existing beliefs.
  • 👀 Users have some control over their algorithmic feeds by the accounts they follow and the content they engage with, which can shape their online experience.
  • 💡 To break out of echo chambers, users can take proactive steps like creating separate accounts to explore different interests or being mindful of the content they engage with.
  • 🌟 Social media algorithms are not perfect and can lead users down dangerous rabbit holes or affect their mental well-being, which is a concern that needs to be addressed.
  • 📢 Users are encouraged to be intentional with their social media use, to engage with a diverse range of content, and to use their voice to create and share edifying content.

Q & A

  • What is the main topic discussed in the video script?

    -The main topic discussed in the video script is the impact of social media recommendation algorithms on users' perspectives and whether the internet is expanding or narrowing their minds.

  • What is a recommendation algorithm?

    -A recommendation algorithm is a set of computer instructions that social media apps use to decide what content to show users based on their interests and engagement.

  • How does the amount of content uploaded to platforms like YouTube affect users?

    -The vast amount of content uploaded to platforms like YouTube means users would be overwhelmed without recommendation algorithms to filter and suggest content that aligns with their interests.

  • What role do recommendation algorithms play in the content users see on social media?

    -Recommendation algorithms play a crucial role by identifying and suggesting content that is likely to appeal to users based on their past interactions, such as likes, comments, and watch history.

  • How do social media platforms benefit from keeping users engaged with their apps?

    -Social media platforms benefit from keeping users engaged by showing more ads, which increases ad revenue, and collecting more data for targeted advertising.

  • What is a 'filter bubble' or 'echo chamber' in the context of social media?

    -A 'filter bubble' or 'echo chamber' refers to a situation where users are only exposed to content that reinforces their existing beliefs and interests, limiting their exposure to diverse perspectives.

  • How can users break out of filter bubbles created by social media algorithms?

    -Users can break out of filter bubbles by actively seeking out diverse content, following a variety of accounts, and being mindful of the types of posts they engage with.

  • What are some strategies mentioned in the script to make social media algorithms work better for users?

    -Some strategies mentioned include being mindful of how algorithms work, thinking before liking and sharing, breaking out of echo chambers, and using one's voice to create and share diverse content.

  • How can users be more intentional with their social media usage according to the script?

    -Users can be more intentional by understanding how algorithms influence their feeds, being selective with their interactions, and actively exploring content outside their usual preferences.

  • What is the importance of engaging with a variety of content on social media platforms?

    -Engaging with a variety of content helps expose users to new ideas and perspectives, preventing them from getting trapped in echo chambers and promoting a more well-rounded understanding of different topics.

  • What is the potential downside of social media algorithms as discussed in the script?

    -The potential downside of social media algorithms is that they can lead users into echo chambers, reinforce existing beliefs without challenge, and potentially expose users to extreme or harmful content.

Outlines

00:00

🌐 The Impact of Social Media Algorithms

The paragraph introduces the concept of recommendation algorithms on social media platforms, particularly focusing on TikTok and YouTube. It discusses how these algorithms work to curate content based on user engagement and preferences, leading to a personalized feed. The speaker, Myles Bess, humorously admits to being so engrossed in vegan wing recipes on TikTok that he was unaware of the Super Bowl, highlighting the potential for these algorithms to create 'filter bubbles.' The paragraph also touches on the sheer volume of content uploaded daily and the role of algorithms in managing this overflow, ensuring users are presented with content that aligns with their interests.

05:03

🔍 Navigating the Echo Chamber

This paragraph delves into the consequences of social media algorithms, such as the creation of echo chambers where users are only exposed to content that reinforces their existing beliefs. It mentions the research indicating that users can become trapped in filter bubbles, which can limit their exposure to diverse perspectives. The paragraph also discusses how some users have found ways to manipulate algorithms to their advantage, such as creating separate accounts to explore different viewpoints. The speaker emphasizes the importance of being aware of the algorithms' influence and suggests strategies for breaking out of echo chambers, such as being mindful of the content we engage with and seeking out diverse perspectives.

Mindmap

Keywords

💡TikTok

TikTok is a social media platform known for its short-form videos, often featuring music, dancing, and other creative content. In the video, TikTok is used as an example of a platform where recommendation algorithms play a significant role in shaping the user experience. The host mentions spending hours on TikTok, which humorously leads to him missing out on other events like the Super Bowl, illustrating the platform's engaging nature.

💡Recommendation Algorithms

Recommendation algorithms are sets of instructions used by social media platforms to determine the content that users see on their feeds. These algorithms are crucial in managing the overwhelming amount of content uploaded daily. The video discusses how these algorithms learn from user interactions to curate personalized content, potentially leading to a narrow or broad range of interests depending on how they are engaged with.

💡Filter Bubbles

Filter bubbles refer to the phenomenon where online platforms, through algorithms, only show users content that aligns with their existing views or interests. This can lead to a lack of exposure to diverse perspectives. The video highlights the potential for social media algorithms to create echo chambers, where users are only exposed to content that reinforces their beliefs, limiting their understanding of different viewpoints.

💡Echo Chambers

Echo chambers are environments where people are exposed primarily to information that confirms their preexisting beliefs, often leading to a distorted view of reality. The video uses the term to describe how social media algorithms can inadvertently isolate users within their own ideological or interest-based communities, potentially reducing their exposure to diverse ideas.

💡Algorithmic Bias

Algorithmic bias refers to the tendency of algorithms to favor certain types of content over others, which can lead to the amplification of certain perspectives or the suppression of others. The video touches on this by discussing how algorithms might prioritize content with high engagement, even if it's misinformation, which can skew the information landscape.

💡Data Collection

Data collection is the process of gathering information about users' online behavior, such as what they watch, like, comment on, and share. The video explains that social media platforms use this data to refine their algorithms and serve users content they are more likely to engage with, which in turn can influence user behavior and preferences.

💡Machine Learning

Machine learning is a subset of artificial intelligence that enables computers to learn from and make predictions or decisions based on data. In the context of the video, machine learning is used to describe how algorithms analyze user data to predict and serve content that aligns with user preferences, shaping the user's experience on platforms like TikTok.

💡Engagement Metrics

Engagement metrics are the various ways users interact with content on social media, such as likes, shares, comments, and views. The video explains that these metrics are crucial for algorithms to understand user preferences and to curate content accordingly. High engagement can lead to more visibility of certain posts, regardless of their accuracy or quality.

💡Ad Revenue

Ad revenue refers to the income generated from advertising on a platform. The video mentions that social media companies aim to keep users engaged for longer periods to increase ad impressions and revenue. This business model can influence the type of content that algorithms promote, as more engaging content can lead to longer user sessions.

💡Wellbeing

Wellbeing in the context of the video refers to the psychological and emotional health of users as they interact with social media platforms. The video discusses how the content served by algorithms can impact users' self-esteem and mental health, particularly in the case of sensitive topics like body image or exposure to harmful ideologies.

💡Intentional Consumption

Intentional consumption is the act of being mindful and deliberate about the content one engages with online. The video encourages viewers to be intentional with their social media use, suggesting that being aware of the algorithms' influence and making conscious choices about what to like, share, and follow can help users break out of echo chambers and expand their perspectives.

Highlights

TikTok's recommendation algorithms are based on user engagement and preferences, potentially leading to a narrow focus on specific content.

Social media algorithms are designed to keep users engaged by showing content that aligns with their interests.

The sheer volume of content on platforms like YouTube and TikTok necessitates algorithms to curate personalized feeds.

Algorithms collect data on user behavior to predict and serve content that users are likely to engage with.

Social media platforms use sophisticated machine learning to analyze user data and tailor content.

User engagement, such as likes and shares, influences the content that algorithms promote.

The business model of social media platforms relies on user retention and ad revenue, incentivizing the creation of addictive algorithms.

The inner workings of recommendation algorithms are kept secret by social media companies, protecting their competitive edge.

Algorithms can create echo chambers by reinforcing users' existing beliefs and limiting exposure to diverse perspectives.

Users can get trapped in filter bubbles, where they only see content that aligns with their views.

Some users have found creative ways to manipulate algorithms, such as creating separate accounts to access different types of content.

Social media algorithms have been linked to negative impacts on mental health, particularly among teenagers.

Being mindful of how algorithms work can help users take control over the content they are served.

Strategic liking and sharing can influence the content that algorithms promote to a user and their network.

Breaking out of echo chambers by seeking out diverse content can lead to a more balanced and informed perspective.

Users are encouraged to use their voice on social media to create and share content that contributes to a healthier online environment.

Transcripts

play00:00

- You know who really gets me?

play00:02

TikTok.

play00:03

What up, world?

play00:03

Myles Bess, journalist, host of "Above the Noise",

play00:06

and world class chef.

play00:08

Y'all, I've spent hours scrolling through TikTok

play00:12

perfecting my vegan wing recipe.

play00:14

I've learned so much about cooking, and pots, and pans.

play00:17

It's just, it's changed my world.

play00:19

But while I was deep in vegan wing TikTok,

play00:21

I totally missed that, apparently,

play00:23

there's this thing called a Super Bowl,

play00:26

and it has nothing to do with cooking.

play00:27

It's a sporting event with football?

play00:30

What's next? You're gonna tell me that

play00:31

Obama isn't president anymore?

play00:33

What else am I missing out on?

play00:34

So today, we're getting philosophical in asking,

play00:37

is the internet expanding your mind

play00:39

or is it making you narrow-minded?

play00:45

Now, you might be thinking,

play00:46

"Dang, Myles, that's a deep question."

play00:49

And to that I say, yes, yes it is,

play00:52

and you're welcome.

play00:52

But to answer that question,

play00:54

we gotta talk about why you're seeing what you're seeing

play00:57

on your social media feeds.

play00:58

And the answer to that, my friends,

play01:00

has a lot to do with recommendation algorithms.

play01:03

Now, basically, an algorithm is a formula

play01:05

or a set of instructions to solve a problem.

play01:08

And when it comes to social media,

play01:09

recommendation algorithms are the computer instructions

play01:12

for how a given social media app decides what to show you.

play01:15

I mean, 500 hours of video are uploaded to YouTube

play01:19

every minute worldwide.

play01:21

That works out to 720,000 hours of new content per day.

play01:25

That's 82.2 years.

play01:27

Yes, years. That's literally a lifetime.

play01:30

- Did you see this one?

play01:31

This one? What about this one?

play01:32

- And on TikTok,

play01:33

more than a billion videos get viewed a day.

play01:35

That's a lot of videos to sift through,

play01:37

and you probably have no interest in a bunch of them.

play01:39

I mean, those pimple popping videos, ugh.

play01:42

No thank you.

play01:43

- And that's the point of the recommendation algorithm,

play01:46

to identify content that will appeal to you.

play01:49

- That's Alexander Nwala.

play01:50

He's a computer scientist who studies

play01:52

how information spreads on social media.

play01:54

We chatted with him to help us better understand

play01:56

how social media recommendation algorithms work.

play01:59

- In order for them to learn about you,

play02:00

they need to collect information.

play02:02

So as they collect some of the pages you browse,

play02:06

some of the content you engage with,

play02:09

it's taken into account in those algorithms.

play02:12

And then, they do a lot of sophisticated machine learning,

play02:16

a lot of mathematics behind these scenes and say,

play02:18

"Okay, we think you are going to like this."

play02:21

And if you engage with that content,

play02:24

they give you more of that.

play02:25

So it's like you're telling the algorithm,

play02:29

you're giving it some feedback that says,

play02:31

"Give me more of this, give me more of this,

play02:32

give me more of this."

play02:33

- So these companies are pretty much tracking

play02:35

our every move.

play02:36

They're collecting data on what you're watching,

play02:38

clicking on, liking, commenting on, sharing,

play02:41

buying, how long we watch something, where we live,

play02:43

how old we are, et cetera.

play02:45

Then they use all of that data to make predictions

play02:48

to serve us the content they think we'll like.

play02:50

You know, stuff like this 27 year old male

play02:52

who lives in California,

play02:53

likes sneakers and cooking,

play02:54

and hates gross medical videos.

play02:56

So show him more cooking content,

play02:58

and don't show him people cutting in a cyst.

play03:00

Ugh. Oh my God. It's gross.

play03:02

And with this data,

play03:03

they can sell super targeted ads to us too.

play03:06

- And if you stay longer there, they make more money.

play03:10

Maybe through ad revenues, maybe make more money

play03:13

through the information they collect about you.

play03:16

But the whole idea is to find something

play03:18

that appeals to you that keeps you.

play03:19

- See? At the end of the day,

play03:21

these social media companies are businesses

play03:23

trying to make money.

play03:24

And the longer you're on the app, the more ads you'll see,

play03:27

and the more money they'll make

play03:29

Y'all, TikTok made $4 billion from ads in 2021,

play03:34

and they want to triple that to $12 billion this year.

play03:38

In 2020, Instagram made $17.4 billion through ad sales.

play03:42

Y'all, that's like a lot of money.

play03:44

But here's the catch.

play03:46

These companies keep the inner workings of these algorithms

play03:49

under tight lock and key.

play03:50

I mean, it is their secret sauce after all.

play03:53

For example, TikTok publicly says that it takes into account

play03:56

shares, likes, follows, what you watch,

play03:59

and even how long you watch something,

play04:01

but we don't know for sure exactly how all these things

play04:04

are weighted or ranked

play04:05

to give you the content that's on your feed.

play04:06

And it's not just what you are engaging with.

play04:09

These social media algorithms also take into account

play04:12

what everyone else is liking and sharing too.

play04:14

They tend to amplify the posts

play04:16

that have the most likes, comments, and shares,

play04:18

even if it's misinformation or lies.

play04:21

Okay, so back to my original question,

play04:23

is the internet expanding your mind

play04:25

or making you narrow-minded?

play04:27

On the one hand, by following accounts and liking videos,

play04:30

you do have a certain amount of agency on what you see.

play04:33

I mean, the type of media we consume

play04:35

shapes how we see the world, and our place in it.

play04:37

It can help you find your people.

play04:38

Say, for instance, you really like knitting,

play04:41

but none of your real life friends do,

play04:42

so you start following some stuff on the Gram.

play04:44

And then bam,

play04:45

you've been connected to a whole new community.

play04:48

You're learning about knitting things

play04:49

you wouldn't have known otherwise.

play04:51

But on the flip side,

play04:52

as these algorithms learn your preferences,

play04:55

you'll never know if you're interested in stuff

play04:57

they don't show you.

play04:58

They're not, like, all of a sudden gonna be like,

play04:59

"Oh, you love knitting. Have you tried MMA?"

play05:02

And ultimately, that can limit your exposure

play05:05

to new ideas or interests.

play05:06

Research has shown that users can get trapped

play05:09

in filter bubbles or echo chambers,

play05:11

where you just, you know,

play05:12

you're getting served content

play05:13

that reinforces what you already believe.

play05:16

This is particularly strong when it comes

play05:17

to political beliefs or news you consume.

play05:20

How can you really think critically about something

play05:22

if you're not being challenged on it?

play05:24

- The more that you interact with content

play05:26

that is something that interests you,

play05:28

the more you're not going to see content

play05:31

that doesn't interest you.

play05:32

- That's Iretiolu Akinrinade.

play05:34

She studies adolescent wellbeing in digital spaces.

play05:36

- And you never really get to see what's going on

play05:41

in other holes of the platform.

play05:43

- She explains that some users have found creative ways

play05:46

to make the algorithm work better for them.

play05:48

Like this one TikTok user created a second account

play05:51

to follow conservative news.

play05:52

And there, he discovered there was this whole trend

play05:55

of COVID positive users going out into public spaces

play05:58

intentionally infecting people.

play05:59

Um, side note, that's awful.

play06:01

So anyway, he shared that on his main account,

play06:04

introducing his followers to this content

play06:06

that they otherwise wouldn't see.

play06:07

- And I really liked what this user did

play06:09

in terms of strategically creating a separate account

play06:14

and really playing with their algorithm,

play06:16

and trying to see what would happen

play06:19

if they had different identities,

play06:21

or if they were understood as an algorithm

play06:24

to be a completely different person.

play06:26

But what I really liked about their decision making,

play06:29

is that they came back to their original account

play06:31

and shared that information.

play06:33

- Talk about breaking down that echo chamber,

play06:35

chamber, chamber, chamber.

play06:37

So, social media algorithms are far from perfect.

play06:41

Similar to echo chambers,

play06:42

there have been reports of people being sucked

play06:44

into dangerous rabbit holes where these algorithms

play06:47

can lead users to more extreme content,

play06:49

even leading to radicalization.

play06:50

Like how on TikTok, engaging in transphobic content

play06:53

lead users to white supremacists

play06:55

and other far-right content.

play06:57

And there are also reports of how what you see

play06:59

can affect your wellbeing too.

play07:01

Like, internal documents from Meta,

play07:03

the company formally known as Facebook,

play07:04

show how Instagram's algorithms can make teen girls

play07:07

feel worse about their body images.

play07:09

Well, dang, that's a big downer.

play07:11

We can't end the video like that.

play07:13

No, no, no, no. That's not gonna fly.

play07:15

We want to leave you with something inspiring,

play07:17

some wisdom on how to make these types

play07:19

of algorithms work better for you.

play07:21

Number one, for starters,

play07:22

be mindful of how social media algorithms work.

play07:25

Ask yourself,

play07:26

what content am I seeing and why am I seeing it?

play07:29

Number two, think before you like and share.

play07:32

Those actions help amplify posts.

play07:33

So doing those things doesn't just affect what you see,

play07:36

but what everyone else sees too.

play07:38

Because after all.

play07:39

- As an informed citizen, we have a part to play

play07:42

when it comes to the content we share.

play07:45

- Well said.

play07:46

Number three, try to break out of that echo chamber.

play07:49

Maybe you don't need to create a separate spam account,

play07:51

or maybe you do.

play07:52

But, like, basically, just keep an eye out

play07:55

for stuff from the "other side".

play07:57

Number four, use your voice.

play07:58

It's not just about consuming content.

play08:01

It's also about what content you're creating,

play08:03

and the conversations you're starting with your audience.

play08:05

- We should all experiment with the different ways

play08:07

that we're seen online,

play08:08

so we can try and receive quality, edifying content

play08:12

not just in the very moment, but over time.

play08:14

So be intentional as you use the internet,

play08:17

especially social media,

play08:19

and talk with your friends and family

play08:20

about what you see online.

play08:22

You might be surprised by the differences and similarities.

play08:25

It's just a really interesting point of conversation.

play08:28

- And that's it from me.

play08:29

But I'm curious about what you all think.

play08:31

How do you see social media algorithms influencing you?

play08:34

Let us know the commons below.

play08:36

Oh, and before we go,

play08:37

I want to give a big shout out to Common Sense Education,

play08:39

who we collabed with on this video.

play08:41

Speaking of algorithms, we need your help.

play08:43

If you liked this video, be sure to like, subscribe,

play08:45

and hit that bell notification

play08:47

so you can see more of our amazing stuff.

play08:48

It helps us with YouTube's algorithm too.

play08:51

Until next time. I'm Myles Bess. Peace out.

Rate This

5.0 / 5 (0 votes)

الوسوم ذات الصلة
Social MediaAlgorithmsEcho ChambersContent CurationDigital WellbeingUser EngagementMental HealthOnline CommunitiesMedia InfluenceTech Ethics
هل تحتاج إلى تلخيص باللغة الإنجليزية؟