Is the internet expanding or narrowing our minds?
Summary
TLDRIn this thought-provoking video, Myles Bess explores the impact of social media algorithms on our worldviews. He delves into how these algorithms, driven by our online behaviors, create personalized content bubbles that can both enhance and limit our perspectives. The video raises concerns about the potential for echo chambers and the amplification of misinformation, while also offering strategies to break free from these constraints and foster a more diverse and critical engagement with online content.
Takeaways
- 📱 Social media platforms like TikTok and YouTube use recommendation algorithms to curate content based on user behavior and preferences.
- 🔍 Recommendation algorithms work by collecting data on user interactions such as likes, comments, shares, and watching habits to predict and serve content that appeals to the user.
- 🌟 The sheer volume of content uploaded daily makes it impossible for users to discover everything, hence the necessity for algorithms to filter and recommend.
- 💸 Social media companies rely on keeping users engaged for longer periods to increase ad revenue, which is why algorithms aim to show content that users are likely to engage with.
- 🔒 The exact mechanisms of these algorithms are closely guarded secrets, known as the 'secret sauce' of these platforms.
- 🌐 Algorithms also consider trends and popular content that many users are engaging with, which can sometimes lead to the amplification of misinformation.
- 🤔 The internet can both expand and narrow the mind, as algorithms may create 'filter bubbles' or 'echo chambers' where users are only exposed to content that aligns with their existing beliefs.
- 👀 Users have some control over their algorithmic feeds by the accounts they follow and the content they engage with, which can shape their online experience.
- 💡 To break out of echo chambers, users can take proactive steps like creating separate accounts to explore different interests or being mindful of the content they engage with.
- 🌟 Social media algorithms are not perfect and can lead users down dangerous rabbit holes or affect their mental well-being, which is a concern that needs to be addressed.
- 📢 Users are encouraged to be intentional with their social media use, to engage with a diverse range of content, and to use their voice to create and share edifying content.
Q & A
What is the main topic discussed in the video script?
-The main topic discussed in the video script is the impact of social media recommendation algorithms on users' perspectives and whether the internet is expanding or narrowing their minds.
What is a recommendation algorithm?
-A recommendation algorithm is a set of computer instructions that social media apps use to decide what content to show users based on their interests and engagement.
How does the amount of content uploaded to platforms like YouTube affect users?
-The vast amount of content uploaded to platforms like YouTube means users would be overwhelmed without recommendation algorithms to filter and suggest content that aligns with their interests.
What role do recommendation algorithms play in the content users see on social media?
-Recommendation algorithms play a crucial role by identifying and suggesting content that is likely to appeal to users based on their past interactions, such as likes, comments, and watch history.
How do social media platforms benefit from keeping users engaged with their apps?
-Social media platforms benefit from keeping users engaged by showing more ads, which increases ad revenue, and collecting more data for targeted advertising.
What is a 'filter bubble' or 'echo chamber' in the context of social media?
-A 'filter bubble' or 'echo chamber' refers to a situation where users are only exposed to content that reinforces their existing beliefs and interests, limiting their exposure to diverse perspectives.
How can users break out of filter bubbles created by social media algorithms?
-Users can break out of filter bubbles by actively seeking out diverse content, following a variety of accounts, and being mindful of the types of posts they engage with.
What are some strategies mentioned in the script to make social media algorithms work better for users?
-Some strategies mentioned include being mindful of how algorithms work, thinking before liking and sharing, breaking out of echo chambers, and using one's voice to create and share diverse content.
How can users be more intentional with their social media usage according to the script?
-Users can be more intentional by understanding how algorithms influence their feeds, being selective with their interactions, and actively exploring content outside their usual preferences.
What is the importance of engaging with a variety of content on social media platforms?
-Engaging with a variety of content helps expose users to new ideas and perspectives, preventing them from getting trapped in echo chambers and promoting a more well-rounded understanding of different topics.
What is the potential downside of social media algorithms as discussed in the script?
-The potential downside of social media algorithms is that they can lead users into echo chambers, reinforce existing beliefs without challenge, and potentially expose users to extreme or harmful content.
Outlines
🌐 The Impact of Social Media Algorithms
The paragraph introduces the concept of recommendation algorithms on social media platforms, particularly focusing on TikTok and YouTube. It discusses how these algorithms work to curate content based on user engagement and preferences, leading to a personalized feed. The speaker, Myles Bess, humorously admits to being so engrossed in vegan wing recipes on TikTok that he was unaware of the Super Bowl, highlighting the potential for these algorithms to create 'filter bubbles.' The paragraph also touches on the sheer volume of content uploaded daily and the role of algorithms in managing this overflow, ensuring users are presented with content that aligns with their interests.
🔍 Navigating the Echo Chamber
This paragraph delves into the consequences of social media algorithms, such as the creation of echo chambers where users are only exposed to content that reinforces their existing beliefs. It mentions the research indicating that users can become trapped in filter bubbles, which can limit their exposure to diverse perspectives. The paragraph also discusses how some users have found ways to manipulate algorithms to their advantage, such as creating separate accounts to explore different viewpoints. The speaker emphasizes the importance of being aware of the algorithms' influence and suggests strategies for breaking out of echo chambers, such as being mindful of the content we engage with and seeking out diverse perspectives.
Mindmap
Keywords
💡TikTok
💡Recommendation Algorithms
💡Filter Bubbles
💡Echo Chambers
💡Algorithmic Bias
💡Data Collection
💡Machine Learning
💡Engagement Metrics
💡Ad Revenue
💡Wellbeing
💡Intentional Consumption
Highlights
TikTok's recommendation algorithms are based on user engagement and preferences, potentially leading to a narrow focus on specific content.
Social media algorithms are designed to keep users engaged by showing content that aligns with their interests.
The sheer volume of content on platforms like YouTube and TikTok necessitates algorithms to curate personalized feeds.
Algorithms collect data on user behavior to predict and serve content that users are likely to engage with.
Social media platforms use sophisticated machine learning to analyze user data and tailor content.
User engagement, such as likes and shares, influences the content that algorithms promote.
The business model of social media platforms relies on user retention and ad revenue, incentivizing the creation of addictive algorithms.
The inner workings of recommendation algorithms are kept secret by social media companies, protecting their competitive edge.
Algorithms can create echo chambers by reinforcing users' existing beliefs and limiting exposure to diverse perspectives.
Users can get trapped in filter bubbles, where they only see content that aligns with their views.
Some users have found creative ways to manipulate algorithms, such as creating separate accounts to access different types of content.
Social media algorithms have been linked to negative impacts on mental health, particularly among teenagers.
Being mindful of how algorithms work can help users take control over the content they are served.
Strategic liking and sharing can influence the content that algorithms promote to a user and their network.
Breaking out of echo chambers by seeking out diverse content can lead to a more balanced and informed perspective.
Users are encouraged to use their voice on social media to create and share content that contributes to a healthier online environment.
Transcripts
- You know who really gets me?
TikTok.
What up, world?
Myles Bess, journalist, host of "Above the Noise",
and world class chef.
Y'all, I've spent hours scrolling through TikTok
perfecting my vegan wing recipe.
I've learned so much about cooking, and pots, and pans.
It's just, it's changed my world.
But while I was deep in vegan wing TikTok,
I totally missed that, apparently,
there's this thing called a Super Bowl,
and it has nothing to do with cooking.
It's a sporting event with football?
What's next? You're gonna tell me that
Obama isn't president anymore?
What else am I missing out on?
So today, we're getting philosophical in asking,
is the internet expanding your mind
or is it making you narrow-minded?
Now, you might be thinking,
"Dang, Myles, that's a deep question."
And to that I say, yes, yes it is,
and you're welcome.
But to answer that question,
we gotta talk about why you're seeing what you're seeing
on your social media feeds.
And the answer to that, my friends,
has a lot to do with recommendation algorithms.
Now, basically, an algorithm is a formula
or a set of instructions to solve a problem.
And when it comes to social media,
recommendation algorithms are the computer instructions
for how a given social media app decides what to show you.
I mean, 500 hours of video are uploaded to YouTube
every minute worldwide.
That works out to 720,000 hours of new content per day.
That's 82.2 years.
Yes, years. That's literally a lifetime.
- Did you see this one?
This one? What about this one?
- And on TikTok,
more than a billion videos get viewed a day.
That's a lot of videos to sift through,
and you probably have no interest in a bunch of them.
I mean, those pimple popping videos, ugh.
No thank you.
- And that's the point of the recommendation algorithm,
to identify content that will appeal to you.
- That's Alexander Nwala.
He's a computer scientist who studies
how information spreads on social media.
We chatted with him to help us better understand
how social media recommendation algorithms work.
- In order for them to learn about you,
they need to collect information.
So as they collect some of the pages you browse,
some of the content you engage with,
it's taken into account in those algorithms.
And then, they do a lot of sophisticated machine learning,
a lot of mathematics behind these scenes and say,
"Okay, we think you are going to like this."
And if you engage with that content,
they give you more of that.
So it's like you're telling the algorithm,
you're giving it some feedback that says,
"Give me more of this, give me more of this,
give me more of this."
- So these companies are pretty much tracking
our every move.
They're collecting data on what you're watching,
clicking on, liking, commenting on, sharing,
buying, how long we watch something, where we live,
how old we are, et cetera.
Then they use all of that data to make predictions
to serve us the content they think we'll like.
You know, stuff like this 27 year old male
who lives in California,
likes sneakers and cooking,
and hates gross medical videos.
So show him more cooking content,
and don't show him people cutting in a cyst.
Ugh. Oh my God. It's gross.
And with this data,
they can sell super targeted ads to us too.
- And if you stay longer there, they make more money.
Maybe through ad revenues, maybe make more money
through the information they collect about you.
But the whole idea is to find something
that appeals to you that keeps you.
- See? At the end of the day,
these social media companies are businesses
trying to make money.
And the longer you're on the app, the more ads you'll see,
and the more money they'll make
Y'all, TikTok made $4 billion from ads in 2021,
and they want to triple that to $12 billion this year.
In 2020, Instagram made $17.4 billion through ad sales.
Y'all, that's like a lot of money.
But here's the catch.
These companies keep the inner workings of these algorithms
under tight lock and key.
I mean, it is their secret sauce after all.
For example, TikTok publicly says that it takes into account
shares, likes, follows, what you watch,
and even how long you watch something,
but we don't know for sure exactly how all these things
are weighted or ranked
to give you the content that's on your feed.
And it's not just what you are engaging with.
These social media algorithms also take into account
what everyone else is liking and sharing too.
They tend to amplify the posts
that have the most likes, comments, and shares,
even if it's misinformation or lies.
Okay, so back to my original question,
is the internet expanding your mind
or making you narrow-minded?
On the one hand, by following accounts and liking videos,
you do have a certain amount of agency on what you see.
I mean, the type of media we consume
shapes how we see the world, and our place in it.
It can help you find your people.
Say, for instance, you really like knitting,
but none of your real life friends do,
so you start following some stuff on the Gram.
And then bam,
you've been connected to a whole new community.
You're learning about knitting things
you wouldn't have known otherwise.
But on the flip side,
as these algorithms learn your preferences,
you'll never know if you're interested in stuff
they don't show you.
They're not, like, all of a sudden gonna be like,
"Oh, you love knitting. Have you tried MMA?"
And ultimately, that can limit your exposure
to new ideas or interests.
Research has shown that users can get trapped
in filter bubbles or echo chambers,
where you just, you know,
you're getting served content
that reinforces what you already believe.
This is particularly strong when it comes
to political beliefs or news you consume.
How can you really think critically about something
if you're not being challenged on it?
- The more that you interact with content
that is something that interests you,
the more you're not going to see content
that doesn't interest you.
- That's Iretiolu Akinrinade.
She studies adolescent wellbeing in digital spaces.
- And you never really get to see what's going on
in other holes of the platform.
- She explains that some users have found creative ways
to make the algorithm work better for them.
Like this one TikTok user created a second account
to follow conservative news.
And there, he discovered there was this whole trend
of COVID positive users going out into public spaces
intentionally infecting people.
Um, side note, that's awful.
So anyway, he shared that on his main account,
introducing his followers to this content
that they otherwise wouldn't see.
- And I really liked what this user did
in terms of strategically creating a separate account
and really playing with their algorithm,
and trying to see what would happen
if they had different identities,
or if they were understood as an algorithm
to be a completely different person.
But what I really liked about their decision making,
is that they came back to their original account
and shared that information.
- Talk about breaking down that echo chamber,
chamber, chamber, chamber.
So, social media algorithms are far from perfect.
Similar to echo chambers,
there have been reports of people being sucked
into dangerous rabbit holes where these algorithms
can lead users to more extreme content,
even leading to radicalization.
Like how on TikTok, engaging in transphobic content
lead users to white supremacists
and other far-right content.
And there are also reports of how what you see
can affect your wellbeing too.
Like, internal documents from Meta,
the company formally known as Facebook,
show how Instagram's algorithms can make teen girls
feel worse about their body images.
Well, dang, that's a big downer.
We can't end the video like that.
No, no, no, no. That's not gonna fly.
We want to leave you with something inspiring,
some wisdom on how to make these types
of algorithms work better for you.
Number one, for starters,
be mindful of how social media algorithms work.
Ask yourself,
what content am I seeing and why am I seeing it?
Number two, think before you like and share.
Those actions help amplify posts.
So doing those things doesn't just affect what you see,
but what everyone else sees too.
Because after all.
- As an informed citizen, we have a part to play
when it comes to the content we share.
- Well said.
Number three, try to break out of that echo chamber.
Maybe you don't need to create a separate spam account,
or maybe you do.
But, like, basically, just keep an eye out
for stuff from the "other side".
Number four, use your voice.
It's not just about consuming content.
It's also about what content you're creating,
and the conversations you're starting with your audience.
- We should all experiment with the different ways
that we're seen online,
so we can try and receive quality, edifying content
not just in the very moment, but over time.
So be intentional as you use the internet,
especially social media,
and talk with your friends and family
about what you see online.
You might be surprised by the differences and similarities.
It's just a really interesting point of conversation.
- And that's it from me.
But I'm curious about what you all think.
How do you see social media algorithms influencing you?
Let us know the commons below.
Oh, and before we go,
I want to give a big shout out to Common Sense Education,
who we collabed with on this video.
Speaking of algorithms, we need your help.
If you liked this video, be sure to like, subscribe,
and hit that bell notification
so you can see more of our amazing stuff.
It helps us with YouTube's algorithm too.
Until next time. I'm Myles Bess. Peace out.
Посмотреть больше похожих видео
5.0 / 5 (0 votes)