Social Media: Crash Course Navigating Digital Information #10
Summary
TLDRIn this Crash Course episode, John Green discusses the profound impact of social media on our lives, including changes in vocabulary, privacy, and offline behavior. He highlights the influence of targeted advertising, the creation of filter bubbles, and the spread of misinformation. Green emphasizes the importance of lateral reading, critical thinking, and engaging with diverse perspectives to navigate the digital information landscape responsibly.
Takeaways
- 😄 Social media has profoundly influenced how we communicate, interact, and perceive information.
- 📱 We often unconsciously follow social media trends and patterns, much like moths are drawn to light.
- 🌐 Social media impacts offline behavior, including political rallies organized by fake accounts.
- 🤔 Algorithms influence our feeds by prioritizing content that is engaging, often over truthful or unbiased information.
- 🧠 Confirmation bias can lead us to believe false information if we see it repeatedly in our feeds.
- 🚨 Targeted advertising uses personal data to show tailored ads, which raises concerns about privacy and data security.
- 🗣️ Social media can amplify voices that traditional gatekeepers may have silenced, allowing for more diverse public discourse.
- ⚠️ There is a risk of filter bubbles, where users are only exposed to information that aligns with their beliefs, limiting diverse perspectives.
- 🔍 Lateral reading is crucial in verifying information on social media to avoid spreading misinformation.
- 🏁 It's essential to challenge our assumptions and seek information from varied and credible sources to navigate digital spaces effectively.
Q & A
What is the main theme of this Crash Course episode?
-The main theme is how social media impacts our perceptions, behaviors, and decisions, and how to navigate digital information responsibly.
What analogy does John Green use to explain human behavior on social media?
-He uses the analogy of a moth flying toward a light, comparing it to how humans often follow social media trends without thinking critically.
How does social media affect our offline behaviors?
-Social media can influence how we vote, how we engage with our communities, and even how we interact with family members, often creating real-world consequences from online interactions.
What role do algorithms play in shaping our social media feeds?
-Algorithms curate our feeds by prioritizing content that aligns with our interests, often creating filter bubbles where we only see perspectives we agree with.
What is a 'filter bubble,' and why is it concerning?
-A 'filter bubble' occurs when algorithms show users content they are likely to agree with, which can limit exposure to diverse viewpoints and create an echo chamber of similar ideas.
How do social media platforms make money?
-Social media platforms make money primarily through targeted advertising, which uses data from users' behavior, location, and preferences to show them personalized ads.
What is the difference between misinformation and disinformation?
-Misinformation is the accidental sharing of false information, while disinformation is the deliberate spread of false information to deceive people.
What was the impact of Russian disinformation campaigns during the 2016 U.S. elections?
-Russian agents created fake grassroots movements on Facebook, which organized real-life political rallies in the U.S., influencing political discourse and creating division.
How can users reduce the impact of social media algorithms on their newsfeed?
-Users can follow accounts with differing viewpoints, turn off top post features to get a more neutral feed, and disable data tracking to reduce personalized targeting.
What are 'radical rabbit holes,' and how do they form on platforms like YouTube?
-Radical rabbit holes form when recommendation algorithms consistently suggest more extreme content, leading users deeper into far-right or far-left ideologies.
Outlines
🦋 The Moth's Problem and Our Natural Responses
John Green begins by telling a joke about a moth visiting a podiatrist, humorously illustrating how humans, like moths, often follow the light without thinking. He draws a comparison between how moths follow light and how humans tend to go where the 'light is on,' such as automatically using social media because it’s the natural thing to do. This leads to a discussion on how social media has deeply impacted our lives, including language, perceptions of privacy, and even offline experiences like political rallies organized through fake Facebook pages.
📱 The Good and Bad of Social Media
John highlights the positive and negative impacts of social media. On the positive side, platforms like Facebook and Twitter allow voices to be heard that were previously silenced. Social media facilitates the formation of communities around shared interests. However, the downsides are significant: from cyberbullying to scams and disinformation. The key issue is how social media influences both what we share and what we consume, often leading us into filter bubbles driven by engagement rather than truth.
⚠️ Extreme Recommendation Engines and Radicalization
John dives into the dangers of extreme recommendation algorithms. Platforms like YouTube have been found to push increasingly radical content to users based on what they’ve previously watched, creating 'radical rabbit holes.' While some steps have been taken to counter this, such as prioritizing 'authoritativeness,' these algorithms can still drive people toward more extreme viewpoints. The solution lies in conscious user behavior: seeking diverse viewpoints, limiting data tracking, and staying vigilant against falling into algorithmic traps.
🔍 The Power of Lateral Reading and Digital Literacy
John emphasizes the importance of lateral reading—researching sources, claims, and context when encountering information online. He gives an example of misinformation about a park sale, showing how checking multiple sources can uncover the truth. He encourages viewers to question not only things they don’t believe but also those they do, and to resist information that confirms biases. The ultimate goal is to become better navigators of the digital landscape, avoiding the allure of simple truths and being skeptical of ideologies.
Mindmap
Keywords
💡Social Media Feed
💡Algorithms
💡Filter Bubbles
💡Disinformation
💡Lateral Reading
💡Engagement
💡Targeted Advertising
💡Confirmation Bias
💡Misinformation
💡Cyberbullying
💡Fake News
Highlights
John Green introduces the topic of navigating social media feeds and jokes about human behavior being similar to moths following a light.
Humans often follow the crowd on social media without thinking critically, much like moths are drawn to light.
Social media has fundamentally changed how we interact and communicate, shaping perceptions of privacy and influencing offline experiences.
In 2016, Russian agents used fake Facebook pages to organize real political rallies across the U.S., showcasing the power of online manipulation.
Silicon Valley has influenced modern language, turning words like 'friend,' 'Google,' and 'DM' into verbs.
Social media can affect how we engage in politics and family discussions, with online arguments influencing real-life interactions.
Algorithms on social media platforms push content that engages users, often reinforcing filter bubbles and preventing exposure to differing viewpoints.
68% of U.S. adults get their news from social media, with Facebook being a key platform, raising concerns about misinformation.
Targeted advertising on social media platforms is highly personalized, based on user habits, browsing history, and even geolocation.
Misinformation and false news are frequently shared on social media, making lateral reading a vital skill to verify sources.
Newsfeed algorithms often prioritize engagement over truth, leading to the spread of emotional and sensational content.
YouTube’s recommendation algorithm has been found to push users toward more extreme content, especially in political topics.
Despite algorithmic flaws, users can protect themselves by diversifying their social media feeds and verifying information independently.
John Green encourages readers to seek diverse viewpoints and remain skeptical of information that aligns too perfectly with their beliefs.
Lateral reading and questioning sources are key to navigating digital information responsibly, preventing misinformation spread.
Transcripts
Hi, I’m John Green, and this is Crash Course: Navigating Digital Information.
So we’re going to talk about your social media feed today, but first: At the beginning
of this series, I told you one of the two jokes I know, and now that we’ve reached
the last episode, I’d like to tell you the other one.
So a moth walks into a podiatrist’s office, and the podiatrist says, “What seems to
be the problem, moth?”
And the moth answers, “Awww, doc.
If only there were only one problem.
I can’t hold down a job because I’m not good at anything.
My wife can hardly stand to look at me;
we don’t even love each other anymore,
worse than that, I can’t even remember if we ever loved each other.
When I look into the eyes of my children,
All I see is the same emptiness and despair that I feel in my own heart, doc.”
And then the podiatrist says, “Whoa, moth.
Okay.
Those are very serious problems, but it seems like you need to see a psychologist.
I’m a podiatrist.
What brought you here today?”
And the moth says, “Oh.
The light was on.”
We humans like to think of ourselves as extremely sophisticated animals.
Like moths may fly toward the light, but humans are endowed with free will.
We make choices.
Except a lot of the time, we just go where the light is on.
We do whatever feels like the natural thing.
We get on facebook because other people are on facebook.
We scroll through posts because the architecture of the site tells us to scroll.
We become passive.
In the past decade especially, social media has fundamentally changed us.
Like take your vocabulary, for example.
Silicon Valley rivals Shakespeare in its prolific additions to the English language.
Friend, Google, and ‘gram are all verbs now.
Snap and handle have new definitions.
Sliding into someone’s DMs is a thing.
But it’s not just how we speak -- these apps have not-so-subtly become embedded in
our daily lives very quickly.
Sometimes we don’t even realize how much they impact us.
They’ve changed our perceptions and expectations of privacy and they’ve also helped to shape
our offline experience.
In 2016 for instance, Russian agents organized political rallies all over the U.S. by creating
fake Facebook pages for made-up grassroots communities that then had real offline rallies.
Just by posing as organizers against Donald Trump or against Hillary Clinton, they actually
got real people to show up in Florida, New York, North Carolina, Washington, and Texas.
And those rally-goers didn’t know that it was a ruse.
I find that scary.
So today, for our big finale, we’re talking about the great white whale of navigating
online information: your social media feed.
INTRO
So quick note here at the start.
I’m not currently using a bunch of social media platforms.
Which may mean that I’m no longer an expert in them, but it’s only been six weeks and
I don’t think anything has changed that much.
Also, it turns out that whether or not you participate in Twitter is irrelevant to whether
Twitter effects you life because what’s shared online has offline consequences.
Like online shouting matches about politics can influence how we vote and also how we
talk to our extended family at the Thanksgiving dinner table.
Unless you don’t live in the US or Canada in which case I guess you don’t have Thanksgiving
and presumably you never fight with your aunts and uncles about politics.
The way we interact in social media is shaping all of our offline behaviors, from how we
engage with IRL communities to how we consume goods and services.
That’s why there are so many people you don’t know, and companies and organizations
using social media to try to influence your thoughts and actions.
Sometimes those who want to influence you use false identities like those with the Russian
rallies.
Sometimes, and more overtly, they buy your attention with advertising.
Some just create really engaging videos about a kitten saved during a hurricane to steal
your attention.
Some of these actors have relatively benign goals and act fairly, like a company sending
ads into your feed for a Harry Potter mug that it turns out you actually want because
you are a Hufflepuff and you are proud!
But others have terrible motives and spread disinformation, like hoax news sites which
are all run by Slytherins.
Still others aren’t quite in either camp.
They might unwittingly spread inaccurate information, or misinformation.
Like your aunt who always posts about Onion articles like they’re actual news.
Or me, on the several occasions when I have failed to pause and laterally read before
retweeting news that turned out to be false.
The big problem with all of that is that 68% of U.S. adults get news through some form
of social media and nearly half of U.S. adults get news through Facebook.
And across the globe, people between 18 and 29 years old are more likely to get their
news from social media than older adults.
When we’re this reliant on a media ecosystem full of pollution, we have to take responsibility
for what we read, post and share and to do that we should fully understand how social
media networks really function including the good stuff, and also the terrible stuff.
First, the good side.
For one thing, platforms like Facebook, Twitter and Instagram allow us to share information
and thoughts without the help of traditional gatekeepers.
Prior to social media it was really difficult to have your voice heard in a large public
forum.
And because all the posts in our feeds look more or less equal social media has allowed
people to have voices in public discourse who previously would have been silenced by
power structures.
That’s great!
All tweets were created equal and everybody’s faces look weird with that one square-jawed
snapchat filter and we’re all in this together!
Also, social media is great for making friends and finding communities.
We can organize ourselves into these little affinity groups around special interests or
organization, which makes communication much easier than it was before.
Like for example, what if a group of people who want to get together and figure out how
decrease overall the worldwide level of suck.
Or, when I need to know what is eating my tomatoes, I can go to a gardening facebook
group.
That example by the way is for old people alienated by my previous mention of snapchat
filters.
That said there are plenty of problems with social media from cyberbullying to catfishing
to scams to massive disinformation campaigns to people live tweeting shows you wanted to
watch later.
And if you’re going to live partly inside these feeds I think it’s really important
to understand both the kinds of information that are likely to be shared with you and
the kinds of information you’re incentivised to share.
Let’s start with targeted advertising.
So you’re probably seeing an ad in this corner.. possibly this one.
I don’t have a great sense of direction when I’m inside the feed.
Or maybe you watched an ad before this video played.
Regardless, you may have noticed that something you searched for recently has been advertised
to you.
Like for instance I’m trying to improve my collection of vintage cameras for the background
and suddenly all I see are advertisements for vintage cameras.
Social media companies make money by selling advertisements.
That’s why you get to use those platforms for free.
But these ads are very different from billboards or ads in a local newspaper, because these
ads were crafted just for you, or people like you, based on what social media companies
know about you.
And they know a lot.
They can learn your interests and habits based on how you use their app, but they also track
you elsewhere -- via other apps associated with that company, or by using geolocation
features to figure out where you physically are.
Social media companies take all that information and present it to advertisers in one form
or another so that those advertisers can target their ads based on your interests and browsing
history and location and age and gender and much more.
Can you protect your privacy and your feeds from targeted advertising?
Kind of.
Sometimes.
You can check your favorite apps and disable data and location tracking where you can -- these
features may fall under Ad Preferences or Security or Privacy settings.
Another potential downside to social media: how algorithms organize our feeds.
So algorithms are sets of rules or operations a computer follows to complete a task.
To put it very simply: social media sites use what they know about your habits, they
combine that with their knowledge of other people and the things you’ve self-selected
to follow, and funnel all that information through an algorithm.
And then the algorithm decides what to show you in your newsfeed.
Generally speaking, a newsfeed algorithm looks for what you’re most likely to engage with,
by liking or sharing it.
Social media companies want you to stay engaged with their app or site for as long as possible.
So they show you stuff that you like so you won’t leave so that they can sell more of
your attention.
And because the algorithms mostly show us things we are likely to like and agree with
we often find ourselves in so-called filter bubbles, surrounded by voices we already know
we agree with, and often unable to hear from those we don’t.
This also means that most newsfeed algorithms are skewed toward engagement rather than truth.
This is so often the case in fact that entire businesses have been successfully run on posting
engaging, but false, news stories.
Many newsfeed algorithms favor outrageous and emotional content, so companies looking
to make money from clicks and advertisements can use that to their advantage.
Hundreds of websites were built on false viral stories leading up to the 2016 U.S. election,
and Buzzfeed later found out many were run by teenagers in Macedonia.
Valuing engagement over quality makes it harder for users to distinguish between truth and
fiction.
Like humans tend to interpret information in a way that matches our pre-existing beliefs.
That’s called confirmation bias.
But even if you did somehow manage to be completely emotionally and ideologically neutral on a
topic.
Research has shown that if there’s information you know is bogus, encountering it again and
again means you might start to believe it.
Warding off the negative effects of algorithmic newsfeeds and filter bubbles is really hard.
But I do think you can limit these effects by A) following people and pages that have
different viewpoints and perspectives than you do, to add some variety to your feed.
And B)
looking for ways to turn off the “best” or “top” posts features in your favorite
social apps so that they display information to you in a more neutral way.
All of these negative features of social media combine to create the feature that I personally
worry about the most: extreme recommendation engines.
Social media algorithms show you more of what you’ve already indicated you like.
The way we use those apps tends to keep us surrounded by information we’re primed to
believe and agree with.
And because engagement is the most important thing, and we tend to engage with what most
outrages, angers, and shocks us.
The longer we hang out on some social media apps and engage with outrageous content the
more likely those apps are to push outrageous content to us.
Researchers have found that YouTube’s recommendation algorithms, for instance, consistently showed
users more and more extreme, far-right channels once they began watching political videos.
They called it a radical rabbit hole.
YouTube was lumping together outlets like Fox News and the channels of Republican politicians
with those of known far-right conspiracy theorists and white nationalists.
They also found that far-left channels have smaller followings and were not nearly as
visible via those same pathways.
Now beginning in 2017, YouTube started to update its algorithm to prioritize what they
call “authoritativeness."
In part to try to stop this from happening.
But as previously noted, no algorithm is perfect or objective.
Ultimately, it’s on us as users not to fall down these rabbit holes, not to go merely
where the light is on.
That’s why I think it’s so important to follow accounts with differing viewpoints
and to turn off data tracking if you can, and in general to try to unwind the algorithmic
web around your social media life.
And while you’re in the feed it’s important to remember to read laterally about sources
you don’t recognize.
And also take a break once in a while.
Talk to actual people.
Get some fresh air.
I really think that’s valuable.
But even though I personally had to leave lots of the social Internet I do believe that
social media can be an effective way to learn about news and other information--if you’re
able to protect yourself.
Let’s try this in the Filter Bubble.
Oh yeah, that looks about right.
Yes, surrounded by everything I love and believe in.
Okay, that’s enough, let’s go to the Thought Bubble.
Okay, so your cousin DMed you a link headlined: Singing Creek Park Sold, Will Be Home to Monster
Truck Rally.
Wow.
That is your favorite park, so that is a huge bummer.
Your first instinct, of course, is to repost it with an angry comment like “UGH we need
nature WTH this is so unfair.”
But wait, no.
Take a deep breath and think.
Your cousin is kind of a big deal -- he’s Blue-check verified and everything.
But blue checkmarks and verified profiles do not denote truth.
They just mean an account itself is who they claim to be.
So you click the link.
It’s from a site called localnews.co, which you’ve never heard of.
And this is where your lateral reading kicks in.
Use a search engine to look up the name of that site.
Its Wikipedia entry reveals it’s a recently founded independent news site for your area,
but it’s a very short Wikipedia article - not many reputable sources have written
about the site to give us a better idea of its perspective or authority.
So you search for their claim instead: singing creek park sale.
The first result is that sketchy Local News site.
Let’s peruse the entire page.
Ah, there you go -- the seventh result is from a website you do know and trust, your
local TV station and they say the park was sold, but it’s actually going to be turned
into a nonprofit wildflower preserve.
Which you know what sounds pretty lovely.
You could leave it at that.
But as a good citizen of the internet, you should correct this misinformation.
Tell your cousin what’s up, they won’t at all be defensive,
ask them not to share it, and then post the trustworthy article yourself.
With the headline, “Condolences to monster truck enthusiasts.”
Mission accomplished.
Thanks, Thought Bubble.
So during this series we’ve talked a lot about using lateral reading to check the source,
look for authority and perspective, and then check the claim and its evidence.
With social media, a more flexible approach is probably best.
Like sometimes it makes sense to find out who’s behind the account you’re seeing.
Sometimes you should investigate the source of what they’re sharing.
Other times it’s best to evaluate the claim being made.
As you practice you’ll develop a better idea of how to spend your time online.
No matter where you begin, lateral reading will help you get the information you’re
looking for.
When in doubt about anything you encounter online you can challenge your source and your
own assumptions and see what others people have to say.
And there’s one last thing I’d add: Be suspicious of information that confirms your
pre-existing worldview, especially stuff that confirms that people you believe to be evil
or stupid are evil or stupid.
Read laterally not only when it comes to stuff you don’t want to be true, but also when
it comes to stuff you do want to be true.
I know our current information environment can be frustrating.
Believe me, I am frustrated by it.
It is really difficult to know where to look for truth and accuracy, and I wish I could
tell you there is one right way, one source you can always rely upon, but the truth is,
anyone who tells you that is selling you an ideology or a product or both.
But by making a habit of following up and following through, we can be expert navigators
of digital information, and maybe even go to places where the lights are not on.
Thanks so much for joining us for Crash Course: Navigating Digital Information.
And thanks to the Poynter Institute and the Stanford History Education Group for making
this series possible.
MediaWise is supported by Google.
If you’re interested in learning more about MediaWise and fact-checking, a good place
to start is @mediawise on Instagram.
Thanks again for watching.
Good luck out there in the wild west.
And as they say in my hometown, “don’t forget to be awesome."
浏览更多相关视频
Who Can You Trust? Crash Course Navigating Digital Information #4
Check Yourself with Lateral Reading: Crash Course Navigating Digital Information #3
Click Restraint: Crash Course Navigating Digital Information #9
Introduction to Crash Course Navigating Digital Information #1
What History Was, Is, and Will Be: Crash Course European History #50
Globalization I - The Upside: Crash Course World History #41
5.0 / 5 (0 votes)