Social Media: Crash Course Navigating Digital Information #10

CrashCourse
12 Mar 201916:51

Summary

TLDRIn this Crash Course episode, John Green discusses the profound impact of social media on our lives, including changes in vocabulary, privacy, and offline behavior. He highlights the influence of targeted advertising, the creation of filter bubbles, and the spread of misinformation. Green emphasizes the importance of lateral reading, critical thinking, and engaging with diverse perspectives to navigate the digital information landscape responsibly.

Takeaways

  • 😄 Social media has profoundly influenced how we communicate, interact, and perceive information.
  • 📱 We often unconsciously follow social media trends and patterns, much like moths are drawn to light.
  • 🌐 Social media impacts offline behavior, including political rallies organized by fake accounts.
  • 🤔 Algorithms influence our feeds by prioritizing content that is engaging, often over truthful or unbiased information.
  • 🧠 Confirmation bias can lead us to believe false information if we see it repeatedly in our feeds.
  • 🚨 Targeted advertising uses personal data to show tailored ads, which raises concerns about privacy and data security.
  • 🗣️ Social media can amplify voices that traditional gatekeepers may have silenced, allowing for more diverse public discourse.
  • ⚠️ There is a risk of filter bubbles, where users are only exposed to information that aligns with their beliefs, limiting diverse perspectives.
  • 🔍 Lateral reading is crucial in verifying information on social media to avoid spreading misinformation.
  • 🏁 It's essential to challenge our assumptions and seek information from varied and credible sources to navigate digital spaces effectively.

Q & A

  • What is the main theme of this Crash Course episode?

    -The main theme is how social media impacts our perceptions, behaviors, and decisions, and how to navigate digital information responsibly.

  • What analogy does John Green use to explain human behavior on social media?

    -He uses the analogy of a moth flying toward a light, comparing it to how humans often follow social media trends without thinking critically.

  • How does social media affect our offline behaviors?

    -Social media can influence how we vote, how we engage with our communities, and even how we interact with family members, often creating real-world consequences from online interactions.

  • What role do algorithms play in shaping our social media feeds?

    -Algorithms curate our feeds by prioritizing content that aligns with our interests, often creating filter bubbles where we only see perspectives we agree with.

  • What is a 'filter bubble,' and why is it concerning?

    -A 'filter bubble' occurs when algorithms show users content they are likely to agree with, which can limit exposure to diverse viewpoints and create an echo chamber of similar ideas.

  • How do social media platforms make money?

    -Social media platforms make money primarily through targeted advertising, which uses data from users' behavior, location, and preferences to show them personalized ads.

  • What is the difference between misinformation and disinformation?

    -Misinformation is the accidental sharing of false information, while disinformation is the deliberate spread of false information to deceive people.

  • What was the impact of Russian disinformation campaigns during the 2016 U.S. elections?

    -Russian agents created fake grassroots movements on Facebook, which organized real-life political rallies in the U.S., influencing political discourse and creating division.

  • How can users reduce the impact of social media algorithms on their newsfeed?

    -Users can follow accounts with differing viewpoints, turn off top post features to get a more neutral feed, and disable data tracking to reduce personalized targeting.

  • What are 'radical rabbit holes,' and how do they form on platforms like YouTube?

    -Radical rabbit holes form when recommendation algorithms consistently suggest more extreme content, leading users deeper into far-right or far-left ideologies.

Outlines

00:00

🦋 The Moth's Problem and Our Natural Responses

John Green begins by telling a joke about a moth visiting a podiatrist, humorously illustrating how humans, like moths, often follow the light without thinking. He draws a comparison between how moths follow light and how humans tend to go where the 'light is on,' such as automatically using social media because it’s the natural thing to do. This leads to a discussion on how social media has deeply impacted our lives, including language, perceptions of privacy, and even offline experiences like political rallies organized through fake Facebook pages.

05:01

📱 The Good and Bad of Social Media

John highlights the positive and negative impacts of social media. On the positive side, platforms like Facebook and Twitter allow voices to be heard that were previously silenced. Social media facilitates the formation of communities around shared interests. However, the downsides are significant: from cyberbullying to scams and disinformation. The key issue is how social media influences both what we share and what we consume, often leading us into filter bubbles driven by engagement rather than truth.

10:02

⚠️ Extreme Recommendation Engines and Radicalization

John dives into the dangers of extreme recommendation algorithms. Platforms like YouTube have been found to push increasingly radical content to users based on what they’ve previously watched, creating 'radical rabbit holes.' While some steps have been taken to counter this, such as prioritizing 'authoritativeness,' these algorithms can still drive people toward more extreme viewpoints. The solution lies in conscious user behavior: seeking diverse viewpoints, limiting data tracking, and staying vigilant against falling into algorithmic traps.

15:05

🔍 The Power of Lateral Reading and Digital Literacy

John emphasizes the importance of lateral reading—researching sources, claims, and context when encountering information online. He gives an example of misinformation about a park sale, showing how checking multiple sources can uncover the truth. He encourages viewers to question not only things they don’t believe but also those they do, and to resist information that confirms biases. The ultimate goal is to become better navigators of the digital landscape, avoiding the allure of simple truths and being skeptical of ideologies.

Mindmap

Keywords

💡Social Media Feed

A social media feed refers to the chronologically ordered list of posts, updates, and content shared by users on social media platforms. In the context of the video, the feed is described as a fundamental part of how users interact with social media, influencing their perceptions and behaviors. The video discusses how the feed is curated by algorithms to maximize engagement, potentially leading to 'filter bubbles' where users are only exposed to content that aligns with their existing views.

💡Algorithms

Algorithms, in the context of the video, are the sets of rules that social media platforms use to determine the order and content of posts in a user's feed. They are designed to predict what users will engage with most, often prioritizing content that sparks strong reactions. The video explains how these algorithms can create 'filter bubbles' and contribute to the spread of misinformation by showing users more extreme or sensational content.

💡Filter Bubbles

Filter bubbles are personalized, algorithmically curated information environments that reflect a user's interests and beliefs, often leading to a lack of exposure to diverse perspectives. The video discusses how social media algorithms can create these bubbles, which can reinforce existing biases and limit the range of information users encounter, as they are primarily shown content they are likely to agree with or engage with.

💡Disinformation

Disinformation refers to false or misleading information that is spread intentionally to deceive or mislead. The video addresses the role of social media in the spread of disinformation, noting how it can be used to manipulate public opinion and influence behaviors. It also distinguishes disinformation from misinformation, which is incorrect information spread without the intent to deceive.

💡Lateral Reading

Lateral reading is a critical thinking strategy where one checks the credibility of information by looking beyond the initial source, verifying facts, and considering multiple perspectives. The video emphasizes the importance of lateral reading in the context of social media, where users are encouraged to investigate the sources of information and the credibility of claims before accepting or sharing them.

💡Engagement

Engagement, in the context of social media, refers to user interactions with content, such as likes, shares, comments, and clicks. The video explains how social media platforms prioritize content that generates high engagement, as it keeps users on the platform longer and allows for more targeted advertising. This focus on engagement can lead to the amplification of sensational or controversial content.

💡Targeted Advertising

Targeted advertising is a form of online advertising where ads are tailored to individual users based on their online behavior, interests, and demographic information. The video discusses how social media platforms collect data on users to enable advertisers to target them with personalized ads, which can lead to privacy concerns and the manipulation of user behavior.

💡Confirmation Bias

Confirmation bias is the tendency to favor information that confirms one's existing beliefs or values. The video touches on how social media algorithms can exploit confirmation bias by showing users content that aligns with their views, reinforcing their beliefs and potentially isolating them from opposing perspectives.

💡Misinformation

Misinformation is false information that is spread, but unlike disinformation, it is not spread with the intent to deceive. The video discusses the role of social media in the spread of misinformation and how it can inadvertently contribute to the confusion and misunderstanding of facts, particularly when users do not critically evaluate the sources of information they encounter.

💡Cyberbullying

Cyberbullying refers to the use of electronic communication to bully others, typically by sending messages of an intimidating or threatening nature. The video mentions cyberbullying as one of the negative aspects of social media, where the anonymity and reach of online platforms can exacerbate the harm caused by bullying behavior.

💡Fake News

Fake news is a term used to describe news stories that are entirely fabricated and presented as if they are genuine news. The video discusses the proliferation of fake news on social media and its potential to mislead and manipulate public opinion, especially when such content is designed to look credible and is shared widely.

Highlights

John Green introduces the topic of navigating social media feeds and jokes about human behavior being similar to moths following a light.

Humans often follow the crowd on social media without thinking critically, much like moths are drawn to light.

Social media has fundamentally changed how we interact and communicate, shaping perceptions of privacy and influencing offline experiences.

In 2016, Russian agents used fake Facebook pages to organize real political rallies across the U.S., showcasing the power of online manipulation.

Silicon Valley has influenced modern language, turning words like 'friend,' 'Google,' and 'DM' into verbs.

Social media can affect how we engage in politics and family discussions, with online arguments influencing real-life interactions.

Algorithms on social media platforms push content that engages users, often reinforcing filter bubbles and preventing exposure to differing viewpoints.

68% of U.S. adults get their news from social media, with Facebook being a key platform, raising concerns about misinformation.

Targeted advertising on social media platforms is highly personalized, based on user habits, browsing history, and even geolocation.

Misinformation and false news are frequently shared on social media, making lateral reading a vital skill to verify sources.

Newsfeed algorithms often prioritize engagement over truth, leading to the spread of emotional and sensational content.

YouTube’s recommendation algorithm has been found to push users toward more extreme content, especially in political topics.

Despite algorithmic flaws, users can protect themselves by diversifying their social media feeds and verifying information independently.

John Green encourages readers to seek diverse viewpoints and remain skeptical of information that aligns too perfectly with their beliefs.

Lateral reading and questioning sources are key to navigating digital information responsibly, preventing misinformation spread.

Transcripts

play00:00

Hi, I’m John Green, and this is Crash Course: Navigating Digital Information.

play00:03

So we’re going to talk about your social media feed today, but first: At the beginning

play00:08

of this series, I told you one of the two jokes I know, and now that we’ve reached

play00:12

the last episode, I’d like to tell you the other one.

play00:15

So a moth walks into a podiatrist’s office, and the podiatrist says, “What seems to

play00:18

be the problem, moth?”

play00:20

And the moth answers, “Awww, doc.

play00:21

If only there were only one problem.

play00:24

I can’t hold down a job because I’m not good at anything.

play00:27

My wife can hardly stand to look at me;

play00:29

we don’t even love each other anymore,

play00:31

worse than that, I can’t even remember if we ever loved each other.

play00:35

When I look into the eyes of my children,

play00:37

All I see is the same emptiness and despair that I feel in my own heart, doc.”

play00:43

And then the podiatrist says, “Whoa, moth.

play00:45

Okay.

play00:46

Those are very serious problems, but it seems like you need to see a psychologist.

play00:49

I’m a podiatrist.

play00:51

What brought you here today?”

play00:53

And the moth says, “Oh.

play00:56

The light was on.”

play00:57

We humans like to think of ourselves as extremely sophisticated animals.

play01:01

Like moths may fly toward the light, but humans are endowed with free will.

play01:05

We make choices.

play01:07

Except a lot of the time, we just go where the light is on.

play01:11

We do whatever feels like the natural thing.

play01:14

We get on facebook because other people are on facebook.

play01:17

We scroll through posts because the architecture of the site tells us to scroll.

play01:22

We become passive.

play01:23

In the past decade especially, social media has fundamentally changed us.

play01:27

Like take your vocabulary, for example.

play01:30

Silicon Valley rivals Shakespeare in its prolific additions to the English language.

play01:35

Friend, Google, and ‘gram are all verbs now.

play01:39

Snap and handle have new definitions.

play01:41

Sliding into someone’s DMs is a thing.

play01:44

But it’s not just how we speak -- these apps have not-so-subtly become embedded in

play01:48

our daily lives very quickly.

play01:51

Sometimes we don’t even realize how much they impact us.

play01:54

They’ve changed our perceptions and expectations of privacy and they’ve also helped to shape

play01:59

our offline experience.

play02:00

In 2016 for instance, Russian agents organized political rallies all over the U.S. by creating

play02:05

fake Facebook pages for made-up grassroots communities that then had real offline rallies.

play02:12

Just by posing as organizers against Donald Trump or against Hillary Clinton, they actually

play02:16

got real people to show up in Florida, New York, North Carolina, Washington, and Texas.

play02:22

And those rally-goers didn’t know that it was a ruse.

play02:25

I find that scary.

play02:26

So today, for our big finale, we’re talking about the great white whale of navigating

play02:31

online information: your social media feed.

play02:33

INTRO

play02:43

So quick note here at the start.

play02:44

I’m not currently using a bunch of social media platforms.

play02:47

Which may mean that I’m no longer an expert in them, but it’s only been six weeks and

play02:50

I don’t think anything has changed that much.

play02:52

Also, it turns out that whether or not you participate in Twitter is irrelevant to whether

play02:57

Twitter effects you life because what’s shared online has offline consequences.

play03:02

Like online shouting matches about politics can influence how we vote and also how we

play03:06

talk to our extended family at the Thanksgiving dinner table.

play03:09

Unless you don’t live in the US or Canada in which case I guess you don’t have Thanksgiving

play03:13

and presumably you never fight with your aunts and uncles about politics.

play03:16

The way we interact in social media is shaping all of our offline behaviors, from how we

play03:21

engage with IRL communities to how we consume goods and services.

play03:25

That’s why there are so many people you don’t know, and companies and organizations

play03:29

using social media to try to influence your thoughts and actions.

play03:33

Sometimes those who want to influence you use false identities like those with the Russian

play03:37

rallies.

play03:37

Sometimes, and more overtly, they buy your attention with advertising.

play03:41

Some just create really engaging videos about a kitten saved during a hurricane to steal

play03:46

your attention.

play03:47

Some of these actors have relatively benign goals and act fairly, like a company sending

play03:51

ads into your feed for a Harry Potter mug that it turns out you actually want because

play03:55

you are a Hufflepuff and you are proud!

play03:57

But others have terrible motives and spread disinformation, like hoax news sites which

play04:01

are all run by Slytherins.

play04:02

Still others aren’t quite in either camp.

play04:04

They might unwittingly spread inaccurate information, or misinformation.

play04:08

Like your aunt who always posts about Onion articles like they’re actual news.

play04:12

Or me, on the several occasions when I have failed to pause and laterally read before

play04:17

retweeting news that turned out to be false.

play04:19

The big problem with all of that is that 68% of U.S. adults get news through some form

play04:25

of social media and nearly half of U.S. adults get news through Facebook.

play04:30

And across the globe, people between 18 and 29 years old are more likely to get their

play04:34

news from social media than older adults.

play04:36

When we’re this reliant on a media ecosystem full of pollution, we have to take responsibility

play04:42

for what we read, post and share and to do that we should fully understand how social

play04:47

media networks really function including the good stuff, and also the terrible stuff.

play04:52

First, the good side.

play04:53

For one thing, platforms like Facebook, Twitter and Instagram allow us to share information

play04:57

and thoughts without the help of traditional gatekeepers.

play05:00

Prior to social media it was really difficult to have your voice heard in a large public

play05:06

forum.

play05:06

And because all the posts in our feeds look more or less equal social media has allowed

play05:10

people to have voices in public discourse who previously would have been silenced by

play05:14

power structures.

play05:15

That’s great!

play05:16

All tweets were created equal and everybody’s faces look weird with that one square-jawed

play05:21

snapchat filter and we’re all in this together!

play05:24

Also, social media is great for making friends and finding communities.

play05:28

We can organize ourselves into these little affinity groups around special interests or

play05:33

organization, which makes communication much easier than it was before.

play05:37

Like for example, what if a group of people who want to get together and figure out how

play05:41

decrease overall the worldwide level of suck.

play05:44

Or, when I need to know what is eating my tomatoes, I can go to a gardening facebook

play05:48

group.

play05:48

That example by the way is for old people alienated by my previous mention of snapchat

play05:52

filters.

play05:52

That said there are plenty of problems with social media from cyberbullying to catfishing

play05:58

to scams to massive disinformation campaigns to people live tweeting shows you wanted to

play06:03

watch later.

play06:04

And if you’re going to live partly inside these feeds I think it’s really important

play06:08

to understand both the kinds of information that are likely to be shared with you and

play06:12

the kinds of information you’re incentivised to share.

play06:16

Let’s start with targeted advertising.

play06:18

So you’re probably seeing an ad in this corner.. possibly this one.

play06:22

I don’t have a great sense of direction when I’m inside the feed.

play06:25

Or maybe you watched an ad before this video played.

play06:28

Regardless, you may have noticed that something you searched for recently has been advertised

play06:32

to you.

play06:33

Like for instance I’m trying to improve my collection of vintage cameras for the background

play06:37

and suddenly all I see are advertisements for vintage cameras.

play06:41

Social media companies make money by selling advertisements.

play06:44

That’s why you get to use those platforms for free.

play06:47

But these ads are very different from billboards or ads in a local newspaper, because these

play06:52

ads were crafted just for you, or people like you, based on what social media companies

play06:58

know about you.

play07:00

And they know a lot.

play07:01

They can learn your interests and habits based on how you use their app, but they also track

play07:05

you elsewhere -- via other apps associated with that company, or by using geolocation

play07:10

features to figure out where you physically are.

play07:13

Social media companies take all that information and present it to advertisers in one form

play07:18

or another so that those advertisers can target their ads based on your interests and browsing

play07:23

history and location and age and gender and much more.

play07:27

Can you protect your privacy and your feeds from targeted advertising?

play07:31

Kind of.

play07:32

Sometimes.

play07:33

You can check your favorite apps and disable data and location tracking where you can -- these

play07:37

features may fall under Ad Preferences or Security or Privacy settings.

play07:41

Another potential downside to social media: how algorithms organize our feeds.

play07:46

So algorithms are sets of rules or operations a computer follows to complete a task.

play07:51

To put it very simply: social media sites use what they know about your habits, they

play07:56

combine that with their knowledge of other people and the things you’ve self-selected

play08:01

to follow, and funnel all that information through an algorithm.

play08:05

And then the algorithm decides what to show you in your newsfeed.

play08:09

Generally speaking, a newsfeed algorithm looks for what you’re most likely to engage with,

play08:14

by liking or sharing it.

play08:16

Social media companies want you to stay engaged with their app or site for as long as possible.

play08:21

So they show you stuff that you like so you won’t leave so that they can sell more of

play08:27

your attention.

play08:28

And because the algorithms mostly show us things we are likely to like and agree with

play08:33

we often find ourselves in so-called filter bubbles, surrounded by voices we already know

play08:38

we agree with, and often unable to hear from those we don’t.

play08:42

This also means that most newsfeed algorithms are skewed toward engagement rather than truth.

play08:48

This is so often the case in fact that entire businesses have been successfully run on posting

play08:52

engaging, but false, news stories.

play08:55

Many newsfeed algorithms favor outrageous and emotional content, so companies looking

play09:00

to make money from clicks and advertisements can use that to their advantage.

play09:04

Hundreds of websites were built on false viral stories leading up to the 2016 U.S. election,

play09:10

and Buzzfeed later found out many were run by teenagers in Macedonia.

play09:14

Valuing engagement over quality makes it harder for users to distinguish between truth and

play09:20

fiction.

play09:21

Like humans tend to interpret information in a way that matches our pre-existing beliefs.

play09:26

That’s called confirmation bias.

play09:28

But even if you did somehow manage to be completely emotionally and ideologically neutral on a

play09:34

topic.

play09:34

Research has shown that if there’s information you know is bogus, encountering it again and

play09:40

again means you might start to believe it.

play09:43

Warding off the negative effects of algorithmic newsfeeds and filter bubbles is really hard.

play09:48

But I do think you can limit these effects by A) following people and pages that have

play09:52

different viewpoints and perspectives than you do, to add some variety to your feed.

play09:57

And B)

play09:58

looking for ways to turn off the “best” or “top” posts features in your favorite

play10:02

social apps so that they display information to you in a more neutral way.

play10:06

All of these negative features of social media combine to create the feature that I personally

play10:11

worry about the most: extreme recommendation engines.

play10:15

Social media algorithms show you more of what you’ve already indicated you like.

play10:19

The way we use those apps tends to keep us surrounded by information we’re primed to

play10:24

believe and agree with.

play10:25

And because engagement is the most important thing, and we tend to engage with what most

play10:29

outrages, angers, and shocks us.

play10:32

The longer we hang out on some social media apps and engage with outrageous content the

play10:38

more likely those apps are to push outrageous content to us.

play10:43

Researchers have found that YouTube’s recommendation algorithms, for instance, consistently showed

play10:47

users more and more extreme, far-right channels once they began watching political videos.

play10:53

They called it a radical rabbit hole.

play10:55

YouTube was lumping together outlets like Fox News and the channels of Republican politicians

play11:00

with those of known far-right conspiracy theorists and white nationalists.

play11:04

They also found that far-left channels have smaller followings and were not nearly as

play11:08

visible via those same pathways.

play11:10

Now beginning in 2017, YouTube started to update its algorithm to prioritize what they

play11:15

call “authoritativeness."

play11:17

In part to try to stop this from happening.

play11:19

But as previously noted, no algorithm is perfect or objective.

play11:23

Ultimately, it’s on us as users not to fall down these rabbit holes, not to go merely

play11:29

where the light is on.

play11:31

That’s why I think it’s so important to follow accounts with differing viewpoints

play11:34

and to turn off data tracking if you can, and in general to try to unwind the algorithmic

play11:39

web around your social media life.

play11:42

And while you’re in the feed it’s important to remember to read laterally about sources

play11:46

you don’t recognize.

play11:47

And also take a break once in a while.

play11:50

Talk to actual people.

play11:52

Get some fresh air.

play11:53

I really think that’s valuable.

play11:55

But even though I personally had to leave lots of the social Internet I do believe that

play11:59

social media can be an effective way to learn about news and other information--if you’re

play12:04

able to protect yourself.

play12:05

Let’s try this in the Filter Bubble.

play12:07

Oh yeah, that looks about right.

play12:12

Yes, surrounded by everything I love and believe in.

play12:16

Okay, that’s enough, let’s go to the Thought Bubble.

play12:18

Okay, so your cousin DMed you a link headlined: Singing Creek Park Sold, Will Be Home to Monster

play12:23

Truck Rally.

play12:24

Wow.

play12:25

That is your favorite park, so that is a huge bummer.

play12:28

Your first instinct, of course, is to repost it with an angry comment like “UGH we need

play12:32

nature WTH this is so unfair.”

play12:35

But wait, no.

play12:36

Take a deep breath and think.

play12:37

Your cousin is kind of a big deal -- he’s Blue-check verified and everything.

play12:42

But blue checkmarks and verified profiles do not denote truth.

play12:46

They just mean an account itself is who they claim to be.

play12:49

So you click the link.

play12:50

It’s from a site called localnews.co, which you’ve never heard of.

play12:55

And this is where your lateral reading kicks in.

play12:57

Use a search engine to look up the name of that site.

play12:59

Its Wikipedia entry reveals it’s a recently founded independent news site for your area,

play13:04

but it’s a very short Wikipedia article - not many reputable sources have written

play13:08

about the site to give us a better idea of its perspective or authority.

play13:11

So you search for their claim instead: singing creek park sale.

play13:16

The first result is that sketchy Local News site.

play13:18

Let’s peruse the entire page.

play13:20

Ah, there you go -- the seventh result is from a website you do know and trust, your

play13:25

local TV station and they say the park was sold, but it’s actually going to be turned

play13:29

into a nonprofit wildflower preserve.

play13:32

Which you know what sounds pretty lovely.

play13:34

You could leave it at that.

play13:35

But as a good citizen of the internet, you should correct this misinformation.

play13:39

Tell your cousin what’s up, they won’t at all be defensive,

play13:41

ask them not to share it, and then post the trustworthy article yourself.

play13:45

With the headline, “Condolences to monster truck enthusiasts.”

play13:49

Mission accomplished.

play13:50

Thanks, Thought Bubble.

play13:51

So during this series we’ve talked a lot about using lateral reading to check the source,

play13:56

look for authority and perspective, and then check the claim and its evidence.

play14:00

With social media, a more flexible approach is probably best.

play14:04

Like sometimes it makes sense to find out who’s behind the account you’re seeing.

play14:08

Sometimes you should investigate the source of what they’re sharing.

play14:11

Other times it’s best to evaluate the claim being made.

play14:15

As you practice you’ll develop a better idea of how to spend your time online.

play14:19

No matter where you begin, lateral reading will help you get the information you’re

play14:24

looking for.

play14:25

When in doubt about anything you encounter online you can challenge your source and your

play14:28

own assumptions and see what others people have to say.

play14:32

And there’s one last thing I’d add: Be suspicious of information that confirms your

play14:37

pre-existing worldview, especially stuff that confirms that people you believe to be evil

play14:43

or stupid are evil or stupid.

play14:46

Read laterally not only when it comes to stuff you don’t want to be true, but also when

play14:51

it comes to stuff you do want to be true.

play14:54

I know our current information environment can be frustrating.

play14:58

Believe me, I am frustrated by it.

play14:59

It is really difficult to know where to look for truth and accuracy, and I wish I could

play15:04

tell you there is one right way, one source you can always rely upon, but the truth is,

play15:11

anyone who tells you that is selling you an ideology or a product or both.

play15:16

But by making a habit of following up and following through, we can be expert navigators

play15:21

of digital information, and maybe even go to places where the lights are not on.

play15:27

Thanks so much for joining us for Crash Course: Navigating Digital Information.

play15:30

And thanks to the Poynter Institute and the Stanford History Education Group for making

play15:34

this series possible.

play15:36

MediaWise is supported by Google.

play15:38

If you’re interested in learning more about MediaWise and fact-checking, a good place

play15:42

to start is @mediawise on Instagram.

play15:45

Thanks again for watching.

play15:46

Good luck out there in the wild west.

play15:47

And as they say in my hometown, “don’t forget to be awesome."

Rate This

5.0 / 5 (0 votes)

Связанные теги
social mediadigital literacyinformation biasconfirmation biasmedia influencefact-checkingalgorithmsprivacynewsfeedonline behavior
Вам нужно краткое изложение на английском?