Social Media: Crash Course Navigating Digital Information #10

CrashCourse
12 Mar 201916:51

Summary

TLDRこのビデオスクリプトでは、ソーシャルメディアのフィードがどのようにして私たちの言語やプライバシーの認識、さらにはオフラインの経験に影響を与えているかについて語られています。また、ソーシャルメディアが持つ良い側面と悪い側面、そして私たちがオンライン上で情報を正しくナビゲートするための戦略についても議論されています。特に、ソーシャルメディアのアルゴリズムが私たちが受け取る情報にどのように影響を与えているか、そしてフィルターバブルや極端な推奨エンジンの問題に焦点を当てています。最後に、ソーシャルメディア上で情報を検証するためのラテラルリーディングの重要性と、信頼できる情報源を見つけるための方法が強調されています。

Takeaways

  • 😀 スクリプトは、ソーシャルメディアのフィードについて話すが、まずはジョークを通じて人間の自然的な行動に触れています。
  • 🔍 人々の行動は、自然なことや、他の人がしていることに影響されますが、ソーシャルメディアはこの影響を根本的に変えています。
  • 📈 インターネット企業は、新しい英語の動詞を作り出し、私たちの日常的な言語やプライバシーの認識に影響を与えています。
  • 🗣️ ソーシャルメディアは、人々が伝統的なメディアの守護者なしに声を上げられるようにして、公共の議論に参加する力を与えています。
  • 👥 ソーシャルメディアは、特定の興味や組織を共有する人々を集めてコミュニティを作るのに役立ち、コミュニケーションを容易にしています。
  • 🚫 ソーシャルメディアには多くの問題があり、サイバーハラスメントや詐欺、デマの拡散などがあります。
  • 🎯 ターゲット広告は、ソーシャルメディア企業が収益を得る方法であり、あなたの興味や習慣に基づいて広告をカスタマイズします。
  • 🤖 アルゴリズムは、あなたの習慣や他の人々の情報をもとに、最もエンゲージメントが高そうなコンテンツをあなたのフィードに表示します。
  • 🌐 フィルターバブルは、アルゴリズムが同意する意見や既知の声だけを表示するため、異なる見解を持つ人々との接触が制限されることがあります。
  • 👀 極端な推奨エンジンは、あなたがすでに好むコンテンツをさらに多く表示し、過激なコンテンツに陥りやすい「ラビットホール」につながることがあります。
  • 🛡️ ユーザーは、異なる見解を持つアカウントにフォローしたり、データトラッキングをオフにしたりして、アルゴリズムの影響を制限することができます。
  • 📚 信頼できる情報源を確認し、デマや誤情報を検証するために、ラテラルリーディングのスキルを磨くことが重要です。
  • 🌟 ソーシャルメディアは、正しい方法で使用すれば、ニュースや他の情報を学ぶ効果的な手段となる可能性があります。

Q & A

  • ジョン・グリーンが話すデジタル情報ナビゲーションのシリーズのジャンルは何ですか?

    -ジョン・グリーンは、デジタル情報ナビゲーションのシリーズである「Crash Course」を通じて、ソーシャルメディアフィードの分析やデジタル情報の正確な理解方法について説明しています。

  • ジョン・グリーンはなぜ、モスが足病医院に行くジョークを話しましたか?

    -ジョン・グリーンは、モスが足病医院に行くジョークを通じて、人類が自然にとられていることと、ソーシャルメディアで自然と行動することの類似点を示しています。

  • ソーシャルメディアはどのようにして私たちの言葉遣いや日常生活に影響を与えていますか?

    -ソーシャルメディアは、新しい英語の動詞や意味を追加することで、私たちの言葉遣いに大きな影響を与えています。また、プライバシーの認識やオフライン体験にも影響を及ぼしています。

  • 2016年にロシアのエージェントがソーシャルメディアをどのように使用しましたか?

    -2016年にロシアのエージェントは、偽のFacebookページを作成し、実際のオフライン集会を組織することで、アメリカ各地で政治集会を組織しました。

  • ソーシャルメディアが私たちのオフライン行動にどのように影響を与えることがありますか?

    -ソーシャルメディアは、オンラインでの政治的な議論が私たちの投票方法や、感謝祭のディナーテーブルでの家族との会話に影響を与えることがあります。

  • ソーシャルメディア上で共有される情報はどのようにして私たちの行動に影響を与えることがありますか?

    -ソーシャルメディア上で共有される情報は、誤解情報やディスインフォメーションを広めることで、私たちの考え方や行動に影響を与える可能性があります。

  • ソーシャルメディアの広告はどのようにして私たちに合わせて作られますか?

    -ソーシャルメディア企業は、アプリの使用状況や他のアプリ、ジオロケーション機能などを通じて私たちの興味や習慣を学んで、広告主にターゲットされた広告を作成します。

  • ソーシャルメディアのアルゴリズムはどのようにしてニュースフィードを整理しますか?

    -ソーシャルメディアのアルゴリズムは、私たちの興味や習慣、フォローしている人やページを基に、私たちが最も関心を持つと予想されるコンテンツを表示することでニュースフィードを整理します。

  • フィルターバブルとは何ですか?どのようにして私たちの情報収集に影響を与えることがありますか?

    -フィルターバブルは、アルゴリズムが私たちがすでに同意している情報や意見に囲まれる状況を生み出すことです。これは、異なる意見や多様な情報を入手することが難しくなる可能性があります。

  • ソーシャルメディアの極端な推奨エンジンはどのような問題を引き起こしますか?

    -極端な推奨エンジンは、私たちがすでに好むコンテンツをさらに多く表示することで、私たちが極端な意見や不信に陥る可能性を高めます。

  • ソーシャルメディア上で情報を評価する際に、どのようなアプローチを取るべきですか?

    -ソーシャルメディア上で情報を評価する際には、後方読みを通じて情報源を確認し、主張や証拠を評価し、異なる見解を持つアカウントにフォローすることで、フィルターバブルの影響を軽減することができます。

  • なぜソーシャルメディア上での情報は、私たちが既に信じている情報に確認バイアスをもたらす可能性がありますか?

    -確認バイアスは、私たちが既に持っている信念に一致する情報に無意識に偏向し、それが真実であると信じやすくなる心理的現象です。ソーシャルメディアでは、このような傾向がアルゴリズムによって強化されることがあります。

  • ソーシャルメディアを効果的に利用するためには、どのようなポイントを意識するべきですか?

    -ソーシャルメディアを効果的に利用するためには、異なる見解を持つアカウントにフォローし、データトラッキングをオフにし、アルゴリズムの影響を軽減し、信頼できる情報源を確認する習慣を身につけることが重要です。

  • このシリーズの最終回でジョン・グリーンは、ソーシャルメディア上での情報を見つけるためのアプローチとして何を勧めていますか?

    -ジョン・グリーンは、ソーシャルメディア上での情報を見つけるためのアプローチとして、後方読みの習慣を身につけること、信頼できる情報源を確認すること、自分の前提や情報源に疑問を持つこと、そして自分が信じたい情報にも同じ基準を適用することが重要だと勧めています。

Outlines

00:00

😀 ソーシャルメディアの影響とユーモア

ジョン・グリーンが主講するクラッシュコースでは、ソーシャルメディアのフィードに焦点を当てた最終回。冒頭では、ジョークを通して人類が自然な誘いに従う傾向を示し、ソーシャルメディアがどのように人々の行動や言語、プライバシーの認識に影響を与えているかを説明。さらに、2016年のロシアによるアメリカの政治集会の陰謀を例に、ソーシャルメディアがオフラインの行動に及ぼす影響についても触れる。

05:01

🗣 ソーシャルメディアの利点と広告の影響

ソーシャルメディアが人々の声を大きくし、公共のディスカッションに参加する機会を提供する利点を強調。また、ソーシャルメディアが友人作りやコミュニティを見つける手段として役立つと指摘。しかし、ターゲット広告やアルゴリズムによるニュースフィードの構成、フィルターバブル、そして誤情報の拡散といった問題にも触れている。

10:02

🔍 アルゴリズムと極端なコンテンツの推薦エンジン

ソーシャルメディアのアルゴリズムがどのようにユーザーの興味に応じてコンテンツを推薦し、それが極端なコンテンツやフィルターバブルを形成する過程を解説。ユーチューブの推薦アルゴリズムがより極端な政治的チャンネルを推薦する「ラビットホール」现象についても議論。アルゴリズムの偏りを回避するためのアドバイスとして、異なる見解を持つアカウントをフォローしたり、データトラッキングをオフにしたりすることが挙げられている。

15:05

🌐 デジタル情報のナビゲートとファクトチェックの重要性

シリーズの締めくくりとして、ソーシャルメディア上での情報の真偽を確認するためのラテラルリーディングの重要性を強調。信頼できる情報源を特定し、主張や証拠を確認することが、デジタル情報の正確性を確保するための鍵であると語る。また、自分の前提や信じたい情報にも疑いを持つことが求められる。最後に、MediaWiseとファクトチェックに関するリソースを紹介し、視聴者にインターネット上での正確なナビゲートを促す。

Mindmap

Keywords

💡ソーシャルメディア

ソーシャルメディアは、人々が情報や思想を共有できるデジタルプラットフォームのことを指します。このビデオでは、ソーシャルメディアが私たちのコミュニケーションスタイルやプライバシーの認識、さらにはオフラインの経験に影響を与えると触れています。例えば、ビデオではロシアのエージェントが偽のFacebookページを作り、実際のオフライン集会を組織する例を挙げています。

💡デジタル情報

デジタル情報とは、インターネット上やデジタルデバイスを通じて共有される情報を指します。ビデオでは、ソーシャルメディアフィードを通じて得られるデジタル情報が、私たちの判断や行動に大きな影響を与えると強調しています。

💡フィルターバブル

フィルターバブルは、アルゴリズムがユーザーの興味に応じて情報を選択的に表示することで生じる現象です。ビデオでは、この現象がユーザーに対して同意する情報のみを提供し、多様性を持たない情報環境を作り出し、真理とフィクションを区別するのを難しくする可能性があると説明しています。

💡ターゲット広告

ターゲット広告とは、ソーシャルメディア企業がユーザーの興味や習慣に基づいて広告を表示することです。ビデオでは、これらの広告がユーザーに対して特別に作り込まれ、それによってプライバシーが侵害される可能性があると議論しています。

💡アルゴリズム

アルゴリズムは、コンピュータが特定のタスクを完了するために従う一連のルールや操作です。ビデオでは、ソーシャルメディアのニュースフィードアルゴリズムがどのようにしてユーザーとのエンゲージメントを最大化し、それがフィルターバブルやデジタル情報の偏りをもたらすのかを解説しています。

💡エンゲージメント

エンゲージメントは、ユーザーがソーシャルメディア上でコンテンツに反応することを指します。ビデオでは、ソーシャルメディア企業がエンゲージメントを重視し、それが偽の情報や過激なコンテンツの拡散につながることがあると指摘しています。

💡デジタルメディア識別

デジタルメディア識別とは、オンライン上での情報やニュースを正確に評価し、検証するスキルのことを指します。ビデオでは、ラテラルリーディングという手法を通じて、ソーシャルメディア上で得られる情報を正確に判断する重要性が強調されています。

💡プライバシー

プライバシーは、個人が自分の情報や生活を保護し、他人から守られる権利を指します。ビデオでは、ソーシャルメディアがプライバシーに与える影響や、ターゲット広告によってプライバシーが侵害される可能性について議論しています。

💡誤情報

誤情報とは、意図的または無意図に誤った情報を広めることを指します。ビデオでは、ソーシャルメディア上で誤情報が拡散されるリスクと、ユーザーが誤情報を検出し、正しい情報を共有することの重要性について説明しています。

💡ラテラルリーディング

ラテラルリーディングは、オンライン上での情報源を検証するための手法です。ビデオでは、この手法がソーシャルメディア上で得られる情報を正確に判断し、誤解や誤情報を減らすための重要なスキルとして紹介されています。

Highlights

John Green introduces the topic of social media's influence on human behavior and its comparison to moths attracted to light.

Discussion on how social media platforms have fundamentally changed human vocabulary and daily life.

The revelation that social media impacts users' offline behaviors and experiences, including political rallies organized by Russian agents.

Explanation of how social media algorithms create 'filter bubbles' that reinforce users' existing beliefs.

The potential dangers of misinformation and disinformation spread through social media, especially during the 2016 U.S. election.

John Green's personal experience with social media and its offline consequences, such as influencing voting behavior and family discussions.

The role of social media in shaping users' perceptions and expectations of privacy.

The benefits of social media in allowing individuals to share information without traditional gatekeepers.

How social media enables the formation of affinity groups and facilitates communication around special interests.

The issue of targeted advertising on social media and its basis on users' data and habits.

John Green's suggestion to check and disable data and location tracking in app settings to protect privacy.

The concept of 'confirmation bias' and its amplification through social media algorithms.

The phenomenon of 'extreme recommendation engines' pushing users towards more radical content on platforms like YouTube.

YouTube's efforts to update its algorithm to prioritize 'authoritativeness' and counteract the radicalization effect.

Advice on following diverse accounts and turning off certain features to mitigate the effects of algorithmic feeds.

The importance of lateral reading and verifying information on social media, including checking the source and claim.

John Green's encouragement to be critical of information that confirms one's pre-existing worldview and to read laterally.

The series conclusion emphasizing the need for users to be expert navigators of digital information.

Acknowledgment of the support from the Poynter Institute, Stanford History Education Group, and Google for the series.

Transcripts

play00:00

Hi, I’m John Green, and this is Crash Course: Navigating Digital Information.

play00:03

So we’re going to talk about your social media feed today, but first: At the beginning

play00:08

of this series, I told you one of the two jokes I know, and now that we’ve reached

play00:12

the last episode, I’d like to tell you the other one.

play00:15

So a moth walks into a podiatrist’s office, and the podiatrist says, “What seems to

play00:18

be the problem, moth?”

play00:20

And the moth answers, “Awww, doc.

play00:21

If only there were only one problem.

play00:24

I can’t hold down a job because I’m not good at anything.

play00:27

My wife can hardly stand to look at me;

play00:29

we don’t even love each other anymore,

play00:31

worse than that, I can’t even remember if we ever loved each other.

play00:35

When I look into the eyes of my children,

play00:37

All I see is the same emptiness and despair that I feel in my own heart, doc.”

play00:43

And then the podiatrist says, “Whoa, moth.

play00:45

Okay.

play00:46

Those are very serious problems, but it seems like you need to see a psychologist.

play00:49

I’m a podiatrist.

play00:51

What brought you here today?”

play00:53

And the moth says, “Oh.

play00:56

The light was on.”

play00:57

We humans like to think of ourselves as extremely sophisticated animals.

play01:01

Like moths may fly toward the light, but humans are endowed with free will.

play01:05

We make choices.

play01:07

Except a lot of the time, we just go where the light is on.

play01:11

We do whatever feels like the natural thing.

play01:14

We get on facebook because other people are on facebook.

play01:17

We scroll through posts because the architecture of the site tells us to scroll.

play01:22

We become passive.

play01:23

In the past decade especially, social media has fundamentally changed us.

play01:27

Like take your vocabulary, for example.

play01:30

Silicon Valley rivals Shakespeare in its prolific additions to the English language.

play01:35

Friend, Google, and ‘gram are all verbs now.

play01:39

Snap and handle have new definitions.

play01:41

Sliding into someone’s DMs is a thing.

play01:44

But it’s not just how we speak -- these apps have not-so-subtly become embedded in

play01:48

our daily lives very quickly.

play01:51

Sometimes we don’t even realize how much they impact us.

play01:54

They’ve changed our perceptions and expectations of privacy and they’ve also helped to shape

play01:59

our offline experience.

play02:00

In 2016 for instance, Russian agents organized political rallies all over the U.S. by creating

play02:05

fake Facebook pages for made-up grassroots communities that then had real offline rallies.

play02:12

Just by posing as organizers against Donald Trump or against Hillary Clinton, they actually

play02:16

got real people to show up in Florida, New York, North Carolina, Washington, and Texas.

play02:22

And those rally-goers didn’t know that it was a ruse.

play02:25

I find that scary.

play02:26

So today, for our big finale, we’re talking about the great white whale of navigating

play02:31

online information: your social media feed.

play02:33

INTRO

play02:43

So quick note here at the start.

play02:44

I’m not currently using a bunch of social media platforms.

play02:47

Which may mean that I’m no longer an expert in them, but it’s only been six weeks and

play02:50

I don’t think anything has changed that much.

play02:52

Also, it turns out that whether or not you participate in Twitter is irrelevant to whether

play02:57

Twitter effects you life because what’s shared online has offline consequences.

play03:02

Like online shouting matches about politics can influence how we vote and also how we

play03:06

talk to our extended family at the Thanksgiving dinner table.

play03:09

Unless you don’t live in the US or Canada in which case I guess you don’t have Thanksgiving

play03:13

and presumably you never fight with your aunts and uncles about politics.

play03:16

The way we interact in social media is shaping all of our offline behaviors, from how we

play03:21

engage with IRL communities to how we consume goods and services.

play03:25

That’s why there are so many people you don’t know, and companies and organizations

play03:29

using social media to try to influence your thoughts and actions.

play03:33

Sometimes those who want to influence you use false identities like those with the Russian

play03:37

rallies.

play03:37

Sometimes, and more overtly, they buy your attention with advertising.

play03:41

Some just create really engaging videos about a kitten saved during a hurricane to steal

play03:46

your attention.

play03:47

Some of these actors have relatively benign goals and act fairly, like a company sending

play03:51

ads into your feed for a Harry Potter mug that it turns out you actually want because

play03:55

you are a Hufflepuff and you are proud!

play03:57

But others have terrible motives and spread disinformation, like hoax news sites which

play04:01

are all run by Slytherins.

play04:02

Still others aren’t quite in either camp.

play04:04

They might unwittingly spread inaccurate information, or misinformation.

play04:08

Like your aunt who always posts about Onion articles like they’re actual news.

play04:12

Or me, on the several occasions when I have failed to pause and laterally read before

play04:17

retweeting news that turned out to be false.

play04:19

The big problem with all of that is that 68% of U.S. adults get news through some form

play04:25

of social media and nearly half of U.S. adults get news through Facebook.

play04:30

And across the globe, people between 18 and 29 years old are more likely to get their

play04:34

news from social media than older adults.

play04:36

When we’re this reliant on a media ecosystem full of pollution, we have to take responsibility

play04:42

for what we read, post and share and to do that we should fully understand how social

play04:47

media networks really function including the good stuff, and also the terrible stuff.

play04:52

First, the good side.

play04:53

For one thing, platforms like Facebook, Twitter and Instagram allow us to share information

play04:57

and thoughts without the help of traditional gatekeepers.

play05:00

Prior to social media it was really difficult to have your voice heard in a large public

play05:06

forum.

play05:06

And because all the posts in our feeds look more or less equal social media has allowed

play05:10

people to have voices in public discourse who previously would have been silenced by

play05:14

power structures.

play05:15

That’s great!

play05:16

All tweets were created equal and everybody’s faces look weird with that one square-jawed

play05:21

snapchat filter and we’re all in this together!

play05:24

Also, social media is great for making friends and finding communities.

play05:28

We can organize ourselves into these little affinity groups around special interests or

play05:33

organization, which makes communication much easier than it was before.

play05:37

Like for example, what if a group of people who want to get together and figure out how

play05:41

decrease overall the worldwide level of suck.

play05:44

Or, when I need to know what is eating my tomatoes, I can go to a gardening facebook

play05:48

group.

play05:48

That example by the way is for old people alienated by my previous mention of snapchat

play05:52

filters.

play05:52

That said there are plenty of problems with social media from cyberbullying to catfishing

play05:58

to scams to massive disinformation campaigns to people live tweeting shows you wanted to

play06:03

watch later.

play06:04

And if you’re going to live partly inside these feeds I think it’s really important

play06:08

to understand both the kinds of information that are likely to be shared with you and

play06:12

the kinds of information you’re incentivised to share.

play06:16

Let’s start with targeted advertising.

play06:18

So you’re probably seeing an ad in this corner.. possibly this one.

play06:22

I don’t have a great sense of direction when I’m inside the feed.

play06:25

Or maybe you watched an ad before this video played.

play06:28

Regardless, you may have noticed that something you searched for recently has been advertised

play06:32

to you.

play06:33

Like for instance I’m trying to improve my collection of vintage cameras for the background

play06:37

and suddenly all I see are advertisements for vintage cameras.

play06:41

Social media companies make money by selling advertisements.

play06:44

That’s why you get to use those platforms for free.

play06:47

But these ads are very different from billboards or ads in a local newspaper, because these

play06:52

ads were crafted just for you, or people like you, based on what social media companies

play06:58

know about you.

play07:00

And they know a lot.

play07:01

They can learn your interests and habits based on how you use their app, but they also track

play07:05

you elsewhere -- via other apps associated with that company, or by using geolocation

play07:10

features to figure out where you physically are.

play07:13

Social media companies take all that information and present it to advertisers in one form

play07:18

or another so that those advertisers can target their ads based on your interests and browsing

play07:23

history and location and age and gender and much more.

play07:27

Can you protect your privacy and your feeds from targeted advertising?

play07:31

Kind of.

play07:32

Sometimes.

play07:33

You can check your favorite apps and disable data and location tracking where you can -- these

play07:37

features may fall under Ad Preferences or Security or Privacy settings.

play07:41

Another potential downside to social media: how algorithms organize our feeds.

play07:46

So algorithms are sets of rules or operations a computer follows to complete a task.

play07:51

To put it very simply: social media sites use what they know about your habits, they

play07:56

combine that with their knowledge of other people and the things you’ve self-selected

play08:01

to follow, and funnel all that information through an algorithm.

play08:05

And then the algorithm decides what to show you in your newsfeed.

play08:09

Generally speaking, a newsfeed algorithm looks for what you’re most likely to engage with,

play08:14

by liking or sharing it.

play08:16

Social media companies want you to stay engaged with their app or site for as long as possible.

play08:21

So they show you stuff that you like so you won’t leave so that they can sell more of

play08:27

your attention.

play08:28

And because the algorithms mostly show us things we are likely to like and agree with

play08:33

we often find ourselves in so-called filter bubbles, surrounded by voices we already know

play08:38

we agree with, and often unable to hear from those we don’t.

play08:42

This also means that most newsfeed algorithms are skewed toward engagement rather than truth.

play08:48

This is so often the case in fact that entire businesses have been successfully run on posting

play08:52

engaging, but false, news stories.

play08:55

Many newsfeed algorithms favor outrageous and emotional content, so companies looking

play09:00

to make money from clicks and advertisements can use that to their advantage.

play09:04

Hundreds of websites were built on false viral stories leading up to the 2016 U.S. election,

play09:10

and Buzzfeed later found out many were run by teenagers in Macedonia.

play09:14

Valuing engagement over quality makes it harder for users to distinguish between truth and

play09:20

fiction.

play09:21

Like humans tend to interpret information in a way that matches our pre-existing beliefs.

play09:26

That’s called confirmation bias.

play09:28

But even if you did somehow manage to be completely emotionally and ideologically neutral on a

play09:34

topic.

play09:34

Research has shown that if there’s information you know is bogus, encountering it again and

play09:40

again means you might start to believe it.

play09:43

Warding off the negative effects of algorithmic newsfeeds and filter bubbles is really hard.

play09:48

But I do think you can limit these effects by A) following people and pages that have

play09:52

different viewpoints and perspectives than you do, to add some variety to your feed.

play09:57

And B)

play09:58

looking for ways to turn off the “best” or “top” posts features in your favorite

play10:02

social apps so that they display information to you in a more neutral way.

play10:06

All of these negative features of social media combine to create the feature that I personally

play10:11

worry about the most: extreme recommendation engines.

play10:15

Social media algorithms show you more of what you’ve already indicated you like.

play10:19

The way we use those apps tends to keep us surrounded by information we’re primed to

play10:24

believe and agree with.

play10:25

And because engagement is the most important thing, and we tend to engage with what most

play10:29

outrages, angers, and shocks us.

play10:32

The longer we hang out on some social media apps and engage with outrageous content the

play10:38

more likely those apps are to push outrageous content to us.

play10:43

Researchers have found that YouTube’s recommendation algorithms, for instance, consistently showed

play10:47

users more and more extreme, far-right channels once they began watching political videos.

play10:53

They called it a radical rabbit hole.

play10:55

YouTube was lumping together outlets like Fox News and the channels of Republican politicians

play11:00

with those of known far-right conspiracy theorists and white nationalists.

play11:04

They also found that far-left channels have smaller followings and were not nearly as

play11:08

visible via those same pathways.

play11:10

Now beginning in 2017, YouTube started to update its algorithm to prioritize what they

play11:15

call “authoritativeness."

play11:17

In part to try to stop this from happening.

play11:19

But as previously noted, no algorithm is perfect or objective.

play11:23

Ultimately, it’s on us as users not to fall down these rabbit holes, not to go merely

play11:29

where the light is on.

play11:31

That’s why I think it’s so important to follow accounts with differing viewpoints

play11:34

and to turn off data tracking if you can, and in general to try to unwind the algorithmic

play11:39

web around your social media life.

play11:42

And while you’re in the feed it’s important to remember to read laterally about sources

play11:46

you don’t recognize.

play11:47

And also take a break once in a while.

play11:50

Talk to actual people.

play11:52

Get some fresh air.

play11:53

I really think that’s valuable.

play11:55

But even though I personally had to leave lots of the social Internet I do believe that

play11:59

social media can be an effective way to learn about news and other information--if you’re

play12:04

able to protect yourself.

play12:05

Let’s try this in the Filter Bubble.

play12:07

Oh yeah, that looks about right.

play12:12

Yes, surrounded by everything I love and believe in.

play12:16

Okay, that’s enough, let’s go to the Thought Bubble.

play12:18

Okay, so your cousin DMed you a link headlined: Singing Creek Park Sold, Will Be Home to Monster

play12:23

Truck Rally.

play12:24

Wow.

play12:25

That is your favorite park, so that is a huge bummer.

play12:28

Your first instinct, of course, is to repost it with an angry comment like “UGH we need

play12:32

nature WTH this is so unfair.”

play12:35

But wait, no.

play12:36

Take a deep breath and think.

play12:37

Your cousin is kind of a big deal -- he’s Blue-check verified and everything.

play12:42

But blue checkmarks and verified profiles do not denote truth.

play12:46

They just mean an account itself is who they claim to be.

play12:49

So you click the link.

play12:50

It’s from a site called localnews.co, which you’ve never heard of.

play12:55

And this is where your lateral reading kicks in.

play12:57

Use a search engine to look up the name of that site.

play12:59

Its Wikipedia entry reveals it’s a recently founded independent news site for your area,

play13:04

but it’s a very short Wikipedia article - not many reputable sources have written

play13:08

about the site to give us a better idea of its perspective or authority.

play13:11

So you search for their claim instead: singing creek park sale.

play13:16

The first result is that sketchy Local News site.

play13:18

Let’s peruse the entire page.

play13:20

Ah, there you go -- the seventh result is from a website you do know and trust, your

play13:25

local TV station and they say the park was sold, but it’s actually going to be turned

play13:29

into a nonprofit wildflower preserve.

play13:32

Which you know what sounds pretty lovely.

play13:34

You could leave it at that.

play13:35

But as a good citizen of the internet, you should correct this misinformation.

play13:39

Tell your cousin what’s up, they won’t at all be defensive,

play13:41

ask them not to share it, and then post the trustworthy article yourself.

play13:45

With the headline, “Condolences to monster truck enthusiasts.”

play13:49

Mission accomplished.

play13:50

Thanks, Thought Bubble.

play13:51

So during this series we’ve talked a lot about using lateral reading to check the source,

play13:56

look for authority and perspective, and then check the claim and its evidence.

play14:00

With social media, a more flexible approach is probably best.

play14:04

Like sometimes it makes sense to find out who’s behind the account you’re seeing.

play14:08

Sometimes you should investigate the source of what they’re sharing.

play14:11

Other times it’s best to evaluate the claim being made.

play14:15

As you practice you’ll develop a better idea of how to spend your time online.

play14:19

No matter where you begin, lateral reading will help you get the information you’re

play14:24

looking for.

play14:25

When in doubt about anything you encounter online you can challenge your source and your

play14:28

own assumptions and see what others people have to say.

play14:32

And there’s one last thing I’d add: Be suspicious of information that confirms your

play14:37

pre-existing worldview, especially stuff that confirms that people you believe to be evil

play14:43

or stupid are evil or stupid.

play14:46

Read laterally not only when it comes to stuff you don’t want to be true, but also when

play14:51

it comes to stuff you do want to be true.

play14:54

I know our current information environment can be frustrating.

play14:58

Believe me, I am frustrated by it.

play14:59

It is really difficult to know where to look for truth and accuracy, and I wish I could

play15:04

tell you there is one right way, one source you can always rely upon, but the truth is,

play15:11

anyone who tells you that is selling you an ideology or a product or both.

play15:16

But by making a habit of following up and following through, we can be expert navigators

play15:21

of digital information, and maybe even go to places where the lights are not on.

play15:27

Thanks so much for joining us for Crash Course: Navigating Digital Information.

play15:30

And thanks to the Poynter Institute and the Stanford History Education Group for making

play15:34

this series possible.

play15:36

MediaWise is supported by Google.

play15:38

If you’re interested in learning more about MediaWise and fact-checking, a good place

play15:42

to start is @mediawise on Instagram.

play15:45

Thanks again for watching.

play15:46

Good luck out there in the wild west.

play15:47

And as they say in my hometown, “don’t forget to be awesome."

Rate This

5.0 / 5 (0 votes)

Related Tags
ソーシャルメディア情報分析デジタルメディアプライバシー広告アルゴリズムフィルターバブル誤情報ファクトチェックメディア教育
Do you need a summary in English?