How We Can Protect Truth in the Age of Misinformation

Sinan Aral
28 May 202020:21

Summary

TLDRThe video script discusses the alarming spread of fake news and its impact on society, economies, and democracies. It recounts the 2013 AP hack that falsely reported an explosion at the White House, causing a stock market crash. It also details Russia's Internet Research Agency's misinformation campaigns during the 2016 US election. The script explores why false news spreads faster than the truth, driven by novelty and human psychology. It warns of the impending threat of synthetic media, powered by generative adversarial networks, which can create convincing fake videos and audio. The speaker calls for vigilance and reliance on trusted news sources to navigate the era of misinformation.

Takeaways

  • 🚨 A false tweet by the Associated Press in 2013 about explosions at the White House led to a stock market crash, demonstrating the impact of misinformation on financial markets.
  • 🌐 The Internet Research Agency, linked to the Kremlin, was involved in spreading misinformation on social media platforms to influence the 2016 US presidential election.
  • 📊 A study by Oxford University revealed that a third of social media content about the Swedish elections was fake or misinformation.
  • ⏰ Misinformation on social media can have severe consequences, such as delaying first responders in emergency situations and leading to loss of lives.
  • 📉 The spread of fake news is more rapid and extensive than true news, with false political news being particularly viral.
  • 🤖 Bots play a role in spreading misinformation, but they are not solely responsible for the differential diffusion of truth and falsity online.
  • 🧠 Humans are more likely to share novel information, which often includes false news, as it makes them appear knowledgeable and increases their perceived status.
  • 🎭 Synthetic media, powered by technologies like generative adversarial networks, is becoming increasingly convincing and poses a threat to the authenticity of visual and audio content.
  • 🛡️ Addressing the spread of misinformation requires a multifaceted approach, including labeling, incentives, regulation, transparency, and the development of algorithms to detect fake news.
  • 🌟 The speaker emphasizes the importance of defending the truth against misinformation through technology, policy, and individual responsibility.

Q & A

  • What was the impact of the false tweet by Syrian hackers on the Associated Press in 2013?

    -The false tweet about explosions at the White House and injuries to President Barack Obama led to a stock market crash, wiping out $140 billion in equity value in a single day.

  • Who was indicted by Robert Mueller for meddling in the 2016 U.S. presidential election?

    -Three Russian companies and 13 Russian individuals were indicted for a conspiracy to defraud the United States through social media manipulation.

  • What was the role of the Internet Research Agency during the 2016 U.S. presidential election?

    -The Internet Research Agency, a shadowy arm of the Kremlin, created and propagated fake news and misinformation on social media platforms to sow discord during the election.

  • How did the spread of fake news affect the Swedish elections according to an Oxford University study?

    -A third of all information spreading on social media about the Swedish elections was found to be fake or misinformation.

  • What are the potential consequences of misinformation during emergency situations like terrorist attacks or natural disasters?

    -Misinformation can lead to lost minutes and lives during emergency responses, as it can misguide first responders about the location of terrorists or trapped individuals.

  • How did the study published in Science in March 2017 analyze the spread of true and false news on Twitter?

    -The study analyzed verified true and false news stories on Twitter from 2006 to 2017, comparing their diffusion, speed, depth, and breadth of spread.

  • What was the 'novelty hypothesis' proposed to explain why false news spreads more widely on social media?

    -The 'novelty hypothesis' suggests that humans are drawn to and share novel information because it makes them seem knowledgeable and increases their perceived status.

  • What role do bots play in the spread of misinformation according to the speaker's research?

    -Bots accelerate the spread of both false and true news online, but they are not responsible for the differential diffusion of truth and falsity, as humans are the primary agents in spreading misinformation.

  • What are generative adversarial networks and how do they contribute to the rise of synthetic media?

    -Generative adversarial networks are machine learning models that consist of a generator creating synthetic media and a discriminator that tries to determine the authenticity of the media. This technology is contributing to the creation of very convincing fake videos and audio.

  • What are the potential dangers of synthetic media as described in the script?

    -Synthetic media can be used to create convincing fake videos and audio, potentially making it look like anyone is saying anything, which can be dangerous for trust in information and the truth.

  • What are the five potential paths to address the problem of misinformation as outlined in the script?

    -The five potential paths are labeling, incentives, regulation, transparency, and algorithms and machine learning technology to detect and dampen the spread of fake news.

  • Why is it important for humans to be involved in the technology designed to combat fake news?

    -Humans must be involved because any technological solution is underpinned by ethical and philosophical questions about truth and falsity, and who has the power to define them.

Outlines

00:00

🚨 The Impact of False News on Society and Economy

The paragraph discusses the significant impact of false news, exemplified by a 2013 tweet from the Associated Press' hacked Twitter account about explosions at the White House injuring President Obama. This false news quickly went viral, causing a panic reaction in financial markets that led to a loss of $140 billion in equity value. The narrative then shifts to the role of the Internet Research Agency in spreading misinformation during the 2016 U.S. presidential election, reaching millions through social media platforms. The paragraph concludes with examples of how misinformation can have severe consequences, including influencing elections and even leading to loss of lives in emergency situations.

05:02

📊 The Virality of False News vs. True News

This section delves into the study of the spread of false versus true news on Twitter, as published in the journal Science. It highlights how false news stories spread more rapidly and extensively than true ones, sometimes by an order of magnitude. The study controlled for various factors such as the number of followers, activity levels, and credibility of the users spreading the news. A key finding was that false news was 70% more likely to be retweeted than true news, despite being shared by users who were less influential and active on the platform. The paragraph introduces the 'novelty hypothesis,' suggesting that the novelty of information plays a significant role in its spread, as people are drawn to and share new and surprising information.

10:04

🤖 The Role of Bots in Spreading Misinformation

The paragraph explores the role of bots in the dissemination of false news. It clarifies that while bots do accelerate the spread of false news, they also do the same for true news, implying that bots are not the primary reason for the differential spread of truth and falsehood. The speaker emphasizes that the responsibility for the spread of misinformation lies with human users. The paragraph also foreshadows the increasing challenges posed by emerging technologies that can create highly convincing synthetic media, which can further exacerbate the problem of misinformation.

15:06

🎭 The Threat of Synthetic Media and Deepfakes

This section warns of the impending threat of synthetic media, including deepfakes—highly realistic fake videos and audio generated using machine learning models like generative adversarial networks (GANs). The speaker explains how these technologies can be used to create convincing fake content, making it difficult to distinguish between real and fake information. Examples are given of how such technology could be misused to create damaging fake statements by public figures. The paragraph underscores the need for vigilance and reliance on trusted news sources in the face of such challenges.

20:07

🛡 Combating Misinformation: Potential Solutions

The final paragraph discusses potential strategies to combat the spread of misinformation. It suggests labeling information with credibility indicators, adjusting economic incentives to discourage the spread of false news, implementing regulations that balance transparency with privacy, and demanding transparency from social media platforms about their algorithms. The speaker also mentions the need for algorithms and machine learning to help identify and mitigate the spread of fake news. However, it emphasizes that technology is not a panacea and that ethical and philosophical considerations are crucial in defining truth and managing the flow of information.

Mindmap

Keywords

💡Fake News

Fake news refers to false information or propaganda that is presented as genuine news. In the video, it is highlighted as a significant issue that can disrupt societies, influence elections, and even cause economic turmoil, as exemplified by the false tweet about explosions at the White House causing a stock market crash.

💡Misinformation

Misinformation is the communication of false or inaccurate information, often unintentionally. The video discusses how misinformation can have severe consequences, such as causing panic during emergencies or contributing to violence, as seen in the examples of the Rohingya in Burma and mob killings in India.

💡Viral

Viral in the context of the video refers to the rapid and extensive spread of information, particularly through social media platforms. The term is used to describe how quickly false news can propagate, as illustrated by the tweet about Barack Obama being injured that was retweeted thousands of times in minutes.

💡Internet Research Agency

The Internet Research Agency is a Russian organization mentioned in the video, known for creating and spreading fake news and misinformation on social media platforms to influence public opinion and political outcomes, such as during the 2016 U.S. presidential election.

💡Novelty Hypothesis

The Novelty Hypothesis in the video suggests that people are more likely to share information that is new or surprising. This concept is used to explain why fake news, which is often more novel, spreads more quickly than true news. The video provides evidence that false news tweets were found to be more novel than true news tweets.

💡Information Cascade

An information cascade in the video refers to the pattern of information spread on social media, where a piece of news, true or false, is shared rapidly among users. The video describes how false news cascades can grow exponentially, leading to widespread dissemination of misinformation.

💡Synthetic Media

Synthetic media, as discussed in the video, involves the creation of fake audio and video content that appears real. The video warns about the rise of technologies like generative adversarial networks that can produce highly convincing synthetic media, which poses a threat to the authenticity of information.

💡Generative Adversarial Networks (GANs)

GANs are a type of machine learning model mentioned in the video that consists of two parts: a generator that creates fake media and a discriminator that tries to distinguish between real and fake media. The video highlights how GANs can be used to create increasingly convincing synthetic media, contributing to the spread of fake news.

💡Economic Incentive

The video discusses the economic incentive as a driving force behind the creation and spread of fake news. It explains how the viral nature of false information can lead to more advertising revenue, as it attracts more viewers and engagement, thus motivating the production of fake news.

💡Transparency

Transparency in the video refers to the need for social media platforms to be open about their algorithms and data practices to understand and potentially mitigate the spread of misinformation. The video points out the challenges of achieving transparency without compromising data security and user privacy.

💡Algorithms

Algorithms in the context of the video are the processes used by social media platforms to curate and display content to users. The video suggests that understanding and potentially modifying these algorithms could be a way to control the spread of fake news, but it also acknowledges the complexity of balancing this with ethical considerations.

Highlights

On April 23rd, 2013, a false tweet by Syrian hackers caused a stock market crash, demonstrating the impact of misinformation.

The Internet Research Agency, a Kremlin-linked organization, was involved in spreading misinformation during the 2016 U.S. presidential election.

Misinformation campaigns can have severe real-world consequences, including influencing elections and even inciting violence.

A study by Oxford University revealed that a third of social media information during the Swedish elections was fake.

False news spreads more rapidly and widely than true news, with political news being particularly viral.

Contrary to expectations, those who spread false news on Twitter tend to have fewer followers and are less active, not more.

The novelty hypothesis suggests that people are more likely to share surprising and new information, which often includes false news.

False news often elicits more surprise and disgust in reactions, indicating its novelty and impact on audiences.

Bots accelerate the spread of both true and false news, but they do not account for the differential diffusion of truth and falsity.

Generative adversarial networks and AI democratization are enabling the creation of convincing fake videos and audio.

The rise of synthetic media poses a threat to the ability to discern truth from falsehood, potentially leading to a 'post-truth' era.

Labeling information with credibility and source transparency could be a way to combat the spread of misinformation.

Economic incentives play a role in the spread of false news, and addressing this could reduce its prevalence.

Regulation of social media platforms could help, but it also carries risks of suppressing minority opinions in authoritarian regimes.

Transparency in algorithms is needed to understand their impact on society, but it conflicts with privacy and security concerns.

Algorithms and machine learning can help identify and limit the spread of fake news, but ethical considerations are crucial.

Defending the truth against misinformation requires vigilance, technological solutions, policy changes, and individual responsibility.

Transcripts

play00:01

[Music]

play00:11

on April 23rd of 2013 The Associated

play00:19

Press put out the following tweet it

play00:21

said breaking news two explosions at the

play00:25

White House and Barack Obama has been

play00:28

injured this tweet was retweeted more

play00:31

than four thousand times in less than

play00:33

five minutes and it went viral

play00:35

immediately thereafter but this tweet

play00:38

was not real news this was false news

play00:41

that was propagated by Syrian hackers

play00:44

that had infiltrated the AP Twitter

play00:47

handle their purpose was to disrupt

play00:50

society but they disrupted much more

play00:52

because automated trading algorithms

play00:55

immediately seized on the sentiment on

play00:58

this tweet and began trading based on

play01:01

the potential that the President of the

play01:03

United States had been injured or killed

play01:04

in this explosion and as they started

play01:08

tweeting they immediately sent the stock

play01:10

market crashing wiping out a hundred and

play01:13

forty billion dollars in equity value in

play01:16

a single day earlier this year robert

play01:19

muller special counsel prosecutor in the

play01:22

united states issued indictments against

play01:25

three russian companies and 13 russian

play01:29

individuals on a conspiracy to defraud

play01:31

the united states by meddling in the

play01:34

2016 presidential election and what this

play01:37

indictment tells as a story is the story

play01:41

of the internet research agency the

play01:44

shadowy arm of the kremlin on social

play01:46

media housed in this nondescript

play01:48

building in st. petersburg with four

play01:51

stories dedicated to the creation and

play01:54

propagation of fake news and

play01:56

misinformation on twitter facebook and

play01:59

all other social media platforms during

play02:03

the presidential election alone the

play02:06

internet agency's efforts reached 126

play02:09

million people on Facebook in the United

play02:12

States

play02:13

issued three million individual tweets

play02:16

and forty three hours worth of YouTube

play02:19

content all of which was fake

play02:22

misinformation designed to sow discord

play02:25

in the u.s. presidential election a

play02:28

recent study by Oxford University showed

play02:31

that in the recent Swedish elections a

play02:34

full third one-third of all of the

play02:38

information spreading on social media

play02:40

about the election was fake or

play02:42

misinformation and these types of

play02:45

misinformation don't just affect

play02:47

economies and democracies but when we

play02:50

talk about first responders that are

play02:52

responding to a terrorist attack or

play02:54

responding to a natural disaster

play02:56

misinformation spreading about where the

play02:59

terrorists are or which building people

play03:02

are trapped in can mean minutes lost and

play03:05

therefore lives lost in addition these

play03:09

types of social media misinformation

play03:11

campaigns can spread what has been

play03:14

called genocide or propaganda for

play03:17

instance against the Rohingya in Burma

play03:19

or recently on whatsapp triggering mob

play03:22

killings in India if you see the quote

play03:25

at the bottom of the page it says fake

play03:27

news is blamed for influencing elections

play03:29

in the West but in India it's killing

play03:33

people

play03:33

we studied fake news and began studying

play03:36

it before it was a popular term and we

play03:39

recently published the largest ever

play03:42

longitudinal study of the spread of fake

play03:45

news online on the cover of science in

play03:47

March of this year we studied all of the

play03:51

verified true and false news stories

play03:53

that ever spread on Twitter from its

play03:56

inception in 2006 to the present day

play03:59

2017 in fact and when we studied this

play04:03

information we studied verified news

play04:07

stories that were verified by six

play04:09

independent fact-checking organizations

play04:11

so we knew which stories were true and

play04:14

which stories were false this is an

play04:17

example of the type of information we

play04:19

have in red you see the rise and fall of

play04:22

false news stories over time in green

play04:25

new stories over time and in yellow an

play04:28

insidious category that we called mixed

play04:30

news which contained information that

play04:32

was partially true and partially false

play04:35

some of the most difficult to root out

play04:38

and some of the most difficult for

play04:39

people to discern this is a graph of the

play04:42

false political news that was spreading

play04:45

during this period and you see it rise

play04:48

and fall you see spikes of false news

play04:50

during the u.s. presidential elections

play04:52

on Twitter and you see one massive spike

play04:55

of mixed information during the

play04:58

annexation of Crimea it begins and ends

play05:02

during that one and a half month period

play05:04

that Crimea was annexed I tell this

play05:08

story of the Crimean annexation and the

play05:11

role of misinformation and fake news in

play05:14

my upcoming book the hype machine I will

play05:16

save that story for you in the book this

play05:20

is what these cascades look like these

play05:23

are cascades of true and false

play05:24

information on Twitter the larger red

play05:28

cascade is a false news tweet the green

play05:31

one a true news tweet and it begins with

play05:33

a starburst pattern of retweets at the

play05:36

beginning of the Cascade and then you

play05:38

see these tendrils like jellyfish

play05:40

emanating from the starbursts one person

play05:43

retweeting another and retweeting

play05:45

another and retweeting another and these

play05:47

types of structures have mathematical

play05:49

properties we can measure their

play05:52

diffusion the speed of their diffusion

play05:54

the depth and breadth of their diffusion

play05:56

how many people become entangled in this

play05:59

information cascade and so on and what

play06:01

we did in this paper was we compared the

play06:04

spread of true news to the spread of

play06:06

false news and here's what we found we

play06:09

found that false news diffused further

play06:11

faster deeper and more broadly than the

play06:14

truth in every category of information

play06:17

that we studied sometimes by an order of

play06:19

magnitude and in fact false political

play06:22

news was the most viral it diffused

play06:25

further faster deeper and more broadly

play06:27

than any other type of false news when

play06:31

we saw these results we were at once

play06:33

worried but also curious why why does

play06:37

false news travel so much further

play06:39

faster deeper and more broadly than the

play06:41

truth the first hypothesis that we came

play06:44

up with was well maybe people who spread

play06:46

false news have more followers or follow

play06:49

more people or tweet more often or maybe

play06:52

they're more often verified users of

play06:54

Twitter with more credibility or maybe

play06:56

they've been on Twitter longer so we

play06:58

checked each one of these in turn and

play07:00

what we found was exactly the opposite

play07:03

false news spreaders had fewer followers

play07:06

followed fewer people were less active

play07:08

less often verified and had been on

play07:10

Twitter for a shorter period of time

play07:12

and yet false news was 70% more likely

play07:16

to be retweeted than the truth

play07:19

controlling for all of these and many

play07:21

other factors so we had to come up with

play07:23

other explanations and what we devised

play07:26

what we called a novelty hypothesis so

play07:29

if you read the literature it is well

play07:31

known that human attention is drawn to

play07:33

novelty things that are new in the

play07:36

environment and if you read the

play07:38

sociology literature you know that we

play07:40

like to share novel information because

play07:44

it makes us seem like we're in the know

play07:46

it makes us seem like we have access to

play07:48

inside information and we've gained in

play07:51

status by spreading this kind of

play07:53

information so what we did was we

play07:56

measured the novelty of an incoming true

play07:59

or false

play08:00

tweet compared to the corpus of what

play08:03

that individual had seen in the 60 days

play08:06

prior on Twitter so we used information

play08:09

theoretic measures of the information

play08:12

content in these true or false tweets

play08:14

compared to all of the information that

play08:16

they had seen in the 60 days prior to

play08:19

this incoming true or false tweet and we

play08:22

measured information novelty across

play08:25

three distinct measures and across all

play08:27

of these measures false news was much

play08:30

more novel than the truth but that

play08:33

wasn't enough because we thought to

play08:35

ourselves well maybe false news is more

play08:37

novel in an information theoretic sense

play08:40

but maybe people don't perceive it as

play08:42

more novel so to understand people's

play08:45

perceptions of false news we looked at

play08:48

the information and the sentiment

play08:51

contained in the report

play08:53

lies to true and false tweets and what

play08:56

we found was that across a bunch of

play08:59

different measures of sentiment surprise

play09:02

disgust fear sadness anticipation joy

play09:06

and trust false news exhibit exhibited

play09:09

significantly more surprise and disgust

play09:13

in the replies to false tweets and true

play09:17

news exhibited significantly more

play09:20

anticipation joy and Trust in reply to

play09:24

true tweets the surprise corroborates

play09:27

our novelty hypothesis this is new and

play09:30

surprising and so we're more likely to

play09:32

share it at the same time there was

play09:35

congressional testimony in front of both

play09:37

houses of Congress in the United States

play09:39

looking at the role of bots in the

play09:42

spread of misinformation so we looked at

play09:44

this too we used multiple sophisticated

play09:47

bot detection algorithms to find the

play09:49

bots in our data and to pull them out so

play09:52

we pulled them out we put them back in

play09:54

and we compared what happens to our

play09:57

measurements when we remove the bots and

play09:59

when we put them back in and what we

play10:01

found was that yes indeed bots were

play10:04

accelerating the spread of false news

play10:06

online but they were accelerating the

play10:08

spread of true news at approximately the

play10:11

same rate which means bots are not

play10:14

responsible for the differential

play10:16

diffusion of truth and falsity online we

play10:20

can't abdicate that responsibility

play10:22

because we humans are responsible for

play10:26

that spread now everything that I have

play10:30

told you so far unfortunately for all of

play10:33

us is the good news the reason is

play10:37

because it's about to get a whole lot

play10:40

worse and two specific technologies are

play10:44

going to make it worse we are going to

play10:47

see the rise of a tremendous wave of

play10:50

synthetic media fake video fake audio

play10:53

that is very convincing to the human eye

play10:57

and this will be powered by two

play10:58

technologies the first of these is known

play11:01

as generative adversarial networks this

play11:04

is a machine learning model with two now

play11:06

works a discriminator whose job it is to

play11:09

determine whether something is true or

play11:12

false and a generator whose job it is to

play11:14

generate synthetic media so the

play11:17

synthetic generator generates synthetic

play11:20

video or audio and the discriminator

play11:22

tries to tell is this real or is this

play11:25

fake and then the generator sees what

play11:28

the discriminator does and optimizes a

play11:31

function to generate more and more

play11:34

convincing video and audio in fact it is

play11:38

the job of the generator to maximize the

play11:41

likelihood that it will fool the

play11:43

discriminator into thinking the

play11:45

synthetic video and audio that it is

play11:47

creating is actually true imagine a

play11:50

machine in a Hyperloop trying to get

play11:52

better and better at fooling us this

play11:55

combined with a second technology which

play11:58

is essentially the democratization of

play12:01

artificial intelligence to the people

play12:03

the ability for anyone without any

play12:06

background in artificial intelligence or

play12:09

machine learning to deploy these kinds

play12:11

of algorithms to generate synthetic meit

play12:13

media makes it ultimately so much easier

play12:17

to create videos like this one we're

play12:20

entering an era in which our enemies can

play12:23

make it look like anyone is saying

play12:24

anything at any point in time even if

play12:27

they would never say those things so for

play12:30

instance they could have me say things

play12:32

like I don't know kill monger was right

play12:36

or ben Carson is in the sunken place or

play12:40

how about that simply President Trump is

play12:43

a total and complete now you see

play12:47

I would never say these things at least

play12:50

not in a public address but someone else

play12:53

would someone like Jordan Peele this is

play12:58

a dangerous time moving forward we need

play13:02

to be more vigilant with what we trust

play13:04

from the internet that's a time when we

play13:06

need to rely on trusted news sources

play13:10

it may sound basic but how we move

play13:13

forward age of information is going to

play13:18

be the difference between whether we

play13:19

survive

play13:20

or whether we become some kind of

play13:22

fucked-up dystopia thank you I'm stay

play13:26

woke pitches in fact just recently the

play13:40

White House issued a false doctored

play13:44

video of a journalist interacting with

play13:46

an intern who was trying to take his

play13:48

microphone

play13:49

they removed frames from this video in

play13:51

order to make his actions seem more

play13:54

punchy and when videographers and stunt

play13:57

men and women were interviewed about

play13:59

this type of technique they said yes we

play14:01

use this in the movies all the time to

play14:05

make our punches and kicks look more

play14:07

choppy and more aggressive they then put

play14:11

out this video and partly used it as

play14:13

justification to revoke Jim Acosta the

play14:16

reporters Press Pass from the White

play14:19

House and CNN had to sue to have that

play14:22

press pass reinstated there are about

play14:26

five different paths that I can think of

play14:29

that we can follow to try and address

play14:32

some of these very difficult problems

play14:35

today each one of them has promised but

play14:38

each one of them has its own challenges

play14:40

the first one is labeling think about it

play14:43

this way when you go to the grocery

play14:45

store to buy food to consume it's

play14:48

extensively labeled you know how many

play14:50

calories it has how much fat it contains

play14:52

how many trans fats it has whether it's

play14:56

been produced in a facility that

play14:57

produces wheat or peanuts if you have an

play15:00

allergy and yet when we consume

play15:02

information we have no labels whatsoever

play15:05

what is contained in this information is

play15:08

it true or false

play15:10

does this source typically put out true

play15:12

information or false information is the

play15:15

source credible where is this

play15:17

information gathered from how many

play15:19

reporters worked on this story what is

play15:21

the policy of this journal in terms of

play15:24

running with a fact do they need to have

play15:25

two independent sources or three we have

play15:28

none of that information when we are

play15:30

consuming information that is a

play15:32

potential Avenue but it comes with its

play15:34

challenges for instance who gets to

play15:37

decide in society what's true and what's

play15:40

false is it the government's is it

play15:44

Facebook is it an independent consortium

play15:47

of fact checkers and who's checking the

play15:50

fact checkers another potential Avenue

play15:53

is incentives we know that during the

play15:56

u.s. presidential election there was a

play15:58

wave of misinformation that came from

play16:00

Macedonia that didn't have any political

play16:03

motive but instead had an economic

play16:05

motive and this economic motive existed

play16:08

because false news travels so much

play16:11

farther faster and more deeply than the

play16:13

truth and you can earn advertising

play16:16

dollars as you garner eyeballs and

play16:18

attention with this type of information

play16:20

but if we can depress the spread of this

play16:24

information perhaps it would reduce the

play16:26

economic incentive to produce it at all

play16:29

in the first place third we can think

play16:33

about regulation and certainly we should

play16:35

think about this option in the United

play16:37

States currently we are exploring what

play16:40

might happen if Facebook and others are

play16:43

regulated recently in Europe the GDP are

play16:46

went into the effect at the end of May

play16:49

instituting strict privacy policies and

play16:51

uses of data and algorithmic

play16:54

transparency requirements but this type

play16:57

of regulation while we should consider

play16:59

things like regulating political speech

play17:02

labeling the fact that it's political

play17:04

speech making sure foreign actors can't

play17:06

fund political speech it also has its

play17:10

own dangers for instance Malaysia just

play17:13

instituted a six year prison sentence

play17:16

for anyone found spreading

play17:18

misinformation and in authoritarian

play17:21

regimes these kinds of policies can be

play17:24

used to suppress minority opinions and

play17:26

to continue to extend repression the

play17:31

fourth possible option is transparency

play17:35

we want to know how do Facebook's

play17:38

algorithms work how does the data

play17:40

combined with the algorithms to produce

play17:42

the outcomes that we see we want them to

play17:46

open the kimono

play17:47

and show us exactly the inner workings

play17:50

of how Facebook is working and if we

play17:52

want to know social medias effect on

play17:53

society we need scientists researchers

play17:56

and others to have access to this kind

play17:58

of information but at the same time we

play18:01

are asking Facebook to lock everything

play18:04

down to keep all of the data secure to

play18:07

not give data to third parties like

play18:09

Cambridge University who then gave the

play18:12

data to Cambridge analytical that

play18:14

created that scandal so Facebook and the

play18:18

other social media platforms are facing

play18:20

what I call a transparency paradox we

play18:23

are asking them at the same time to be

play18:26

open and transparent and simultaneously

play18:29

secure this is a very difficult needle

play18:32

to thread but they will need to thread

play18:34

this needle if we are to achieve the

play18:36

promise of social technologies while

play18:39

avoiding their peril the final thing

play18:42

that we could think about is algorithms

play18:44

and machine learning technology devised

play18:47

to root out and understand fake news how

play18:50

it spreads and to try and dampen its

play18:52

flow humans have to be in the loop of

play18:55

this technology because we can never

play18:58

escape that underlying any technological

play19:01

solution or approach is a fundamental

play19:04

ethical and philosophical question about

play19:07

how do we define truth and falsity to

play19:10

whom do we give the power to define

play19:12

truth and falsity and which opinions are

play19:15

legitimate which type of speech should

play19:18

be allowed and so on technology is not a

play19:21

solution for that ethics and philosophy

play19:23

is a solution for that

play19:26

nearly every theory of human

play19:29

decision-making human cooperation and

play19:32

human coordination has some sense of the

play19:35

truth at its core but with the rise of

play19:38

fake news the rise of fake video the

play19:41

rise of fake audio we are teetering on

play19:44

the brink of the end of reality where we

play19:47

cannot tell what is real from what is

play19:50

fake and that's potentially incredibly

play19:53

dangerous we have to be vigilant in

play19:57

defending the truth against

play19:59

misinformation

play20:01

with our technologies with our policies

play20:04

and perhaps most importantly with our

play20:07

own individual responsibilities

play20:09

decisions behaviors and actions thank

play20:14

you very much

play20:15

[Applause]

Rate This

5.0 / 5 (0 votes)

الوسوم ذات الصلة
Fake NewsMisinformationSocial MediaEconomic ImpactPolitical InfluenceDisinformation CampaignsTruth VerificationDigital DisruptionMedia LiteracyAI Technology
هل تحتاج إلى تلخيص باللغة الإنجليزية؟