Artificial Intelligence and Social Media

Malcolm Stirling
26 Oct 202017:34

Summary

TLDRThis transcript explores the influence of social media platforms, emphasizing how companies utilize notifications and artificial intelligence to keep users engaged and drive profit. It delves into the role of machine learning in shaping content recommendations, sentiment analysis, and micro-targeting, raising concerns about privacy, bias, and misinformation. The script also touches on the emotional impact of social media, especially on teens, and discusses potential risks, including cyberbullying and manipulated perceptions. It concludes with a call for better regulation and management of social media's rapidly evolving landscape to safeguard users.

Takeaways

  • 🐣 Social media companies constantly grab users' attention through notifications and emails that encourage content creation or site visits.
  • 🤖 AI, particularly machine learning and sentiment analysis, is used by social media platforms to interpret and categorize user-generated data based on emotions and opinions.
  • 😊 Emoticons help train AI engines by adding emotional context to messages, especially when understanding humor and sarcasm.
  • 📈 Recommendation engines on social media platforms use user data, like content interactions, to suggest targeted ads or posts based on personal preferences.
  • 📷 Facial recognition technology identifies and tags people in photos, contributing more data to social media algorithms about user behaviors and social connections.
  • 💼 LinkedIn's AI tools match users with job opportunities, while YouTube uses algorithms to recommend videos, leading users down 'rabbit holes' to increase engagement.
  • 😟 Social media's unregulated environment contributes to cyberbullying, teen anxiety, and the spread of misinformation, which can have serious real-world consequences.
  • 🧠 Social media triggers reward centers in teen brains, encouraging addictive behaviors, while also intensifying feelings of anxiety and exclusion.
  • 💻 Social media shapes our reality by filtering the information we see, reinforcing confirmation biases and sometimes manipulating perceptions for political or commercial gain.
  • 📊 Predictive analytics on social media use vast amounts of data to forecast user behaviors, decisions, and even election outcomes, potentially influencing opinions and beliefs.

Q & A

  • What are the two main types of notifications used by social media companies?

    -The two main types of notifications are: 1) Notifications encouraging users to create more content, such as reminders about sharing updates when engagement is low, and 2) Notifications that drive users to revisit the platform, such as alerts about likes, mentions, or events.

  • How do social media companies use machine learning to analyze user content?

    -Social media companies use machine learning, particularly sentiment analysis, which combines natural language processing (NLP) and machine learning to categorize user content as positive, negative, or neutral, helping AI tools understand user emotions and opinions.

  • What role do emoticons play in sentiment analysis on social media?

    -Emoticons help AI sentiment analysis tools understand the tone of messages, especially in cases of humor and sarcasm, which are difficult for machines to interpret accurately.

  • How do recommendation engines on social media platforms work?

    -Recommendation engines collect data on user interactions, such as posts liked, content reviewed, and user activity, to display targeted ads or relevant posts based on the user’s preferences and patterns.

  • What are the ethical concerns around facial recognition technology used by social media companies?

    -Facial recognition algorithms often have biases, particularly toward underrepresented groups, such as people with darker complexions. These biases stem from unequal representation during the development stages, raising concerns about fairness and accuracy.

  • How does social media contribute to the spread of fake news?

    -Social media facilitates the rapid spread of false information because there is no verification system in place for self-published content. Studies show that fake news travels faster on platforms like Twitter, often due to its sensational nature.

  • What are some of the psychological effects of social media use on teenagers?

    -Social media can intensify feelings of depression and anxiety in teens. Factors include pressure to post appealing content, fear of missing out (FOMO) when seeing others' posts, and the stress of getting likes or positive feedback on their own posts.

  • How do social media platforms create echo chambers that influence user perception?

    -Social media algorithms often show users content that aligns with their views, reinforcing their opinions. This creates confirmation bias, where users primarily see posts that support their beliefs, contributing to polarized perspectives on controversial topics.

  • What is micro-targeting and how is it used in political campaigns?

    -Micro-targeting is a strategy that uses detailed data about users’ preferences, demographics, and behaviors to deliver personalized political ads. It is often criticized for its lack of transparency and potential for misuse in manipulating voter perceptions.

  • What concerns are there regarding AI's role in shaping the future of social media content?

    -There is concern that AI could be used to manipulate information, influencing user beliefs by curating the content they see. This could lead to AI controlling narratives, shaping public opinion in ways that are not transparent or democratic.

Outlines

00:00

📲 Social Media's Attention Economy

Social media platforms use notifications to grab our attention, either encouraging content creation or prompting visits to the platform. These notifications are designed to be vague, making users curious enough to click through, which often leads to increased exposure to advertisements. Social media companies leverage AI, particularly machine learning and sentiment analysis, to interpret user-generated content and emotions. Emoticons are a key factor in helping these systems understand human expressions like humor and sarcasm.

05:02

📊 AI and Social Media Recommendation Engines

Social media platforms employ machine learning to power recommendation engines, tracking user behaviors such as posts, clicks, and likes to serve relevant content or ads. Facial recognition technology also aids in identifying individuals in photos. LinkedIn, for example, uses AI to match job seekers with employers based on user data. YouTube's algorithms are adept at keeping users engaged, pushing content to lead users down 'rabbit holes.' These strategies generate profit for platforms like LinkedIn and YouTube while increasing user engagement.

10:02

🚨 Social Media's Influence on Mental Health and Behavior

Social media platforms have a significant impact on teen mental health, with studies showing links to increased depression and anxiety. The pressure to receive likes and positive feedback on posts can contribute to these issues. Social media also creates an addiction-like effect by constantly adapting to user preferences. This leads to problematic behavior, similar to how drug dealers cater to users, reflecting the deep psychological influence that these platforms can have.

15:03

🔍 Social Media Misdirection and the AI Matrix

Just like magicians use misdirection to divert attention, social media distracts users with posts and content while subtly influencing their perceptions. AI filters the information we receive, shaping our worldview. With many relying on social media for news, the content tends to reinforce existing beliefs, creating a feedback loop of confirmation bias. Social media has unprecedented influence, as demonstrated by how fake news spreads faster than facts and how platforms like Facebook have been used for political manipulation and social discord.

💥 The Real-World Consequences of Social Media Misinformation

False information spread on social media has led to dangerous real-world consequences. One notable example is 'Pizzagate,' where a man acted on conspiracy theories by attacking a pizzeria. Similarly, in Myanmar, a false report on Facebook incited mob violence. These incidents highlight how social media algorithms can amplify harmful narratives. While platforms may not intentionally cause harm, their design allows misinformation to spread rapidly, with dire effects.

🛑 AI Bias and Misrepresentation in Social Media

AI tools used by social media companies can carry hidden biases, especially in areas like facial recognition, where dark-skinned individuals are often misrepresented due to inadequate training data. Public perception can also be skewed, as people consistently overestimate or underestimate certain demographic statistics. These biases affect broader societal behaviors and decisions, including voting patterns and consumer preferences.

🎯 Social Media and Election Manipulation

Social media's influence extends to political campaigns through micro-targeting, where user data is analyzed to deliver specific ads and messages. This covert form of advertising can sway public opinion without transparency. Platforms like Twitter have attempted to mitigate these issues by banning political ads and addressing the dangers of deepfakes, which are becoming more sophisticated and deceptive with advancements in AI.

📉 Predictive Analysis and the Future of Social Media Manipulation

At its deepest level, social media utilizes predictive analysis to forecast user behavior and preferences. This enables platforms to shape public opinion and even alter election outcomes by tailoring content. The concern is that these predictive tools might evolve to the point where they manipulate beliefs en masse. The rise of social media during COVID-19 has made it an even more powerful communication tool, but like early automobiles, which were eventually made safer through regulation, social media must also be better managed.

Mindmap

Keywords

💡Social Media Notifications

Social media notifications are alerts sent to users, encouraging them to engage with the platform. These can include prompts to create content (e.g., 'You had no new likes this week') or to revisit the site ('Your post has been liked'). The video explains how these notifications are carefully designed to provoke user interest and draw them back into the platform, where they are exposed to advertisements.

💡Sentiment Analysis

Sentiment analysis is a branch of machine learning that processes and interprets user-generated content to categorize it as positive, negative, or neutral. The video highlights how social media platforms use this tool to understand emotions and opinions within posts, often aided by emoticons, which help algorithms grasp nuances like humor or sarcasm. This forms the foundation of how AI engines learn about users' feelings.

💡Recommendation Engines

Recommendation engines are AI-powered tools that collect and analyze users' behavior—such as the content they interact with—and predict what they might enjoy next. Social media platforms use these engines to suggest posts, advertisements, or videos that are relevant to users. The video points out that these systems are central to platforms like YouTube and Pinterest, creating personalized experiences that keep users engaged.

💡Facial Recognition Technology

Facial recognition technology is used by platforms like Facebook to automatically identify and tag people in photos. The video explains how this technology works by feeding AI with more data about users' locations and social connections, which can then be used to refine advertising or content recommendations. It's an example of how personal data is collected and utilized without direct user involvement.

💡Artificial Intelligence (AI)

Artificial Intelligence (AI) refers to the computer systems that perform tasks traditionally requiring human intelligence, such as understanding language or recognizing images. In the context of the video, AI is the engine behind features like sentiment analysis, recommendation systems, and facial recognition, driving many of the user experiences on social media platforms and shaping the way content is delivered.

💡Cyberbullying

Cyberbullying is the use of digital platforms to harass or intimidate individuals, often with serious emotional consequences. The video discusses how social media platforms have become a breeding ground for this behavior, linking it to rising rates of teenage self-harm and suicide. It highlights the responsibility that social media companies bear in moderating harmful content, a task they often neglect due to profit motives.

💡Fake News

Fake news refers to the spread of misinformation or deliberately falsified stories via social media platforms. The video provides examples like Pizzagate, where false information spread rapidly and led to real-world consequences. It emphasizes that the algorithms driving content distribution often amplify sensationalized or false content because it engages users more effectively than the truth.

💡Machine Learning Bias

Machine learning bias occurs when AI systems make flawed decisions based on biased or incomplete training data. The video uses the example of facial recognition algorithms struggling with dark complexions due to underrepresentation in training data. It warns that AI systems can perpetuate and exacerbate existing societal biases if not carefully managed.

💡Microtargeting

Microtargeting is a marketing strategy that segments users into small, specific groups based on their data—such as interests, demographics, or behavior—and then delivers personalized content or ads. The video explains that social media platforms use microtargeting extensively, especially in political campaigns, to influence users' opinions and actions by tailoring messages to resonate with their specific preferences.

💡Confirmation Bias

Confirmation bias is the tendency for people to favor information that confirms their pre-existing beliefs. The video explains that social media platforms often reinforce confirmation bias by showing users content that aligns with their views, especially in controversial areas like politics. This creates echo chambers where users are exposed only to perspectives they already agree with, reinforcing divisive or extreme viewpoints.

Highlights

Social media companies use notifications and emails to drive users to create more content or visit the site.

Artificial Intelligence and machine learning tools are used to review user data and understand sentiments through sentiment analysis.

Emoticons help AI better understand the nuances of human emotions, humor, and sarcasm, essentially training the AI engines.

Social media recommendation engines gather data on user interactions and preferences to suggest relevant posts and advertisements.

Facial recognition technology is commonly used on social media platforms, such as Facebook, to tag people in group photos.

LinkedIn uses AI to match users to potential employers by analyzing hiring patterns, skills, and job descriptions.

YouTube’s recommendation engine uses user behavior to push more content, often leading users down content 'rabbit holes' for increased engagement.

Studies link social media to increasing rates of cyberbullying, teen depression, anxiety, and even suicides.

Social media can create anxiety by fostering comparison culture, where users feel pressure to post and get likes or comments.

Machine learning algorithms, despite their complexity, may reinforce biases, as demonstrated by facial recognition technology's lower accuracy on dark-skinned individuals.

Misinformation spreads rapidly on social media; for example, fake news on Twitter spreads six times faster than the truth.

Pizzagate, a conspiracy theory, is an example where false information led to real-world violent actions.

Social media platforms are often accused of being complicit in amplifying false narratives due to algorithmic biases that promote engagement.

Micro-targeting in political campaigns leverages detailed personal data to craft covert and highly targeted advertisements.

Deep fakes, powered by AI, have raised concerns about the manipulation of audio and visual content, potentially deceiving large audiences.

Transcripts

play00:08

like

play00:09

chicks in a nest social media companies

play00:12

clamor for our attention with their

play00:14

emails notifications and beeps

play00:17

these come in two varieties

play00:19

notifications that encourage you to

play00:21

create more content

play00:23

you had no new likes this week share

play00:26

updates about yourself and friends by

play00:28

creating a new post or video

play00:31

and notifications that encourage you to

play00:33

go and visit the site

play00:35

your post has been liked you were tagged

play00:38

in a photo

play00:39

you have been mentioned birthday

play00:41

reminders

play00:43

there's an event happening near you

play00:46

someone viewed your profile

play00:48

here are tweets that might be of

play00:49

interest to you

play00:51

the notifications and emails never tell

play00:54

you the full story

play00:55

they're designed to provoke your

play00:57

interest so that you follow the link to

play00:59

the social media site

play01:00

and there you are subjected to more

play01:02

advertisements

play01:04

and the till's ring catching

play01:08

at level two we're looking at artificial

play01:10

intelligence

play01:11

and more specifically machine learning

play01:14

these tools review the large amounts of

play01:16

data that us users post on social media

play01:19

sites

play01:19

and they have the ability to decipher

play01:22

these data

play01:22

and gain an understanding of what is

play01:24

being said

play01:26

here we use a branch of machine learning

play01:28

called

play01:29

sentiment analysis a process that uses

play01:32

natural language processing

play01:34

nlp and machine learning to pair social

play01:37

media data

play01:38

with predefined labels such as positive

play01:41

negative or neutral

play01:42

in this way machines can develop agents

play01:45

that learn to understand the opinions

play01:46

and emotions

play01:47

underlying the messages we are asked to

play01:50

help in this process

play01:51

by adding the emoticons these help the

play01:54

sentiment analysis tools better

play01:56

understand the nature of our messages

play01:58

particularly in the areas of humor and

play02:00

sarcasm

play02:02

which are concepts ai machines find

play02:04

difficult to understand

play02:06

so by using emoticons we are in effect

play02:09

training the ai engines

play02:11

having analyzed all the data that we the

play02:13

users have posted

play02:15

what do the social media companies do

play02:17

with it

play02:18

well at level two they do the following

play02:21

they populate

play02:22

social media recommendation engines the

play02:25

recommendation engines collect data on

play02:27

the content you engage with

play02:29

from pinning a picture on pinterest to

play02:31

commenting on an instagram post

play02:33

and then they display material that they

play02:36

anticipate you will enjoy

play02:38

this material can be either targeted

play02:40

advertising

play02:41

or other people's posts that are

play02:43

perceived to be relevant

play02:44

the recommendation engines collect

play02:46

information relating to

play02:48

the content you post the content you

play02:51

review and click on

play02:52

and the number of likes you put on

play02:54

certain posts

play02:56

and in this way they can deduce your

play02:58

personality

play02:59

facial recognition technology is used to

play03:02

identify people in photographs

play03:04

you may have seen on facebook that when

play03:06

you post a group photo

play03:08

many of the faces if not all of them

play03:10

will be identified and automatically

play03:12

tagged

play03:14

you can then enhance the knowledge base

play03:16

by clicking on the

play03:17

check-in icon and in this way you can

play03:20

tell facebook

play03:20

that all these people were in this

play03:22

location at this time

play03:24

this then feeds the machine learning

play03:26

programs with more information on the

play03:28

preferences of the individuals

play03:31

linkedin has two facets a social media

play03:34

platform

play03:35

facebook for corporate workers which

play03:37

uses its recommendation engine to

play03:39

suggest people that you might like to

play03:41

connect with

play03:42

but it makes its money from recruiters

play03:44

seeking talent

play03:46

as a user you have kindly updated your

play03:48

career history and skills

play03:50

and linkedin uses an ai tool from

play03:52

recently acquired bright

play03:54

to perform intelligent matches for both

play03:57

employers and job seekers

play03:59

it takes into account the user's hiring

play04:01

patterns work experience

play04:03

and similarities in job descriptions

play04:05

kaching

play04:06

money for linkedin youtube is

play04:09

particularly adept at analyzing your

play04:11

usage patterns and making

play04:12

recommendations

play04:14

for what you should look at next these

play04:16

are designed to tempt you down

play04:18

rabbit holes so that you continue

play04:20

watching videos

play04:21

and in so doing you're subjected to more

play04:23

advertisements

play04:24

catching again national laws run way

play04:27

behind

play04:28

the environments that technology

play04:29

companies are creating

play04:31

social media companies are commercial

play04:33

organizations and are driven by the

play04:35

pursuit of profit

play04:36

therefore they are not incentivized to

play04:39

regulate themselves

play04:41

this leads to at best a degree of

play04:43

negligence in reviewing the content that

play04:46

they publish

play04:47

unregulated self-published content is

play04:50

causing

play04:50

increasing concern imperial college

play04:53

london

play04:54

published a report in the lancet

play04:56

outlining the harm

play04:58

caused to teenagers by cyber bullying

play05:01

teen suicides and self-harm have

play05:03

dramatically increased over recent years

play05:06

there is evidence that supports a link

play05:08

with social media

play05:10

youtube instagram and snapchat are the

play05:13

most common social media platforms for

play05:16

teens

play05:17

in a study by researchers at the ucla

play05:20

brain mapping center

play05:21

they found that certain regions of teen

play05:24

brains became

play05:25

activated by likes on social media

play05:28

sometimes causing them to want to use

play05:30

social media more

play05:31

researchers believe social media can be

play05:34

associated with the intensification

play05:36

of the symptoms of depression the

play05:39

harvard graduate school of education has

play05:42

posted a research story

play05:43

entitled social media and teen anxiety

play05:47

with anxiety being triggered by seeing

play05:50

people posting about events to which

play05:52

they haven't been invited

play05:54

feeling pressure to post positive and

play05:56

attractive content about yourself

play05:59

feeling pressure to get comments and

play06:01

likes on your posts

play06:03

and having someone post things about you

play06:05

that you can't change or control

play06:07

just take a moment and think do you

play06:10

recognize any of these behaviors in

play06:12

yourself

play06:13

can you control your addictions one of

play06:16

the most compelling features of social

play06:18

media

play06:18

is that it responds to your choices and

play06:21

so adapts to suit your preferences

play06:23

there are two groups of people that call

play06:25

their customers users

play06:27

one is the technology companies and the

play06:29

other is drug dealers

play06:32

now let's move to level three where we

play06:34

look at the less

play06:35

obvious behavior patterns facilitated by

play06:38

social media companies

play06:39

magicians are able to use misdirection

play06:42

because our brain automatically

play06:44

categorizes people's motions

play06:46

by interpreting their intentions we see

play06:49

somebody push their spectacles up the

play06:50

bridge of their nose and assume

play06:52

that their glasses have slipped but the

play06:54

magician uses this motion

play06:56

to hide something in their mouth

play06:58

similarly

play06:59

whilst we're being distracted by the

play07:01

posts photos and friendly surveys we

play07:04

partake in

play07:05

we are being misdirected away from what

play07:07

is happening beneath the surface

play07:09

the threat from artificial intelligence

play07:12

is not taking the shape of terminators

play07:14

being managed by skynet it is much

play07:17

closer to the matrix

play07:18

where we are living in an artificially

play07:20

constructed reality

play07:22

by this i mean that the view we have of

play07:24

the world is through

play07:26

our computers and mobile devices and the

play07:28

information we receive

play07:30

is being filtered by ai engines these

play07:33

are regulating the news articles that we

play07:35

see

play07:36

which of our friends posts we see and

play07:38

the advertisements in our news feeds

play07:41

a recent pew survey found almost

play07:44

60 percent of us regularly use social

play07:46

media for our news

play07:48

but we are complicit in constructing our

play07:50

artificial world

play07:51

by creating a confirmation bias

play07:55

by this i mean when it comes to

play07:56

controversial topics

play07:58

including politics if you're like most

play08:00

people the majority of your friends and

play08:02

followers on social media

play08:04

probably share your outlook this means

play08:06

that the vast majority of tweets

play08:08

facebook posts pins or other content you

play08:11

read on these sites

play08:12

tend to express the same point of view

play08:14

as your own

play08:16

here are some july 2020 figures world

play08:19

population

play08:20

7.8 billion unique mobile phone users

play08:24

5.2 billion internet users

play08:27

4.6 billion active social media users

play08:32

4 billion social media companies have

play08:35

unprecedented audiences facebook youtube

play08:38

and whatsapp

play08:39

dominate the league table twitter

play08:42

clearly punches above its weight with a

play08:44

comparatively small

play08:46

326 million this creates

play08:48

a lot of influence any advertiser will

play08:52

tell you

play08:52

the most powerful form of advertising is

play08:55

word of mouth

play08:57

we believe what our friends tell us

play08:59

social media

play09:00

provides a vehicle for forwarding and

play09:02

liking messages

play09:03

so that they become word of mouth

play09:05

endorsements

play09:06

because social media sites provide the

play09:08

vehicle for self-publication

play09:10

there's no verification of the content

play09:13

this leads to fake news

play09:15

and we have a natural bias towards false

play09:18

and potentially more exciting

play09:19

information

play09:21

in 2018 an mit study showed that fake

play09:25

news on twitter travels six times faster

play09:27

than the truth

play09:29

each person with marginal views can see

play09:31

that they're not alone

play09:33

and when these people are introduced to

play09:34

each other on social media

play09:36

they collaborate and share information

play09:39

here are a couple of examples

play09:41

where social media postings have been

play09:43

interpreted as genuine nudes

play09:46

pizzagate here conspiracy theorists

play09:49

falsely claimed in posts made on social

play09:51

media that a number of emails linked

play09:53

high-ranking democratic party officials

play09:56

within an alleged human trafficking and

play09:59

child sex

play09:59

ring managed out of the comet ping pong

play10:02

pizzeria

play10:03

in washington dc a man from north

play10:06

carolina

play10:07

believed these stories to be true and

play10:09

went to the comet ping pong

play10:11

armed with a rifle which he fired inside

play10:13

the restroom in the belief he was there

play10:15

to rescue children

play10:17

the restaurant owner and staff received

play10:19

death threats

play10:20

from other conspiracy theorists in

play10:22

myanmar

play10:23

previously known as burma on the evening

play10:25

of july 2014

play10:28

a mob of hundreds of angry residents

play10:30

gathered around the sun tea shop

play10:32

in the commercial hub of mandalay

play10:35

myanmar's second largest city

play10:37

the tea shop's muslim owner had been

play10:39

accused falsely

play10:41

of misdeeds against a female buddhist

play10:43

employee

play10:44

the accusations against him originally

play10:46

reported on a blog

play10:48

exploded when they made their way to

play10:50

facebook

play10:51

is facebook complicit in these events

play10:54

well certainly

play10:55

they allowed false accusations to be

play10:57

published

play10:58

but to be fair that's simply what they

play11:00

do and the way their algorithms work

play11:03

their ai tools identified the nature of

play11:05

the posts

play11:06

and direct them to people showing

play11:08

similar interests

play11:09

and so facebook became the wind behind

play11:12

the fire

play11:12

you may think handing decisions over to

play11:15

a machine

play11:16

would remove any human bias overtly tech

play11:19

companies are politically correct

play11:21

and do not display any segregation or

play11:23

discrimination

play11:25

however their machine learning

play11:27

algorithms do have a covert bias

play11:30

it is what they're designed to do they

play11:32

categorize people and make decisions

play11:34

based on these categorizations

play11:37

an example is that social media facial

play11:39

recognition algorithms

play11:41

do not work well on people with dark

play11:43

complexions

play11:44

and the reason is that this group were

play11:47

underrepresented in the machine learning

play11:49

stages

play11:50

of the algorithm's development

play11:53

our perception doesn't always match

play11:55

reality

play11:56

here are some examples from surveys

play11:59

people in saudi arabia were asked

play12:01

what proportion of the population do you

play12:04

think is obese

play12:05

respondents replied 25

play12:09

it's actually 75 percent

play12:12

people in the uk were asked what

play12:14

percentage of the population do you

play12:16

think

play12:17

is muslim they answered 24

play12:21

it's actually 5 percent people in japan

play12:25

were asked what percentage of the

play12:27

population live in rural areas

play12:30

they answered 56 percent it's actually

play12:34

seven percent our perceptions affect our

play12:37

behavior

play12:38

perhaps even our voting preferences in

play12:41

2010

play12:42

facebook conducted an experiment by

play12:45

randomly deploying

play12:46

a non-partisan i voted button

play12:50

into 61 million feeds during the u.s

play12:53

midterm elections

play12:55

that simple action led to an additional

play12:58

340 000

play12:59

votes a number that could swing an

play13:01

election

play13:03

facebook had shown itself to be an

play13:05

influencer

play13:06

the chinese government is cited as being

play13:08

particularly prolific in the creation of

play13:10

false accounts

play13:12

and paying people to broadcast

play13:14

pro-government messages

play13:15

with the intention of changing their

play13:17

perceptions

play13:19

in september 2019 twitter removed over

play13:22

900 accounts it believed were

play13:24

established by the chinese government

play13:26

which were deliberately and specifically

play13:29

attempting to sow political discord

play13:32

in may 2020 the bbc reported that

play13:35

hundreds of fake

play13:36

or hijacked social media accounts have

play13:39

been pushing pro-chinese government

play13:41

messages

play13:42

about the coronavirus pandemic onto

play13:44

facebook twitter and youtube

play13:46

these were attributed to the chinese

play13:48

regime's 50 cent army

play13:51

these are hired professional trolls and

play13:53

the nickname

play13:54

suggests how much they get paid per post

play13:56

micro targeting

play13:58

is a marketing strategy that uses

play14:00

people's data what they like

play14:02

what they're connected to what their

play14:04

demographics are so that they can be

play14:06

segmented into small groups for content

play14:08

targeting

play14:10

leading up to elections detailed micro

play14:12

targeting has been used to target voters

play14:15

the institute of practitioners in

play14:17

advertising the ipa

play14:19

has called for a suspension of micro

play14:21

targeting ads in political campaigns

play14:24

due to the covert secret nature of these

play14:26

ads that do not have to be listed for

play14:28

public display

play14:30

like other political advertising it

play14:32

creates a culture

play14:33

that lacks openness and can be

play14:35

vulnerable to abuse

play14:37

with the us 2020 election looming

play14:39

twitter announced

play14:41

we will not permit our service to be

play14:43

abused around civic processes

play14:45

most importantly elections any attempt

play14:48

to do so

play14:49

both foreign and domestic will be met

play14:51

with strict enforcement of our rules

play14:54

twitter also banned political ads and

play14:56

deep fakes

play14:57

while the actor faking content is not

play14:59

new deep fakes leverage

play15:01

powerful techniques from machine

play15:03

learning and artificial intelligence

play15:05

to manipulate or generate visual and

play15:08

audio content

play15:09

with a high potential to deceive

play15:12

and an entertaining example that springs

play15:14

to mind is the fabricated footage of

play15:16

forrest gump

play15:17

talking to john and f kennedy

play15:21

level four is evolving but it's here

play15:24

and at this deep level it is where the

play15:26

vast amounts of data

play15:28

collected from around our planet are

play15:30

used to perform predictive analysis

play15:33

the software mines and analyzes

play15:35

historical data patterns to predict

play15:37

future outcomes

play15:38

by extracting information from data sets

play15:41

to determine likely outcomes

play15:43

many of our decisions are not based on

play15:45

logic

play15:46

rather emotions trust intuition

play15:49

satisfaction and culture

play15:50

all play a crucial role in persuading us

play15:53

to buy a certain product

play15:55

or make a particular decision this

play15:58

provides the opportunity to say

play16:00

commission a social media company to

play16:02

predict the voting patterns from an

play16:04

election manifesto

play16:06

the manifesto could then be reworked so

play16:09

that it derives the most positive

play16:11

results

play16:12

and the analysis could also identify the

play16:14

key messages to be conveyed to

play16:16

particular social groups

play16:18

i do hope we don't reach the next level

play16:20

which is where social media companies

play16:22

are commissioned to change the beliefs

play16:24

of populations

play16:25

by manipulating the information we

play16:27

receive and so

play16:29

change our beliefs to order so in

play16:32

conclusion i would say that

play16:33

during the lockdown periods for covert

play16:35

19 there has been a significant increase

play16:38

in online activity

play16:40

social interaction is part of our dna

play16:42

and so social networking

play16:44

is here for the foreseeable future when

play16:47

cars were first introduced to our

play16:49

highways

play16:50

they featured body damaging vertical

play16:52

shapes with sharp metal emblems

play16:55

they were a new concept and it took some

play16:57

years of public pressure to change the

play16:59

laws

play17:00

and make the cars safer changes included

play17:04

having gently sloping fronts with

play17:06

collapsible bumpers

play17:07

and outlawing those fixed protruding

play17:10

bonnet emblems

play17:11

and also fitting airbags

play17:14

social media is a powerful and rapidly

play17:17

evolving

play17:18

communications vehicle we cannot put the

play17:20

genie back in the bottle

play17:22

but we must learn to manage it better

play17:33

you

Rate This

5.0 / 5 (0 votes)

関連タグ
Social MediaAINotificationsMachine LearningSentiment AnalysisDigital AddictionRecommendation EnginesFake NewsCyber BullyingData Privacy
英語で要約が必要ですか?