This is why emotional artificial intelligence matters | Maja Pantic | TEDxCERN

TEDx Talks
14 Dec 201814:40

Summary

TLDRThe talk delves into emotional artificial intelligence, a field with potential for immense societal good but also misuse. It discusses technology's ability to sense and analyze human emotions through facial expressions, with applications in therapy for autistic children and detecting depression signs. However, it warns of the risks of data misuse, such as personalized pricing based on online behavior, urging consumers to be vigilant about their digital footprint.

Takeaways

  • 🧠 Emotional artificial intelligence (EAI) is a growing field focused on the automatic sensing and analysis of human behavior, particularly facial expressions.
  • 🤖 EAI has the potential to greatly benefit people, such as in therapeutic applications with autistic children, where robots can provide consistent facial expressions for learning.
  • 📈 The technology is advanced enough to detect and track faces and facial expressions in various scenes, but challenges remain with occlusions, head position changes, and small faces.
  • 👶 For autistic children, EAI can help them understand and express emotions in a way that is recognizable to others, facilitating communication.
  • 🤝 Robots can be programmed to display the same facial expressions consistently, which is beneficial in therapy sessions where human expressions might vary.
  • 😄 Autistic children often respond positively to robots, as seen in therapy sessions where they show increased attentiveness and engagement.
  • 😟 EAI can also be used to detect behavioral cues of depression, which is a significant and increasing issue, especially among teenagers and the elderly.
  • 🔍 Depressed individuals may exhibit subtle and short-lived smiles and a negative bias in interpreting expressions, which EAI can help identify.
  • 💡 The technology can flag these cues to healthcare providers or family members, potentially leading to earlier intervention and support.
  • 🛒 However, EAI can be misused, as demonstrated by a Facebook patent for a camera that could customize prices based on consumer behavior and preferences, including medical remedies.
  • ⚠️ There is a call to be aware of the data we share, especially behavioral data, as it can be exploited by companies for their benefit, urging consumers to be informed and cautious.

Q & A

  • What is the main topic of the talk?

    -The main topic of the talk is Emotional Artificial Intelligence (EAI), also known as affective computing or human-centric AI, which involves the automatic sensing and analysis of human behavior, particularly facial expressions.

  • Why is the human face important in EAI?

    -The human face is important in EAI because it is the only observable window of our inner selves, including our emotions, intentions, attitudes, and moods. It is used for recognizing other people and judging various attributes such as age, gender, and sometimes even personality.

  • What are some of the challenges in EAI technology?

    -Some challenges in EAI technology include dealing with occluded faces, large changes in head position, especially fast ones, very small faces, and accurately interpreting typical facial expressions in various contexts.

  • How can EAI be applied in therapy with autistic children?

    -EAI can be applied in therapy with autistic children by using robots to teach them to recognize and express emotions in a way that typically-developing people can understand. Robots can provide consistent facial expressions and positive feedback, which can help in the iterative loop of behavioral therapy.

  • Why are autistic children often not able to understand or express facial expressions correctly?

    -Autistic children often cannot understand or express facial expressions correctly because they may perceive the thousands of daily facial expressions as thousands of different categories, making it difficult for them to generalize and learn the appropriate expressions to communicate their emotions.

  • What is one example of a positive outcome from using EAI in therapy with autistic children?

    -One positive outcome is that a child with attention deficit disorder became more attentive after a session with a robot, as reported by his daily supervisor. This shows the potential of EAI to improve focus and engagement in therapy.

  • How can EAI technology be used to detect signs of depression?

    -EAI technology can be used to detect signs of depression by automatically sensing and analyzing behavioral cues such as the quality and intensity of smiles, as well as the presence of negative bias in interpreting situations.

  • What is the potential misuse of EAI technology mentioned in the talk?

    -The potential misuse of EAI technology mentioned in the talk is the patenting of a camera by Facebook that could recognize people and customize prices based on their online search patterns and behavioral signals in shops, leading to higher prices for those who show more interest in a product.

  • How can consumers protect themselves from the misuse of their data by companies?

    -Consumers can protect themselves by being aware, getting informed, not clicking on all cookies, and being cautious about the kind of information they give away. They should keep their own data for their own use.

  • What is the speaker's final message to the audience regarding EAI technology?

    -The speaker's final message is to be aware and informed about how EAI technology can be used and misused, and to take care of the personal data they provide to companies to prevent potential misuse.

Outlines

00:00

😀 Emotional AI: Potential and Ethical Considerations

This paragraph introduces the concept of emotional artificial intelligence (AI), a burgeoning field within AI research with significant potential to benefit society. It discusses the automatic sensing and analysis of human behavior, particularly facial expressions, which serve as a window to our emotions and intentions. The speaker highlights the advanced state of the technology, which can detect and analyze facial expressions in various scenarios. However, they also acknowledge the challenges, such as dealing with occluded faces and rapid head movements. The paragraph emphasizes the importance of consumer awareness in determining the ethical use of this technology, setting the stage for a deeper exploration of its applications and misuse.

05:00

🤖 Emotional AI in Therapy and Autism

The second paragraph delves into the application of emotional AI in therapeutic settings, specifically for children with autism. It explains the difficulty autistic children face in interpreting facial expressions due to their atypical generalization abilities. The speaker suggests that robots can be more effective than human therapists in teaching these children to recognize and express emotions consistently. The use of robots provides a controlled environment for learning, as they can offer the same facial expression repeatedly, which is crucial for autistic children's understanding. The paragraph provides examples of successful interactions between autistic children and robots, illustrating the positive impact of emotional AI on communication and social skills.

10:04

😟 Detecting Depression with Emotional AI

This paragraph explores the use of emotional AI in identifying behavioral cues of depression. The speaker discusses the alarming rates of depression among various demographics, emphasizing the need for tools that can detect early signs of this mental health issue. The technology can analyze the quality of smiles and the presence of negative bias in facial expressions, which are indicative of depression. The speaker contrasts the public misconception of depression with the actual behavioral signals that AI can detect. The paragraph also raises concerns about the potential misuse of emotional AI, such as Facebook's patent for a camera that could potentially adjust prices based on consumers' online behavior and preferences, including sensitive information about health conditions.

Mindmap

Keywords

💡Emotional Artificial Intelligence

Emotional Artificial Intelligence (EAI) refers to the field of AI that focuses on the automatic sensing and analysis of human emotions and behavior, particularly facial expressions. It is central to the video's theme as it discusses the potential of EAI to improve lives, such as in therapy for autistic children and in detecting signs of depression. The script mentions EAI's various names like 'affective computing' and 'human-centric AI,' highlighting its significance in understanding and responding to human emotions.

💡Affective Computing

Affective Computing is one of the terms used interchangeably with Emotional Artificial Intelligence. It is the study and development of systems and devices that can recognize, interpret, and simulate human emotions. In the video, affective computing is presented as a key area of research within EAI, emphasizing its role in creating technology that can detect and respond to human emotions, which is crucial for applications like therapy and mental health monitoring.

💡Human-Centric Artificial Intelligence

Human-Centric Artificial Intelligence is an approach to AI development that prioritizes human needs, emotions, and behaviors. The concept is mentioned in the script to describe the underlying research behind EAI, indicating that the technology is designed to be more attuned to human emotional states and interactions, which is essential for creating empathetic and responsive AI systems.

💡Facial Expressions

Facial Expressions are the movements of the face that convey a person's emotions or reactions. The script discusses the importance of facial expressions as a window into a person's inner feelings and states of mind. It also highlights the role of EAI in analyzing these expressions for applications like therapy for autistic children, where understanding and mimicking facial expressions can be beneficial.

💡Autism

Autism, or Autism Spectrum Disorder (ASD), is a developmental disorder characterized by challenges with social skills, repetitive behaviors, and speech and nonverbal communication. The video script uses the context of autism to illustrate the potential of EAI in therapy, where robots can be programmed to display consistent facial expressions to help autistic children learn to recognize and express emotions.

💡Behavioral Therapy

Behavioral Therapy is a type of psychological treatment that focuses on changing undesirable behaviors or habits. In the script, it is mentioned as a method used in conjunction with EAI to help autistic children learn to express emotions and understand the emotions of others, using robots as consistent and reliable models for facial expressions.

💡Depression

Depression is a common mental health disorder characterized by persistent feelings of sadness, hopelessness, and a lack of interest or pleasure in activities. The video script discusses the application of EAI in detecting behavioral cues of depression, such as changes in facial expressions and emotional responses, which can be flagged for medical attention or support.

💡Misuse of Technology

Misuse of Technology refers to the improper or unethical use of technological advancements for negative purposes. The script warns about the potential misuse of EAI, such as Facebook's patent for a camera that could customize prices based on behavioral patterns, including those of medical remedies, which raises ethical concerns about privacy and fairness.

💡Data Privacy

Data Privacy is the protection of personal information from unauthorized access or disclosure. The video emphasizes the importance of being aware of data privacy issues, especially in the context of how companies collect and use behavioral data, which can lead to misuse if not properly managed or consented to by the individuals involved.

💡Cookies

Cookies are small pieces of data stored on a user's device when visiting a website, often used to track browsing behavior and preferences. The script mentions cookies as an example of how users' online activities are monitored and data is collected, urging viewers to be cautious about accepting cookies and to protect their data privacy.

💡Ethical Considerations

Ethical Considerations refer to the moral principles and values that guide decision-making, especially regarding what is right and wrong. The video script touches on the ethical implications of EAI, discussing both its potential for good, such as in therapy and mental health, and the risks of misuse, emphasizing the need for consumers to be aware and make informed decisions about their technology use.

Highlights

Emotional artificial intelligence (EAI) is a new field in AI with the potential for immense good or misuse.

EAI, also known as affective computing or human-centric AI, involves automatic sensing and analysis of human behavior and facial expressions.

The human face is a crucial window into our emotions, intentions, attitudes, and moods.

Technological advancements have enabled the detection and analysis of facial expressions in various naturalistic scenes.

Current technology can measure the intensity of facial gestures like frowns and smiles, and analyze higher-level behaviors and emotions.

Challenges remain in dealing with occluded faces, large head position changes, very small faces, and typical facial expressions.

EAI is mature enough for applications such as therapy with autistic children, where it can help teach emotional expression and recognition.

Autistic children often struggle with the complexity of facial expressions, making communication challenging.

Robots can be programmed to display consistent facial expressions, aiding in therapy for autistic children.

Autistic children's affinity for mechanical toys makes robots particularly effective in therapy.

EAI can also be used to detect and flag behavioral cues of depression, a significant issue in the Western world.

Depression's impact is particularly severe among teenagers and the elderly, with many cases going undetected and untreated.

EAI can analyze the quality of smiles and negative bias in interpretation, which are indicators of depression.

Facebook patented a camera technology that could potentially misuse EAI by customizing prices based on consumer behavior and preferences.

The potential misuse of EAI raises ethical concerns, especially when it comes to privacy and the exploitation of personal data.

Consumers must be aware and informed about the data they share, including behavioral and facial expression data.

The talk emphasizes the importance of being cautious with the personal data shared online and the potential for both positive and negative uses of EAI.

Transcripts

play00:01

[Music]

play00:10

[Applause]

play00:14

this talk is about emotional artificial

play00:18

intelligence it's a relatively new field

play00:21

in research in artificial intelligence

play00:23

but it has a great potential to do

play00:26

immense good to people for people

play00:29

however the technology can be misused

play00:33

and in this talk I want to make a point

play00:36

that actually it is up to us the

play00:40

consumers of this technology who will

play00:42

decide whether the technology will be

play00:45

used for good or for evil but let me

play00:48

first explain what emotional artificial

play00:51

intelligence is really about so

play00:54

emotional artificial intelligence goes

play00:56

by various names affective computing is

play00:59

one of those another one is human

play01:02

centric artificial intelligence but

play01:05

actually behind all of those names is

play01:08

exactly the same research topic

play01:11

automatic sensing and analysis of human

play01:14

behavior and in particular human facial

play01:18

behavior human face is really

play01:21

fascinating we use the face to recognize

play01:25

other members of our species to

play01:27

recognize other people use the face also

play01:31

to judge various things such as age

play01:34

gender beauty and sometimes even

play01:38

personality don't do the last one most

play01:43

importantly the human face is the only

play01:46

observable window of our inner selves of

play01:50

our emotions intentions attitudes and

play01:53

moods it is therefore not surprising

play01:56

that in the recent years we had a surge

play02:00

of interest in this technology and if we

play02:04

could I mean the reason is that if we

play02:06

could detect the faces in various

play02:10

naturalistic scenes and then analyze the

play02:12

facial expressions of

play02:13

those phases in terms of specific

play02:16

emotions or specific attitudes we could

play02:19

use this kind of technology for a very

play02:21

wide number of applications the state of

play02:26

the art in the field is relatively

play02:28

advanced so we do have and we did

play02:31

develop a number of techniques that

play02:34

could detect and track the phases in

play02:36

various and realistic scenes we also do

play02:39

have the tools that can detect track and

play02:43

measure the intensity of various facial

play02:46

gestures such as frowns and smiles we

play02:50

also can automatically analyze

play02:52

higher-level behaviors such as certain

play02:55

emotions and certain attitudes like

play02:57

interest however the methodology and the

play03:02

technology is not fully matured so we

play03:05

still have great problems in dealing

play03:08

with occluded faces with dealing with

play03:11

faces which where we observe large

play03:14

changes in head position especially fast

play03:17

changes in head position very small

play03:20

faces make problem to us and also our

play03:23

typical facial expressions such as the

play03:25

expressions of the lady enjoying her

play03:28

skydive however the technology is

play03:31

matured enough to be used in a large

play03:35

number of really great applications so

play03:37

let me just give a couple of examples of

play03:40

those the first example I want to talk

play03:43

about is use of emotional artificial

play03:46

intelligence in therapies with autistic

play03:49

children the human face displays 10,000

play03:53

different facial expressions 7,000 of

play03:56

which we display on a daily basis

play03:59

typically developing people classify

play04:02

those expressions in maybe 15 or 20

play04:04

categories

play04:05

however autistic children means this

play04:08

generalization ability so 7,000 facial

play04:12

expressions that we express on a daily

play04:14

basis they see a 7,000 different

play04:17

categories this is the reason why

play04:20

autistic children usually do not look at

play04:23

our faces they are too confusing for

play04:24

that but this is also the reason

play04:27

why they can not understand what are our

play04:30

facial expressions when we try to

play04:32

express that we are sad or we are upset

play04:35

or we are happy this is also the reason

play04:37

why they cannot learn what are those

play04:41

expressions that they should display

play04:43

when they want to say that they are

play04:45

happier they are upset

play04:47

so that typically-developing people can

play04:50

understand them and hence the breakage

play04:52

in communication so one of the major

play04:55

goals of behavioural therapy with

play04:57

autistic children is to give them the

play05:00

tools the skills to actually express the

play05:03

emotions in a way that

play05:04

typically-developing people can

play05:06

understand and also to understand what

play05:09

are the emotions of others around them

play05:11

and one of the problems with this is

play05:15

that when you have a human teach a child

play05:19

these typical expressions humans like

play05:23

and usually emphasize things so when

play05:26

they teach the child you know when you

play05:28

are happy you smile they usually raise

play05:31

the eyebrows and most of you probably

play05:33

didn't notice that at all right but

play05:36

actually for autistic child a smile with

play05:40

raised eyebrows is a very different

play05:42

expression than just a smile so this is

play05:45

where we came to the idea to use the

play05:48

robots instead of human therapists

play05:50

because you can program the robot and

play05:52

the expression that the robot will show

play05:54

will always be the same expression right

play05:56

so when the therapist says you know can

play06:01

you display a happy facial expression or

play06:04

how would you feel if you get a train

play06:07

then the robot complies and shows this

play06:12

smiley expression the child sees this

play06:14

expression repeats the expression and

play06:17

then the robot because it recognizes

play06:19

that the child display the correct

play06:21

facial expression gives a positive

play06:23

feedback so this is the way you can use

play06:26

the robots in this iterative loop of

play06:29

behavioral therapy with autistic

play06:31

children autistic children love robots

play06:36

they actually love all mechanical toys

play06:38

which are built of parts

play06:40

and exactly this love is something that

play06:44

can have a really amazing effect on

play06:47

children so the district herb boy that

play06:50

you have previously seen suffers from

play06:54

rather severe attention deficit however

play06:58

after this session with robot his daily

play07:01

supervisor who is this guy in the

play07:03

background said to us that he has never

play07:06

seen the boy more attentive in a whole

play07:08

year right similarly when we brought the

play07:13

robot to Serbia to have a session with

play07:15

Serbian autistic children we had this

play07:18

boy who is a nonverbal child interacting

play07:22

with with a robot when we say a

play07:24

nonverbal autistic child this doesn't

play07:26

mean that the child cannot speak it

play07:28

means that the child chooses not to

play07:30

speak but after this session you see he

play07:33

was really like happy and he was really

play07:35

enjoying his time with the robot and

play07:37

after the session he went home and said

play07:40

to his mom and tomorrow in school the

play07:42

robot he said to his mom so the mother

play07:46

of course immediately wrote to us and

play07:48

said what did you do like this is

play07:51

amazing it was it was for her an

play07:54

unbelievable awakening and this kind of

play07:57

awakenings is actually what makes the

play07:59

robot worthy of putting into the

play08:03

therapies with autistic children and

play08:04

this is what makes our research on this

play08:07

topic fully worthwhile another case and

play08:12

a very good application of emotional

play08:15

artificial intelligence technology is

play08:18

the case of automatic sensing and

play08:21

flagging behavioral cues of depression

play08:25

depression is a huge problem currently

play08:28

in Western world we are having an

play08:30

increase of depressed people at all age

play08:34

levels however with teenagers this is

play08:38

the worst in USA and UK we currently

play08:41

have 25 percent of teenage girls

play08:44

suffering from this illness overall 17

play08:48

percent of young adults aged 18 to 22

play08:53

suffer from depression it is very

play08:55

similar the case with the elderly

play08:58

population in UK we have 22% of people

play09:02

above 65 years of age who suffer from

play09:07

depression the worst part is it actually

play09:12

only one out of ten of these elderly

play09:15

people are officially helped by UK

play09:18

national health services so it would be

play09:21

really fantastic if we would have tools

play09:23

that could actually detect these signs

play09:29

behavioral cues of depression and flag

play09:32

them to GP or to the family members when

play09:37

you talk to people about depression they

play09:39

often may say to you you know depressed

play09:42

people do not smile that often but it is

play09:45

not at all the case it is not a

play09:47

frequency it is really the quality of

play09:50

the smiles depressed people feel usually

play09:54

embarrassed if they smile so their

play09:57

smiles are short-lived and they are

play09:59

often dampened another problem is the

play10:03

negative bias so whatever you say to a

play10:06

depressed person they will actually

play10:08

interpret it more negatively than the

play10:11

intention was let me ask you something

play10:14

what do you think this lady is currently

play10:17

watching a very typical answer is that

play10:25

she's watching some kind of horror movie

play10:28

but actually we asked her to watch funny

play10:32

videos she's watching the videos of kids

play10:36

and cats running around but due to her

play10:40

negative bias she has these very strong

play10:44

expressions of fear and even disgust her

play10:48

smiles are really so subtle that the

play10:54

intensity of them are barely detectable

play10:56

and it is exactly this the cumulative

play10:59

effect of all those behavioral signals

play11:02

that we plot on this

play11:05

lined plot at the bottom of the vida so

play11:10

emotional artificial intelligence can be

play11:13

used for great purposes really to help

play11:17

people to help humanity but it can also

play11:20

be misused let me give you an example in

play11:24

the sample 2017 Facebook patented a

play11:29

camera the camera will be placed in

play11:33

shops and shopping malls it will

play11:35

recognize people by their profiles in

play11:38

facebook currently Facebook has 1.4

play11:44

billion

play11:45

profits also the camera will know what

play11:50

are the search patterns that the people

play11:52

exhibited on Facebook and on Google so

play11:56

they will know what we like or what we

play12:00

are interested in

play12:01

furthermore the camera will watch our

play12:04

behavioral signals in the shop what do

play12:08

we like to which departments do we go

play12:11

most often and based on all of this the

play12:14

price will be customized so the more you

play12:18

search for something the more you like

play12:20

something the price will be higher the

play12:25

worst part is actually that neither

play12:27

medicines nor medical care nor medical

play12:31

services are excluded from the pattern

play12:34

so if somebody is ill insert is a remedy

play12:38

online and has a Facebook account that

play12:42

person will unfortunately pay much

play12:44

higher prices for this medical remedy

play12:48

although again emotional artificial

play12:51

intelligence can be used for really

play12:54

great purposes and great good it can be

play12:57

badly misused but it is especially the

play13:01

case if we carelessly and blindly give

play13:05

our data our behavioral data to certain

play13:10

companies so this is exactly what's

play13:14

happening nowadays I mean you go to any

play13:17

website they ask you to

play13:18

clicking except these cookies right what

play13:21

are the cookies the cookies are actually

play13:24

the search patterns of what you do what

play13:28

did you search to which website you went

play13:29

how many times you visited that website

play13:31

what did you purchase how many times you

play13:34

purchase that so they have four

play13:36

behavioral pattern of your purchasing

play13:40

behavior then we go to this Facebook we

play13:43

open the profile we put our pictures

play13:45

with P to put our children's picture we

play13:47

put our friends then we tag everybody we

play13:49

put videos so they give everything they

play13:51

have the faces they have the behavioral

play13:54

patterns they have how we smile which is

play13:57

a dynamic thing right they have can we

play14:00

walk what makes us happy where we go

play14:03

most often so then you go back to their

play14:08

part and then you realize how harmful

play14:11

that can be so the message for today

play14:15

really is

play14:16

be aware get informed you don't have to

play14:20

click on all of these cookies all the

play14:21

time you can actually scroll without

play14:23

accepting the cookies

play14:25

take care what kind of information you

play14:27

give away and keep your own data for

play14:30

your own thank you

play14:32

[Applause]

play14:33

[Music]

play14:34

[Applause]

Rate This

5.0 / 5 (0 votes)

Etiquetas Relacionadas
Emotional AITherapyAutismDepressionFacial AnalysisBehavioral CuesEthicsTechnology MisuseData PrivacyHuman-Centric AI
¿Necesitas un resumen en inglés?