The ethical dilemma we face on AI and autonomous tech | Christine Fox | TEDxMidAtlantic

TEDx Talks
11 May 201718:47

Summary

TLDRThe video discusses the evolving relationship between technology and policy, emphasizing how rapid advancements in technology are outpacing government regulation. It explores military technology, autonomous systems, the Internet, and the Internet of Things, highlighting both the benefits and unforeseen risks. The speaker raises concerns about vulnerabilities, especially in cybersecurity and autonomous machines, urging a shift in responsibility to companies and consumers. They advocate for proactive measures like corporate social responsibility and 'red teaming' to address potential risks before technologies are released to the public, stressing that we all share the responsibility in shaping our technological future.

Takeaways

  • 🤖 Advanced technology, particularly in robotics, can elicit mixed reactions, from excitement to concern about its potential risks.
  • ⚔️ The military must comply with the laws of armed conflict, focusing on targeting combatants while protecting civilians, and it strives to improve with the help of technology.
  • 🛠️ Precision weapons and drones are remotely piloted and not autonomous, contrary to public perception, and DoD policy mandates human oversight for any lethal actions.
  • 🚫 Current policy prohibits fully autonomous lethal weapons, as they cannot yet reliably differentiate between combatants and civilians.
  • 📉 The private sector is leading in advanced technology development, surpassing government agencies in research and development spending, creating a regulatory gap.
  • 🌐 The internet, while transformative and beneficial, has also introduced vulnerabilities like cyber warfare, hacking, and threats to personal data and national security.
  • 📱 The Internet of Things (IoT) brings convenience but also risks, as devices like household robots may not be secure against cyber threats, with little regulation on safety standards.
  • 🚗 Autonomous vehicles are already in operation, but they too raise ethical and safety concerns, such as decision-making in dangerous situations and potential hackability.
  • 🏢 Companies must embrace a form of corporate social responsibility, ensuring their technologies are secure by applying tactics like red teaming to anticipate risks.
  • 🧠 Consumers need to become more informed and proactive, demanding safer, more secure technology as part of their buying decisions, shifting some responsibility onto the public.

Q & A

  • What are the three phases of reactions people typically have when watching the video mentioned at the beginning?

    -The three phases are: 1) Initial amazement and admiration for the technology, 2) Discomfort when the robots are kicked, and 3) Growing unease, feeling that the robots are 'creepy' and questioning whether such technology should be developed.

  • Who currently decides on the use of military technology, particularly in relation to the laws of armed conflict?

    -The government and defense department leadership make decisions regarding military technology, guided by policies ensuring compliance with the laws of armed conflict, which require the military to target combatants and protect civilians.

  • Why is the use of lethal autonomous systems not currently allowed on the battlefield?

    -Lethal autonomous systems are not allowed because current technology cannot reliably discriminate between combatants and civilians, which would violate the laws of armed conflict.

  • What is one reason the speaker believes the policy on autonomous systems is appropriate for now?

    -The speaker argues that if current technology cannot correctly interpret simple text (e.g., spell check errors), it cannot be trusted to distinguish between combatants and civilians in complex warfare environments.

  • What concerns does the speaker raise about the rapid development of technology in the commercial world?

    -The speaker is concerned that commercial technology is advancing so quickly that policymakers have fallen behind and may not be able to catch up, creating risks for society due to the lack of regulation and oversight.

  • How has the role of the internet evolved from its original conception?

    -The internet, initially developed by DARPA for communication and information access, has evolved into a space that includes cyber warfare and vulnerabilities like hacking, which were unanticipated at its inception.

  • What example does the speaker use to illustrate the unforeseen vulnerabilities introduced by the internet?

    -The speaker references several examples, including the 2014 Sony hack linked to North Korea, the theft of personal data from the Office of Personnel Management, and the hacking of the U.S. political process.

  • What is the 'Internet of Things' and what are its potential risks?

    -The 'Internet of Things' refers to interconnected devices like smart homes and autonomous systems, which aim to improve daily life but also present risks such as inadequate cybersecurity and vulnerability to hacking.

  • Why does the speaker believe that corporate social responsibility is necessary in the context of emerging technology?

    -The speaker argues that since policymakers may not keep up with rapid technological advancements, companies should take responsibility by adopting measures like red-teaming to identify and mitigate potential risks before releasing new products.

  • What role do consumers play in shaping the future of technology, according to the speaker?

    -Consumers need to be educated and ask hard questions about the security and safety of technology before purchasing, as their choices influence companies to prioritize protection and ethical practices.

Outlines

00:00

🤖 The Complexity of Technology and Human Reactions

The speaker introduces the theme of technology’s complexity using a video as an example. Reactions to the video occur in three phases: amazement, discomfort, and concern. The speaker questions who should decide on the development and control of such technologies, specifically referencing military applications. Military policy is clear on protecting civilians and following laws of armed conflict, with precision weapons improving over time. However, concerns persist about the use of autonomous systems and whether they can be trusted to make life-and-death decisions.

05:02

📱 The Spellchecker Dilemma and Technology-Policy Evolution

The speaker shifts to a lighter example, discussing how technology like spellcheck can misinterpret intent, drawing a parallel to more serious military contexts where autonomous systems might fail to distinguish between combatants and civilians. The argument is made that technology and policy must evolve together. The speaker points out that commercial tech, not government, now drives innovation. With commercial R&D vastly outpacing defense budgets, the gap between technology's rapid development and policy regulation is growing, which poses significant challenges.

10:02

💻 The Internet’s Promise and Unintended Vulnerabilities

The speaker highlights the dual nature of the internet—its great promise of global connectivity and its unforeseen vulnerabilities, such as cyberattacks. The Sony hack is discussed as a key example of how nations and individuals can exploit these weaknesses, resulting in personal and institutional harm. Additionally, the speaker mentions incidents like the breach of personal information from government databases, emphasizing the risks involved in our increasing dependence on technology.

15:03

🏠 The Internet of Things: Friend or Foe?

This section focuses on the Internet of Things (IoT) and how it’s designed to make life easier through interconnected devices. The speaker introduces the concept of a household robot and raises concerns about who decides the level of cybersecurity embedded in such devices. They argue that there is no clear policy regulating the security of these commercial technologies, placing the responsibility on consumers and companies. The speaker urges for corporate responsibility in safeguarding against the potential misuse of these advancements.

🚗 Autonomous Vehicles and Ethical Dilemmas

The speaker discusses the ethical challenges of autonomous vehicles, particularly in life-or-death situations where they may have to choose between the safety of passengers and pedestrians. This dilemma mirrors decisions made in combat scenarios involving distinguishing civilians from combatants. Additionally, the issue of vehicle hackability is raised, with real-world examples like the Jeep hack illustrating the risks of new technologies. The speaker calls for urgent attention to these problems, expressing doubt that policymakers alone can keep up with the pace of innovation.

🏭 Corporate Responsibility in the Age of Technology

The speaker argues for a new form of corporate social responsibility, where companies take proactive steps to secure their technologies against potential threats. This might include adopting practices like 'red teaming,' where experts simulate how new technologies could be exploited by adversaries. By thinking ahead and addressing vulnerabilities, companies can create safer products. The speaker emphasizes that consumers also have a role in this process by demanding secure technologies and making informed decisions.

🌍 A Call for Collective Action in Shaping the Future of Technology

In the concluding remarks, the speaker expresses their love for technology but stresses the importance of guiding its development responsibly. They encourage collective action among corporations, consumers, and policymakers to ensure that technological advancements lead to positive outcomes for society. The speaker remains optimistic that by asking the right questions and addressing challenges head-on, the future of technology can be shaped in a way that benefits everyone, both today and for future generations.

Mindmap

Keywords

💡Technology

Technology is the central theme of the video, referring to the development and application of advanced tools, systems, and machinery. In the script, technology is portrayed as a double-edged sword—capable of both enhancing human lives (e.g., precision weapons and drones) and creating new ethical, security, and policy challenges (e.g., autonomous systems and the Internet of Things).

💡Autonomous Systems

Autonomous systems are machines that can perform tasks without human intervention. In the video, the speaker explores the ethical concerns surrounding autonomous systems, particularly in military contexts. The issue is whether these systems can make moral judgments, such as distinguishing between combatants and civilians, which is why lethal autonomous weapons are not allowed in current warfare.

💡Laws of Armed Conflict

The Laws of Armed Conflict are international rules meant to regulate the conduct of war, particularly the protection of civilians and the targeted engagement of combatants. The speaker emphasizes how military technology, like precision weapons, must comply with these laws, and expresses concern over autonomous systems' ability to uphold these standards in conflict.

💡Cybersecurity

Cybersecurity refers to the protection of systems, networks, and data from digital attacks. The video highlights cybersecurity as a growing concern with the advent of new technologies like autonomous cars and connected devices. The example of Sony’s hack by North Korea illustrates how cyber vulnerabilities can be exploited, leading to serious consequences.

💡Corporate Social Responsibility

Corporate social responsibility (CSR) in this context refers to the ethical obligation of companies to consider the societal impact of their technologies. The speaker suggests that corporations should 'red team' their technologies—actively seeking to identify potential misuse or vulnerabilities before releasing products to the public. This responsibility is seen as crucial in a rapidly advancing technological landscape where regulations lag behind.

💡Remote Piloted Vehicles (Drones)

Drones are unmanned aircraft controlled remotely by a human pilot. While they are often controversial, the speaker clarifies that current military drones are not autonomous but are controlled by humans for precision targeting. This underscores a key distinction between remotely piloted systems and fully autonomous technologies, which are not yet deployed in combat due to ethical and legal constraints.

💡Internet of Things (IoT)

The Internet of Things refers to the growing network of physical devices connected to the internet, such as smart homes or autonomous cars. The speaker expresses concern over the potential vulnerabilities of these interconnected devices, questioning who decides the level of cybersecurity built into them and stressing the lack of government policy regulating such technologies.

💡Hackability

Hackability refers to the ease with which a device or system can be exploited by hackers. The speaker uses examples like the 'Jeep hack' and the Sony hack to demonstrate how even everyday technologies, such as vehicles and corporate networks, can be vulnerable to cyberattacks, posing serious risks to security and privacy in an increasingly connected world.

💡Human in the Loop

The phrase 'human in the loop' describes the requirement that a human must be involved in decision-making processes when lethal force is used in military technology. The speaker emphasizes that current Department of Defense policy mandates this control to ensure ethical decision-making in combat, contrasting it with the dangers posed by fully autonomous lethal systems.

💡Unanticipated Consequences

Unanticipated consequences refer to the unforeseen negative outcomes that arise from new technologies. The video discusses how innovations like the internet brought great benefits but also introduced risks like cyber warfare, data theft, and threats to democratic institutions. This concept highlights the need for foresight and caution as technology evolves faster than policy frameworks.

Highlights

The speaker introduces a video that showcases the complexity surrounding technology today, noting how people's reactions follow three distinct phases.

In military operations, the government, policy makers, and defense department leadership decide on the use of technology, with a focus on protecting civilians while targeting combatants.

Drones, though controversial, are still piloted by humans, with Department of Defense (DoD) policy mandating a human be in the loop for any lethal actions.

Lethal autonomous systems are currently not allowed on the battlefield due to concerns about the technology's ability to distinguish between combatants and civilians.

Precision weapons have significantly improved military capabilities in targeting combatants while protecting civilians.

Commercial entities, not governments, are leading the development of advanced technologies, and policymakers are struggling to keep up with the pace of technological advancement.

The research and development budgets of top defense contractors are dwarfed by those of tech giants like Microsoft, Apple, and Google.

The speaker emphasizes that the policy surrounding military technology and lethal systems is currently appropriate, but acknowledges that it will likely evolve as technology advances.

An app that connects people trained in CPR with individuals experiencing cardiac arrest is an example of how technology is saving lives in simple but impactful ways.

The internet, initially developed by DARPA, has fulfilled its promise of instant global communication but has also introduced unforeseen vulnerabilities, like cyber warfare.

The Sony hack serves as an example of the unanticipated risks associated with technology, demonstrating how vulnerabilities can impact large organizations and governments alike.

Hackers breaching the Office of Personnel Management database and stealing the personal data of millions of Americans showcases the unnerving reality of cyber threats.

As autonomous technologies like self-driving cars and home robots become more prevalent, the speaker raises concerns about their safety, including their susceptibility to hacking.

Companies need to adopt a new form of corporate social responsibility, focusing on the vulnerabilities their technologies introduce, including security measures before release.

The speaker advocates for the use of red teaming, a defense department strategy to imagine how adversaries might exploit new technologies, as a method for addressing vulnerabilities before they arise.

Transcripts

play00:13

[Music]

play00:24

[Applause]

play00:59

see so I wanted to start today with that

play01:02

video because I think it really captures

play01:05

the complexity surrounding technology

play01:07

today now I've looked at that video with

play01:09

a lot of colleagues and friends and I

play01:12

find that people's reactions to that

play01:13

video follow along sort of three phases

play01:16

so let me describe them and you can ask

play01:17

if your reactions followed along these

play01:20

phases phase one wow is that cool right

play01:24

I mean is that cool those guys are

play01:26

amazing then they start kicking it

play01:29

that's kind of

play01:30

you know that's kind of

play01:32

mean right I don't know that they should

play01:35

be doing that I don't think they should

play01:38

be allowed to kick them like that and

play01:41

then there's phase three they're kind of

play01:43

walking up the hill and looking around

play01:45

and that big guy starts going up the

play01:48

hill it's kind of H you know they're a

play01:51

little

play01:53

creepy I don't know if I'd feel happy

play01:56

running into them walking in the woods

play01:58

you know what would they do

play02:01

I'm not sure we should be building those

play02:04

things I don't think they should be

play02:07

allowed to build them so the question

play02:11

that I want to explore with you today is

play02:14

who decides

play02:16

should now I've spent a lifetime

play02:19

studying military operations as you

play02:22

heard I was even in the defense

play02:23

department for a while and I can tell

play02:25

you the answer to this question for

play02:27

military operations and technology is

play02:30

very clear who decides should for

play02:32

military operations it's the government

play02:34

its policy its defense department

play02:37

leadership now one of the things we

play02:38

expect our military to be able to do is

play02:41

comply with something called the laws of

play02:42

armed conflict now the laws of armed

play02:45

conflict say many important things but

play02:47

one of the things it says is that the

play02:49

military should do everything in its

play02:51

power to Target combatants and protect

play02:55

civilians in

play02:57

Conflict now I can tell you the military

play02:59

tries very very hard to do this I've

play03:02

studied operations for decades do are

play03:05

they perfect absolutely not do they make

play03:07

mistakes yes but they try very hard and

play03:11

they've gotten better and better and

play03:12

better at it as technology has improved

play03:16

one of the things that has improved are

play03:18

Precision weapons Precision weapons have

play03:20

made a big difference here in their

play03:21

ability to focus on combatants and

play03:23

protect civilians now more recently

play03:26

we've put Precision weapons on something

play03:29

called remote piloted Vehicles drones

play03:33

now I know drones are controversial and

play03:35

I confess I'm not entirely sure I

play03:37

understand why they're so controversial

play03:39

and I wonder if people have the

play03:41

perception or misperception that they're

play03:43

actually

play03:45

autonomous they are remotely piloted the

play03:48

pilot is not in the aircraft but there

play03:50

is a pilot on the ground in a distant

play03:52

location but they have positive control

play03:55

of that aircraft from takeoff to landing

play03:58

and certainly for any employment of a

play04:01

weapon in fact today it is DoD policy

play04:06

that any employment of lethal capability

play04:08

have a human in the loop and when it

play04:11

comes to autonomous systems there's a

play04:14

special directive governing autonomous

play04:16

systems that specifically says lethal

play04:19

autonomous capability is not allowed on

play04:22

the battlefield today and one of the

play04:24

reasons for that is a concern that

play04:27

lethal autonomous systems or technology

play04:30

is not able to discriminate between

play04:32

combatants and civilians and so

play04:35

therefore it would not be in compliance

play04:38

with the laws of armed conflict now let

play04:41

me tell you there are a lot of people

play04:43

who think that this policy is overly

play04:45

constraining I've heard a lot about it

play04:47

right it's it's uh holding us back it's

play04:50

putting men and women in uniform In

play04:52

Harm's Way needlessly our adversaries

play04:55

are going to get Advanced uh ahead of us

play04:57

so it's you know there's a lot of push

play04:59

back on this directive I've thought a

play05:01

lot about this directive I personally

play05:04

think it's just right and I want to ask

play05:06

you to think along with me using the

play05:08

example of our friend the spell checker

play05:11

okay I don't know about you but I often

play05:15

hit send on a text just after I noticed

play05:19

that my helpful smartphone changed the

play05:21

spelling and the meaning of what I

play05:22

wanted to send I hope that happens to

play05:24

you and I'm not the only one right okay

play05:27

so let's ask ourselves if technology

play05:30

today can't understand the intent of a

play05:33

few simple lines of text how could we

play05:37

count on it to discriminate between a

play05:39

combatant and a civilian in environments

play05:42

so complex such as Warfare so I

play05:46

personally think the policy has it right

play05:49

for now but the technology is evolving

play05:52

and I also think it will probably evolve

play05:54

to a point where it can discriminate and

play05:56

I am confident that the policy will

play05:59

evolve along with the technology and so

play06:02

I think this is a nice example of the

play06:04

relationship between technology and

play06:07

policy one evolves the other evolves

play06:09

they stay together and that happens

play06:12

because technology that I'm talking

play06:15

about here is under the purview of the

play06:17

government now there was a time when

play06:20

advanced technology was developed under

play06:22

the purview of the government in most

play06:24

cases but that is not true today today

play06:27

advanced technology is developed in the

play06:30

commercial World far more than in

play06:32

government consider for a moment the

play06:34

fact that if you add up the research and

play06:37

development budgets of the top five

play06:40

defense contractors talking the big ones

play06:43

lockie Martin and Boeing Etc the top

play06:46

five add them all up that comes to less

play06:49

than half of Microsoft's research and

play06:52

development budget in the year less than

play06:55

half that doesn't even begin to consider

play06:58

the R&D budgets of Apple and Google and

play07:00

so many others right so advanced

play07:03

technology is no longer the purview of

play07:05

the government it's the purview of the

play07:06

commercial world and policy makers are

play07:09

not governing advanced technology in the

play07:12

commercial world I think that technology

play07:14

is amazing today and it is going so fast

play07:18

that policy makers have fallen behind

play07:21

and I personally am very dubious that

play07:23

they can catch up so where does that

play07:25

leave us I think it leaves us with a bit

play07:28

of a toxic brew

play07:30

where we have advanced powerful

play07:33

technologies that are available to

play07:36

anyone who wants to buy them with few if

play07:39

any constraints over their development

play07:42

and

play07:44

accessibility now technology is making

play07:46

our world better okay we we do not want

play07:49

to regulate technology we want

play07:51

technology to continue going forward

play07:54

take for example this very simple but

play07:56

lovely application of technology today a

play08:00

a nonprofit Foundation has developed

play08:02

this app it pairs people with CPR skills

play08:05

with people in need having acute Cardiac

play08:07

Arrest this simple application of

play08:10

technology is saving lives today awesome

play08:13

right that is awesome we don't want to

play08:15

do anything to stop

play08:17

that but there are other applications of

play08:21

technology that we didn't anticipate

play08:24

that aren't making our world a better

play08:25

place and that we aren't able to

play08:28

govern so I want to continue to explore

play08:30

this a little bit and I want to work

play08:32

through an example with you and I want

play08:34

to work through the example of the

play08:36

internet so I remember when the internet

play08:39

was first conceived of and introduced

play08:41

right was developed by DARPA the defense

play08:43

department but rapidly went out into the

play08:45

public and the promise was this amazing

play08:48

notion I still remember how I felt about

play08:50

it at the time that we would be able to

play08:53

instantly communicate with people all

play08:55

over the world and any information you

play08:57

wanted would be at your fingertips

play08:58

coming right up on your computer really

play09:02

really wow that's amazing look what's

play09:04

happened that promise has been realized

play09:07

it's amazing right I think the promise

play09:10

of the internet has really changed our

play09:12

lives forever there's no going back I

play09:14

don't know about you but I'm really

play09:15

cranky when my Wi-Fi is down for just a

play09:17

little bit right we expect it now it's

play09:21

part of

play09:22

life but when the internet was first

play09:25

introduced I don't think we really

play09:27

Envision this future world of

play09:31

cyber or cyber warfare or the fact that

play09:34

the defense department would one day

play09:36

need a cyber command whose whole mission

play09:40

is to operate in cyberspace and defend

play09:42

it and defend our

play09:44

networks so that's an unanticipated

play09:48

consequence of the internet the internet

play09:50

has introduced a whole set of

play09:51

vulnerabilities and those

play09:53

vulnerabilities are also impacting Our

play09:56

Lives take for example this um this

play09:59

group of people that worked at Sony so

play10:02

you're at Sony you come to work you turn

play10:04

on your computer you expect your

play10:06

calendar and your email to come up and

play10:08

instead this is what you see that

play10:11

happened to every employee in Sony and

play10:14

November two years ago that was

play10:16

happening right before Sony was due to

play10:19

release a parody a movie about the North

play10:22

Korean leader Kim Jong-un it was a

play10:25

terrible movie okay terrible it was it

play10:28

was their biggest concerns at Sony

play10:30

before this happened was how big a flop

play10:32

it was going to be okay but North Korea

play10:36

didn't think it was so funny they didn't

play10:38

were they didn't understand that it

play10:40

might be a flop they didn't care and

play10:43

this is what happened to Sony they were

play10:45

hacked information was stolen

play10:47

embarrassing information and they got

play10:49

this threat and multiple threats for the

play10:51

next several weeks and it had a very

play10:53

devastating effect on Sony and the

play10:55

people who work

play10:57

there or perhaps consider that you're

play11:00

one of the 21.9 million people I bet

play11:03

some of you here are I certainly am who

play11:06

had their personal identifying

play11:08

information stolen out of the office of

play11:10

personnel Management's

play11:12

database okay stolen by a hacker they

play11:15

think in

play11:17

China well what's that hacker going to

play11:19

do with my data I don't know you know

play11:22

it's unnerving it's really

play11:24

unnerving and then take a much more

play11:27

recent example where we have a nation

play11:30

hacking into our political

play11:32

process the very Foundation of our

play11:35

government a very big deal

play11:39

indeed so that's where we've been that's

play11:43

the story of the internet great promise

play11:46

all achieved even more so but also

play11:49

simultaneously introduced some

play11:50

vulnerabilities that we hadn't

play11:52

envisioned hadn't counted on and hadn't

play11:55

prepared for all right so let's stop and

play11:58

ask ourselves going going forward what's

play12:00

next where are we now and what do we see

play12:02

going forward where maybe we can

play12:04

anticipate a little better do a little

play12:06

bit better job so I want to talk a

play12:09

little bit about the internet of things

play12:12

now the Internet of Things is already

play12:14

here in some ways right you can already

play12:17

turn the lights on in your home before

play12:18

you get there so it's nice and friendly

play12:20

when you arrive turn the heat up so it's

play12:22

warm and toasty get your coffee done in

play12:24

the morning from just to click on your

play12:26

smartphone wonderful things all aim at

play12:29

making our lives easier making our lives

play12:32

better internet of things manufacturers

play12:35

now are preparing to U include something

play12:37

like this little guy into our homes this

play12:40

little guy is going to be your friendly

play12:43

household servant he's going to help

play12:45

wake you up in the morning he's going to

play12:47

get the kids ready for school he's going

play12:48

to get breakfast ready he's going to

play12:50

have your favorite news program on your

play12:52

radio who wouldn't want him I'm I I want

play12:55

him right my life is hard I really would

play12:58

love to have some help right I think we

play12:59

all feel the Press of time today and we

play13:02

this is a very appealing concept it

play13:04

would make our lives

play13:06

better or would

play13:12

it who

play13:14

decides the Cyber protection in our

play13:17

friend the

play13:19

robot who decides if you're sitting out

play13:22

there thinking that there's some

play13:23

government policy somewhere that's going

play13:25

to make this guy safe I'm afraid that

play13:28

you're wrong

play13:29

there is no policy governing it there is

play13:32

no sense of what's the appropriate cyber

play13:35

protection to have in this robot before

play13:36

it goes to Market there's nobody

play13:39

deciding when it's ready to go to

play13:41

market this robot is going to come to

play13:44

Market and we're going to decide to to

play13:46

buy it and so who decides we do and the

play13:49

company that builds

play13:51

it okay so I am concerned about this and

play13:56

I wish I had an answer because I really

play13:58

don't think that policy makers are going

play14:00

to be able to keep up with this

play14:02

explosion of

play14:04

technology and you know this has been

play14:06

the study of academics for a long time

play14:08

morality ethics legality of tech new

play14:11

technologies and the introduction of

play14:13

technology and I want to suggest to you

play14:15

today that the time for study is over

play14:19

these things are here today this

play14:22

technology is exploding today now you

play14:25

may think that that robot is way far off

play14:28

and we have a lot of time to get it

play14:29

right but the manufacturers are

play14:32

predicting that they'll be ready for

play14:33

Market in less than two

play14:35

years now you may be skeptical I can

play14:38

tell you I'm skeptical but I'm a

play14:40

terrible predictor of these things two

play14:42

years ago I said no way are we going to

play14:45

have autonomous cars on the road in any

play14:47

near time right there it's going to take

play14:49

a long time well if you're driving on

play14:51

the Beltway today there's a very good

play14:53

chance that there's an autonomous

play14:54

vehicle driving somewhere in your midst

play14:57

right now today so you know autonomous

play15:00

cars is another really interesting thing

play15:02

right because autonomous cars are also

play15:05

going to have to make some difficult

play15:08

choices what if a car autonomous car

play15:12

gets into a situation where it has to

play15:14

choose whether to protect the occupants

play15:17

of the car or the people outside the car

play15:21

does the car drive into the pedestrians

play15:23

or the brick

play15:24

wall it seems to me that's a little bit

play15:27

like trying to choose between a

play15:29

combatant and a

play15:31

civilian yet the cars are on the road

play15:34

now and what about their hackability if

play15:37

you don't think you can hack these cars

play15:39

I encourage you to Google Jeep

play15:42

hack many of you have I can hear that

play15:45

you know it's and that's not even an

play15:47

autonomous car right he's just driving

play15:49

down the road His Radio goes crazy his

play15:51

air conditioning goes crazy but his

play15:54

transmission is cut and he coasts to a

play15:57

stop in the midst of traffic a very

play15:59

dangerous

play16:00

situation so so these things are with us

play16:03

now and I think there's a sense of

play16:05

urgency we have to start really

play16:06

grappling with this problem and I don't

play16:09

think policy makers are going to solve

play16:10

it so I think that maybe it calls for

play16:12

some new form of corporate social

play16:15

responsibility I think companies need to

play16:17

take this on it's a little different

play16:20

version of it but it you could see how

play16:22

it might apply where companies think

play16:24

about the vulnerabilities of their

play16:26

technology as well as the promise and

play16:28

the coolness of of it and perhaps we

play16:30

could get companies to adopt something

play16:32

we do in the defense department called

play16:34

red teaming when the defense department

play16:36

comes up with new plans or new

play16:38

technologies that we want to introduce

play16:40

we bring people in who are very

play16:42

imaginative and expert on on what

play16:45

adversaries might do and we say okay

play16:47

have at it give us your all use your

play16:50

imaginations take our plans apart tell

play16:52

us how this technology can be turned

play16:54

against us and they do that and that

play16:57

gives us a chance to fix our plans and

play16:59

to change the way we use the technology

play17:01

so we have a much better chance of

play17:03

success when we need to use these things

play17:06

perhaps we could Red Team new

play17:08

technologies before they go on the

play17:10

street we could get somebody to use

play17:12

their imagination to ask how that robot

play17:15

might end up being used against us

play17:17

instead of helpful helping us make our

play17:20

lives better now in order for a new form

play17:23

of corporate social responsibility to

play17:26

work there has to be a reward system

play17:28

right so they have to invest in more

play17:31

protection it's going to take them more

play17:32

time cost a little more money so the

play17:35

reward is that you preferentially buy

play17:37

the products that come with security

play17:40

that means consumers need to be educated

play17:42

and they need to ask the hard questions

play17:44

before they just buy the device and so

play17:47

that means the answer to who decides

play17:50

should is us I really think all of us we

play17:54

have to decide should corporations need

play17:56

to decide should and consumers need to

play17:59

decide should now listen I don't want to

play18:02

be Debbie Downer I love technology okay

play18:05

I work at the Applied Physics lab I am

play18:07

surrounded by cool technology every day

play18:10

and I love to buy it okay but I want a

play18:13

world where the technology is taking us

play18:15

to that great place that makes our lives

play18:17

better and I want to avoid a world where

play18:21

we're surprised by unintended events and

play18:24

vulnerabilities that we just didn't take

play18:26

a moment to anticipate but I'm confident

play18:29

that if we all pull together and we

play18:31

start asking the hard questions we can

play18:33

drive our world in that good direction

play18:36

not just for us today but for

play18:38

generations to come thank

play18:46

you

Rate This

5.0 / 5 (0 votes)

Связанные теги
technology ethicsautonomous systemsAI policycybersecuritymilitary techconsumer responsibilitycorporate responsibilityinnovation risksfuture techinternet of things
Вам нужно краткое изложение на английском?