Lester Holt interviews Open AI's Sam Altman and Airbnb's Brian Chesky

NBC News
26 Jun 202438:15

Summary

TLDR在这段访谈中,Sam Altman和Brian Chesky就人工智能(AI)的现状和未来进行了深入讨论。他们探讨了AI如何融入日常生活,并对各行各业产生影响。讨论还涉及了AI技术发展中的道德责任、数据使用问题,以及AI可能带来的社会和经济变革。此外,他们还讨论了人工智能在政治、政策制定和选举中的作用,以及如何确保技术的安全和对社会的积极影响。

Takeaways

  • 🧑‍🤝‍🧑 两位嘉宾是朋友,并且在一些重要项目上有过合作。
  • 🤖 大多数现代人在某种程度上都与AI有交互,但可能并不总是意识到这一点。
  • 🚀 AI技术正在迅速发展,已经过了一个关键的转折点,未来将会更加深入地融入我们的日常生活。
  • 🔍 AI在医疗领域的应用,如Callor Health使用AI进行癌症筛查和治疗计划。
  • 💡 AI的未来可能不是单一的‘奇点’,而是一系列能力的逐步提升。
  • 🔑 技术公司,包括Airbnb,正在通过收购AI初创公司或建立合作伙伴关系来参与AI领域。
  • 🌐 AI将像互联网一样,成为几乎所有公司的基础设施的一部分。
  • 🏆 Sam Altman认为AI可能是他职业生涯中最大的一件事,因为它将比以往任何技术都更深刻地影响人们的生活。
  • 🤖 AI技术的发展带来了道德责任问题,需要确保技术的发展是安全和负责任的。
  • 🌍 随着AI技术的发展,可能会出现地缘政治上的竞争和合作,需要全球性的框架来确保技术的健康和安全发展。
  • 💡 AI技术的发展需要社会、政府和企业的共同努力,以确保技术带来的利益最大化,同时避免潜在的负面影响。

Q & A

  • Sam和Brian在对话中提到了哪些AI技术的应用实例?

    -Sam和Brian提到了使用AI技术进行癌症筛查和治疗计划,以及未来可能帮助发现癌症治疗方法的应用。另外,还提到了Airbnb使用AI来更好地理解用户,提供个性化的旅行和居住体验。

  • 对话中提到的AI技术发展是否已经到达了一个临界点?

    -Sam认为AI技术已经越过了一个临界点,但他也指出未来还会有更多的临界点,随着系统获得新能力和更好的性能。

  • Sam和Brian如何看待与其他科技公司的关系,例如与苹果的合作以及Elon Musk的反应?

    -Sam没有预料到Elon Musk的负面反应,而Brian则认为这仅仅是Elon个人的反应,并不代表其他科技公司对OpenAI的看法。

  • 为什么Brian认为每家科技公司都需要拥有AI技术?

    -Brian认为AI技术将会像互联网一样被完全嵌入我们所做的一切事情中,因此每家公司都需要有AI技术,无论是通过合作还是自己的开发计划。

  • Sam和Brian讨论了AI技术可能带来的哪些社会变革?

    -他们讨论了AI技术可能带来的巨大社会变革,包括提高生产力、改变人们的生活方式,以及可能对教育、医疗、艺术等多个领域产生深远影响。

  • Sam被自己创立的公司解雇的事件对AI行业有什么影响?

    -Sam被解雇的事件引起了业界的广泛关注和讨论,这反映了AI技术发展过程中可能遇到的道德责任、监管和治理等挑战。

  • AI技术在发展过程中如何确保安全和伦理性?

    -Sam提到了他们公司在推出系统时会非常谨慎,确保系统的健壮性和安全性,并强调了与政府、政策制定者和其他利益相关者的沟通和合作的重要性。

  • Sam和Brian如何看待AI技术在即将到来的总统选举中的作用?

    -他们认为AI技术将是这次选举中的一个主要技术元素,需要确保准确的投票信息,防止深度伪造等滥用技术的出现,并保持警惕以应对可能的新的滥用方式。

  • AI技术的发展是否会遇到数据使用的法律和伦理问题?

    -Sam提到了关于数据使用的公平性和合法性的问题,他们正在考虑新的经济模型,以确保所有创造数据和知识的人都能参与其中。

  • Sam和Brian如何看待人工通用智能(AGI)的未来发展?

    -Sam认为AGI的发展不会是一个单一的关键时刻,而是一系列逐步提升能力的过程。他们强调了与社会透明沟通和技术发展步骤的重要性。

  • Sam和Brian认为AI技术如何影响创作和艺术领域?

    -他们认为AI技术可以成为艺术家和创意工作者的强大工具,而不是取代他们,可以协助创作、提高效率,并开辟新的艺术表达方式。

Outlines

00:00

🤖 AI技术的普及与未来展望

在这段对话中,讨论了人工智能(AI)技术的普及程度以及它如何影响日常生活。提到了像Chat GPT这样的工具,尽管许多人可能没有意识到,但AI已经渗透到许多服务中。Sam认为AI技术已经跨过了一个关键的门槛,并且未来将有更多的突破。提到了AI在医疗领域的应用,例如用于癌症筛查和治疗计划,并预测未来AI可能帮助发现癌症的治疗方法。此外,还讨论了AI技术如何被集成到各种服务中,以及它如何提升这些服务的水平。

05:01

🛠️ 技术的责任与道德问题

这段对话聚焦于技术的责任和道德问题,特别是AI技术。讨论了公众对于AI技术的不信任感,以及技术发展背后的道德责任。Sam分享了他与Brian之间的对话,强调了技术应当被看作工具,而不是控制我们的存在。同时,提到了Sam被自己公司解雇的经历,以及他对于AI技术发展的看法,包括对监管的需求和公众对于AI技术未来可能的担忧。

10:01

📰 AI技术在社会中的争议

在这一部分中,讨论了AI技术在社会中的争议,包括与名人声音相似的AI技术可能引起的问题,以及深度伪造技术(deep fakes)的滥用问题。提到了AI技术可能对个人和社会造成的伤害,以及行业需要采取的立场来防止这些滥用情况的发生。同时,也提到了AI技术在选举中的应用,以及如何防止虚假信息的传播。

15:01

🏛️ AI技术与政策制定

这段对话讨论了AI技术与政策制定之间的关系。Sam和Brian分享了他们对于AI技术发展的看法,以及政策制定者如何影响这一进程。提到了AI技术可能对选举产生的影响,以及不同企业可能受到选举结果的不同影响。同时,强调了建立一个全球性的框架来合作管理AI技术的重要性,以及避免让技术发展超出社会控制的范围。

20:03

🧠 AI训练数据与知识创造

在这一部分中,讨论了AI训练数据的需求,以及如何公平地使用这些数据。提到了AI模型的训练需要大量的数据,并且对于数据的来源和使用方式有深入的讨论。Sam提到了他们对于数据使用的法律立场,以及他们如何考虑数据创造者的权利。同时,提到了AI技术可能对互联网使用方式和经济模型产生的影响。

25:03

🚀 AI技术发展与价值观教育

这段对话探讨了AI技术的发展,特别是关于如何教授AI系统人类的价值观。Sam分享了他们如何尝试将特定的价值观嵌入到AI模型中,并且讨论了这些价值观的来源和如何让社会参与到这一过程中。同时,提到了AI技术可能带来的巨大变革,以及如何确保这些变革是积极的。

30:04

🌐 AI技术与全球合作

在这一部分中,讨论了AI技术在全球范围内的影响,以及不同国家之间如何合作来管理这项技术。提到了建立一个跨国的组织或团体来确保全球对AI技术的共识,以及如何避免技术被滥用。同时,讨论了AI技术可能带来的经济影响,以及如何通过技术创新来推动全球GDP的增长。

35:05

🛑 AI技术的自我调节与未来规划

这段对话讨论了AI技术发展过程中的自我调节问题,以及如何平衡技术创新与社会影响。Sam和Brian分享了他们对于AI技术未来发展的看法,包括可能的技术突破和对社会的潜在影响。同时,提到了在AI技术发展过程中可能需要的自我限制,以及如何确保技术发展不会超出社会的控制范围。

🌟 AI技术的广泛应用与期望

在对话的最后部分,讨论了AI技术的广泛应用,以及对未来5年内AI技术发展的期望。Sam分享了他收到的许多关于人们如何使用AI工具的积极反馈,并且表达了他对于AI技术能够继续帮助人们实现更多目标的乐观态度。同时,提到了AI技术在教育、艺术、科学研究等领域的潜在应用,以及如何通过这些技术来解决社会问题。

Mindmap

Keywords

💡人工智能(AI)

人工智能(AI)是指由人造系统所表现出来的智能,它能够执行通常需要人类智能的任务,如语言理解、学习、推理和感知。在视频中,AI是整个讨论的核心,涉及到它如何融入我们的日常生活和服务中,以及它对未来社会可能产生的深远影响。例如,提到了人们可能不知不觉中已经与AI有了互动,如使用聊天机器人Chat GPT。

💡通用人工智能(AGI)

通用人工智能(AGI)是指能够执行任何智能任务的AI系统,与人类智能无异。视频中提到,人们曾将AGI视为一个模糊而遥远的目标,但现在认为AI的发展是一个连续的、跨过多个门槛的过程,而不是一个单一的、革命性的时刻。

💡深度学习

深度学习是一种机器学习技术,它使用类似于人脑的神经网络结构来学习复杂的模式。视频中,深度学习是实现AI能力提升的关键技术之一,例如通过训练大型模型如GPT来理解和生成自然语言。

💡道德责任

道德责任是指在技术创新和应用过程中,开发者和使用者应考虑的伦理和道德问题。视频中讨论了AI开发者面临的道德责任,包括确保技术的安全性、防止滥用以及与政府和监管机构合作制定适当的政策和法规。

💡技术门槛

技术门槛是指技术发展过程中的关键点或阶段,在这些点上技术能力有了显著提升。视频中提到AI已经跨过了一些关键门槛,比如能够理解和响应自然语言,但未来还有更多的门槛等待跨越。

💡数据隐私

数据隐私涉及到个人数据的保护和使用,确保个人信息不被未授权访问或滥用。在视频讨论中,随着AI技术的发展,如何处理训练数据、保护数据隐私以及确保数据使用的公平性成为了重要议题。

💡深度伪造(Deepfake)

深度伪造是指使用AI技术生成的假图像或视频,看起来非常真实。视频中提到了深度伪造技术可能带来的问题,例如用于制作虚假的色情内容或影响选举等,这需要行业和政府采取行动来规范。

💡监管

监管是指政府或监管机构对某个行业或技术实施的监督和管理措施。视频中,讨论了AI技术发展需要适当的监管来确保安全、防止滥用,并促进公平竞争和创新。

💡技术创新

技术创新是指开发新的或改进现有的产品、服务或流程。视频中提到,AI是技术创新的一个关键领域,它正在推动多个行业的变革,如医疗、教育和艺术等。

💡价值观

价值观是指个人或社会认为重要的道德标准和原则。在AI的背景下,价值观涉及到如何将人类的道德和伦理标准编码到AI系统中,确保AI的行为符合社会期望。视频中讨论了AI发展中价值观的重要性,以及如何教会AI积极的价值观念。

💡技术发展

技术发展是指技术不断进步和创新的过程。视频中多次提到技术发展,强调AI技术的发展不是一场竞赛,而是一个渐进的、持续的进化过程,需要社会、政府和企业共同参与和影响。

Highlights

Sam和Brian讨论了人工智能(AI)在日常生活中的普及程度,提到了像Chat GPT这样的工具。

AI技术的快速发展,可能已经越过了一个关键的门槛,未来将有更多突破。

Sam提到AI在医疗领域的应用,例如Callor Health使用AI进行癌症筛查和治疗计划。

讨论了AI技术发展可能带来的伦理和责任问题,以及公众对此的担忧。

Brian强调了AI将如何深刻地影响人们的生活,甚至超过以往任何技术。

Sam和Brian讨论了与苹果公司的合作以及Elon Musk对此的反应。

Airbnb收购AI初创公司,预示着每个科技公司都需要拥有或合作AI技术。

讨论了AI技术在不同行业中的应用,例如Airbnb使用AI来更好地匹配用户需求。

Sam分享了他在OpenAI的经历,包括被解雇和重新加入公司的过程。

讨论了AI技术在政治和社会中的作用,以及其对选举可能产生的影响。

Sam和Brian讨论了AI技术对数据的需求,以及数据使用中的公平性和合法性问题。

讨论了AI技术可能带来的问题,例如deepfake技术对个人和社会的潜在危害。

Sam提到了AI技术在提高生产力和可能创造巨大经济价值方面的潜力。

讨论了AI技术发展中的价值观问题,以及如何教会AI积极的价值观。

Sam和Brian分享了他们对于AI技术未来发展的看法,以及对社会的积极影响。

讨论了AI技术在教育、艺术和科学研究等领域的应用潜力。

Sam表达了他对于AI技术未来5年的期望,希望能够继续为人们带来帮助和快乐。

Transcripts

play00:00

[Music]

play00:00

[Applause]

play00:00

[Music]

play00:03

[Applause]

play00:07

well you guys get all the Applause I've

play00:09

invented nothing Zippo great to see you

play00:12

guys welcome thanks everybody for being

play00:14

here very excited about this

play00:16

conversation uh we'll set this up by

play00:18

letting folks know you guys are friends

play00:20

you have your your work is kind of you

play00:23

know worked in together on some

play00:25

important projects and some important

play00:27

things so we're going to get into some

play00:28

of that as well but you're wondering why

play00:29

the two them here that's why thank you

play00:31

so much for your time let me get um

play00:33

let's start off with kind of a

play00:34

perspective Sam what percentage of this

play00:37

audience do you think has in some way

play00:39

interative with AI

play00:41

today I I would bet most

play00:45

uh I'm not going to hold you to it by

play00:47

the way in the

play00:50

90s it's a and most of us don't know

play00:52

where it's affecting our lives yeah you

play00:54

know there there are people who use chat

play00:56

GPT and you kind of know when you're

play00:58

using that or not but the number of

play00:59

people are integrating AI into all of

play01:01

their other services and taking our gp4

play01:04

and other models that we have and you

play01:06

know it's sort of

play01:08

like lifting a lot of services up has AI

play01:11

crossed a a critical threshold in the

play01:14

past

play01:16

year I think

play01:18

that yes but I think there will be many

play01:21

thresholds that AI uh crosses you know

play01:24

we used to Brian actually gave me great

play01:26

advice about this we used to talk about

play01:28

we're going to get to this like moment

play01:29

of AGI and you know it was this very IL

play01:32

defined term and I think it never made

play01:34

sense to think about it that way in the

play01:35

first place but we used to and now we

play01:38

think about it is it'll just be this

play01:39

series of thresholds uh where the

play01:41

systems will get new and new cap better

play01:43

and better capabilities so you know you

play01:45

can use chat GPT today for some things

play01:48

and you'll be able to use it for much

play01:50

more helpful tasks in the future um you

play01:53

know maybe today there are things like

play01:55

okay uh like for example one of our um

play01:58

one of our partners callor health is now

play02:00

using uh gp4 for cancer screening and

play02:04

treatment plans and that's great and

play02:05

then maybe a future version will help uh

play02:08

discover cures for cancer so I think of

play02:11

it as success of thresholds but

play02:13

definitely the fact that we can talk to

play02:15

computers in natural language and have

play02:17

them understand us and help us that's

play02:19

certainly been a threshold I want to

play02:21

talk about some things that we've seen

play02:22

in the news lately and get your reaction

play02:24

to it um at times you have both made

play02:26

friends and enemies fairly quickly you

play02:28

struck a big deal with Apple recently um

play02:32

Elon Musk was upset and said he wouldn't

play02:34

allow Apple products at his companies

play02:36

did you see that reaction coming uh well

play02:40

I saw it happen but no I didn't I I I

play02:44

didn't I sort of doubt it will actually

play02:46

happen um but I didn't predict that are

play02:49

are you does it represent something

play02:52

that's happening on the outskirts of

play02:53

open AI in terms of reaction from other

play02:56

tech

play02:57

companies uh no I think that's just like

play03:01

an Elon

play03:04

reaction and Brian let me turn to you

play03:07

Airbnb recently picked up an AI startup

play03:10

are we at a point now that every tech

play03:12

company is going to have to have a piece

play03:13

of this action a partnership or uh its

play03:16

own development plans yeah I mean I

play03:18

think that just like now every company

play03:21

almost in the world is on the internet

play03:23

AI is just going to be completely

play03:25

embedded in everything that we do and I

play03:27

think that one of the things that's

play03:29

incredible Sam is like Sam used to say

play03:31

you have to be if you want to be a great

play03:33

entrepreneur you have to be right about

play03:35

one big thing in your career and I think

play03:38

that Sam was right about one of the

play03:40

biggest things in the history of tech

play03:42

because this is going to be something

play03:43

that's going to affect people's lives

play03:45

more than any technology that we've ever

play03:47

seen in the past but I think a lot of

play03:49

the conversation you know we're talking

play03:50

about AI as this like existential

play03:52

enigmatic thing and I think one of the

play03:55

things we're missing is just talking

play03:56

about the practical ways that people can

play03:58

benefit their lives I can give you an

play04:00

example Airbnb but Sam has a lot of

play04:01

examples so today Airbnb is a way you

play04:04

like type in a city and you find a home

play04:06

and you book a home and that's Airbnb

play04:08

and it's pretty much the way that the

play04:09

internet's worked for the last 20 years

play04:12

but imagine in the future um systems

play04:14

that understand you better that's the

play04:16

real promise a computer that can

play04:18

understand you and can ask you like well

play04:20

who are you Lester like what are your

play04:22

hopes what are your dreams like where do

play04:24

you want to travel what do you one day

play04:25

want to do with your life and then it

play04:27

could actually understand you and be

play04:28

more of a Matchmaker really understand

play04:30

you and match you to people communities

play04:34

Services experiences anything you want

play04:36

to be able to travel and live anywhere

play04:38

in the world and that's kind of how I

play04:39

think airb be can use but I think almost

play04:42

every industry can get remade with AI

play04:45

and I think they can participate but the

play04:46

stakes are higher here than I mean what

play04:48

what you talk about is largely

play04:50

aspirational but with AI you're looking

play04:52

at some real fears that I think we all

play04:55

here understand so what does that mean

play04:57

in terms of the people who are running

play04:59

this most most of us are just passengers

play05:00

on this bus we're watching you guys you

play05:03

know do these incredible things you know

play05:05

talk about it being compared to the

play05:06

Manhattan Project and wondering where is

play05:08

this going and wondering who are the

play05:10

people behind it can we trust these

play05:12

people so talk if you can about the

play05:15

moral

play05:16

responsibility um and and for all of us

play05:18

to know these people know people like

play05:20

you who are making these

play05:22

changes yeah I mean I I can share um I I

play05:27

me I I met Sam in 2008 and when I came

play05:30

to Silicon Valley the word technology

play05:32

might as well have been like a

play05:33

dictionary definition for the word good

play05:35

I mean Facebook is a way to share photos

play05:37

of your friends YouTube was like cat

play05:38

videos Twitter was like talking about

play05:40

like what you're going doing today and I

play05:42

think there was this General innocence

play05:44

and I think over time what we realize is

play05:46

when you take a tool and I think

play05:49

technology is a tool you know Steve Jobs

play05:51

one of the things he said is he put a

play05:53

handle in the back of every computer cuz

play05:54

he said never trust a computer who can't

play05:56

throw out the window he said these are

play05:58

tools and we're meant to dominate them

play06:00

they don't dominate us and I think one

play06:02

of the things that happen though is when

play06:04

you put a tool in the hands of hundreds

play06:06

of millions of people you know they're

play06:08

going to use it for ways you didn't

play06:10

intend and I think we are much more

play06:12

sober and realistic in this new

play06:15

generation because I think we learned a

play06:17

lot of the lessons of the last

play06:18

generation we learned about how

play06:19

technology can be used mostly for good

play06:21

but there's always unintended

play06:22

consequences and so I think this time

play06:25

one of the things I've seen Sam do is

play06:26

he's been very cautious not polanish at

play06:29

all about where this technology is going

play06:31

and and really telling governments there

play06:33

actually is a need for regulation Sam I

play06:36

want to get your your take and give you

play06:38

a chance to talk about your firing you

play06:41

were you were fired from your own

play06:42

company

play06:44

why let me first touch on something that

play06:48

Brian said in with your earlier question

play06:49

and then I will very happy to talk about

play06:51

that

play06:52

um

play06:55

I this is going to be a huge change in

play06:59

society uh I think unlike other

play07:02

technological Trends um we're sort of

play07:04

we're aware even if today we're like

play07:06

okay chat gbt is this like very helpful

play07:08

tool and it's you know once I use it I'm

play07:10

not scared of it um there is a sense

play07:15

of super understandable anxiety about

play07:19

where this is going to go what does it

play07:21

mean if these tools keep getting more

play07:23

capable at the rate they've been getting

play07:25

capable at and there's tons of wonderful

play07:27

things and we could talk about those all

play07:29

day but there is this what is the future

play07:32

going to look like even if we solve

play07:34

every safety problem even if we solve

play07:37

every um you know misuse problem even if

play07:40

we figure out the perfect regulatory

play07:42

regime like what are what are our lives

play07:44

going to be like when it's not just like

play07:46

the computer understands us and gets to

play07:48

know us and helps us and do these things

play07:50

but we can say like hey computer like

play07:53

discover all of physics and it can go

play07:55

off and do that um what does it mean

play07:57

when we can say like hey start and run a

play07:59

great company you can go off and do that

play08:02

so that's a big change uh that's a lot

play08:06

of trust that we have to earn to be some

play08:09

of the stewards there will be many other

play08:10

people working on this of this

play08:12

technology and we're we're proud of our

play08:15

track record uh I think if you look at

play08:17

the systems that we've put out and the

play08:19

time and Care we've taken we've been

play08:20

able to get them to a level of generally

play08:23

accepted robustness and safety that is

play08:25

well beyond what what people thought we

play08:27

were going to be able to do when we got

play08:28

to these initial systems a few years ago

play08:31

like when you looked at gpt2 or gpt3 and

play08:33

said are we going to be able to make

play08:34

this safe enough to use a lot of people

play08:37

thought no but but there's this thing in

play08:40

there's this the future is like looming

play08:42

large and we've got to continue to earn

play08:45

the trust with what we do the systems we

play08:47

put in the world um and how we how we

play08:51

have legitimate decision-making over

play08:52

these systems how we broadly Empower

play08:54

people with them how we continue to

play08:56

promote stability in the world in the

play08:57

face of all this change um and it makes

play09:00

people very anxious uh and the whole

play09:03

like the whole board firing me and

play09:04

coming back thing I mean Brian was an

play09:06

enormous help during that uh it was

play09:09

obviously a super painful experience but

play09:12

I do understand why anxiety levels have

play09:16

been so are so high uh I and I think the

play09:20

previous board members like they're

play09:24

nervous about the continued development

play09:26

of AI uh had whatever feelings they had

play09:29

about

play09:29

me and how we were doing things and

play09:32

although I super strongly disagree with

play09:35

what what they think things they've said

play09:37

since how how they acted uh I think

play09:39

there are like fundamentally good people

play09:42

who are nervous about the future and

play09:44

trying to figure out how we get to a

play09:47

good outcome um I'm super excited with

play09:50

the new board they're extremely uh

play09:53

constructive and helpful and experienced

play09:55

and strong and it's been a very

play09:56

productive thing since then but that was

play09:58

a horrible experience to go through not

play10:00

not during the moment where it was just

play10:02

like this is a crazy thing let's figure

play10:05

out how to undo it and Brian was like

play10:07

unbelievably helpful but then the period

play10:08

after that uh where I just had to like

play10:12

kind of pick up the pieces in this like

play10:13

state of emotional shock that was that

play10:15

was really bad you were trying to pick

play10:17

up the pieces you were picking up the

play10:18

phone Brian yes explain that well I

play10:21

remember

play10:23

um so maybe just to go back in time um

play10:27

when chat GPT launched and launched in

play10:29

Nate late November

play10:32

2022 it was a phenomenon unlike anything

play10:35

we'd seen probably since the launch of

play10:36

the iPhone I have no recollection

play10:38

anything and I we knew overnight

play10:40

everything was going to change and I

play10:42

remember meeting with Sam and I said you

play10:44

know I've been through a little bit of

play10:45

this rocket ship before and I'm not

play10:47

going to advise you on the core research

play10:49

of AI but when it comes to like

play10:51

marketing and like stakeholder

play10:53

management and PR and like design and

play10:55

product and everything that's not that

play10:56

you're going to go on a rocket ship and

play10:58

I'm only where I am today because people

play11:01

believed in me and people helped me and

play11:03

one of the great things about silken

play11:05

Valley is is a high trust place where

play11:06

people will help so I just wanted to be

play11:08

helpful to him so this goes on for about

play11:10

a year it's one year later and I get a

play11:13

text message and it's actually from

play11:17

somebody else saying Sam was fired from

play11:19

open the eye I was like fired and I

play11:22

immediately texted him and I think his

play11:25

text back to me like was 5 minutes later

play11:27

he had just found out he was fired

play11:30

and he said so brutal and I go what

play11:33

happened so we get on the phone and he

play11:34

doesn't know what happened it wasn't

play11:36

fully explained to him and by the way

play11:38

his co-founder who was also on the board

play11:40

was removed from the board and that

play11:42

seemed to me very suspicious so I got on

play11:45

the phone with him and Greg and I felt

play11:47

really comfortable with the

play11:49

circumstances that this was not a fair

play11:51

process and I think this should always

play11:53

be a fair process but especially if

play11:54

they're Founders because they're very

play11:56

very difficult to replace and what I

play11:58

noticed in those first 24 hours was not

play12:01

a lot of people sticking up for Sam and

play12:03

I in my darkest times in my crisis have

play12:06

had people stick up for me and that's

play12:08

what I wanted to do for Sam and I

play12:10

basically we talk through things and I

play12:13

said I think the most important thing

play12:15

for you to do is just be completely

play12:17

transparent internally and externally

play12:19

with what you know and what's happening

play12:22

but the most remarkable thing and the

play12:23

thing that made me really want to defend

play12:25

him was you know you you learn a lot

play12:28

about people in a crisis if you really

play12:31

want to know what someone's like see

play12:33

them in a crisis and at no point in the

play12:36

5

play12:37

days this went down did Sam ever even

play12:41

for a second focus on self-preservation

play12:44

he was completely I I was like why

play12:47

aren't you sticking up for here like why

play12:48

aren you care more about yourself that's

play12:49

what I was saying to him like somebody's

play12:51

got to stick up for you you're not even

play12:52

sticking up for yourself and he just was

play12:54

so focused on the team and what was best

play12:56

for the team and I think that's what

play12:58

really made me

play12:59

you know so vifer like focused on

play13:02

helping I want to turn Sam if I can turn

play13:04

to the some of the bad publicity you've

play13:06

received lately including the dust up

play13:07

over the voice of Sky one thing that

play13:10

could help clear up the concern over the

play13:12

similarity of Sky's voice to Scarlett

play13:14

Johansson would be to hear from the

play13:16

actor who you say was hired to be the

play13:18

voice of sky is that something that you

play13:21

will

play13:22

do certainly if she wants to I mean I

play13:24

know she's made statements through her

play13:27

agent uh but I'm not I don't I don't

play13:30

know where I mean anything she wants to

play13:31

do would certainly be fine with us the

play13:33

whole thing opens up certainly a larger

play13:35

question of what do we own in an AI

play13:38

world uh do we have control over our

play13:40

likenesses we're seeing uh you know deep

play13:43

fake porn right now people's you know

play13:45

heads being swapped um these are harmful

play13:48

on an individual level how and I know

play13:51

it's not unique to open AI but how is

play13:53

the industry going to respond to

play13:56

this I mean we think the industry needs

play13:58

to take a super strong stance on that it

play14:00

is we obviously do uh and there are

play14:04

other issues related to how this

play14:06

technology is being used uh to harm

play14:09

people that we think the industry needs

play14:10

to take a very strong stance on um we

play14:12

try to be not only very loud in our

play14:14

calls for regulation to prevent some of

play14:17

these misuse cases these misuses which I

play14:19

think is happening but also to set a

play14:21

really good example in the products and

play14:23

services we offer and hold ourselves to

play14:25

a very high were these things inevitable

play14:27

I mean you you clearly saw the risk

play14:30

coming as this uh technology was

play14:33

maturing like deep fakes and stuff deep

play14:35

fakes yeah head face swapping yeah um it

play14:39

was inevitable that the technology was

play14:41

going to be capable of that and so you

play14:43

know of course there are going to be

play14:45

systems out there that allow that uh but

play14:48

that's where I think we society and

play14:51

governments have a role to say you know

play14:54

will allow

play14:56

some use cases of Technology we not not

play14:59

comfortable with but in some places we

play15:01

are going to draw a line and face

play15:03

swapping deep fake revenge porn is a

play15:05

great place to draw a line we're nearing

play15:08

a presidential election as you know

play15:10

we're seeing some of the deep fakes

play15:11

already happening there's been talk

play15:13

about this for years that this would be

play15:15

a very difficult election what are your

play15:17

thoughts as you begin to see this stuff

play15:19

kind to emerge and in terms of your

play15:21

responsibility your industry's

play15:23

responsibility to make sure that we're

play15:25

not being overwhelmed by disinformation

play15:28

yes so you know this will be I think the

play15:30

first election where there's not just

play15:32

the US many other elections this year

play15:34

where AI is like a major technological

play15:37

element Providence is really important

play15:40

accurate polling information and

play15:41

avoiding some of the issues we've seen

play15:44

with uh previous technological platforms

play15:46

and other election Cycles um and you

play15:50

know preventing things like deep fakes I

play15:52

I think those are three top of- mind

play15:54

issues for us uh in this election cycle

play15:57

I'll also add that there may be other

play16:00

things ways people try to misuse misuse

play16:02

this that we're not aware of yet um so

play16:05

we're we have like a whole monitoring

play16:07

efforts set up and uh I think we'll need

play16:09

a very tight feedback loop as we get

play16:11

closer to the election uh to see if

play16:14

there's additional areas where people

play16:16

are trying to abuse the technology while

play16:17

we're on the topic of the election Brian

play16:19

I'll let you start what what do you

play16:20

think will be the impact on your

play16:22

individual businesses in terms of the

play16:23

outcome of this

play16:26

election hard to say I mean Airbnb is

play16:30

kind of more of a cityby city

play16:31

state-by-state thing so the changes in

play16:34

um Federal administrations don't have

play16:36

not historically um had a huge impact on

play16:39

us and we're of course in a 109 220

play16:42

countries so we're a pretty resilient

play16:44

business I mean one of the things we saw

play16:45

during the pandemic is when one part of

play16:47

our business changes it adapts to some

play16:49

other part so I don't anticipate a

play16:50

really big change based on who's who's

play16:52

who's elected how about you

play16:57

Sam I do

play16:59

expect some big impact based off who's

play17:02

elected but I don't know how to I I

play17:04

don't know what it'll be it it does seem

play17:06

to me like AI is going to be an

play17:08

increasingly important geopolitical

play17:11

priority in the world um but I'm you

play17:14

know I I hard for me to say exactly how

play17:16

it's going to go one of the

play17:18

things that I've really valued about

play17:20

Brian so Brian kind of like under sold

play17:23

what he mentioned earlier in that first

play17:24

year kind of like what he's done to help

play17:26

but when Chach BT started taking off and

play17:28

everything just went crazy for me a lot

play17:30

of people reach out and say oh I'd love

play17:31

to help you I can do this I can do that

play17:33

and you know everyone's I think they

play17:35

mean it when they say it but everybody's

play17:36

just busy um Brian was like the person

play17:39

who would just sit down with me for like

play17:41

3 hours every other week and like give

play17:42

me a list and say Here's the five things

play17:44

you got to do now here's where you're

play17:45

behind here's what you're screwing up

play17:47

here's what you got to proactively do

play17:48

here's what you got to think about um

play17:51

and it's basically like almost always

play17:53

right and uh I learn to just like always

play17:56

shut up and follow the advice um

play17:59

one of the things that Brian started

play18:01

saying

play18:03

uh more recently uh is that you're

play18:07

probably not thinking enough about

play18:10

politics and policy and what that's

play18:12

going to mean for how the world thinks

play18:14

about Ai and here's the people you need

play18:16

to hire here's the here's what it means

play18:18

to like you know map this out and think

play18:21

about a strategy here here's what you

play18:23

should do and definitely not do and uh

play18:27

that's been like super helpful and do

play18:29

think for our business it's going to be

play18:30

really important and I think one of the

play18:31

things Lester is that you know I

play18:33

remember coming to silen Valley we

play18:35

didn't think these platforms would have

play18:37

the impact on society that they we now

play18:39

know they have and so I think the

play18:41

mindset that Sam has and even the

play18:43

questions you're asking him probably

play18:44

weren't asked of tech leaders 15 years

play18:46

ago I think the whole industry is

play18:48

changed the whole conversation is is

play18:50

like like Sam has built out much more of

play18:53

a team much earlier than the big tech

play18:54

companies would have around policy and

play18:56

stakeholder management I want to ask if

play18:58

about one of the things we've learned in

play19:00

your research and developing chat GPT

play19:02

and others is the requirement of data to

play19:05

train up these modules it's an

play19:07

insatiable appetite as it as it appears

play19:10

has it changed how you view what is fair

play19:12

use and whose material compated material

play19:15

you can

play19:16

use first of all I don't think we know

play19:19

yet what the future of how these models

play19:22

get smart is going to look like you know

play19:24

is it that we just need more and more

play19:25

data

play19:26

forever doesn't feel to me like likely

play19:29

to be right you know if you think about

play19:30

what a human can learn from Reading one

play19:33

textbook it's very different than what

play19:35

it takes these AI models for now so I I

play19:39

expect and also there comes a point

play19:41

where to like invent new science you

play19:43

need to just sit there and think and run

play19:45

some experiments but it's not in any

play19:46

textbook because it's new so I I expect

play19:50

that the future of how we think about

play19:53

training data um and what it takes to

play19:56

make these models really capable is it

play19:58

going to be a roadblock though in the

play19:59

development of these products that's

play20:01

what I was trying to say I I you know

play20:03

this is like science we don't know for

play20:04

sure I think it won't be um now that

play20:08

said uh the issue of like fair use and

play20:11

how to think about how people who create

play20:14

data create knowledge create you know

play20:17

Wonderful

play20:18

books I think although like from a legal

play20:22

perspective we're confident in our fair

play20:23

use position now that we see where this

play20:26

may evolve um we need to figure figure

play20:28

out New Economic models where the whole

play20:31

world gets to participate and I think

play20:32

this goes beyond just people who have

play20:35

data that we train on but also uh and

play20:38

we've you know found many different ways

play20:39

to license it and do different things

play20:41

but also the people that provide the

play20:42

feedback to the models the people who

play20:44

like go off and create great realtime

play20:46

news that maybe the model doesn't train

play20:48

on but you want to display it um at the

play20:51

time and that there's a lot of work that

play20:52

goes into that uh and you know I I think

play20:57

maybe AI is going to not super

play21:00

significantly but somewhat significantly

play21:03

change the way people use the internet

play21:05

and if so you can see some of the

play21:06

economic models of the past needing to

play21:08

evolve uh and I think that's a broader

play21:10

conversation than just training data but

play21:12

it's sort of like content in general

play21:15

surfaced via AI I want to ask you about

play21:17

artificial general intelligence that's

play21:19

taking it up taking up the game

play21:21

considerably if I understand it

play21:22

correctly that's when you get to the

play21:24

point that the computers can do whatever

play21:26

we can do is that a fair summation you

play21:28

know that I I I think I was wrong to

play21:31

initially think about it as this one

play21:34

moment as we talked about but uh it does

play21:38

seem to me and now I think people use

play21:40

AGI to means all all sorts of things it

play21:44

it does seem to me that trying to sort

play21:47

of road map out for the world where we

play21:49

think the significant increases in

play21:51

capability will be um can do what you

play21:55

know people can do can create new

play21:57

science can what whole companies can do

play22:00

uh that feels like it'd be very useful

play22:02

for the industry to sort of agree on so

play22:04

that we could have these conversations

play22:06

in a little bit more of a disciplined

play22:07

way and that's one of the things we

play22:08

talked about is like just operating

play22:11

transparently letting people know that

play22:13

it's probably not this one promethian

play22:15

moment where it goes from AI to AGI that

play22:17

there's many many steps just like the

play22:19

story of technology and that it's really

play22:21

important that we bring Society along

play22:23

and that we're not operating in this

play22:24

black box and people think there's only

play22:25

a few people controlling the future that

play22:28

were transparent with other developers

play22:31

and computer scientists and researchers

play22:33

and policy makers about these are steps

play22:35

we're going this is what we're seeing

play22:37

and this is what we think the next four

play22:38

steps look like but isn't but isn't this

play22:40

a race on a different level the stakes

play22:42

are so high I mean are are you do

play22:44

consider yourself in a race and do you

play22:46

think it's one you'll win to get to the

play22:47

point of artificial general intelligence

play22:50

I don't think of it as a race I

play22:52

understand why that's like a very

play22:54

compelling dramatic way to to talk about

play22:57

it I I think that

play23:00

there may be a race between nation

play23:02

states um at some point but the

play23:05

companies that are developing this now I

play23:07

think everyone feels the stakes the need

play23:09

to get this right I also think to to

play23:12

Brian's point that it's not there's not

play23:15

this Milestone we're all racing towards

play23:17

it is this it is this continual

play23:19

evolution of Technology where we melted

play23:24

sand and figured out how to like turn it

play23:26

into transistors and then figured out

play23:28

how to like build an operating system

play23:30

and do a certain kind of programming and

play23:32

we made it bigger and bigger and then we

play23:34

figured out how to like train these

play23:35

systems that are sort of smart in some

play23:37

ways but they're not off like running as

play23:39

these autonomous things they're tools

play23:41

that we're using to do more than we

play23:43

could before in the way that we used

play23:45

computers to do more than we could

play23:47

before without Ai and in the way we used

play23:51

machines in the industrial revolution to

play23:52

do more than we could before and the way

play23:54

we used agriculture to be able to have

play23:56

time and space to do more things than we

play23:57

could before

play23:59

and and I don't think it's this race to

play24:01

a milestone it's this ongoing the next

play24:04

step and the next one and the next one

play24:05

and the tools are going to get better

play24:06

and better but what happens is it's not

play24:10

like for sure technology is not neutral

play24:13

and tools are not inherently neutral

play24:15

things but the impact we can have by

play24:18

building the tools is important we want

play24:20

to get that right people are going to go

play24:22

use these tools to invent the future

play24:24

that we all collectively live in and

play24:26

what one person can already do now

play24:28

before chat GPT existed is an impressive

play24:30

leap and by the time we get to GPT 6 or

play24:33

7 what one person can do will be

play24:35

incredibly uh increased and I'm very

play24:38

excited for that like I think that is

play24:40

that is the story of the world getting

play24:42

better we make technology um people use

play24:45

it to build new things Express their

play24:48

creative ideas and Society improves yeah

play24:50

when you when you talk about these

play24:52

programs though um and when you give

play24:54

them the ability to do what we do we

play24:57

also have a set of values different sets

play24:59

of values we view common decency in a

play25:03

not so common way sometimes how do you

play25:05

teach that to a computer in a way that

play25:07

won't be harmful how do you teach values

play25:10

that are

play25:11

positive one of the things that has

play25:14

surprised me and I don't want to say

play25:15

this gets us like this solves the whole

play25:18

AI alignment problem um but at our

play25:20

current levels of systems uh our ability

play25:23

to teach a Model A Certain set of values

play25:26

and to behave in a certain way um is way

play25:29

better than I thought it was going to be

play25:31

at this point now there's a harder

play25:33

question which is who gets to decide

play25:34

what those values are um who gets to

play25:36

decide what the defaults are how much an

play25:38

individual user can uh sort of customize

play25:41

them within those broad bounds and as an

play25:44

early step there we put out this thing

play25:45

maybe a month or two ago called the spec

play25:47

where we tried to say um here is our

play25:50

desired Model Behavior here are the

play25:52

values we want our model to follow and

play25:54

that way people can at least tell if

play25:55

it's a bug or intentional when it does

play25:58

something that they don't like and over

play25:59

time Society can debate what those

play26:01

values are and we can adapt to it um so

play26:05

I'm very heartened by our technical

play26:07

progress on this topic but man writing

play26:09

that set of values or getting Society to

play26:11

debate and agree on what those set of

play26:13

values should be that's a much harder

play26:15

Challenge and Brian you as you've talked

play26:17

about you've given Sam um advice from

play26:20

time to time I I I read somewhere I

play26:22

don't have the exact quote in front of

play26:23

me at least I can't find it right now

play26:25

but to the notion of go for it and

play26:27

figure it out later

play26:29

I don't what is the quote it's it's the

play26:31

idea that you you have believe that you

play26:33

need to go for it when it comes to this

play26:35

kind of research are there brakes that

play26:37

should be put

play26:38

on well yeah I mean I think if you

play26:41

imagine you're in a car the faster the

play26:43

car goes the more you need to look ahead

play26:46

in front of you and you need to

play26:47

anticipate the corners and I think that

play26:49

we acknowledge that this technology is

play26:51

so so powerful that I think this is why

play26:55

we're like being so thoughtful I mean

play26:57

people really are agon izing over how to

play26:59

treat these systems and I do not

play27:00

remember us doing this in 2007 2008 so I

play27:03

do think it's a very very different time

play27:05

I mean one of the things that like Sam

play27:07

and I talked about was bring other

play27:09

stakeholders in early and one of the

play27:10

things we did last year was he went on a

play27:12

tour around the world meeting with

play27:13

people it was mostly I think a way to

play27:15

get feedback from people educate people

play27:18

and really get feedback so I think I

play27:21

think the key Point Lester is we never

play27:23

go so fast that we leave Society behind

play27:26

that we only go as fast as to bring

play27:28

bring everyone along and I think that if

play27:30

everyone here could feel like they could

play27:31

participate and they could have their

play27:33

input into it then I don't really think

play27:35

there's a huge thing to fear I think the

play27:37

thing to fear is something we don't

play27:39

understand we're left out of and

play27:40

something that runs away from us that we

play27:42

can't control and so that's the future

play27:44

we don't want to live in also it's quite

play27:46

interesting if you say the word AI it

play27:48

can be scary you say Chachi BT it

play27:50

doesn't sound as scary because it's a

play27:52

very tangible tool so I think we need to

play27:55

also just focus on like that which is in

play27:57

front of us and how can we help people

play27:59

there's a lot of problems right now and

play28:01

open AI I mean it can lead to a lot of

play28:03

scientific research and Discovery um

play28:05

Chach PD can be an incredible tool for

play28:07

artists um you know Airbnb we can think

play28:10

we think it can really bring people

play28:11

together we're living in this huge

play28:13

epidemic of Lon we can use this to help

play28:15

bring people together at the end of the

play28:17

day it's not the technology it's the

play28:18

people with the technology it's always

play28:20

comes down to the people their values

play28:22

and are they good people the way I sort

play28:25

of think about this is um we need to

play28:28

learn how to make safe technology we

play28:31

need to figure out how to build Safe

play28:32

products and that in that includes like

play28:35

an ongoing dialogue with Society about

play28:37

hey this has this impact I didn't expect

play28:39

or don't want and also you're not

play28:41

letting me do this thing that's really

play28:42

important for this reason you didn't

play28:44

understand so the way that we talk to

play28:45

the broader world and the people that

play28:47

use and impact by our products and

play28:49

impacted by our products and let let

play28:51

them reflect what they want and then

play28:53

also like a safe operating plan which is

play28:56

we get better and better at predicting

play28:58

capabilities um research is of course an

play29:01

open question you don't always know

play29:02

where it's going to go but before we

play29:03

start training a new model we'd like to

play29:05

be able to say here here are the

play29:07

dangerous capabilities that we think

play29:09

could happen we have a preparedness

play29:10

framework to test them sometimes this

play29:12

takes a very long time uh with gp4 for

play29:15

example we had about eight months

play29:17

between when we finished training when

play29:18

we released it including lots of like

play29:20

external consultation in red teaming um

play29:23

future models may take even longer but

play29:25

it is very important to get the feedback

play29:28

from society one thing that I don't

play29:29

think is good is to let a huge

play29:32

capability overhang build

play29:34

up uh and then we haven't had that

play29:36

feedback loop with Society so we we we

play29:38

we do need to figure out how to balance

play29:40

that but yeah you know taking the time

play29:43

to get it right is very important are

play29:44

you ever inclined or you think you'd

play29:45

ever be inclined to back up to see the

play29:47

future and and a and find it is maybe as

play29:50

scary as some some people have suggested

play29:52

are you prepared to that hit that moment

play29:54

where you have to take a step back even

play29:56

as your competitors may want to move

play29:57

forward forward for sure um there are

play30:00

things that we have built and chosen not

play30:03

to release or held back for long periods

play30:05

of time um there are plenty of other

play30:07

companies that would release things that

play30:09

we won't um we're not going to get every

play30:12

decision right of course and uh we also

play30:15

May at some point deploy something and

play30:16

need to take it back but there'll also

play30:18

be things that we just don't deploy you

play30:21

you we talk about these scary images did

play30:24

it help when you compared where you are

play30:26

with a I with the Manhattan Project the

play30:30

the race to build an atomic weapon was

play30:32

that helpful for you as you try to make

play30:35

your

play30:36

case I mean we we try to give a number

play30:40

of historical analogies because we think

play30:43

it is important we may be wrong we may

play30:45

be right but it's important for us to

play30:46

tell Society what we believe the level

play30:49

of importance of this technology is

play30:51

there's no perfect historical analog for

play30:54

any new technology so we can say there

play30:56

were some things about the Manhattan

play30:57

project that are like what we're doing

play30:59

now there's some things about the Apollo

play31:00

program there are some things about the

play31:02

iPhone there are some things about that

play31:03

iMac with the handle which I also really

play31:05

loved um there's some things about the

play31:08

internet there are some things about the

play31:09

Industrial Revolution

play31:12

and but what I think is important is to

play31:15

say here are the parts where we can look

play31:17

to a historical analogy and here are the

play31:19

parts where we can't and the shape of

play31:22

this technology and kind of the

play31:23

decisions and the impact it is

play31:25

fundamentally like a little bit

play31:27

different than anything I think I think

play31:28

it's different than the Manhattan

play31:29

Project it's not a race it's not going

play31:31

to be done in secret and I think Nations

play31:34

can collaborate together and there're

play31:35

could be a transnational kind of group

play31:38

or body that could really kind of align

play31:40

to make sure we're all on the same page

play31:42

which would be best for society and

play31:44

frankly probably best for entrepreneurs

play31:45

so they don't have to comply with like

play31:46

200 different laws we think we think

play31:48

that's super important uh to get to get

play31:50

some sort of global framework in

play31:52

cooperation uh I think we're we're

play31:54

really going to need that I think one of

play31:55

you mentioned the the the nation states

play31:57

is there a risk of of Nations States

play32:00

taking this technology and using them in

play32:02

a in a dangerous way

play32:05

or absolutely and um I you know I think

play32:09

you always have to be really really

play32:10

careful about like who this technology

play32:13

who who you're putting this technology

play32:15

in the hands of and I think it goes back

play32:16

to like some of the things Sam's

play32:17

thinking about like one of the things we

play32:19

like I know they developed early on that

play32:21

they chose not to release is like voice

play32:23

cloning right there's technology already

play32:25

where you can basically like capture

play32:26

someone's voice but obviously that would

play32:29

be very very dangerous because obviously

play32:31

you can imagine how it could compromise

play32:32

elections and major security risk so I

play32:36

think one of the things is just thinking

play32:37

about like who could these tools end up

play32:39

in the hands of and therefore if you let

play32:41

the genie out of the bottle could it get

play32:43

like too dangerous and so be very

play32:44

thoughtful about it yeah and Sam

play32:46

according to one report you speculated

play32:48

AR artificial in general intelligence

play32:51

could acrew as much wealth as a $100

play32:53

trillion that's wealth that you said you

play32:56

would then redistribute is it

play32:58

was that an accurate quote and do you

play32:59

want to expand on it I I think the sort

play33:02

of point I was trying to make was that I

play33:04

thought it could like double the world's

play33:05

GDP um which feels like reasonable to me

play33:09

and certainly would be in line with

play33:10

other technological

play33:11

revolutions um yeah we do think this is

play33:14

just going to be a massive driver of

play33:17

productivity and already at this early

play33:19

stage seeing what people are doing with

play33:23

it to sort of vast improve products and

play33:26

services do you think do you understand

play33:27

have that would sound to a lot of people

play33:29

though um for for for sure of course uh

play33:34

but I think like this is where this is

play33:37

where historical analoges are helpful

play33:40

and this is where it is helpful to look

play33:41

at the chart of world GDP over time and

play33:46

you know if the if the world GDP can

play33:48

grow at you know 7% a year like which

play33:52

sounds hugely fast but maybe with a

play33:54

technological shift like this um is not

play33:58

that far away I'm always bad at doing

play34:00

this in my head but I think that's like

play34:01

only 10 years to

play34:02

double

play34:04

so I I think it

play34:08

is you know I think it is worth taking

play34:11

the potential of this technology to do

play34:13

enormous good very seriously and I think

play34:15

we can now see more of what that looks

play34:17

like as people are adopting the tools

play34:19

preview if you will for us Chad GPT five

play34:22

um what what will the leaps in

play34:24

technology be and and does it put you on

play34:26

a straighter path to where you want to

play34:28

be does it put us on a what path I'm

play34:30

sorry a what path does it put you on a

play34:32

straighter path in terms of your goals

play34:34

um so we don't know yet uh you know

play34:37

we're optimistic but we still have a lot

play34:38

of work to do on it uh but I expect it

play34:43

to be a significant Leap Forward um a

play34:46

lot of the things that GPT for gets

play34:48

wrong you know can't do much in the way

play34:50

of reasoning sometimes just sort of

play34:53

totally goes off the rails and like

play34:54

makes a dumb mistake uh like even like a

play34:57

six-year-old would ever make um I expect

play34:59

it to be much much better in those ways

play35:02

and to be able to be used for a much

play35:04

wider variety of of more helpful tasks

play35:07

and it does go off the rail sometimes is

play35:09

that a result to back where we were we

play35:11

were speaking about the lack of data or

play35:13

the shortage of

play35:15

data I think it's many things together

play35:18

it's we're we're still just like so

play35:20

early in developing such a complex

play35:22

system um there's data issues there's

play35:26

algorithmic issues uh the models are

play35:28

still quite small relative to what they

play35:30

will be someday and we know they get

play35:32

predictably better uh so I think it's

play35:34

more like there's many things that we

play35:37

need to go improve all of and we're

play35:40

still just like so early in the

play35:42

technology you know the first iPhone was

play35:43

still pretty buggy but it was like good

play35:45

enough to be useful for people yeah I

play35:46

think that like I I don't think things

play35:49

are going to change as much in the world

play35:50

in the next couple years people think

play35:51

it's not linear it's things are going to

play35:53

change slowly and then probably all of a

play35:55

sudden and I think everyone's still

play35:56

trying to figure out how to use this

play35:58

technology if you take your phone and

play36:01

you look at your home screen and ask

play36:02

yourself a year and a half after chat

play36:04

GPT launched how many apps are

play36:05

fundamentally different because of AI

play36:07

and very few of them are fundamentally

play36:08

different so I think we're still in this

play36:10

world where we're developing a lot of

play36:12

the you know computation with Nvidia Sam

play36:15

and team are developing the models and a

play36:17

lot of the change to SI is going to

play36:18

happen when people build on top of those

play36:19

models the applications and there's so

play36:22

many uses for it I mean you know one of

play36:23

the big use cases that we're talking

play36:25

about scientific discovery you know

play36:27

about what this can do to drug research

play36:30

uh to like you know some of the biggest

play36:32

kind of uh types of ills in society

play36:35

there's a lot this can do with education

play36:37

we think this can essentially give

play36:39

access to tutors to everyone around the

play36:41

world um creative people I know there's

play36:43

a lot of fear that artists can be

play36:45

replaced but you know I think if artists

play36:47

participate I went to design school I

play36:49

think this is a technology that they can

play36:50

use so I think we can go down the list

play36:52

um and I think there are going to be a

play36:54

lot of really exciting opportunities in

play36:55

the next 3 to five years where do you

play36:57

want to be in 5 years

play36:59

Sam further along the same path you know

play37:02

we'd like

play37:03

to one of the most fun parts of the job

play37:06

is getting like tons of email every day

play37:08

from people who are using Tools in these

play37:10

amazing ways I you know was able to like

play37:14

diagnosed this health problem that I'd

play37:15

had for years and I couldn't figure it

play37:17

out and was making my life miserable and

play37:18

I just typed my symptoms into Chachi PT

play37:20

and I got this idea and went to see a

play37:21

doctor and I'm totally cured or I've

play37:24

been trying my whole life to learn these

play37:25

things and couldn't do it and I got

play37:27

Chach PT to be like a tutor for me or I

play37:30

you know I'm like three times as

play37:31

productive as a developer and I'm doing

play37:32

these amazing things I'm the scientist

play37:34

using it and I love getting those things

play37:37

I love how much people love chbt I

play37:39

really do and 5 years from now I just

play37:42

hope it's a lot more of that I hope we

play37:43

have put this tool into the world that

play37:45

continues to Delight people and let them

play37:48

do more and like be their best at

play37:50

whatever they're doing well will we

play37:51

having more conversations like this down

play37:53

the road certainly as you go down your

play37:54

path but I want to thank both of you Sam

play37:56

wman Brian chesy for taking time and

play37:58

being with us here in as great

play38:01

[Music]

play38:03

conversation nice job thanks for

play38:06

watching stay updated about breaking

play38:08

news and top stories on the NBC News app

play38:11

or follow us on social media

Rate This

5.0 / 5 (0 votes)

Связанные теги
人工智能AI发展技术革命社会影响道德责任数据隐私深度伪造选举影响经济潜力教育变革创意产业
Вам нужно краткое изложение на английском?