Elon Musk talks Twitter, Tesla and how his brain works — live at TED2022

TED
14 Apr 202254:45

Summary

TLDR在这段引人入胜的视频中,埃隆·马斯克谈论了他对可持续未来的愿景,重点讨论了特斯拉和他对人工智能的看法。他提到,特斯拉不仅仅是制造电动汽车,更重要的是创建一个能够大规模生产电动汽车的系统。马斯克还探讨了自动驾驶技术的挑战,并透露他们正在开发的Optimus机器人,预测这将成为特斯拉最重要的产品。此外,他还提到了他对推特的收购计划,强调了他对推特作为自由言论平台的看重,并分享了他个人的成长经历和对未来的看法。

Takeaways

  • 🚗 特斯拉的巨大成功不仅在于制造电动汽车,而在于创造了一个高效生产电动汽车的系统,这是实现可持续未来的关键。
  • 🔮 埃隆·马斯克对人工智能发展的预测显示出他对未来技术进展的深刻见解,尽管有时候他的预测并不完全准确。
  • 🤖 全自动驾驶技术的挑战在于需要解决真正的世界人工智能问题,包括高级视觉识别和理解复杂环境。
  • 🌐 马斯克提出,要实现全自动驾驶,必须构建一个能够模仿人类大脑和视觉的系统,这意味着在计算机上实现真实世界的人工智能。
  • 👥 通过解决自动驾驶的挑战,马斯克认为这将推动在其他领域应用真实世界人工智能的发展,如通过他们的机器人项目Optimus。
  • 🏭 特斯拉在制造领域的创新不仅仅是为了生产汽车,还包括了推动制造业的现代化,例如通过高度自动化的工厂和生产流程。
  • 💡 马斯克关于未来技术和人工智能的思考,反映了他对于科技如何塑造未来社会和经济的深刻理解。
  • 📈 马斯克预测,机器人技术将在未来扮演关键角色,不仅在制造业中,也在日常生活中,通过提供个人助理和服务。
  • 🚀 对于马斯克而言,探索和利用人工智能技术的最终目的是推动人类文明向前发展,包括通过改进交通系统、提高生产效率和改善生活质量。
  • 🌍 马斯克对于可持续能源解决方案的推广,例如通过电动汽车和太阳能,体现了他对于解决全球能源问题和减少环境影响的承诺。

Q & A

  • 埃隆·马斯克如何描述特斯拉的'机器制造机器'的概念?

    -埃隆·马斯克将特斯拉的'机器制造机器'概念描述为实现可持续未来的关键。他认为,不仅仅是制造电动汽车,而是要建立一个能够大规模生产电动汽车并保持利润率的系统,以资助进一步的增长。

  • 为什么全自动驾驶技术难以实现?

    -马斯克指出,全自动驾驶技术之所以难以实现,是因为存在许多假阶段,人们可能认为他们已经掌握了问题,但实际上却达到了瓶颈。要真正解决全自动驾驶问题,需要解决现实世界的人工智能和复杂视觉问题。

  • 特斯拉如何设想未来的家用机器人应用?

    -特斯拉设想,未来的家用机器人将能够理解家庭的三维结构,知道家中每个物品的位置,以及能够识别所有物品。这些机器人可以执行诸如整理房间、做饭、割草或为家庭成员提供服务等任务。

Outlines

00:00

🚀 开场与特斯拉工厂探秘

在这段视频中,主持人介绍了即将与埃隆·马斯克进行现场对话的节目。在与马斯克的对话之前,他分享了自己对特斯拉位于德克萨斯州奥斯汀的巨型工厂的参观经历。这座工厂在正式开放的前一天晚上,主持人得以独自探访,发现了马斯克所说的“制造机器的机器”,这是实现可持续未来的关键。此外,主持人还提到了他与马斯克进行的一次精彩采访,并展示了采访的一部分,聚焦于人工智能、特斯拉车辆销售的预测、自动驾驶的挑战以及对未来的展望。

05:03

🤖 特斯拉Optimus机器人的启示

在这一段中,马斯克讨论了特斯拉在开发自动驾驶技术过程中的一个关键发现——为了实现自动驾驶,他们需要解决真实世界的人工智能问题。这个认识为特斯拉开发Optimus机器人奠定了基础,一个可以在没有具体指令的情况下,在真实世界中导航并执行有用任务的机器人。马斯克认为,这将对制造业和家庭生活产生深远影响,从执行危险、枯燥的任务到家庭护理和娱乐。他还强调了人工智能监管的重要性,以及如何确保技术的安全使用。

10:03

🌍 马斯克关于收购推特的动机

在这一段视频中,马斯克讨论了他提出收购推特的原因,强调了他对于自由言论的重要性和推特作为公共讨论平台的角色。他提议开源推特的算法,以增加透明度和公众信任。马斯克还分享了他对于如何处理争议性言论的看法,主张在法律允许的范围内最大限度地促进自由表达。他强调了这项收购的目的不是为了经济利益,而是为了确保推特能够作为一个促进民主和自由表达的平台继续发挥作用。

15:04

💬 马斯克提出的推特改进措施

马斯克在这一段中提出了他对推特进行改进的计划,包括增加编辑按钮、消除垃圾邮件和欺诈性内容、以及减少自动化机器人的影响。他讨论了如何通过公开算法和手动调整的透明度来增强用户对推特的信任。马斯克还谈到了他对于内容审查和自由表达之间平衡的看法,强调了在不确定情况下倾向于允许言论自由的重要性。

20:04

🧠 马斯克的个人挑战和特斯拉的背景

在这段视频中,马斯克分享了他在个人和职业生活中面临的挑战,以及特斯拉的成立背景和它的使命。他讨论了特斯拉如何克服了创业初期的困难,并最终成为领先的电动汽车制造商。马斯克强调了他对于可持续能源的承诺,以及他对于特斯拉在未来能源转型中的角色和影响的看法。

Mindmap

Keywords

💡Twitter

推特是一家社交媒体和微博公司。推特在视频中被描述为“事实上的城市广场”,是一个公开讨论和交流思想的平台。埃隆·马斯克决定收购推特是为了让其成为一个更加包容和信任的平台。

💡言论自由

言论自由是指个人表达观点和思想的权利。视频中马斯克强调,推特应该是一个可以自由发表言论的论坛,在法律规定的范围内。他认为这对民主的功能很重要。

💡算法

算法指的是推特用来决定如何展示和推荐内容的计算机程序。马斯克建议推特应该公开其算法,这样用户就可以看到推文是如何被处理和展示的。

💡安全

马斯克认为,推特应该努力成为一个更加安全和受信任的平台,减少辱骂,欺诈和虚假信息。这有助于保护用户,并提高公众对该平台的信任。

💡封禁

指平台禁止某个账户发帖或使用平台的做法。马斯克建议推特应该更加谨慎小心地使用永久封禁,他支持使用暂时封禁或其他更温和的惩罚。

💡机器人

指自动程序,常用于社交媒体平台上进行spam或欺诈活动。马斯克认为消除这些机器人账户应该是推特的首要任务之一。

💡特斯拉

马斯克创建的电动汽车公司。视频中马斯克详细讲述了特斯拉早期生产中遇到的困难,以及他为实现大规模制造所做的努力。这使他积累了丰富的制造业经验。

💡可持续发展

指满足当代需求而不损害后代需求能力的发展。马斯克的使命是加速实现可持续能源的未来,这需要大规模转向风能,太阳能等可再生能源以及电动交通工具。

💡未来主义

指相信科技和人类理性可以创造出更美好世界的思想。马斯克坚信他的工作可以帮助实现一个令人激动的未来。

💡意识

马斯克认为扩大生物和数字意识的范围及规模,将使我们能够更好地理解宇宙的本质,这是他的主要驱动理念。

Highlights

Elon Musk discusses the impact and future of Tesla's Gigafactory in Austin, Texas.

Musk shares insights on the challenges and advancements in Tesla's self-driving technology.

Musk emphasizes the importance of sustainable energy and Tesla's role in promoting electric cars.

Elon Musk's interview unveils his vision for artificial intelligence in Tesla's operations.

Musk reveals Tesla's shift towards AI and its significance in the future of autonomous vehicles.

Musk highlights the potential of Tesla's Optimus robot in revolutionizing manufacturing and domestic tasks.

Musk addresses the challenges in predicting advancements in technology and Tesla's strategies to overcome them.

Discussion on the importance of a transparent and inclusive platform for free speech in relation to Musk's Twitter acquisition.

Musk's proposal for an open-source algorithm for Twitter to enhance public trust and reduce manipulation.

Insights into Musk's personal philosophy on free speech and its implications for Twitter's content moderation.

Musk shares his motivation behind the Twitter offer, focusing on the platform's role in democracy and public discourse.

Elon Musk's candid reflection on Tesla's near-bankruptcy phase and the intense effort to stabilize the company.

Musk's personal journey with Asperger's syndrome and how it shaped his communication and perspective.

Musk's vision for humanity's future, emphasizing sustainability, technological advancement, and optimism.

The interview concludes with Musk's commitment to fighting for a positive future for humanity.

Transcripts

play00:02

hello

play00:04

so

play00:06

in just a few minutes um elon musk will

play00:08

be joining us here live on stage

play00:12

for a conversation uh

play00:14

rumor has it there are a few things to

play00:16

talk about with him

play00:18

um

play00:20

we we we will see

play00:22

but um before that

play00:24

i just want to show you something

play00:26

special

play00:29

i want you to come with me to

play00:31

tesla's huge gigafactory in austin texas

play00:35

so the day before it opened last week

play00:37

the evening before i was allowed to walk

play00:39

around it

play00:40

no one else there

play00:42

and what i saw there was honestly pretty

play00:45

mind-blowing

play00:46

this is elon musk's famous machine that

play00:49

builds the machine and his view the

play00:51

secret to a sustainable future is not

play00:54

just making an electric car

play00:55

it's making a system that churns out

play00:58

huge numbers of electric cars with a

play01:00

margin so that they can fund further

play01:03

growth

play01:04

when i was there um none of us knew

play01:07

whether elon would actually be able to

play01:08

make it here today so i took the chance

play01:10

to sit down with him and record an epic

play01:13

interview

play01:14

and i just want to show you

play01:16

a nine

play01:18

an eight minute excerpt of that

play01:20

interview so here from austin texas elon

play01:23

musk

play01:24

i want us to switch now to think a bit

play01:25

about artificial intelligence i i'm

play01:28

curious about your timelines and how you

play01:30

predict and how come some things are so

play01:32

amazingly on the money and some aren't

play01:34

so when it comes to predicting

play01:36

sales of tesla vehicles for example

play01:39

i mean you've kind of been amazing i

play01:40

think in 2014

play01:41

when tesla had sold that year 60 000

play01:45

cars you said 2020 i think we will do

play01:48

half a million a year yeah we did almost

play01:50

exactly half a million five years ago

play01:52

last time you came today we um i asked

play01:55

you about for self-driving and um you

play01:57

said yep this very year where i am

play02:00

confident that we will have a car going

play02:03

from la to new york uh without any

play02:06

intervention yeah i i don't want to blow

play02:08

your mind but i'm not always right

play02:11

um

play02:12

so talk what's the difference between

play02:13

those two why why

play02:15

why has full self-driving in particular

play02:17

been so hard to predict i mean the thing

play02:20

that really got me and i think it's

play02:21

going to get a lot of other people is

play02:23

that

play02:24

there are just so many false stones with

play02:26

with self-driving um where you think you

play02:30

think you've got the problem

play02:32

have a handle on the problem and then it

play02:33

nope uh it turns out uh you just hit a

play02:37

ceiling um and and uh

play02:39

uh

play02:40

because what happened if you if you were

play02:42

to plot the progress

play02:44

the progress looks like a log curve so

play02:45

it's like yeah

play02:47

a series of log curves so

play02:49

uh most people don't like cookies i

play02:50

suppose but it shows the show

play02:53

it goes it goes up sort of a you know

play02:55

sort of a fairly straight way and then

play02:57

it starts tailing off right and and and

play02:59

you start there's a kind of ocean

play03:00

getting diminishing returns you know in

play03:02

retrospect they seem obvious but uh in

play03:04

in order to solve uh full self-driving

play03:06

uh properly you actually just you have

play03:08

to solve real-world ai

play03:10

um you you you know because you said

play03:13

what are the road networks designed to

play03:15

to work with they're designed to work

play03:17

with a biological neural net our brains

play03:20

and with uh vision our eyes

play03:24

and so in order to make it work

play03:28

with computers you basically need

play03:31

to solve real world ai

play03:33

and vision

play03:35

because because we we need

play03:38

we need cameras

play03:40

and silicon neural nets

play03:42

uh in order to have to have

play03:45

self-driving work for a system that was

play03:46

designed for eyes and biological neural

play03:49

nets

play03:50

it you know when you i guess when you

play03:52

put it that way it's like quite obvious

play03:54

that the only way to solve full

play03:56

self-driving is to solve real-world

play03:58

ai and sophisticated vision what do you

play04:01

feel about the current architecture do

play04:02

you think you have an architecture now

play04:04

where where there is

play04:06

a chance for the logarithmic curve not

play04:08

to tail off any anytime

play04:10

soon

play04:11

well i mean

play04:13

admittedly these these uh may be an

play04:15

infamous uh last words but i i actually

play04:18

am confident that we will solve it this

play04:19

year

play04:20

uh that we will exceed uh

play04:22

you're like what the probability

play04:25

of an accident at what point should you

play04:27

exceed that of the average person right

play04:29

um i think we will exceed that this year

play04:31

we could be here

play04:32

talking again in a year it's like well

play04:34

yeah another year went by and it didn't

play04:36

happen but i think this i think this is

play04:37

the year is there an element that you

play04:39

actually deliberately

play04:40

make aggressive prediction timelines to

play04:44

drive people to be ambitious and without

play04:46

that nothing gets done

play04:48

so it's it feels like at some point in

play04:50

the last year

play04:52

seeing the progress on

play04:55

understanding you that you're that the

play04:57

ai the tesla ai understanding the world

play04:59

around it led to a kind of an aha moment

play05:02

in tesla because you really surprised

play05:03

people recently when you said

play05:06

probably the most important product

play05:08

development going on at tesla this year

play05:10

is this robot optimus yes

play05:13

is it something that happened in the

play05:15

development of fourself driving that

play05:16

gave you the confidence to say you know

play05:18

what we could do something special here

play05:21

yeah exactly so you know it took me a

play05:23

while to sort of realize that that in

play05:26

order to solve self-driving you really

play05:28

needed to solve real-world ai um

play05:31

at the point of which you solve

play05:32

real-world ai for a car which is really

play05:34

a robot on four wheels uh you can then

play05:37

generalize that to a robot on legs as

play05:40

well

play05:41

the thing that the things that are

play05:43

currently missing are uh

play05:45

enough intelligence enough to tell

play05:47

intelligence for the robot to navigate

play05:48

the real world and do useful things

play05:50

without being explicitly instructed it

play05:53

is so so the missing things are

play05:54

basically real world uh intelligence and

play05:57

uh scaling up manufacturing um those are

play06:00

two things that tesla is very good at

play06:02

and

play06:03

uh so then we basically just need to

play06:05

design the the uh specialized actuators

play06:08

and sensors that are needed for a

play06:09

humanoid robot

play06:11

people have no idea this is going to be

play06:13

bigger than the car

play06:16

um but so talk about i mean i think the

play06:18

first applications you you've mentioned

play06:20

are probably going to be manufacturing

play06:21

but eventually the vision is to to have

play06:23

these available for people at home

play06:25

correct if you had a robot that really

play06:28

understood the 3d architecture of your

play06:31

house and

play06:33

knew where every object in that house

play06:36

was or was supposed to be and could

play06:39

recognize all those objects i mean that

play06:42

that's kind of amazing isn't that like

play06:44

like that the kind of thing that you

play06:45

could ask a robot to do

play06:47

would be what like tidy up yeah um

play06:50

absolutely

play06:51

or make make dinner i guess mow the lawn

play06:54

take take a cup of tea to grandma and

play06:57

show her family pictures and

play06:59

exactly take care of my grandmother and

play07:02

make sure yeah exactly and it could

play07:04

recognize obviously recognize everyone

play07:06

in the home yeah could play catch with

play07:08

your kids yes i mean obviously we need

play07:10

to be careful this doesn't uh become a

play07:12

dystopian situation

play07:14

um like i think one of the things that's

play07:16

going to be important is to have a

play07:18

localized rom chip on the

play07:21

robot that cannot be updated over the

play07:23

air uh where if you for example were to

play07:26

say stop stop stop that would if anyone

play07:28

said that then the robot would stop you

play07:30

know type of thing

play07:31

and that's not updatable remotely um i

play07:34

think it's going to be important to have

play07:35

safety features like that

play07:37

yeah that that sounds wise and i do

play07:39

think there should be a regular free

play07:40

agency for ai i've said this for many

play07:42

years i don't love being regulated but i

play07:44

you know i think this is an important

play07:45

thing for public safety do you think

play07:47

there will be basically like in say 2050

play07:50

or whatever that like a

play07:52

a robot in most homes is what they will

play07:54

be and people will

play07:56

probably count

play07:57

you'll have your own butler basically

play07:59

yeah you'll have your sort of buddy

play08:01

robot

play08:02

probably yeah i mean how much of a buddy

play08:05

do like do you do

play08:06

how many applications you thought is

play08:07

there you know can you have a romantic

play08:09

partner

play08:10

lot of a sex

play08:11

inevitable

play08:13

i mean i did promise the internet that i

play08:14

would make cat girls we'll have we could

play08:16

make a robot cackle

play08:19

how are you because yeah

play08:21

you know

play08:24

so

play08:25

yeah i i guess uh it'll be what whatever

play08:27

people want really you know so what sort

play08:30

of timeline should we be

play08:32

thinking about of the first the first

play08:34

models that are actually made and

play08:37

sold

play08:39

you know the the first units that that

play08:41

we tend to make are

play08:43

um

play08:44

for jobs that are dangerous boring

play08:46

repetitive and things that people don't

play08:48

want to do and you know i think we'll

play08:49

have like an interesting prototype uh

play08:52

sometime this year we might have

play08:54

something useful next year but i think

play08:56

quite likely within at least two years

play08:59

and then we'll see rapid growth year

play09:01

over year of the usefulness of the

play09:02

humanoid robots um and decrease in cost

play09:05

and scaling out production

play09:07

help me on the economics of this so what

play09:09

what do you picture the cost of one of

play09:10

these being well i think the cost is

play09:12

actually not going to be uh crazy high

play09:15

um

play09:16

like less than a car yeah but but think

play09:18

about the economics of this if you can

play09:21

replace a

play09:23

thirty thousand dollar forty thousand

play09:24

dollar a year worker

play09:26

which you have to pay every year

play09:28

with a one-time payment of twenty five

play09:30

thousand dollars for a robot that can

play09:32

work longer hours

play09:35

doesn't go on vacation i mean that could

play09:37

it could be a pretty rapid replacement

play09:40

of certain types of jobs how worried

play09:43

should the world be about that

play09:44

i wouldn't worry about the the sort of

play09:46

putting people out of a job thing um i

play09:48

think

play09:49

we're actually going to have and already

play09:51

do have a massive shortage of labor so i

play09:53

i i think we'll we will have um

play09:57

uh

play09:59

not not people out of work but actually

play10:01

still a shortage labor even in the

play10:03

future uh but

play10:06

this really will be a world of abundance

play10:09

any goods and services

play10:11

uh will be available to anyone who wants

play10:13

them that it'll be so cheap to have

play10:15

goods and services it'll be ridiculous

play10:22

so

play10:25

that is part of

play10:27

an epic 80 minute interview

play10:29

which we are releasing to people

play10:33

members of ted 2022 right after this

play10:35

conference um you should be able to

play10:38

look at it on the ted live

play10:41

website

play10:42

um there's public interest in it we're

play10:44

putting that out to the world on sunday

play10:47

afternoon i think sunday evening but uh

play10:49

but if you're into this kind of stuff um

play10:51

definitely a good thing to do over the

play10:52

weekend um

play10:54

now then

play10:55

hearing from elon live there's there's

play10:57

huge public interest in that we have

play11:00

opened up this segment to live stream

play11:03

and so

play11:04

we're joined right now by i think quite

play11:07

a few people around the world um welcome

play11:09

to vancouver welcome to ted 22 you're

play11:11

joining us on the last day of our

play11:12

conference here in a packed

play11:15

theater

play11:16

and

play11:17

we've been hearing all week from

play11:19

people with dreams about what the next

play11:22

era

play11:23

of humanity is going to be

play11:26

and now arguably

play11:28

the biggest visionary of them all elon

play11:30

musk

play11:34

[Music]

play11:40

hey elon welcome

play11:46

so elon um a few hours ago

play11:50

you

play11:50

made

play11:51

an offer to buy twitter

play11:56

why

play12:00

[Laughter]

play12:03

how'd you know

play12:05

little bird

play12:06

tweeted in my ear or something i don't

play12:08

know

play12:09

by the way have you seen the movie ted

play12:10

about the bear

play12:11

i i i have i have a movie

play12:19

so um

play12:20

yeah yeah so

play12:23

was there a question

play12:25

why why make that offer oh so

play12:28

um

play12:28

well i think it's very important for

play12:31

uh there to be

play12:33

an inclusive arena for

play12:35

free speech

play12:37

where

play12:38

all yeah so uh yeah

play12:40

um

play12:43

twitter has become kind of the de facto

play12:45

town square um so

play12:48

uh

play12:49

it's just really important that people

play12:51

have the both the uh the reality and the

play12:54

perception

play12:56

uh that they're able to speak freely

play12:59

within the bounds of the law um

play13:01

and

play13:03

you know so one of the things that i

play13:04

believe twitter should do is open source

play13:07

the algorithm

play13:08

um and make any changes

play13:11

uh to people's tweets you know if

play13:12

they're emphasized or de-emphasized uh

play13:15

that action should be

play13:17

made apparent so you anyone can see that

play13:19

action has been taken

play13:21

so there's there's no sort of behind the

play13:22

scenes

play13:24

manipulation either algorithmically or

play13:27

manually um

play13:30

but

play13:31

last week when we spoke elon um i asked

play13:34

you whether you were thinking of taking

play13:36

over you said no way said i i do not

play13:38

want to own twitter it is a recipe for

play13:40

misery everyone will blame me for

play13:43

everything what on earth changed no i

play13:45

think i think everyone will still blame

play13:46

me for everything

play13:48

yeah if something if if i acquire

play13:50

twitter and something goes wrong it's my

play13:52

fault 100

play13:54

i i think there will be quite a few

play13:55

arrows uh yes um it will it will be

play13:58

miserable but you still want to do it

play13:59

why i mean i hope it's not too miserable

play14:02

uh but

play14:03

um

play14:05

i i just think it's important to the fun

play14:08

like

play14:09

uh

play14:10

it's important to the function of

play14:13

democracy

play14:14

it's important to the function of

play14:17

uh the united states uh as a free

play14:19

country and many other countries and to

play14:21

help

play14:22

actually to help

play14:23

freedom in the world

play14:24

more broadly than the u.s

play14:26

and so

play14:28

i think it's uh

play14:30

it's a

play14:32

you know i think this there's the risk

play14:34

civilizational risk

play14:36

uh is decreased if twitter

play14:39

the more we can increase the trust of

play14:41

twitter as a public platform and so

play14:45

i do think this will be somewhat painful

play14:47

and i'm not sure that i will actually be

play14:48

able to to acquire it

play14:50

and i should also say

play14:52

the intent is is to

play14:55

retain as many shareholders as is

play14:57

allowed by the law in a private company

play14:59

which i think is around

play15:01

2000 or so so we'll it's not like it

play15:03

it's definitely not not from the

play15:05

standpoint of letting me figure out how

play15:06

to monopolize or maximize my ownership

play15:08

of twitter

play15:09

but we'll try to bring along as many

play15:11

shoulders as we right as we're allowed

play15:13

to you don't necessarily want to pay out

play15:15

40 or whatever it is billion dollars in

play15:17

cash you you'd like them to come come

play15:18

with you in in

play15:20

i mean

play15:21

i mean i could technically afford it um

play15:28

what i'm saying is this this is

play15:30

this is uh this is not a

play15:32

way to sort of make money

play15:34

you know i think this is it's just that

play15:37

i think this is um this could

play15:40

my

play15:40

strong intuitive sense is that uh having

play15:44

a public platform that is maximally

play15:46

trusted

play15:48

um and and and

play15:50

and broadly inclusive

play15:52

um is extremely important to the future

play15:55

of civilization but you've described

play15:58

yourself i don't care about the

play15:59

economics at all

play16:00

okay that's that's core to hear you this

play16:03

is not about the economics it's for the

play16:05

the moral good that you think will

play16:07

achieve you you've described yourself

play16:08

elon as a free speech absolutist

play16:11

but

play16:12

does that mean that there's literally

play16:13

nothing that people can't say and it's

play16:16

okay

play16:17

well i i i think uh

play16:20

obviously uh twitter or any forum is

play16:23

bound by the laws of the country that it

play16:25

operates in um

play16:27

so

play16:28

obviously there are some limitations on

play16:31

free speech uh in in the us and and of

play16:34

course uh

play16:36

twitter would have to abide by those uh

play16:38

right rules so so so you can't incite

play16:40

people to violence like that that the

play16:43

like a direct incitement to violence you

play16:46

know you can't do the equivalent of

play16:47

crying fire in a in a movie theater for

play16:50

example no that would be a crime yeah

play16:52

right

play16:53

it should be a crime but here's here's

play16:55

the challenge is is that it's it's such

play16:57

a nuanced difference between different

play16:59

things so

play17:01

there's

play17:02

there's excitement to violence yeah

play17:04

that's a no if it's illegal um there's

play17:06

hate speech which some forms of hate

play17:08

speech are fine you know i hate spinach

play17:12

um

play17:13

i mean if it's a sauteed in a

play17:16

you know cream sauce that would be quite

play17:18

nice

play17:19

but so so

play17:20

but the problem is so so so let's say

play17:22

someone says okay here's one tweet i

play17:24

hate politician x yeah next tweet is i

play17:27

wish polite politician x wasn't alive

play17:30

as we some of us have said about putin

play17:32

right now for example so that's

play17:34

legitimate speech

play17:36

another tweet is i wish politician x

play17:38

wasn't alive with a picture of their

play17:40

head with a gun sight over it

play17:43

or that plus their address i mean at

play17:46

some point

play17:48

someone has to make a decision as to

play17:50

which of those is not okay can an

play17:52

algorithm

play17:54

do that well surely you need human

play17:55

judgment at some point

play17:57

no i think

play17:58

the like i said

play18:01

in my view

play18:02

uh twitter should um

play18:05

match the laws of the of the country of

play18:07

and and and really you know

play18:09

that there's an obligation to to do that

play18:12

um

play18:14

but going beyond going beyond that um

play18:17

and having it be

play18:18

unclear who's making what changes to who

play18:21

to where

play18:22

uh having tweets sort of mysteriously be

play18:25

promoted and demoted

play18:27

with no insight into what's going on uh

play18:29

having a black box algorithm uh promote

play18:32

some things and other not not other

play18:33

things i think this can be quite

play18:35

dangerous

play18:36

so so so the idea of opening the

play18:38

algorithm is a huge deal and i think

play18:40

many people would would welcome that of

play18:42

understanding exactly how it's making

play18:44

the decision and critique it and

play18:46

critique like i want to improve what

play18:48

wondering is like like i think like the

play18:50

code should be on github you know so

play18:52

then uh and so people can look through

play18:54

it and say like i see a problem here i

play18:57

don't i don't agree with this

play18:58

um

play18:59

they can highlight issues right um

play19:02

suggest changes in the same way that you

play19:04

sort of update linux or or signal or

play19:06

something like that you know but as i

play19:08

understand it like at some point right

play19:11

now

play19:12

what the algorithm would do is it would

play19:14

look at for example how many people have

play19:15

flagged a tweet as obnoxious

play19:19

and then

play19:21

at some point a human has to look at it

play19:22

and make make a decision as to does this

play19:25

cross the line or not that the algorithm

play19:27

itself can't i don't think yet um tell

play19:30

the difference between

play19:32

legal and okay and and definitely

play19:34

obnoxious and so the question is which

play19:36

humans you know make make that

play19:39

core i mean do you have do you have a

play19:40

picture of that right now twitter

play19:43

and facebook and others you know they've

play19:45

hired thousands of people to try to help

play19:48

make wise decisions and the trouble is

play19:50

that no one can agree on on what is wise

play19:52

how do you solve that

play19:55

well i i i think we would want to er on

play19:58

this if if in doubt

play20:00

uh

play20:01

let let the speech that let it exist uh

play20:04

it would have you know if it's a

play20:07

you know a

play20:09

a gray area i would say let let the

play20:11

tweet exist

play20:12

um

play20:13

but

play20:14

obviously you know in a case where

play20:16

there's perhaps a lot of controversy uh

play20:19

that you would not want to necessarily

play20:21

promote that tweet if uh you know so the

play20:25

i'm not i'm not saying this is that i

play20:27

have all the answers here um

play20:29

but

play20:30

i i do think that we want to be just

play20:33

very reluctant to delete things

play20:36

and and have um

play20:37

just just be very cautious with with

play20:39

with permanent bands uh you know

play20:42

timeouts i think are better or uh than

play20:45

sort of permanent bands

play20:47

and um

play20:49

but just just in general like i said

play20:53

uh how how it won't be perfect but i

play20:56

think we wanted to really uh have

play20:59

like so the possession and reality that

play21:01

speech is as

play21:02

free as reasonably possible

play21:04

and a good sign as to whether there's

play21:07

free speech is

play21:09

is

play21:10

is someone you

play21:11

don't like allowed to say something you

play21:13

don't like

play21:15

and if that is the case then we have

play21:16

free speech and it's it's damn annoying

play21:19

when someone you don't like says

play21:21

something you don't like

play21:22

that is a sign of a healthy functioning

play21:25

uh

play21:26

free speech situation

play21:30

so

play21:31

i think many people would agree with

play21:33

that and look at the reaction online

play21:34

many people are excited by

play21:36

you coming in and the changes you're

play21:37

proposing some others are absolutely

play21:39

horrified here's how they would see it

play21:41

they would say wait a sec we agree that

play21:43

that twitter is an incredibly important

play21:45

town square it is a it is you know where

play21:47

the world exchanges opinion about life

play21:49

and death matters

play21:50

how on earth could it be owned by the

play21:52

world's richest person that can't be

play21:54

right

play21:55

so how how do you i mean what's the

play21:57

response there is there any way that you

play21:59

can

play22:00

distance yourself from the actual

play22:02

decision-making that matters on content

play22:05

at

play22:06

in some very clear way that is

play22:08

convincing to people

play22:10

well like i said i think the

play22:13

it's it's very important that like the

play22:15

the algorithm be open sourced and that

play22:17

any manual uh adjustments be

play22:20

uh identified like so if this tweet if

play22:23

somebody did something to a tweet it's

play22:25

there's information attached to it that

play22:27

this that action was taken and i i i i

play22:30

won't personally be uh you know in their

play22:33

editing tweets

play22:34

um

play22:36

but you'll know if something was done to

play22:39

promote demote or otherwise affect uh a

play22:42

tweet um

play22:44

you know

play22:45

as for

play22:46

media sort of ownership i mean you've

play22:48

got you know um mark zuckerberg owning

play22:50

facebook and

play22:52

instagram and whatsapp um

play22:54

and with a share ownership structure

play22:56

that will

play22:57

have

play22:58

mark zuckerberg the 14th still

play23:01

controlling those

play23:02

uh entities

play23:04

so

play23:08

literally um

play23:11

what's that need we won't have that on

play23:12

twitter

play23:13

if if you commit to opening up the

play23:15

algorithm that that definitely gives

play23:17

some level of confidence um talk about

play23:20

talk about some of the other changes

play23:21

that you've proposed so you

play23:24

at the edit button that's that's

play23:26

definitely coming if you if you have

play23:28

your way yeah yeah

play23:29

and how do you i mean i i think

play23:32

i mean

play23:33

one

play23:34

frankly

play23:35

um

play23:37

the

play23:38

top priority i have i would have is is

play23:41

eliminating the the spammings and scam

play23:44

bots

play23:45

and the bot armies that are on twitter

play23:48

um

play23:51

you know i think i think these these fun

play23:53

influence

play23:55

that

play23:56

they're not they're they're they they

play23:57

make the product much worse

play24:00

um if i see if you know

play24:02

if i had a dogecoin for every crypto

play24:04

scam i saw

play24:05

[Laughter]

play24:10

more you know 100 billion dollars

play24:13

do you regret sparking the sort of storm

play24:16

of excitement overdose and you know

play24:18

where it's gone or

play24:20

i mean i think deutsche is fun and you

play24:23

know i've always said don't bet the form

play24:24

of dogecoin uh fyi

play24:27

yeah

play24:29

but i i think i think it's it's i like

play24:32

dogs and i like memes and uh it's got

play24:34

both of those

play24:36

and

play24:37

but just on the on the edit button how

play24:39

how do you get around the problem of so

play24:40

someone tweets elon rocks and it's

play24:42

tweeted by two million people um and um

play24:46

and then then after that they edit it so

play24:48

i'm elon sucks and um and then all those

play24:51

retweets

play24:52

they're all embarrassed and how how do

play24:54

you avoid that type of

play24:56

changing of meaning so that retweeters

play24:59

are exploited

play25:02

well i think uh you know you'd only have

play25:05

the edit capability for a short period

play25:07

of time and probably the thing to do at

play25:10

upon the edit would be to zero out

play25:12

all retweets and favorites

play25:14

okay

play25:16

i'm open to ideas though you know

play25:19

so in one way the um algorithm works

play25:21

kind of well for you right now i just i

play25:23

wanted to show you this this is so

play25:25

this is a typical tweet of of mine kind

play25:28

of lame and wordy and whatever and look

play25:30

at and the amazing response it gets is

play25:32

this oh my god

play25:34

97 likes

play25:36

um and then i tried another one um

play25:41

and uh

play25:45

29 000 likes so the algorithm at least

play25:48

seems to be at the moment you know if

play25:50

elon musk expanded the world immediately

play25:54

um not bad right

play25:58

yeah i guess so i mean that was

play26:00

cool

play26:01

i mean you but but you've

play26:03

so help us understand how it is you've

play26:05

built this incredible

play26:07

um following on twitter yourself when

play26:10

i mean some of the people who love you

play26:12

the most look at some of what you tweet

play26:14

and they they

play26:15

they think it's somewhere between um

play26:18

embarrassing and crazy some of it's

play26:20

amazing i mean

play26:21

[Laughter]

play26:24

is that actually why it's worked or why

play26:27

why has it worked

play26:29

i mean i don't know i mean i i'm

play26:31

you know tweeting more or less stream of

play26:33

consciousness you know it's not like let

play26:35

me think about some grand plan about my

play26:36

twitter or whatever you know i'm like

play26:38

literally on the toilet or something i'm

play26:40

like oh this is funny and then tweet

play26:42

that out you know

play26:44

that's

play26:45

that's like most of them

play26:46

[Laughter]

play26:48

you know over sharing

play26:51

but um but you are obsessed with getting

play26:53

the most out of every minute of your day

play26:55

and so why not you know

play26:57

um

play26:59

so

play27:00

i don't know i just like try to tweet

play27:02

out like things that are interesting or

play27:03

funny or

play27:04

you know and then people seem to

play27:07

like it

play27:08

so if if you are unsuccessful actually

play27:11

before i ask that let me ask this if i

play27:13

don't

play27:14

yeah so how can i say

play27:16

is uh funding secured

play27:19

[Music]

play27:23

i i have sufficient uh assets to

play27:26

complete the

play27:28

uh

play27:29

it's not a forward-looking statement

play27:30

blah blah but

play27:33

i have to i mean i can do it if possible

play27:35

right um

play27:36

so um

play27:38

and um

play27:41

i mean i should say actually even in the

play27:42

in

play27:43

originally

play27:44

the

play27:45

uh with with tesla back in the day

play27:48

funding was actually secured

play27:50

i want to be clear about that

play27:51

um in fact this may be a good

play27:53

opportunity to to to clarify that um

play27:56

if funding was indeed secured um and uh

play28:00

i should say like why why do i do not

play28:02

have respect for the sec in that

play28:04

situation and i don't mean to

play28:06

blame everyone at the sec but certainly

play28:08

the san francisco office

play28:10

um it's because the sec

play28:12

knew that funding was secured

play28:15

but they pursued the

play28:17

an active public investigation

play28:18

nonetheless at the time tesla was in a

play28:21

precarious financial situation

play28:23

and i was told by the banks that if i

play28:25

did not agree to settle with the sec

play28:27

that they would the banks would cease

play28:28

providing working capital and tesla

play28:30

would go bankrupt immediately

play28:32

so that's like having

play28:34

a gun to your child's head

play28:36

so i was forced to concede to the sec

play28:39

unlawfully

play28:41

those bastards

play28:44

and and and now that they they say

play28:47

it makes it look like i lied when i did

play28:49

not in fact lie i was i was forced to

play28:51

admit that i lied for to save tesla's

play28:53

life and that's the only reason

play28:55

given what's actually happened

play28:59

given what's actually happened to tesla

play29:01

since then though aren't you glad that

play29:03

you didn't take it private

play29:07

yeah i mean

play29:10

it's difficult to put yourself in the

play29:11

position at the time tesla was under the

play29:13

most relentless short seller attack in

play29:16

the history of the stock market

play29:18

uh there's something called short and

play29:20

distort

play29:21

um where the barrage of negativity that

play29:24

tesla was experiencing from short sales

play29:26

wall street was beyond or belief tesla

play29:29

was the most shorted stock in the

play29:31

history of stock markets

play29:33

this is saying something

play29:35

so

play29:36

you know this was affecting our ability

play29:38

to hire people it was affecting our

play29:39

ability to sell cars

play29:41

it was

play29:42

uh

play29:43

they were

play29:44

yeah it was terrible um

play29:47

yeah they wanted tesla to die so bad

play29:49

they could taste it

play29:51

well most of them have paid the price

play29:54

yes

play29:55

where are they now

play29:58

um

play30:01

so

play30:02

so that was a really strong statement i

play30:04

mean obviously a lot of people

play30:05

who who support you i thought would say

play30:08

you have so much to

play30:10

offer the world on the upside on the

play30:12

vision side don't don't waste your time

play30:14

getting getting distracted by these

play30:16

these battles that bring out negativity

play30:18

and and and make people feel that you're

play30:20

being defensive or like people don't

play30:22

like fights especially with with

play30:23

powerful government authorities they'd

play30:25

rather they'd rather buy into your to a

play30:27

dream do do you like aren't you

play30:29

encouraged by people just just to edit

play30:32

that

play30:33

in that

play30:34

you know temptation out and uh

play30:37

go with the bigger story

play30:41

um well i mean i i would say like you

play30:43

know i'm sort of a mixed bag you know i

play30:48

mean well you're a fighter and you you

play30:51

don't you don't you don't you don't

play30:54

you don't like to lose and and you you

play30:56

you are determined that you don't

play30:58

basically i i mean you are sure i don't

play31:00

like to lose i'm not sure many people do

play31:01

um

play31:02

but the truth matters to me a lot really

play31:05

like

play31:07

sort of pathologically it matters to me

play31:09

okay so so you don't like to lose if in

play31:12

this case you are not successful in you

play31:15

know the board

play31:16

does not accept your offer you've said

play31:18

you won't go higher is there a plan b

play31:24

there is

play31:29

i i think we i think we would like to

play31:31

hear a little bit about plan b

play31:37

for it for another time i think

play31:39

another time yeah all right

play31:40

[Applause]

play31:44

i that that's a nice tease all right so

play31:48

um

play31:50

i i would love

play31:51

to

play31:53

try to understand this brain of yours

play31:55

more ilan i i if with your permission

play31:58

i'd like to just play this this is the

play32:00

oh actually before we do that

play32:02

um here was one of the of the thousands

play32:04

of questions that people asked i thought

play32:06

this was actually quite a good one um if

play32:08

you could go back in time and change one

play32:10

decision you made along the way

play32:11

do your own edit button

play32:13

which one would it be and why

play32:16

do you mean like a career decision or

play32:17

something

play32:18

just any decision over the last

play32:21

few years like your decision to invest

play32:23

in twitter in the first place or your

play32:26

anything um i mean the

play32:30

the worst business decision i ever made

play32:32

was

play32:33

um

play32:34

not starting tesla with just jb straval

play32:38

by far the worst decision i've ever made

play32:40

is not just starting tesla with jb

play32:43

that that that's the number one by far

play32:46

all right so jb strabo was was the

play32:47

visionary co-founder who who who was

play32:49

obsessed with and knew so much about

play32:51

batteries and your your decision to go

play32:54

with tesla the company as it was meant

play32:56

that you got locked into what you

play32:58

concluded it was a weird architecture

play33:00

now this this

play33:01

there's a lot of confusion tesla

play33:04

tesla did not exist in any

play33:07

tesla was a shell company with no

play33:08

employees uh no intellectual property

play33:10

when i invested but the

play33:13

a false narrative has been created by um

play33:16

one of the other co-founders uh martin

play33:17

everhard and i don't want to get into

play33:19

the nastiness here but uh

play33:22

i didn't invest in an existing company

play33:24

we created a company yeah and

play33:27

ultimately the creation that company uh

play33:30

was was done by

play33:32

uh jv and me um and

play33:34

unfortunately there's a someone else and

play33:37

another co-founder who has made it his

play33:39

life's mission

play33:40

uh to make it sound like he he created

play33:42

the company which is false wasn't there

play33:44

another issue

play33:45

right at the heart of the development of

play33:48

the tesla model 3 where tesla almost

play33:50

went bankrupt and i i think you have

play33:52

said that part of the reason for that

play33:54

was that you overestimated the extent to

play33:57

which it was possible at that time to

play33:59

automate a a factory a huge amount was

play34:02

spent

play34:03

kind of over automating and it didn't

play34:05

work

play34:06

and it nearly took the company down is

play34:08

that fair

play34:11

uh i mean

play34:13

first of all it's important to

play34:15

understand like what what has tesla

play34:18

actually accomplished that is that is

play34:20

most noteworthy um it is not the

play34:23

creation of

play34:25

an electric vehicle or creating

play34:27

electrical vehicle prototype or

play34:29

low volume production

play34:31

of a

play34:32

of a car that they've been

play34:35

uh hundreds of cars startups over the

play34:37

years hundreds and uh in fact at one

play34:40

point um bloomberg counted up the number

play34:42

of electric vehicle startups and they i

play34:44

think they got to almost 500. yeah so

play34:47

the hard part is not creating a

play34:48

prototype or going into limited

play34:50

production

play34:51

the the the absolutely difficult thing

play34:54

which has not been accomplished by an

play34:55

american car company in 100 years is

play34:58

reaching volume production without going

play35:00

bankrupt

play35:02

is the actual hard thing

play35:04

um the last company american company to

play35:07

reach volume production without going

play35:08

bankrupt was chrysler in the 20s right

play35:12

and and and it nearly happened to tesla

play35:16

yes it but it's not like oh geez i guess

play35:18

if we just done more manual stuff things

play35:20

would have been fine

play35:21

of course not uh that is definitely not

play35:24

the case uh

play35:25

so

play35:27

we basically messed up

play35:29

almost every aspect of the model 3

play35:32

production line

play35:34

from

play35:36

from cells to packs to

play35:39

driving voters

play35:40

motors

play35:42

body line the paint shop

play35:45

uh

play35:46

final assembly

play35:47

um

play35:49

everything everything was messed up

play35:51

um and i lived in that fa i lived in the

play35:54

fremont and and nevada factories

play35:57

for three years

play35:59

fixing the that production line running

play36:01

around like a maniac

play36:03

through every part of that factory

play36:06

living with the team

play36:10

i slept on the floor

play36:12

so that the

play36:13

the team who was going through

play36:15

a hard time

play36:17

could see me on the floor

play36:19

uh

play36:21

that they knew that i was not in some

play36:23

ivory tower

play36:25

whatever pain they experienced i was i

play36:27

had it more and some people who knew you

play36:30

well

play36:31

actually thought you were making a

play36:32

terrible mistake that you were driving

play36:33

yourself you were

play36:35

you were driving yourself to the edge of

play36:37

sanity almost and yeah and

play36:39

and that you were in danger of making

play36:42

bad

play36:43

choices and in fact i heard you say last

play36:45

week elon that that you because of

play36:47

tesla's huge value now and and you know

play36:50

the the significance of every minute

play36:52

that you spend that you are in danger of

play36:54

sort of

play36:56

obsessing over spending all this time to

play36:57

the point of to the edge of

play37:00

sanity

play37:01

um

play37:03

that doesn't that doesn't sound super

play37:04

wise isn't that like your your your time

play37:09

your your completely sane centered

play37:11

rested time and decision making

play37:14

is more powerful and compelling than

play37:17

that sort of i can barely

play37:19

hold my eyes open so surely it should be

play37:22

an absolute strategic priority to look

play37:24

after yourself

play37:28

i mean there wasn't any other way to

play37:30

make it work

play37:32

there were three years of hell

play37:35

17 8 2017 18 and 19

play37:39

with three years

play37:40

this longest period of excruciating pain

play37:42

in my life

play37:44

uh

play37:45

there wasn't any other way and we barely

play37:47

made it and we're on the ragged edge of

play37:49

bankruptcy the entire time

play37:52

so

play37:52

so when you felt like i want

play37:54

pain

play37:55

i don't like it

play37:56

um

play37:58

those were three or three so so much

play38:00

pain

play38:01

but it had to be done or tesla would be

play38:03

dead when you looked around the

play38:05

gigafactory that we saw images of

play38:07

earlier

play38:08

um last week and just see where the

play38:11

companies come i mean do you feel that

play38:13

that this this challenge of

play38:16

figuring out the the new way of

play38:18

manufacturing um that you that

play38:21

you actually have an edge now that it's

play38:23

different that you've figured out how to

play38:24

do this and

play38:26

and um from

play38:27

those three years

play38:29

what won't be repeated you've actually

play38:31

figured out a new way of manufacturing

play38:36

at this point i think i know

play38:39

more about manufacturing than anyone

play38:41

currently alive on earth

play38:45

between that

play38:48

yeah

play38:50

i'll tell you i can tell you how every

play38:51

damn part part in that car is made

play38:53

which basically if you just live on the

play38:55

factory live in the factory for three

play38:56

years and

play38:58

that was nice that was a poignant note

play39:01

or something

play39:03

someone wants to compose a symphony to

play39:05

that uh expression of confidence uh

play39:07

something like that i have no idea what

play39:09

that is

play39:10

anyway yeah

play39:12

every aspect of a car six weeks to

play39:13

sunday i know

play39:15

i mean you you you

play39:16

talk about scale right now you're in the

play39:18

middle of writing your new master plan

play39:21

and you've said that scale is at the

play39:23

heart of it

play39:25

why does scale matter why are you

play39:26

obsessed with that what are you thinking

play39:28

yeah well see

play39:30

in order in order to accelerate the

play39:32

advent of sustainable energy

play39:34

uh there must be scale

play39:36

because we've got a transition um a vast

play39:39

economy that is currently uh overly

play39:41

dependent on fossil fuels to a

play39:43

sustainable energy economy one where the

play39:46

energy is uh

play39:48

yeah i mean we got to do it

play39:54

so so the energy's got to be sustainably

play39:56

generated with wind solar uh hydro

play40:00

geothermal i i'm a believer in nuclear

play40:02

as uh as well i think ever talk about

play40:05

and

play40:05

uh and then you you

play40:07

since solar and wind is intermittent you

play40:08

have to have stationary storage

play40:10

batteries and and then we're going to

play40:12

transition all transport um

play40:15

to to electric uh

play40:17

if we do those things we have a

play40:18

sustainable energy future the faster we

play40:20

do those things the less risk we

play40:24

the less risk we

play40:25

put to the environment

play40:27

uh so sooner is better uh and and so

play40:31

scale is very important um

play40:33

you know it's not about it's not about

play40:35

press releases it's about tonnage what

play40:38

was the tonnage of

play40:39

of batteries produced

play40:42

and obviously done in a sustainable way

play40:44

and and our estimate is that

play40:47

approximately 300 terawatt hours of

play40:50

battery storage is needed to transition

play40:52

uh transport uh

play40:55

electricity and and heating and cooling

play40:58

uh to a fully electric situation others

play41:00

may

play41:01

there's there may be some

play41:03

different estimates out out there but uh

play41:06

our estimate is 300 terawatt hours yeah

play41:09

so we dug into this a lot in the

play41:10

interview that we recorded last week and

play41:12

so people can go in and hear that more

play41:13

but i mean the context is that is i

play41:15

think about a thousand times the current

play41:17

install battery capacity i mean the

play41:19

scale up needed is

play41:21

breathtaking basically yeah and and and

play41:24

um

play41:25

yeah so so your vision is to commit

play41:27

tesla to try to deliver on a meaningful

play41:30

percentage of what is needed yeah and

play41:32

what and call on others to do the rest

play41:34

that this is what this is a task for

play41:36

humanity to massively scale up our

play41:38

response to change change the energy

play41:40

grade

play41:41

yes it it's

play41:43

it's like basically how fast can we can

play41:45

we scale um and encourage others to

play41:49

scale

play41:50

to get to that 300 terawatt hour

play41:53

installed uh base of batteries right

play41:57

and then of course uh there'll be a

play41:59

tremendous need to recycle those

play42:00

batteries which is i and it makes sense

play42:02

to recycle them because the raw

play42:03

materials are like high grade ore um so

play42:07

people shouldn't think well they'd be

play42:08

this big pile of batteries now they're

play42:09

going to get recycled because the

play42:11

even a dead battery pack is worth about

play42:12

a thousand dollars so

play42:15

um but but this is what's needed for a

play42:17

sustainable energy future so we're going

play42:19

to try to take the set of actions that

play42:21

accelerate the day of and bring the day

play42:23

of a sustainable energy future sooner

play42:27

okay

play42:30

there's going to be a huge interest in

play42:32

your master plan when you when you

play42:34

publish that um meanwhile i just i would

play42:36

love to

play42:37

understand more

play42:39

what goes on in this brain of yours

play42:41

because it is it is a pretty unique one

play42:42

i want to play with your permission this

play42:44

very funny opening from snl saturday

play42:47

night live can we have the volume there

play42:48

actually please sorry

play42:50

it's an honor to be hosting saturday

play42:52

night live i mean that

play42:54

sometimes after i say something i have

play42:57

to say i mean that

play42:58

[Music]

play42:59

so people really know that i mean

play43:02

that's because i don't always have a lot

play43:04

of

play43:05

international variation in how i speak

play43:09

which i'm told makes for great comedy

play43:13

i'm actually making history tonight as

play43:14

the first person

play43:16

with asperger's to host snl

play43:18

[Applause]

play43:21

and i think you followed that up with at

play43:23

least the first person to admit it the

play43:24

first person to admit it

play43:27

but i mean

play43:32

so this was a great thing to say

play43:34

but i i would love to

play43:36

understand

play43:38

whether you know how you think of of

play43:40

asperger's like whether you can give us

play43:42

any sense of even you as a boy how what

play43:44

what the experience

play43:47

was or as you now

play43:49

understand with the benefit of hindsight

play43:51

can you talk about that a bit

play43:53

well i think i think everyone's

play43:55

experience is going to be somewhat

play43:56

different

play43:58

but i guess for me the

play44:00

social cues were not uh intuitive so

play44:05

i was just very bookish and i didn't

play44:08

understand

play44:09

this i guess

play44:11

others could

play44:13

sort of intuitively understand uh

play44:17

what watches meant by something

play44:20

i would just tend to take things very

play44:21

literally as just like the words

play44:24

as spoken word exactly what they meant

play44:26

but but then that

play44:28

didn't turn out to be wrong

play44:31

you can't they do not they're not simply

play44:32

saying exactly what they mean

play44:34

there's all sorts of other things that

play44:35

are meant it took me a while to figure

play44:37

that out um

play44:38

so

play44:40

i was you know bullied quite a lot

play44:42

um

play44:44

so

play44:46

i didn't i did not have a sort of happy

play44:49

childhood to be frank was quite quite

play44:50

rough um

play44:52

and um

play44:54

but i read a lot of books i read lots

play44:56

and lots of books

play44:57

and so that you know

play44:59

sort of

play45:01

gradually i sort of understood more from

play45:03

the books that i was reading and watched

play45:05

a lot of movies

play45:07

and

play45:09

you know just

play45:12

but it took it took me it took me a

play45:14

while to understand things that most

play45:17

people

play45:18

intuitively understand

play45:20

so i've wondered whether it's possible

play45:23

that that was in a strange way an

play45:25

incredible gift to you and and

play45:28

indirectly to many other people

play45:30

in as much as

play45:33

brains you know are plastic and they

play45:36

they they go where the action is and if

play45:39

in for some reason the external world

play45:41

and social cues which so many people

play45:43

spend so much time and energy and mental

play45:45

energy obsessing over if that is partly

play45:47

cut off

play45:48

isn't it possible that that is partly

play45:50

what gave you

play45:52

the ability to

play45:54

understand inwardly

play45:56

the world at a much deeper level than

play45:58

than most people do

play46:01

i suppose that's certainly possible um

play46:05

i think this may be some value also from

play46:08

a technology standpoint because

play46:11

i found it uh rewarding to spend all

play46:13

night programming computers

play46:15

um just by myself and

play46:18

i think most people

play46:20

most people don't enjoy typing strange

play46:22

symbols into a computer by themselves

play46:24

all night

play46:24

they think that's

play46:26

not fun but i thought it was i really

play46:28

liked it um so so i just programmed all

play46:32

night by myself and

play46:33

um i found that to be quite enjoyable

play46:37

um but but i think that is not uh normal

play46:41

[Music]

play46:43

so i mean it does you know i've thought

play46:45

a lot about

play46:48

it's a riddle to a lot of people of of

play46:50

how you've done this how you've

play46:51

repeatedly innovated in these different

play46:53

industries and it it does you know every

play46:55

entrepreneur sees possibility in the

play46:58

future and then acts to make that real

play47:01

it it feels to me like you see

play47:03

possibility just more broadly than

play47:06

almost anyone and can connect with so

play47:08

you see scientific possibility based on

play47:11

a deep understanding of physics and

play47:14

knowing what the fundamental equations

play47:15

are

play47:16

what the technologies are that are based

play47:18

on that science and where they could go

play47:19

you see technological possibility and

play47:21

then really unusually you combine that

play47:24

with

play47:25

economic possibility of like what it

play47:27

actually would cost is there a system

play47:29

you can imagine where you could

play47:31

affordably make that thing and that that

play47:35

sometimes you then get conviction that

play47:37

there is an opportunity here put those

play47:39

pieces together and you could do

play47:41

something

play47:42

amazing

play47:44

yeah i think one aspect of whatever

play47:46

condition i had um was i was just

play47:49

absolutely obsessed with truth

play47:51

just obsessed with truth

play47:55

and and so the obsession with truth is

play47:58

why i studied physics

play48:01

because physics attempts to understand

play48:03

the

play48:04

the truth the truth of the universe

play48:06

physics just it's just what are the

play48:09

provable truths of the universe

play48:11

um

play48:12

and and true and truths that have

play48:14

predictive power

play48:15

um

play48:16

so for me physics was sort of a very

play48:18

natural thing to study

play48:21

nobody made me study it it was

play48:23

intrinsically interesting

play48:24

to understand the nature of the universe

play48:27

and then computer science

play48:29

or information theory

play48:32

also to just i understand uh logic and

play48:35

and

play48:36

uh

play48:37

you know there's an also there's an

play48:38

argument that

play48:40

you know that you the that information

play48:42

theory is actually operating

play48:44

at a more fundamental level more

play48:45

fundamental level than than even physics

play48:48

um

play48:49

so

play48:50

uh just yeah um

play48:53

the physics and information theory uh

play48:55

were really interesting to me so when

play48:58

you say truth i mean it's it's not

play49:00

like some people

play49:02

so it's what you're talking about is the

play49:04

truth of the universe like the

play49:05

fundamental truths that drive the

play49:07

universe it's like a deep curiosity

play49:09

about what this universe is why we're

play49:11

here simulation why not you know we

play49:14

don't have time to go into that but i

play49:15

mean it's you're just deeply curious

play49:17

about

play49:19

what this is for what this is this whole

play49:21

thing yes i mean i think the why the why

play49:24

of things is very important

play49:26

um

play49:26

i i actually

play49:28

uh when i was a i don't know

play49:31

so young teens

play49:33

uh i i got quite depressed about the

play49:35

meaning of life

play49:36

um and i was trying to sort of

play49:38

understand the meaning of life looking

play49:40

at reading religious texts and and

play49:44

reading books on philosophy

play49:46

and i got into the german philosophers

play49:47

which is definitely not wise if you're a

play49:49

young teenager i have to say

play49:52

can be ripped out but dark

play49:54

so

play49:55

[Music]

play49:57

much better at as an adult i um and and

play50:00

then actually i ended up reading um the

play50:02

hitchhiker's guide to the galaxy

play50:05

which is actually a book on philosophy

play50:08

just sort of disguised as a silly humor

play50:11

book but but actually the book it's

play50:13

actually a

play50:15

philosophy book

play50:16

and

play50:17

uh adams uh makes the point that

play50:20

it's actually the

play50:21

the question that is harder than the

play50:23

answer

play50:24

um you know this sort of makes a joke

play50:26

that the answer was 42. um that number

play50:28

does pop up a lot um

play50:31

and 420 is just 10 14 10 10 times 10

play50:34

times more significant than 42. okay

play50:38

you know there's um you can make a

play50:40

triangle with 42 or 42 degrees and two

play50:43

69s

play50:44

um

play50:46

so there's no such thing as a perfect

play50:48

triangle or is there

play50:52

but even more important than the answer

play50:56

is the questions that was the whole

play50:58

theme of that book i mean is that yeah

play50:59

basically how you see meaning then it's

play51:01

the pursuit of questions yeah so i have

play51:04

a sort of

play51:05

you know a proposal for a world view or

play51:07

a motivate a motivating philosophy which

play51:10

is to understand

play51:12

what questions to ask about the answer

play51:14

that is the universe and the to agree

play51:17

that we expand the scope and scale of

play51:19

consciousness

play51:20

uh

play51:21

biological and digital

play51:22

uh we will be better able to to uh ask

play51:26

these

play51:26

these questions to frame these questions

play51:28

and to understand

play51:30

why we're here how we got here what

play51:32

what the heck is going on

play51:34

and so that that is my driving

play51:36

philosophy is to expand the scope and

play51:38

scale of consciousness that we may

play51:39

better understand the nature of the

play51:40

universe

play51:42

elon one of the things that was most

play51:44

touching last week

play51:47

was uh was seeing you hang out with your

play51:49

kids um here's if i may

play51:52

um

play51:53

it looks vaguely like a ventriloquist

play51:54

dummy there

play51:55

[Laughter]

play51:57

i mean how do you know that's real

play52:00

um

play52:01

so that's x and and you know you're it

play52:04

was just a delight seeing seeing you

play52:06

hang out with him and

play52:08

what

play52:09

what what what's his future

play52:12

going to be i mean i don't mean him

play52:14

personally but the world he's going to

play52:16

grow up in

play52:17

what future do you believe he will grow

play52:19

up in

play52:22

well i mean a very digital future

play52:26

um

play52:32

a very a different world than i grew up

play52:34

in that's for sure

play52:35

um

play52:37

but i think we want to obviously do our

play52:38

absolute best to ensure that the future

play52:40

is good uh for everyone's children

play52:43

um and

play52:44

and that

play52:46

you know that the future is something

play52:48

that that you can look forward to and

play52:49

not feel sad about

play52:51

um

play52:52

you know you want to get up in the

play52:53

morning and be be excited about the

play52:54

future and we should fight for the

play52:56

things that make us excited about the

play52:58

future you know the future cannot it

play53:01

cannot just be that one

play53:03

miserable thing after another solving

play53:05

one sad problem after another there got

play53:07

to be things that get you excited like

play53:09

you're like you want to live

play53:12

these things are very important

play53:15

you should have more of it

play53:17

and it's not as if it's a done deal like

play53:19

it's all it's all to play for like the

play53:20

future may be

play53:23

horrible still there are scenarios where

play53:24

it is horrible but you you see a pathway

play53:28

to an exciting future both on earth and

play53:32

on mars and

play53:34

in our minds through artificial

play53:35

intelligence and so forth i mean in your

play53:37

in your heart of hearts do you really

play53:39

believe that you are helping deliver

play53:41

that exciting

play53:43

future

play53:44

for

play53:45

ex and for

play53:47

others

play53:48

i'm trying my hardest to do so

play53:53

i

play53:54

you know

play53:56

i love humanity and i think

play53:58

that

play53:59

we should fight for a good future for

play54:01

humanity and i think we should be

play54:02

optimistic about the future and fight to

play54:04

make that optimistic optimistic future

play54:06

happen

play54:09

[Music]

play54:13

i think that's that's the perfect place

play54:15

to close this thank you so much for

play54:16

spending time coming here and for the

play54:19

work that you're doing and good luck

play54:21

with finding a wise course through on

play54:23

twitter and everything else all right

play54:26

thank you

play54:29

hey guys

play54:34

[Music]

Rate This

5.0 / 5 (0 votes)

Do you need a summary in English?