Elon Musk STUNNING Reveal of Grok | NOT aligned, MUCH Bigger, Open Source. There is no doubt left...

AI Unleashed - The Coming Artificial Intelligence Revolution and Race to AGI
17 Mar 202410:48

Summary

TLDRエロン・マスクとXAIは、約束通りにGrockをオープンソースにしました。3月11日にリリースされ、314億パラメーターのミックスオブエキスパートモデルが公開されました。このモデルは、細かい調整されておらず、驚くべき発言をする可能性があります。Apache 2.0ライセンス下で公開され、商用利用も可能です。オープンソースAIの分野において、エロン・マスクがこのモデルを世界に公開することは非常に重要であり、これによりAIの力を広く利用できるようになりました。

Takeaways

  • 🚀 イーロン・マスクとXAIが背後を据えるGrockを3月11日にオープンソースにしました。
  • 📅 オープンソースになるまでの時間が長いと予想されましたが、ようやく実現されました。
  • 🔗 Groのプロフィールページには、ダウンロードするためのTorrentリンクがあります。
  • 🤖 Grockは314億のパラメーターを持つエキスパートのミックスで、RLHF(再現学習から人类フィードバック)を経ていないベースモデルです。
  • 📜 モデルはApache 2.0ライセンス下で公開され、GitHubにはXAIのコードが掲載されています。
  • 🌐 このモデルは商用利用、個人利用など、様々な用途で使用することができます。
  • 🔄 Grockのチームコードの行動規範は、非常に簡潔で「お互いに優しく接してください」という一文でまとめられています。
  • 🔢 Grockは25%のウェイトがアクティブなトークンで動作し、エキスパートのミックスを使用しています。
  • 🔎 オープンソースAIが法律的問題を引き起こす可能性があり、米国では議論が行われています。
  • 🌍 オープンソースAIの公開は、世界中の人々がダウンロードし、利用できるようになり、AIの普及を促進しています。
  • 🌐 オープンソースAIは、大きなテクノロジー企業の独占を防ぎ、技術の力がより広く共有されることを促進しています。

Q & A

  • エロン・マスクはいつGroをオープンソースにしましたか?

    -エロン・マスクは2024年3月11日にGroをオープンソースにしました。

  • Groのオープンソースに関してどのような予想がされていたか?

    -Groのオープンソースに関して、実際にそれが起こるかどうかについて多くの予想がされていました。

  • Groをダウンロードするにはどのようなソフトウェアが必要ですか?

    -GroをダウンロードするにはTorrentソフトウェアが必要です。

  • Groのパラメーターは何億ものものですか?

    -Groは314億のパラメーターを持っています。

  • Groはどのようなモデルですか?

    -Groは8つのエキスパートのミックスであるエキスパートの混合物モデルです。

  • Groはどのライセンス下で公開されていますか?

    -GroはApache 2.0ライセンス下で公開されています。

  • Groのオープンソースリリースがどのような影響を与える可能性がありますか?

    -Groのオープンソースリリースは、オープンソースAIコミュニティやエロン・マスクが持つリソースと才能にアクセスできるようになることで、大きな影響を与える可能性があります。

  • Groのオープンソースリリースは法的に問題がある可能性があるか?

    -Groのオープンソースリリースは、法的に問題がある可能性があり、アメリカでは議論されていることで、違反者は刑務的可能があるとされています。

  • Groのオープンソースリリースはどのような意義がありますか?

    -Groのオープンソースリリースは、AIの力を少数の大手テクノロジー企業に集中させるのではなく、より多くの人々にアクセスを提供し、技術的な進歩を促進する意義があります。

  • Groのオープンソースリリースをどう評価するべきですか?

    -Groのオープンソースリリースは、エロン・マスクが履行した約束の一部であり、AIの開放的な進歩を促進する重要な一歩と評価できます。

Outlines

00:00

🚀 オープンソース化されたGrokkとその影響

Elon MuskはXAIの後ろ盾を持ち、Grokkをオープンソースとして公開しました。Grokkは314億のパラメーターを持つエキスパートのミックスで、細かい調整はされていないため衝撃的な発言をすることがあります。Apache 2.0ライセンス下で公開され、商用利用も可能です。この動きはオープンソースAIコミュニティにとって大きな一歩であり、AIの制御権が数少ない大手企業に集中されるのを防ぎ、技術の進歩と共有を促進しています。

05:00

🌐 Grokkの仕様とオープンソースAIの現状

Grokkは314億のパラメーターを持つ8つのエキスパートのミックスで、特定のタスクに特化したトレーニングを受けていません。これはGPT-4やPalM 2などの他の大規模なモデルと比較することができます。Grokkはトークンに対して25%のウェイトが活性化されています。オープンソースAIの世界では、Grokkは第二largestのモデルの4倍大きいです。Elon Muskのこの開放的なアプローチは、AIの進化と普及に大きな影響を与え、世界中の人々がこの技術を利用できるようになりました。

10:01

📢 Grokk公開の意義と今後の展望

Grokkのオープンソース化は、Elon Muskが行う約束を履行し、オープンソースAIの輪を広げた重要な出来事です。Grokkの公開は、Facebook/MetaやElon Muskなどのアメリカの有力企業がオープンソースAIに力を入れ、技術格差を埋める一環として注目されています。今後もGrokkの開発が進めば、継続的にオープンソース化が行われることが期待されています。この動きは人類の歴史の中で最も興味深い年の一つとなり、全ての未来の年を含めて考えると、その重要性が大きくなっています。

Mindmap

Keywords

💡Elon Musk

エロン・マスクはイノベーター兼ビジネスマンで、この動画の主題と密接に関連しています。彼はXAIという会社を背後に持ち、GrockというAIをオープンソースとして公開するという約束を行いました。この行動はオープンソースAIの発展に大きな影響を与えるとされています。

💡XAI

XAIはGrockを開発し、オープンソースとして公開した会社です。この動画では、XAIがAI技術の進化とその公開に関連する主要な役割を果たしています。

💡Grock

GrockはXAIによって開発されたAIモデルで、この動画の中心的なテーマです。Grockはオープンソースとして公開され、その基盤は314億のパラメーターを持つミキシング・オブ・エキスパートモデルです。

💡Open Source

オープンソースとは、ソフトウェアやハードウェア、ドキュメント、や他のアセットの設計やソースコードを公に公開し、誰でも自由に使用、改変、再配布できることを指します。この動画では、Grockのオープンソース化が注目の的で、これはAI技術の共有と進歩を促進する可能性があります。

💡Mixture of Experts

「エキスパートのミックス」とは、複数の専門家(またはモデル)を組み合わせ、それぞれの専門知識を利用して特定の問題に対処する機械学習の手法です。この動画で話されているGrockは、8つのエキスパートから成り立っており、その中から適切なエキスパートを選択して問題に応答するという仕組みを持っています。

💡Apache 2.0 License

Apache 2.0ライセンスは、ソフトウェアの著作権を保持しながら、他の人々にそのソフトウェアを使用、改変、再配布することを許可するオープンソースライセンスです。この動画では、Grockがこのライセンス下で公開されていることが重要です。

💡Neural Networks

ニューラルネットワークとは、人工知能で使用される計算モデルで、人間の脳の神経細胞と似た働きを持ちます。この動画では、Grockの「デジタルの脳」としてニューラルネットワークが言及されており、データの学習とモデルのトレーニングに不可欠な要素となっています。

💡Weights

ウェイツとは、ニューラルネットワークにおいて、入力されたデータに基づいて得られる出力を決定するパラメーターです。この動画では、Grockのトレーニング後、ニューラルネットワークのウェイツが調整され、特定のデータに応答するように学習されています。

💡GitHub

GitHubは、ソフトウェアの開発者がコードをホストし、共有し、協調して作業できるウェブベースのプラットフォームです。この動画では、XAIがGitHubにGrockのコードを公開していることが重要です。

💡Ethics and Morals

エティックとモラルは、倫理学において道徳的な行動や判断に関する基準を指します。この動画では、Grockのオープンソース化が倫理的な意味を持つことが示されていますが、具体的なエティックやモラルに関する議論はあまり見られないです。

💡Regulation of AI

AIの規制とは、人工知能技術の使用や開発を管理する法律やルールの制定を指します。この動画では、Elon Muskが行ったGrockのオープンソース化が、AIを規制する方法について議論を引き起こしていることが述べられています。

Highlights

Elon Musk and XAI have open-sourced GPT on March 11th.

There was speculation on whether the open-sourcing would happen, but it was confirmed with a torrent link posted on GPT's profile page.

GPT has 314 billion parameters and is a mixture of experts with eight experts in total.

The model is not fine-tuned and may produce shocking statements for the faint of heart.

The open-sourced model is published under the Apache 2.0 license.

XAI has code on GitHub for those interested in learning more or downloading the model.

The model's weights represent the neural network's training on large data sets, aligning the digital brains in a specific way.

The open-sourcing includes not only the weights but also the architecture that connects and operates the model.

The Apache 2.0 license allows for commercial, personal, and any other use rights.

The release was recent, so there may be inaccuracies that will be corrected in a follow-up video.

GPT's Team Code of Conduct emphasizes being excellent to each other, with a simple one-line guideline.

Eager Babushkin, hired by Elon Musk for the AI team, may have influenced the code of conduct.

The base model is trained on large text data and is not fine-tuned for specific tasks, making it potentially adaptable for various uses.

GPT has 25% of the weights active on a given token, making it a mixture of experts model for efficiency.

GPT is estimated to be 1.76 trillion parameters, making it one of the largest models in comparison.

The open-sourcing of GPT is significant for the AI community, promoting progress in building open-source models.

There are discussions about regulating AI, with some advocating for laws that could punish the distribution of AI models like GPT with jail time.

Open-source AI models counterbalance the power of large tech corporations and promote accessibility and innovation.

Elon Musk's actions align with his previous promises and statements about the importance of open-source AI.

Sam Altman's tweet suggests that the current year could be one of the most interesting in human history, with significant developments in AI.

Transcripts

play00:00

well he did it so Elon Musk and xai the

play00:03

company behind grock as promised open

play00:06

sources grock on March 11th Elon Musk

play00:09

said this week we will open source Gro

play00:12

it didn't happen for a long long time

play00:14

they really waited till well today

play00:16

Sunday a lot of speculation on whether

play00:18

or not it was going to happen and today

play00:20

Gro posts this waits in BIO so if you go

play00:23

to kind of the bio the profile page

play00:25

there's a torrent torrent link you can

play00:28

use to download it so you need some sort

play00:29

of torrent software and then just this

play00:32

and you're going to be able to download

play00:33

Gro so what does that look like so it's

play00:35

314 billion parameters M mixture of

play00:39

experts eight experts and it's not rlf

play00:43

so 314 billion parameters so it's the

play00:45

base model it's not fine-tuned so it

play00:48

will say shocking things if you ask for

play00:50

them not for the faint of heart it's

play00:52

eight experts mixture of experts

play00:54

published under the Apache 2.0 license

play00:57

xai has code on GitHub I'll put these

play01:00

links down in the show notes and shout

play01:02

out to Andrew here I'll post his profile

play01:05

as well CU he looks like kind of did a

play01:07

deep dive and put together some stuff

play01:09

that's uh useful if you wanted to know

play01:11

more about it or if you wanted to

play01:13

download it and uh spin it up yourself

play01:16

so basically with these LM models you

play01:19

have the weights so the weights of the

play01:20

neural network so kind of after you

play01:23

train them on large amounts of data

play01:25

their brains their digital brains their

play01:27

neural networks you know align

play01:29

themselves with certain way similar to

play01:30

how we have neural Connections in our

play01:32

brains that's determined by weights

play01:35

among other things but that's not the

play01:37

entirety of it you also have the

play01:39

architecture so that's all the stuff

play01:41

that kind of connects it and makes it

play01:43

work so as far as I can tell I mean this

play01:45

is the whole shebang this is the whole

play01:48

thing open sourced completely laid bare

play01:51

Pache license means commercial use

play01:54

personal use whatever use right if I

play01:56

understand correctly by the way correct

play01:57

me if I'm wrong in the comments and I'll

play01:59

so just a quick disclaimer I mean this

play02:02

got released like 30 minutes ago so it

play02:04

is very possible that some of the things

play02:07

that we read here or go over they they

play02:10

might be incorrect so we'll definitely

play02:12

do a follow-up video later kind of

play02:14

making sure that everything's correct

play02:15

but just keep this in mind so when I'm

play02:17

recording this this this thing is like

play02:18

30 minutes old and it's it's still uh

play02:21

developing and so they're saying so

play02:23

unbelievably based of grock's Team Code

play02:25

of Conduct you know there's no spiel

play02:27

about ethics morals whatever right just

play02:30

be kind this I believe is Eiger

play02:32

babushkin so he was the guy that Elon

play02:35

hired however many months ago to be on

play02:37

the AI team is this the code of conduct

play02:40

be excellent to each other is that just

play02:42

one line so we'll dive deep into this

play02:45

but before we do let me just cover some

play02:47

higher end things and then we'll come

play02:49

back by the way this person got to give

play02:51

him some credit for doing this Andrew

play02:53

keing G and I'll link them in the show

play02:55

notes so computer science at Stanford L

play02:57

ji Dropbox zellow Z fellow is was kind

play03:01

of a community for the most determined

play03:03

technical Founders you know they got

play03:05

nval ravikant on there they have founder

play03:07

of Netflix event bride figma do not pay

play03:10

Tinder founder door Dash and many many

play03:13

more here's the kind of press release

play03:16

blogpost from x. and so they're saying

play03:19

base model trained on large amounts of

play03:21

text Data not fine-tuned for any

play03:24

particular task so again it's likely

play03:26

that the gro that if you've interacted

play03:28

with Gro in X in Twitter right so that's

play03:31

probably a little bit different in the

play03:32

sense that it was fine-tuned and there's

play03:34

probably other code architecture that's

play03:36

hooked up to it right kind of combs

play03:38

through all the tweets that people make

play03:39

to pull out relevant information again

play03:41

I'm kind of assuming there but this is

play03:43

the like the base model that you can now

play03:46

fine-tune and apply to your own

play03:49

particular task potentially right so 314

play03:52

billion parameter mixture of experts

play03:54

model with 25% of the weights active on

play03:56

a given token so mixture of experts

play03:59

meaning that depending on what you're

play04:00

asking for it might get routed to a

play04:01

different sort of part of that model

play04:04

which just makes it you know faster and

play04:06

cheaper to run and uh seemingly very

play04:09

very effective still so we we believe

play04:11

open AI GPT 4 runs on that Google Gemini

play04:14

when they updated to Gemini 1.5 I think

play04:17

they they used mixture of experts and

play04:18

that really bumped up how well that was

play04:20

running mix trol is uh mixture of

play04:23

experts as well so just to give you an

play04:25

idea this is roughly kind of the size of

play04:28

these models so GPT four we're

play04:30

estimating it to be

play04:32

1.76 trillion mixture of experts right

play04:35

so it's it's a big boy and palm 2 340

play04:40

billion we don't know some of the other

play04:42

ones I mean a lot of them are not

play04:43

publishing it this lists Gro at 33

play04:45

billion but this was uh from 2023 so

play04:48

this is you know maybe outdated or

play04:50

whatever but like there's a Lama 2 7

play04:52

billion model so I believe this was

play04:54

posted November 2023 so it's a little

play04:56

bit out of date but I think it gives you

play04:58

a good idea of where grock is so it's

play05:00

not one of the bigger models it's in

play05:02

line with a lot of the other models that

play05:04

are kind of behind GPT 4 all right so

play05:06

let's dive in a little bit to 34 billion

play05:09

parameters mixture of eight experts so

play05:12

with mixture of experts we're at the

play05:14

parameter coun is when we add all of the

play05:16

experts together so it's not really an

play05:18

Apples to Apples comparison to just one

play05:21

model that has that is just kind of like

play05:23

one thing so like mixl represents there

play05:25

as 8 * 7 billion so 8 experts times you

play05:29

know around 7 billion each so the total

play05:31

it's closer to 50 billion but but that's

play05:33

not really the same thing as a 50

play05:35

billion parameter model that's just one

play05:37

whole model so tokenizer vocab size is

play05:41

uh similar to GPT 4 and they list the

play05:44

embedding size 64 Transformer layers the

play05:48

people that want to really dig deep into

play05:50

this certainly should follow this tweet

play05:52

we're not going to cover everything line

play05:54

by line here but it looks like two

play05:56

experts out of eight selected bir tokens

play05:59

so depending on what the use case is

play06:02

different experts are pulled into to to

play06:04

answer that question and something about

play06:06

a bit quantization for the weights so

play06:09

here's kind of a chart that represents

play06:11

the size in parameters so parameters are

play06:14

kind of the synapsis in our brain I

play06:16

think that's fair to say it's analogous

play06:18

to that so right so when we're comparing

play06:20

it to other open source llms right so we

play06:22

have the mystal 7B M strol which you

play06:26

know the 46 billion or whatever right we

play06:28

have the Llama 6 5 billion and here's

play06:31

Gro so it's four times bigger than the

play06:33

second biggest one and as some people

play06:35

pointed out you know don't click on the

play06:36

link this is the entire thing you have

play06:39

to to use the torrent to get that so

play06:42

that's eager babushkin so he is he was

play06:45

hired by Elon Musk to work on xai and

play06:47

once grock posted the weights uh Chad BT

play06:50

the app says stole my whole joke right

play06:52

this whole idea of weights and bio Elon

play06:54

Musk continues with tell us more about

play06:56

the open part of open AI somewhere in

play06:58

there he was also saying that Sam Alman

play07:00

runs this account so I'm not sure if he

play07:03

changed that part or not so shots fired

play07:06

so this of course is huge for the open

play07:09

source Community for progress with

play07:10

Building open source models somebody

play07:13

with the resources and connections and

play07:16

access to talent that Elon Musk has

play07:18

being able to create something like this

play07:20

and then just making it free and

play07:22

accessible for everyone in the world is

play07:25

a big big deal and there's no putting

play07:27

that c back in the bag because at this

play07:29

point it's likely that that has been

play07:31

downloaded across all parts of the world

play07:34

making it harder and harder to prevent

play07:36

people from using these now across the

play07:37

world a lot of governments right now are

play07:38

deciding how to best approach regulating

play07:41

AI some want to completely Outlaw open

play07:45

AI as in what Elon Musk is doing here

play07:48

right weights and bio and then just

play07:50

dropping a torrent Link in fact in the

play07:52

US there has been discussion of

play07:54

potentially giving jail sentences to

play07:57

people that the do what Elon Musk just

play08:01

did now that's not a law or anything but

play08:03

it is being discussed so this is from

play08:05

time.com so there was a report that was

play08:07

commission by the state department in

play08:09

November 2022 as part of a federal

play08:11

contract worth quarter of a million

play08:13

according to public records it was

play08:15

written by Gladstone AI a four-person

play08:17

company that runs technical briefings on

play08:19

AI for government employees and I I

play08:21

believe this is that same report the

play08:24

headline that that references the same

play08:26

report or there might be a few different

play08:28

ones but it's saying here us must move

play08:30

decisively to avert extinction level

play08:32

threat from AI government commission

play08:34

report says and so the report recommends

play08:36

setting a certain thresold so the

play08:38

companies can't train anything more

play08:40

powerful than GPT 4 and Google Gemini

play08:43

and for those companies to require

play08:44

permission to train and deploy new

play08:46

models and authorities should also

play08:49

urgently consider outlawing the

play08:51

publication of the weights or inner

play08:53

workings of powerful AI models for under

play08:56

open- Source licenses with violations

play08:58

possibly pun iable by jail time so it's

play09:01

kind of an important thing to understand

play09:02

in all this that there's some factions

play09:05

that want this to be legal punishable by

play09:08

jail time and they're saying it's

play09:09

because of safety right some sort of a

play09:12

extinction level event but certainly

play09:14

this would really benefit the

play09:17

corporations that have political pull

play09:19

that have invested a lot of money into

play09:20

building these models and this would

play09:22

basically Outlaw any competition anybody

play09:25

to use the free open source models if

play09:28

that was the case we would all be riant

play09:30

on these large Tech corporations to have

play09:33

access to Ai and of course we would have

play09:35

to pay them they could you know design

play09:37

the AI to reflect whatever world view

play09:40

they wanted to reflect and their wealth

play09:42

and power and Status would grow their

play09:44

ability to build the world as they see

play09:46

fit would grow and things like this open

play09:49

source AI is sort of the counterbalance

play09:51

to that that's what Elon Musk is talking

play09:53

about when he's saying that we need more

play09:56

open- Source stuff so this power isn't

play09:58

just concent traed in the hands of the

play10:00

few so that's it for me we'll do a

play10:03

followup when we have more information

play10:05

as people download this thing it's

play10:07

massive let me know what you think about

play10:09

this is Elon Musk living up to the

play10:12

promises that he's made do you think

play10:13

he's doing the right thing by open

play10:15

sourcing Gro it sounds like as he keeps

play10:17

developing it he will continue open

play10:19

sourcing it now that's two wealthy sort

play10:21

of American influences facebook/ meta

play10:24

and you know Elon Musk and the various

play10:26

companies that he have kind of throwing

play10:27

their hat in the open sour course AI

play10:30

ring yeah let me know what you think and

play10:32

we have Sam Alin posted this just a few

play10:34

hours before Gro dropped he's saying

play10:37

this is the most interesting year in

play10:38

human history except for all the future

play10:41

years and whatever the case I think he's

play10:44

spoton about that my name is Wes rth and

play10:46

thank you for watching

Rate This

5.0 / 5 (0 votes)

Related Tags
エロン・マスクGrockオープンソースAIエキスパートミックスパラメーターニューラルネットワーク技術革新AI規制コミュニティ
Do you need a summary in English?