Elon Musk STUNNING Reveal of Grok | NOT aligned, MUCH Bigger, Open Source. There is no doubt left...
Summary
TLDRエロン・マスクとXAIは、約束通りにGrockをオープンソースにしました。3月11日にリリースされ、314億パラメーターのミックスオブエキスパートモデルが公開されました。このモデルは、細かい調整されておらず、驚くべき発言をする可能性があります。Apache 2.0ライセンス下で公開され、商用利用も可能です。オープンソースAIの分野において、エロン・マスクがこのモデルを世界に公開することは非常に重要であり、これによりAIの力を広く利用できるようになりました。
Takeaways
- 🚀 イーロン・マスクとXAIが背後を据えるGrockを3月11日にオープンソースにしました。
- 📅 オープンソースになるまでの時間が長いと予想されましたが、ようやく実現されました。
- 🔗 Groのプロフィールページには、ダウンロードするためのTorrentリンクがあります。
- 🤖 Grockは314億のパラメーターを持つエキスパートのミックスで、RLHF(再現学習から人类フィードバック)を経ていないベースモデルです。
- 📜 モデルはApache 2.0ライセンス下で公開され、GitHubにはXAIのコードが掲載されています。
- 🌐 このモデルは商用利用、個人利用など、様々な用途で使用することができます。
- 🔄 Grockのチームコードの行動規範は、非常に簡潔で「お互いに優しく接してください」という一文でまとめられています。
- 🔢 Grockは25%のウェイトがアクティブなトークンで動作し、エキスパートのミックスを使用しています。
- 🔎 オープンソースAIが法律的問題を引き起こす可能性があり、米国では議論が行われています。
- 🌍 オープンソースAIの公開は、世界中の人々がダウンロードし、利用できるようになり、AIの普及を促進しています。
- 🌐 オープンソースAIは、大きなテクノロジー企業の独占を防ぎ、技術の力がより広く共有されることを促進しています。
Q & A
エロン・マスクはいつGroをオープンソースにしましたか?
-エロン・マスクは2024年3月11日にGroをオープンソースにしました。
Groのオープンソースに関してどのような予想がされていたか?
-Groのオープンソースに関して、実際にそれが起こるかどうかについて多くの予想がされていました。
Groをダウンロードするにはどのようなソフトウェアが必要ですか?
-GroをダウンロードするにはTorrentソフトウェアが必要です。
Groのパラメーターは何億ものものですか?
-Groは314億のパラメーターを持っています。
Groはどのようなモデルですか?
-Groは8つのエキスパートのミックスであるエキスパートの混合物モデルです。
Groはどのライセンス下で公開されていますか?
-GroはApache 2.0ライセンス下で公開されています。
Groのオープンソースリリースがどのような影響を与える可能性がありますか?
-Groのオープンソースリリースは、オープンソースAIコミュニティやエロン・マスクが持つリソースと才能にアクセスできるようになることで、大きな影響を与える可能性があります。
Groのオープンソースリリースは法的に問題がある可能性があるか?
-Groのオープンソースリリースは、法的に問題がある可能性があり、アメリカでは議論されていることで、違反者は刑務的可能があるとされています。
Groのオープンソースリリースはどのような意義がありますか?
-Groのオープンソースリリースは、AIの力を少数の大手テクノロジー企業に集中させるのではなく、より多くの人々にアクセスを提供し、技術的な進歩を促進する意義があります。
Groのオープンソースリリースをどう評価するべきですか?
-Groのオープンソースリリースは、エロン・マスクが履行した約束の一部であり、AIの開放的な進歩を促進する重要な一歩と評価できます。
Outlines
🚀 オープンソース化されたGrokkとその影響
Elon MuskはXAIの後ろ盾を持ち、Grokkをオープンソースとして公開しました。Grokkは314億のパラメーターを持つエキスパートのミックスで、細かい調整はされていないため衝撃的な発言をすることがあります。Apache 2.0ライセンス下で公開され、商用利用も可能です。この動きはオープンソースAIコミュニティにとって大きな一歩であり、AIの制御権が数少ない大手企業に集中されるのを防ぎ、技術の進歩と共有を促進しています。
🌐 Grokkの仕様とオープンソースAIの現状
Grokkは314億のパラメーターを持つ8つのエキスパートのミックスで、特定のタスクに特化したトレーニングを受けていません。これはGPT-4やPalM 2などの他の大規模なモデルと比較することができます。Grokkはトークンに対して25%のウェイトが活性化されています。オープンソースAIの世界では、Grokkは第二largestのモデルの4倍大きいです。Elon Muskのこの開放的なアプローチは、AIの進化と普及に大きな影響を与え、世界中の人々がこの技術を利用できるようになりました。
📢 Grokk公開の意義と今後の展望
Grokkのオープンソース化は、Elon Muskが行う約束を履行し、オープンソースAIの輪を広げた重要な出来事です。Grokkの公開は、Facebook/MetaやElon Muskなどのアメリカの有力企業がオープンソースAIに力を入れ、技術格差を埋める一環として注目されています。今後もGrokkの開発が進めば、継続的にオープンソース化が行われることが期待されています。この動きは人類の歴史の中で最も興味深い年の一つとなり、全ての未来の年を含めて考えると、その重要性が大きくなっています。
Mindmap
Keywords
💡Elon Musk
💡XAI
💡Grock
💡Open Source
💡Mixture of Experts
💡Apache 2.0 License
💡Neural Networks
💡Weights
💡GitHub
💡Ethics and Morals
💡Regulation of AI
Highlights
Elon Musk and XAI have open-sourced GPT on March 11th.
There was speculation on whether the open-sourcing would happen, but it was confirmed with a torrent link posted on GPT's profile page.
GPT has 314 billion parameters and is a mixture of experts with eight experts in total.
The model is not fine-tuned and may produce shocking statements for the faint of heart.
The open-sourced model is published under the Apache 2.0 license.
XAI has code on GitHub for those interested in learning more or downloading the model.
The model's weights represent the neural network's training on large data sets, aligning the digital brains in a specific way.
The open-sourcing includes not only the weights but also the architecture that connects and operates the model.
The Apache 2.0 license allows for commercial, personal, and any other use rights.
The release was recent, so there may be inaccuracies that will be corrected in a follow-up video.
GPT's Team Code of Conduct emphasizes being excellent to each other, with a simple one-line guideline.
Eager Babushkin, hired by Elon Musk for the AI team, may have influenced the code of conduct.
The base model is trained on large text data and is not fine-tuned for specific tasks, making it potentially adaptable for various uses.
GPT has 25% of the weights active on a given token, making it a mixture of experts model for efficiency.
GPT is estimated to be 1.76 trillion parameters, making it one of the largest models in comparison.
The open-sourcing of GPT is significant for the AI community, promoting progress in building open-source models.
There are discussions about regulating AI, with some advocating for laws that could punish the distribution of AI models like GPT with jail time.
Open-source AI models counterbalance the power of large tech corporations and promote accessibility and innovation.
Elon Musk's actions align with his previous promises and statements about the importance of open-source AI.
Sam Altman's tweet suggests that the current year could be one of the most interesting in human history, with significant developments in AI.
Transcripts
well he did it so Elon Musk and xai the
company behind grock as promised open
sources grock on March 11th Elon Musk
said this week we will open source Gro
it didn't happen for a long long time
they really waited till well today
Sunday a lot of speculation on whether
or not it was going to happen and today
Gro posts this waits in BIO so if you go
to kind of the bio the profile page
there's a torrent torrent link you can
use to download it so you need some sort
of torrent software and then just this
and you're going to be able to download
Gro so what does that look like so it's
314 billion parameters M mixture of
experts eight experts and it's not rlf
so 314 billion parameters so it's the
base model it's not fine-tuned so it
will say shocking things if you ask for
them not for the faint of heart it's
eight experts mixture of experts
published under the Apache 2.0 license
xai has code on GitHub I'll put these
links down in the show notes and shout
out to Andrew here I'll post his profile
as well CU he looks like kind of did a
deep dive and put together some stuff
that's uh useful if you wanted to know
more about it or if you wanted to
download it and uh spin it up yourself
so basically with these LM models you
have the weights so the weights of the
neural network so kind of after you
train them on large amounts of data
their brains their digital brains their
neural networks you know align
themselves with certain way similar to
how we have neural Connections in our
brains that's determined by weights
among other things but that's not the
entirety of it you also have the
architecture so that's all the stuff
that kind of connects it and makes it
work so as far as I can tell I mean this
is the whole shebang this is the whole
thing open sourced completely laid bare
Pache license means commercial use
personal use whatever use right if I
understand correctly by the way correct
me if I'm wrong in the comments and I'll
so just a quick disclaimer I mean this
got released like 30 minutes ago so it
is very possible that some of the things
that we read here or go over they they
might be incorrect so we'll definitely
do a follow-up video later kind of
making sure that everything's correct
but just keep this in mind so when I'm
recording this this this thing is like
30 minutes old and it's it's still uh
developing and so they're saying so
unbelievably based of grock's Team Code
of Conduct you know there's no spiel
about ethics morals whatever right just
be kind this I believe is Eiger
babushkin so he was the guy that Elon
hired however many months ago to be on
the AI team is this the code of conduct
be excellent to each other is that just
one line so we'll dive deep into this
but before we do let me just cover some
higher end things and then we'll come
back by the way this person got to give
him some credit for doing this Andrew
keing G and I'll link them in the show
notes so computer science at Stanford L
ji Dropbox zellow Z fellow is was kind
of a community for the most determined
technical Founders you know they got
nval ravikant on there they have founder
of Netflix event bride figma do not pay
Tinder founder door Dash and many many
more here's the kind of press release
blogpost from x. and so they're saying
base model trained on large amounts of
text Data not fine-tuned for any
particular task so again it's likely
that the gro that if you've interacted
with Gro in X in Twitter right so that's
probably a little bit different in the
sense that it was fine-tuned and there's
probably other code architecture that's
hooked up to it right kind of combs
through all the tweets that people make
to pull out relevant information again
I'm kind of assuming there but this is
the like the base model that you can now
fine-tune and apply to your own
particular task potentially right so 314
billion parameter mixture of experts
model with 25% of the weights active on
a given token so mixture of experts
meaning that depending on what you're
asking for it might get routed to a
different sort of part of that model
which just makes it you know faster and
cheaper to run and uh seemingly very
very effective still so we we believe
open AI GPT 4 runs on that Google Gemini
when they updated to Gemini 1.5 I think
they they used mixture of experts and
that really bumped up how well that was
running mix trol is uh mixture of
experts as well so just to give you an
idea this is roughly kind of the size of
these models so GPT four we're
estimating it to be
1.76 trillion mixture of experts right
so it's it's a big boy and palm 2 340
billion we don't know some of the other
ones I mean a lot of them are not
publishing it this lists Gro at 33
billion but this was uh from 2023 so
this is you know maybe outdated or
whatever but like there's a Lama 2 7
billion model so I believe this was
posted November 2023 so it's a little
bit out of date but I think it gives you
a good idea of where grock is so it's
not one of the bigger models it's in
line with a lot of the other models that
are kind of behind GPT 4 all right so
let's dive in a little bit to 34 billion
parameters mixture of eight experts so
with mixture of experts we're at the
parameter coun is when we add all of the
experts together so it's not really an
Apples to Apples comparison to just one
model that has that is just kind of like
one thing so like mixl represents there
as 8 * 7 billion so 8 experts times you
know around 7 billion each so the total
it's closer to 50 billion but but that's
not really the same thing as a 50
billion parameter model that's just one
whole model so tokenizer vocab size is
uh similar to GPT 4 and they list the
embedding size 64 Transformer layers the
people that want to really dig deep into
this certainly should follow this tweet
we're not going to cover everything line
by line here but it looks like two
experts out of eight selected bir tokens
so depending on what the use case is
different experts are pulled into to to
answer that question and something about
a bit quantization for the weights so
here's kind of a chart that represents
the size in parameters so parameters are
kind of the synapsis in our brain I
think that's fair to say it's analogous
to that so right so when we're comparing
it to other open source llms right so we
have the mystal 7B M strol which you
know the 46 billion or whatever right we
have the Llama 6 5 billion and here's
Gro so it's four times bigger than the
second biggest one and as some people
pointed out you know don't click on the
link this is the entire thing you have
to to use the torrent to get that so
that's eager babushkin so he is he was
hired by Elon Musk to work on xai and
once grock posted the weights uh Chad BT
the app says stole my whole joke right
this whole idea of weights and bio Elon
Musk continues with tell us more about
the open part of open AI somewhere in
there he was also saying that Sam Alman
runs this account so I'm not sure if he
changed that part or not so shots fired
so this of course is huge for the open
source Community for progress with
Building open source models somebody
with the resources and connections and
access to talent that Elon Musk has
being able to create something like this
and then just making it free and
accessible for everyone in the world is
a big big deal and there's no putting
that c back in the bag because at this
point it's likely that that has been
downloaded across all parts of the world
making it harder and harder to prevent
people from using these now across the
world a lot of governments right now are
deciding how to best approach regulating
AI some want to completely Outlaw open
AI as in what Elon Musk is doing here
right weights and bio and then just
dropping a torrent Link in fact in the
US there has been discussion of
potentially giving jail sentences to
people that the do what Elon Musk just
did now that's not a law or anything but
it is being discussed so this is from
time.com so there was a report that was
commission by the state department in
November 2022 as part of a federal
contract worth quarter of a million
according to public records it was
written by Gladstone AI a four-person
company that runs technical briefings on
AI for government employees and I I
believe this is that same report the
headline that that references the same
report or there might be a few different
ones but it's saying here us must move
decisively to avert extinction level
threat from AI government commission
report says and so the report recommends
setting a certain thresold so the
companies can't train anything more
powerful than GPT 4 and Google Gemini
and for those companies to require
permission to train and deploy new
models and authorities should also
urgently consider outlawing the
publication of the weights or inner
workings of powerful AI models for under
open- Source licenses with violations
possibly pun iable by jail time so it's
kind of an important thing to understand
in all this that there's some factions
that want this to be legal punishable by
jail time and they're saying it's
because of safety right some sort of a
extinction level event but certainly
this would really benefit the
corporations that have political pull
that have invested a lot of money into
building these models and this would
basically Outlaw any competition anybody
to use the free open source models if
that was the case we would all be riant
on these large Tech corporations to have
access to Ai and of course we would have
to pay them they could you know design
the AI to reflect whatever world view
they wanted to reflect and their wealth
and power and Status would grow their
ability to build the world as they see
fit would grow and things like this open
source AI is sort of the counterbalance
to that that's what Elon Musk is talking
about when he's saying that we need more
open- Source stuff so this power isn't
just concent traed in the hands of the
few so that's it for me we'll do a
followup when we have more information
as people download this thing it's
massive let me know what you think about
this is Elon Musk living up to the
promises that he's made do you think
he's doing the right thing by open
sourcing Gro it sounds like as he keeps
developing it he will continue open
sourcing it now that's two wealthy sort
of American influences facebook/ meta
and you know Elon Musk and the various
companies that he have kind of throwing
their hat in the open sour course AI
ring yeah let me know what you think and
we have Sam Alin posted this just a few
hours before Gro dropped he's saying
this is the most interesting year in
human history except for all the future
years and whatever the case I think he's
spoton about that my name is Wes rth and
thank you for watching
Browse More Related Video
【速報】Meta社がついに最新・最強AI「Llama3」をリリース!今後インスタにも導入!?徹底レビュー
盲点!Difyのローカル版でなく、クラウド版を使ったら、RAGチャットが普通に公開できちゃったし、回答も鬼早だった
松田語録:Jamba〜Mamba+Transformer
【これが無料!?】Meta社の最新AI「Llama3.1」の使い方を解説!オープンソースLLMの時代が来る!?
ショート動画大量生成!!動画生成AI「NoLang」でWebページを一瞬で動画に変換!!
BREAKING: ELON MUSK Drops OPEN AI BOMBSHELL "AGI Achieved" (Elon Musk Lawsuit) Q" QSTAR
5.0 / 5 (0 votes)