I'm Buying These 3 Stocks to Get Rich (Without Getting Lucky)
Summary
TLDRThe video explores the growing energy demands caused by AI, particularly with tools like ChatGPT, which consumes significantly more power than traditional Google searches. It highlights companies like Verve, Broadcom, and Marvell that are creating solutions to manage the energy consumption and cooling challenges in data centers. The AI boom is projected to increase power demand by 160% by 2030, making energy-efficient infrastructure crucial. The video also touches on custom AI chips and how investors can capitalize on these trends through innovations in cooling, chip design, and data center optimization.
Takeaways
- ๐ **AI Power Consumption**: A single Chat GPT query consumes up to 10 times the electricity of a Google Search.
- ๐ **Growing Power Demand**: Power demand from data centers has doubled in the last 5 years and is predicted to grow by another 160% by 2030.
- ๐ **Global AI Market Growth**: The global AI market is expected to grow almost 12 times over the next 8 years with a CAGR of 36.8%.
- ๐ง **Cooling Solutions**: Verve Holdings provides power and thermal management solutions, including liquid cooling systems for data centers.
- ๐ป **Custom AI Chips**: Broadcom designs custom power-efficient AI chips for tech giants like Google and Meta Platforms.
- ๐ **Efficient AI Chips and Switches**: Marvel makes efficient AI chips and switches for data centers, aiming to reduce power consumption.
- ๐ **AI Training Energy**: Training large AI models, especially trillion-parameter models, is very energy-intensive.
- ๐ญ **Data Center Infrastructure**: Many data centers need to make infrastructure changes to support more efficient cooling and AI workloads.
- ๐น **Investment Opportunities**: The Fundrise Innovation Fund offers a way to invest in top private pre-IPO companies in the AI and data infrastructure space.
- ๐ **Broadcom's Dominance**: Broadcom leads the ASIC market with a 55-60% share and is a key player in AI and data center infrastructure.
Q & A
How much electricity does a single Chat GPT query consume compared to a Google Search?
-A single Chat GPT query consumes up to 10 times the electricity of a Google Search. Specifically, a Chat GPT prompt costs 2.9 watt-hours, while a Google search uses around 0.3 watt-hours.
What is Goldman Sachs' prediction for power demand from data centers by 2030?
-Goldman Sachs predicts that power demand from data centers will grow by another 160% by 2030.
What is the significance of Verve's liquid cooling systems in the context of AI power consumption?
-Verve's liquid cooling systems are significant because they provide power and thermal management solutions for data centers, which are crucial for handling the high power consumption of AI applications. These systems are designed for high-density deployments and can integrate with existing data center infrastructures.
What is Broadcom's role in the AI chip market?
-Broadcom is a leader in the ASIC market with a dominant 55 to 60% share and a major focus on AI and data center infrastructure. They co-designed the last six generations of Google TPUs and are involved in designing custom power-efficient AI chips for tech giants like Google and Meta Platforms.
How does Marvel's involvement in the ASIC market compare to Broadcom?
-Marvel is the second-largest company in the ASIC market with around a 14% share. They have partnerships with hyperscale customers and are expected to double their share of the overall data center accelerator market in the coming years.
What is the projected growth rate of the global artificial intelligence market over the next 8 years?
-The global artificial intelligence market is expected to almost 12x over the next 8 years, which is a compound annual growth rate of 36.8%.
How does the power consumption of AI models during training compare to inference?
-Training and retraining large AI models is very energy-intensive, especially when we're talking about trillion parameter models. For example, GPT-4 took over 50 gigawatt-hours to train, which is about 0.2% of the electricity generated by the entire State of California over a year.
What is the projected growth rate for the liquid cooling market for data centers by 2030?
-The liquid cooling market for data centers is expected to more than quadruple by 2030, which would be a compound annual growth rate of 27.6% for the next 6 years.
How does the age of the US power grid impact the power consumption of AI?
-The average age of the US power grid is around 40 years old, with over a quarter of the grid being 50 years old or older. This aging infrastructure can struggle to meet the increased power demands of AI, which can exacerbate power consumption issues.
What is the significance of the Fundrise Innovation Fund in relation to AI investments?
-The Fundrise Innovation Fund provides regular investors access to some of the top private pre-IPO companies in the AI and data infrastructure sectors, allowing them to invest in the next generation of AI applications before they go public.
Outlines
๐ The Power Struggle of AI Growth
This paragraph discusses the significant energy consumption of artificial intelligence, particularly highlighting that a single AI query can consume up to 10 times the electricity of a Google search. It outlines the rapid growth in power demand from data centers, which has doubled in the last five years and is predicted to increase by another 160% by 2030. The speaker plans to discuss the energy consumption of AI, the impact on data centers, and companies like Verve, Broadcom, and Marvel that are addressing this issue by providing power and thermal management solutions, custom AI chips, and efficient AI chips and switches. The energy cost of training AI models, such as GPT-4, is also emphasized, along with the importance of factors like data center design, power grid age, and the increasing power efficiency of GPUs.
๐ Investing in the AI Boom: Opportunities and Challenges
The second paragraph focuses on the exponential growth of the AI market, which is expected to increase almost 12 times over the next 8 years. It discusses the trend of companies delaying their IPOs, potentially causing investors to miss out on significant returns. The paragraph introduces the Fundrise Innovation fund as a way for regular investors to access top private pre-IPO companies. It also covers the shift towards direct-to-chip liquid cooling in data centers, which is predicted to quadruple by 2030, and Verve's expansion plans to meet this demand. Additionally, it touches on the custom chips used by tech giants like Amazon, Microsoft, and Google, and the competition in the ASIC market, particularly between Broadcom and Marvel.
๐ก Broadcom and Marvel: Powering AI with ASICs
This paragraph delves into the roles of Broadcom and Marvel in the ASIC market, with Broadcom leading with a 55-60% share and a significant focus on AI and data center infrastructure. It mentions Broadcom's involvement in designing Google's TPUs and its potential to expand into custom chips for other companies. The paragraph also discusses Marvel's position as the second-largest ASIC company, its partnership with hyperscale customers, and its focus on data center products. The growth of Broadcom's networking revenue and the potential for the global AI ASIC market to grow significantly by 2033 is also highlighted.
๐ The Future of Data Center Power Demand and Efficiency
The final paragraph addresses the increasing power demand of data centers due to AI, which is expected to rise by 15-20% annually through 2030, potentially reaching 16% of total US power consumption. It emphasizes the need for more than just efficient cooling and ASICs to meet this demand, suggesting that the companies mentioned earlier will play crucial roles. The speaker encourages understanding the science behind these stocks for investment purposes and signs off with a reminder that the best investment is in oneself.
Mindmap
Keywords
๐กArtificial Intelligence (AI)
๐กData Centers
๐กPower Consumption
๐กThermal Management
๐กASIC (Application-Specific Integrated Circuit)
๐กGPU (Graphics Processing Unit)
๐กEfficiency
๐กVerve Holdings
๐กBroadcom
๐กMarvel
๐กInvestment
Highlights
AI's growing power consumption is a major concern, with ChatGPT queries using 10 times the electricity of a Google search.
Goldman Sachs predicts power demand from data centers will grow by 160% by 2030, driven by AI workloads.
Verve Holdings (VRT) provides power and thermal management solutions, including liquid cooling systems, to help data centers handle high-density AI applications.
AI model training is energy-intensive; for instance, GPT-4 used over 50 gigawatt-hours of electricity, equivalent to 0.2% of California's annual power.
Nvidiaโs latest Hopper GPUs consume 700 watts each, nearly double the power of the previous generation, contributing to rising power demands.
Direct-to-chip liquid cooling, which is 3,000 times more effective than air cooling, is expected to grow rapidly, quadrupling by 2030.
Broadcom holds a 55-60% market share in the custom chip (ASIC) market, co-designing Google's TPUs and meta's training and inference accelerators.
Broadcomโs custom AI chip program is expected to generate $8 billion in revenue in 2024 and $10 billion in 2025, mostly from Googleโs TPUs.
Verveโs production capacity for liquid cooling systems is projected to expand by 45 times in 2024, with a $7 billion backlog in orders.
Broadcomโs networking products, including Tomahawk Ethernet switches, saw a 44% year-over-year revenue growth in a single quarter.
Nvidia dominates the data center GPU market, with estimates of 92-98% market share, but faces rising competition from custom AI chips.
Companies like Amazon, Google, and Microsoft are developing their own AI chips (ASICs) to reduce reliance on Nvidia GPUs.
AIโs rapid growth is expected to drive data center power consumption to 16% of total U.S. electricity by 2030.
The liquid cooling market is expected to grow at a compound annual rate of 27.6% until 2030, driven by high-density AI computing needs.
Marvell Technology, the second-largest player in the ASIC market, is growing its data center business with custom AI chip programs, partnering with major tech giants.
Transcripts
artificial intelligence has a huge
problem a single chat GPT query can take
up to 10 times the electricity of a
Google Search and while power demand
from data centers has already doubled
Over The Last 5 Years Goldman Sachs
predicts that it'll grow by another 160%
by 2030 so in this episode I'll
highlight a few companies that are
tackling this exact problem positioning
them to win big no matter which AI
companies come out on top making them a
great way to get rich without getting
lucky your time is valuable so let's get
right into it first things first I'm not
here to waste your time so here's
exactly what I'm going to cover I'll go
over the big Power problem that AI is
causing right now I'll talk about Verve
which is a company that provides power
and thermal Management Solutions for
data centers broadcom which designs
custom power efficient AI chips for Tech
giants like Google and meta platforms
and Marvel a company that also makes
efficient AI chips and switches for data
centers it's important to understand
just how much power AI is projected to
consume over the next few years so let
me break that down first a single chat
GPT prompt costs 2.9 wat hours that's
like keeping a 5w LED bulb on for a
little over half an hour while using AI
to generate a single image can cost as
much electricity as charging your phone
compare that to a Google search which
also takes in a query and returns text
and images but only uses around 0 .3 wat
hours in the process a couple of wat
hours may not seem like much but there
are roughly 9 billion Google searches
every day and if we move them all to
generative AI it would take on the order
of 10 tratt hours more to serve all of
those requests that's enough electricity
to power almost a million homes for an
entire year but this is actually a bad
comparison because people don't use
generative AI tools the same way they
use Google for example chat GPT tends to
be more of a dialogue between the user
and an AI model that can really rack up
the number of queries compared to a
Google Search and that's just the
inference side of the story training and
retraining large AI models is very
energy intensive too especially when
we're talking about trillion parameter
models for example GPT 4 took over 50
gwatt hours to train or about .2% of the
electricity generated by the entire
State of California over a year as an
investor this worries me for three
reasons first the amount of compute
needed to train AI models has been
doubling roughly every 6 months talk
about exponential growth second that
gets multiplied by the number of
foundation models being trained which is
also growing exponentially and third
even though you can use chat GPT almost
anywhere in the world it consumes energy
only at the server's location energy
accounts for up to 70% of a data
Center's total cost cost of operations
so the hardware and the racks how the
data center facility is designed and
weighed out and even the age of its
local power grid all really matter and
by the way the average age of the US
power grid is around 40 years old with
over a quarter of the grid being 50
years old or older and AI only makes
this problem worse for example nvidia's
previous generation a100 gpus use about
400 watts but the current generation of
Hopper gpus run at 700 Watts that's
almost twice the power and four or five
times the power of CPU based servers
it's worth noting that the h100 gpus are
up to nine times faster at AI training
and 30 times faster for inference over
the A1 100s so the power efficiency of
nvidia's gpus is going way up with every
generation but Power demand is going up
faster so it takes more than just high
performance gpus to solve this problem
let's start with cooling since that
accounts for up to 40% of a data
Center's energy us which means 28% of
the total cost of operations ver of
Holdings ticker symbol VRT provides
power and thermal Management Solutions
for data centers like their libert
liquid cooling systems these systems are
built specifically for highdensity
deployments like the ones that power
intense AI applications providing cold
plates and directed chip Cooling in a
way that integrates with existing data
center infrastructures which is a big
deal because around 90% of all server
racks are air cooled today
in fact most data centers even run their
h100 chips at low enough power so they
can be air cooled so a lot of them will
need to make massive infrastructure
changes to support direct to chip liquid
cooling for nvidia's upcoming Blackwell
systems if they want to run those chips
at Peak Performance that includes
hyperscalers like Amazon Google and
Microsoft all of which need to support
power hungry AI workloads for thousands
of other businesses speaking of which
according to Market us the Global
artificial intelligence Market is
expected to almost 12x over the next 8
years which is a compound annual growth
rate of
36.8% but many of the companies building
the next generation of AI applications
are not publicly traded think about the
9s and early 2000s companies like Amazon
and Google went public very early in
their growth cycle but today companies
are waiting an average of 10 years or
longer to go public that means investors
like us can miss out on most of the
returns from the next Amazon the next
Google the next Nvidia so I spent a lot
of time digging into this and The
fundrise Innovation fund is a great way
to invest in some of the best tech
companies before they go public venture
capital is usually only for the ultra
wealthy but fund Rises Innovation fund
gives regular investors access to some
of the top private pre-ipo companies on
Earth without breaking the bank The
fundrise Innovation fund also has an
impressive track record already
investing over $100 million into some of
the largest most inem demand Ai and data
infrastructure companies so if you want
access to some of the best weight stage
AI companies before they IPO check out
the fundrise Innovation fund with my
link below today all right so 90% of all
server racks are air cooled today but
industry estimates suggest that up to
80% of data center cooling will become
direct to chip liquid cooling over time
direct to chip liquid cooling is where a
heat conductive copper plate sits on top
of a chip chip just like a normal heat
sink but instead of being air cooled by
a fan the plate is connected to two
pipes one pipe brings in Cool Water to
absorb the heat from the plate and the
other pipe moves hot water away direct
to chip liquid cooling is up to 3,000
times more effective than air Cooling
and better cooling means that servers
can be stacked closer together without
overheating every data center has a
fixed amount of space so they need to
optimize their cooling if they want to
squeeze the most compute out of their
entire facility as a result the liquid
cooling market for data centers is
expected to more than quadruple by 2030
which would be a compound annual growth
rate of
27.6% for the next 6 years and Verve
definitely knows that according to their
quarter2 earnings call they're on track
to expand their liquid cooling
production capacity by a whopping 45x
over the course of 2024 Verve also
doubled their production capacity for
power management products over the last
3 years and they plan to double It Again
by the end of 2025 all of these
expansions should lead directly to more
Revenue since Verve is currently limited
by Supply not demand Verve had a$7
billion backlog of orders at the end of
quarter 2 which was up 11% quarter over
quarter and 47% year over-year Verve
stock is already up around 120% year to
dat and I definitely think they have
plenty of room to run as the AI boom
continues compute and connectivity are
also energy intensive so let's talk
about them next today Nvidia has a
massive share of the data center GPU
Market with estimates ranging from 92%
all the way to 98% market share but gpus
are not the only way to process intense
AI workloads over the last 3 years I've
spent a lot of time covering the custom
chips used by Amazon web services
Microsoft Azure and Google Cloud these
custom chips are called as6 application
specific integrated circuits and they do
exactly what their name implies their
design is tailored to a specific
application which simplifies the Chip's
architecture the result is a chip that
can run a narrow set of workloads
extremely efficiently at the cost of
supporting fewer kinds of workloads than
more General processors like gpus and
CPUs so as Amazon Microsoft Google and
their Cloud clients need more support
for a specific kind of workload like
running large language models
synthesizing speech from text or
generating images they could make a chip
for that workload and free up their more
expensive Nvidia infrastructure for
other tasks all three hyperscalers are
making big investments into their own
semiconductor Supply chains to reduce
their overall Reliance on Nvidia over
time Amazon has their inferentia and
tranium chips for AI inference and
training respectively Microsoft has
their Azure Maya accelerators and of
course Google has their tensor
processing units or tpus the demand for
A6 is so high that even an Nvidia is
building a new business unit focused on
making custom chips for other companies
which could help extend their Cuda
ecosystem to new kinds of chips but
Nvidia will have some serious
competition in this space from rival
companies like Marvel technology and
broadcom so let's talk about them next
broadcom is the leader of the Asic
Market with a dominant 55 to 60% share
and a major focus on AI and data center
infrastructure broadcom co-designed the
last six generations of Google tpus and
that partnership got extended to the
next generation of tpus as well which
shows just how sticky these chip design
relationships can be once they're up and
running Google claims that their sixth
generation Trillium tpus are 67% more
energy efficient than their current
generation with 4.7 times more Peak
compute performance and JP Morgan
analysts estimate that broadcom's TPU
program will generate $8 billion in
Revenue in 2024 and another 10 billion
in 2025 and that's just from Google's
tpus broadcom is also behind every
generation of the MTI chips which are
meta's training and inference
accelerators and broadcom's Ambitions
for custom AI chips don't stop with
Google or meta in July broadcom was
rumored to be in talks with open AI to
design as6 for them as well but who
knows what'll happen now that open ai's
Chief technical officer Mira Madi is
leaving and open AI is becoming a
for-profit company let me know if you
want a separate Deep dive on all of that
and the resulting drama but broadcom
makes more than just custom chips for
Tech Giants according to broadcom CEO
Hawk tan more than
99.5% of all internet traffic Touches at
least one broadcom chip Jim I got to
tell you in
99.5% of every bit of data that flows in
the internet will cross at least one or
more broadcom chip Brom has several
lines of network switches that also use
as6 designed to optimize Network traffic
by choosing the best path for data
packets based on the Network's layout
and its current conditions broadcom's
Tomahawk series is tailored specifically
for ethernet switches and their Jericho
line is for more complex networks that
need core and Edge Computing this past
quarter broadcom's networking Revenue
grew by 44% year-over-year after they
doubled the number of switches that they
sold and broadcom stock is it's up
nearly 60% year-to date broadcom may not
be another Nvidia but honestly that's a
good thing too because it's exposed to
all the growth that AI has to offer in a
different way than Nvidia ethernet
versus infiniband and as6 versus gpus
that's why holding broadcom and Nvidia
is a great way to have your cake and eat
it too by the way the global market for
AI as6 is expected to roughly 10x in
size by 2033 which is a compound annual
growth rate of around 30 % for the next
10 years as6 currently represent 16% of
the total data center accelerator market
and I expect that to meaningfully
increase over time as AI applications
get more diverse and power consumption
becomes an even bigger issue than it is
today Marvel is the second largest
company in the Asic Market only behind
broadcom with around a 14% share so
holding both broadcom and Marvel stock
means owning roughly 75% of the entire
Asic Market according to a recent call
with Wall Street analysts Marvel
currently has around a 10% share of the
overall data center accelerator market
and expects to double that over the next
few years back in May Marvel confirmed
that they partnered with at least three
hyperscale customers which analysts
currently believe are Amazon web
services tranium and inferentia chips
Microsoft azure's Maya accelerators and
Google's arm-based Axion data center
CPUs and going back to our original
Power problem Google claims their Axion
CPU is 60% more power efficient than
comparable x86 chips that are produced
by AMD and Intel but compute is just one
side of the story moving data over a
massive network is also extremely energy
intensive picking the right Network
switches can reduce power consumption by
around 30% Marvel dominates the space
with a range of products like Optical
and copper transceivers high performance
Optical interconnects for data centers
and a broad range of ethernet switches
in March Marvel extended their
longstanding partnership with tsmc to
develop their next generation of AI
infrastructure products using tsmc's 2
nanometer process technology it's worth
noting that Marvel stock hasn't done
nearly as well as broadcom since
Marvel's total revenues have been in
Decline but under the hood Marvel's
Revenue mix is Shifting substantially
with data centers now accounting for
almost 70% of their total revenue last
quarter up from around 33% at the start
of 2023 kind of like how Nvidia saw a
sharp decline in their revenue when they
pivoted from gaming gpus to Data Center
chips a few years ago case in point
Marvel's Data Center business grew 7%
quarter over quarter and a whopping 87%
year over-year and for this quarter
Marvel forecast data center revenues to
accelerate even faster thanks to their
custom AI chip programs which are
beginning to ramp right now the AI boom
is expected to increase data center
Demand by 15 to 20% every year through
2030 at which point they could reach a
whopping 16% of total us power
consumption that's the same demand as
about 2/3 of all the homes in the United
States and while every generation of
nvidia's gpus is more efficient than the
last AI has caused the overall demand
for data center power to far exceed
Supply it will take much more than
efficient cooling solutions from
companies like veriff or the as6 for AI
Computing designed by broadcom and
Marvell but in my opinion all three of
these companies will will end up being
important pieces of the puzzle and
that's why it's so important to
understand the science behind the stocks
and if you want to see what else I'm
investing in to get rich without getting
lucky check out this video next either
way thanks for watching and until next
time this is ticker simple you my name
is Alex reminding you that the best
investment you can make is in you
Browse More Related Video
How Much Energy Does the Internet Use? | Hot Mess ๐
Google Data Center Efficiency Best Practices -- Full Video
How The Massive Power Draw Of Generative AI Is Overtaxing Our Grid
Setting Up Data Centers In India: Assessing The Systemic Tailwinds & Possible Beneficiaries
Sean James, Microsoft, on the opportunity for data centers to lead the energy transition
AI and the energy required to power it fuel new climate concerns
5.0 / 5 (0 votes)