I'm Buying These 3 Stocks to Get Rich (Without Getting Lucky)

Ticker Symbol: YOU
29 Sept 202415:55

Summary

TLDRThe video explores the growing energy demands caused by AI, particularly with tools like ChatGPT, which consumes significantly more power than traditional Google searches. It highlights companies like Verve, Broadcom, and Marvell that are creating solutions to manage the energy consumption and cooling challenges in data centers. The AI boom is projected to increase power demand by 160% by 2030, making energy-efficient infrastructure crucial. The video also touches on custom AI chips and how investors can capitalize on these trends through innovations in cooling, chip design, and data center optimization.

Takeaways

  • 🔋 **AI Power Consumption**: A single Chat GPT query consumes up to 10 times the electricity of a Google Search.
  • 📈 **Growing Power Demand**: Power demand from data centers has doubled in the last 5 years and is predicted to grow by another 160% by 2030.
  • 🌐 **Global AI Market Growth**: The global AI market is expected to grow almost 12 times over the next 8 years with a CAGR of 36.8%.
  • 💧 **Cooling Solutions**: Verve Holdings provides power and thermal management solutions, including liquid cooling systems for data centers.
  • 💻 **Custom AI Chips**: Broadcom designs custom power-efficient AI chips for tech giants like Google and Meta Platforms.
  • 🔌 **Efficient AI Chips and Switches**: Marvel makes efficient AI chips and switches for data centers, aiming to reduce power consumption.
  • 📊 **AI Training Energy**: Training large AI models, especially trillion-parameter models, is very energy-intensive.
  • 🏭 **Data Center Infrastructure**: Many data centers need to make infrastructure changes to support more efficient cooling and AI workloads.
  • 💹 **Investment Opportunities**: The Fundrise Innovation Fund offers a way to invest in top private pre-IPO companies in the AI and data infrastructure space.
  • 🚀 **Broadcom's Dominance**: Broadcom leads the ASIC market with a 55-60% share and is a key player in AI and data center infrastructure.

Q & A

  • How much electricity does a single Chat GPT query consume compared to a Google Search?

    -A single Chat GPT query consumes up to 10 times the electricity of a Google Search. Specifically, a Chat GPT prompt costs 2.9 watt-hours, while a Google search uses around 0.3 watt-hours.

  • What is Goldman Sachs' prediction for power demand from data centers by 2030?

    -Goldman Sachs predicts that power demand from data centers will grow by another 160% by 2030.

  • What is the significance of Verve's liquid cooling systems in the context of AI power consumption?

    -Verve's liquid cooling systems are significant because they provide power and thermal management solutions for data centers, which are crucial for handling the high power consumption of AI applications. These systems are designed for high-density deployments and can integrate with existing data center infrastructures.

  • What is Broadcom's role in the AI chip market?

    -Broadcom is a leader in the ASIC market with a dominant 55 to 60% share and a major focus on AI and data center infrastructure. They co-designed the last six generations of Google TPUs and are involved in designing custom power-efficient AI chips for tech giants like Google and Meta Platforms.

  • How does Marvel's involvement in the ASIC market compare to Broadcom?

    -Marvel is the second-largest company in the ASIC market with around a 14% share. They have partnerships with hyperscale customers and are expected to double their share of the overall data center accelerator market in the coming years.

  • What is the projected growth rate of the global artificial intelligence market over the next 8 years?

    -The global artificial intelligence market is expected to almost 12x over the next 8 years, which is a compound annual growth rate of 36.8%.

  • How does the power consumption of AI models during training compare to inference?

    -Training and retraining large AI models is very energy-intensive, especially when we're talking about trillion parameter models. For example, GPT-4 took over 50 gigawatt-hours to train, which is about 0.2% of the electricity generated by the entire State of California over a year.

  • What is the projected growth rate for the liquid cooling market for data centers by 2030?

    -The liquid cooling market for data centers is expected to more than quadruple by 2030, which would be a compound annual growth rate of 27.6% for the next 6 years.

  • How does the age of the US power grid impact the power consumption of AI?

    -The average age of the US power grid is around 40 years old, with over a quarter of the grid being 50 years old or older. This aging infrastructure can struggle to meet the increased power demands of AI, which can exacerbate power consumption issues.

  • What is the significance of the Fundrise Innovation Fund in relation to AI investments?

    -The Fundrise Innovation Fund provides regular investors access to some of the top private pre-IPO companies in the AI and data infrastructure sectors, allowing them to invest in the next generation of AI applications before they go public.

Outlines

00:00

🔌 The Power Struggle of AI Growth

This paragraph discusses the significant energy consumption of artificial intelligence, particularly highlighting that a single AI query can consume up to 10 times the electricity of a Google search. It outlines the rapid growth in power demand from data centers, which has doubled in the last five years and is predicted to increase by another 160% by 2030. The speaker plans to discuss the energy consumption of AI, the impact on data centers, and companies like Verve, Broadcom, and Marvel that are addressing this issue by providing power and thermal management solutions, custom AI chips, and efficient AI chips and switches. The energy cost of training AI models, such as GPT-4, is also emphasized, along with the importance of factors like data center design, power grid age, and the increasing power efficiency of GPUs.

05:00

🌐 Investing in the AI Boom: Opportunities and Challenges

The second paragraph focuses on the exponential growth of the AI market, which is expected to increase almost 12 times over the next 8 years. It discusses the trend of companies delaying their IPOs, potentially causing investors to miss out on significant returns. The paragraph introduces the Fundrise Innovation fund as a way for regular investors to access top private pre-IPO companies. It also covers the shift towards direct-to-chip liquid cooling in data centers, which is predicted to quadruple by 2030, and Verve's expansion plans to meet this demand. Additionally, it touches on the custom chips used by tech giants like Amazon, Microsoft, and Google, and the competition in the ASIC market, particularly between Broadcom and Marvel.

10:01

💡 Broadcom and Marvel: Powering AI with ASICs

This paragraph delves into the roles of Broadcom and Marvel in the ASIC market, with Broadcom leading with a 55-60% share and a significant focus on AI and data center infrastructure. It mentions Broadcom's involvement in designing Google's TPUs and its potential to expand into custom chips for other companies. The paragraph also discusses Marvel's position as the second-largest ASIC company, its partnership with hyperscale customers, and its focus on data center products. The growth of Broadcom's networking revenue and the potential for the global AI ASIC market to grow significantly by 2033 is also highlighted.

15:03

🚀 The Future of Data Center Power Demand and Efficiency

The final paragraph addresses the increasing power demand of data centers due to AI, which is expected to rise by 15-20% annually through 2030, potentially reaching 16% of total US power consumption. It emphasizes the need for more than just efficient cooling and ASICs to meet this demand, suggesting that the companies mentioned earlier will play crucial roles. The speaker encourages understanding the science behind these stocks for investment purposes and signs off with a reminder that the best investment is in oneself.

Mindmap

Keywords

💡Artificial Intelligence (AI)

Artificial Intelligence refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. In the video, AI is central to the discussion about the increasing power consumption due to data centers and the need for efficient power and thermal management solutions. The script mentions AI applications like chat GPT and generative AI, which consume significant amounts of electricity.

💡Data Centers

Data Centers are facilities used to house servers, storage systems, and networking equipment that organizations use to store, process, and manage data. The video highlights the growing power demand from data centers due to the rise of AI, which is a concern because these centers already consume a large amount of electricity, and this demand is expected to increase further.

💡Power Consumption

Power Consumption refers to the amount of electrical energy used by electronic devices over a period. The video emphasizes the high power consumption of AI applications, comparing it to a Google search and highlighting how much more electricity is used, especially when training large AI models like GPT-4.

💡Thermal Management

Thermal Management is the process of controlling and removing heat generated by electronic devices to maintain optimal operating temperatures. The video discusses Verve Holdings, a company that provides thermal management solutions for data centers, which is crucial as AI applications generate a lot of heat, increasing the need for efficient cooling systems.

💡ASIC (Application-Specific Integrated Circuit)

ASIC refers to custom-designed integrated circuits created for a specific use rather than general-purpose use. The video mentions Broadcom and Marvel as leaders in the ASIC market, designing chips for AI applications that are more energy-efficient than general processors like GPUs and CPUs.

💡GPU (Graphics Processing Unit)

A GPU is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. The video discusses how GPUs, especially Nvidia's Hopper GPUs, consume more power but are also more efficient in AI training and inference tasks.

💡Efficiency

Efficiency in the context of the video refers to the ability of a system or device to perform its intended function with minimal waste of energy or resources. The video explores companies like Verve, Broadcom, and Marvel that are developing solutions to improve the efficiency of AI applications in data centers.

💡Verve Holdings

Verve Holdings is a company mentioned in the video that provides power and thermal management solutions for data centers. It is positioned as a company that can benefit from the growing demand for energy-efficient solutions in data centers due to the power-hungry nature of AI applications.

💡Broadcom

Broadcom is a technology company discussed in the video that designs custom power-efficient AI chips and ASICs for tech giants. It has a significant share in the ASIC market and is highlighted as a company that can profit from the growing need for efficient AI hardware.

💡Marvel

Marvel is another company mentioned in the video that makes efficient AI chips and switches for data centers. It is highlighted as a competitor to Broadcom in the ASIC market, with a focus on creating energy-efficient solutions for AI applications.

💡Investment

Investment in the context of the video refers to the act of allocating resources, such as money, with the expectation of earning a return. The video suggests that investing in companies like Verve, Broadcom, and Marvel, which are tackling the power consumption problem of AI, can be a lucrative opportunity.

Highlights

AI's growing power consumption is a major concern, with ChatGPT queries using 10 times the electricity of a Google search.

Goldman Sachs predicts power demand from data centers will grow by 160% by 2030, driven by AI workloads.

Verve Holdings (VRT) provides power and thermal management solutions, including liquid cooling systems, to help data centers handle high-density AI applications.

AI model training is energy-intensive; for instance, GPT-4 used over 50 gigawatt-hours of electricity, equivalent to 0.2% of California's annual power.

Nvidia’s latest Hopper GPUs consume 700 watts each, nearly double the power of the previous generation, contributing to rising power demands.

Direct-to-chip liquid cooling, which is 3,000 times more effective than air cooling, is expected to grow rapidly, quadrupling by 2030.

Broadcom holds a 55-60% market share in the custom chip (ASIC) market, co-designing Google's TPUs and meta's training and inference accelerators.

Broadcom’s custom AI chip program is expected to generate $8 billion in revenue in 2024 and $10 billion in 2025, mostly from Google’s TPUs.

Verve’s production capacity for liquid cooling systems is projected to expand by 45 times in 2024, with a $7 billion backlog in orders.

Broadcom’s networking products, including Tomahawk Ethernet switches, saw a 44% year-over-year revenue growth in a single quarter.

Nvidia dominates the data center GPU market, with estimates of 92-98% market share, but faces rising competition from custom AI chips.

Companies like Amazon, Google, and Microsoft are developing their own AI chips (ASICs) to reduce reliance on Nvidia GPUs.

AI’s rapid growth is expected to drive data center power consumption to 16% of total U.S. electricity by 2030.

The liquid cooling market is expected to grow at a compound annual rate of 27.6% until 2030, driven by high-density AI computing needs.

Marvell Technology, the second-largest player in the ASIC market, is growing its data center business with custom AI chip programs, partnering with major tech giants.

Transcripts

play00:00

artificial intelligence has a huge

play00:02

problem a single chat GPT query can take

play00:05

up to 10 times the electricity of a

play00:07

Google Search and while power demand

play00:09

from data centers has already doubled

play00:11

Over The Last 5 Years Goldman Sachs

play00:13

predicts that it'll grow by another 160%

play00:17

by 2030 so in this episode I'll

play00:20

highlight a few companies that are

play00:21

tackling this exact problem positioning

play00:24

them to win big no matter which AI

play00:26

companies come out on top making them a

play00:28

great way to get rich without getting

play00:31

lucky your time is valuable so let's get

play00:33

right into it first things first I'm not

play00:35

here to waste your time so here's

play00:37

exactly what I'm going to cover I'll go

play00:39

over the big Power problem that AI is

play00:41

causing right now I'll talk about Verve

play00:44

which is a company that provides power

play00:46

and thermal Management Solutions for

play00:48

data centers broadcom which designs

play00:50

custom power efficient AI chips for Tech

play00:53

giants like Google and meta platforms

play00:55

and Marvel a company that also makes

play00:57

efficient AI chips and switches for data

play01:00

centers it's important to understand

play01:02

just how much power AI is projected to

play01:04

consume over the next few years so let

play01:07

me break that down first a single chat

play01:09

GPT prompt costs 2.9 wat hours that's

play01:13

like keeping a 5w LED bulb on for a

play01:15

little over half an hour while using AI

play01:18

to generate a single image can cost as

play01:20

much electricity as charging your phone

play01:23

compare that to a Google search which

play01:25

also takes in a query and returns text

play01:27

and images but only uses around 0 .3 wat

play01:30

hours in the process a couple of wat

play01:32

hours may not seem like much but there

play01:34

are roughly 9 billion Google searches

play01:37

every day and if we move them all to

play01:39

generative AI it would take on the order

play01:41

of 10 tratt hours more to serve all of

play01:45

those requests that's enough electricity

play01:47

to power almost a million homes for an

play01:49

entire year but this is actually a bad

play01:52

comparison because people don't use

play01:54

generative AI tools the same way they

play01:56

use Google for example chat GPT tends to

play01:59

be more of a dialogue between the user

play02:02

and an AI model that can really rack up

play02:04

the number of queries compared to a

play02:06

Google Search and that's just the

play02:08

inference side of the story training and

play02:11

retraining large AI models is very

play02:13

energy intensive too especially when

play02:16

we're talking about trillion parameter

play02:18

models for example GPT 4 took over 50

play02:21

gwatt hours to train or about .2% of the

play02:25

electricity generated by the entire

play02:28

State of California over a year as an

play02:31

investor this worries me for three

play02:33

reasons first the amount of compute

play02:35

needed to train AI models has been

play02:37

doubling roughly every 6 months talk

play02:40

about exponential growth second that

play02:42

gets multiplied by the number of

play02:44

foundation models being trained which is

play02:46

also growing exponentially and third

play02:49

even though you can use chat GPT almost

play02:51

anywhere in the world it consumes energy

play02:54

only at the server's location energy

play02:56

accounts for up to 70% of a data

play02:58

Center's total cost cost of operations

play03:01

so the hardware and the racks how the

play03:03

data center facility is designed and

play03:04

weighed out and even the age of its

play03:06

local power grid all really matter and

play03:09

by the way the average age of the US

play03:11

power grid is around 40 years old with

play03:13

over a quarter of the grid being 50

play03:16

years old or older and AI only makes

play03:18

this problem worse for example nvidia's

play03:21

previous generation a100 gpus use about

play03:24

400 watts but the current generation of

play03:26

Hopper gpus run at 700 Watts that's

play03:30

almost twice the power and four or five

play03:32

times the power of CPU based servers

play03:35

it's worth noting that the h100 gpus are

play03:37

up to nine times faster at AI training

play03:40

and 30 times faster for inference over

play03:42

the A1 100s so the power efficiency of

play03:45

nvidia's gpus is going way up with every

play03:47

generation but Power demand is going up

play03:50

faster so it takes more than just high

play03:52

performance gpus to solve this problem

play03:55

let's start with cooling since that

play03:56

accounts for up to 40% of a data

play03:58

Center's energy us which means 28% of

play04:01

the total cost of operations ver of

play04:04

Holdings ticker symbol VRT provides

play04:06

power and thermal Management Solutions

play04:08

for data centers like their libert

play04:10

liquid cooling systems these systems are

play04:12

built specifically for highdensity

play04:14

deployments like the ones that power

play04:16

intense AI applications providing cold

play04:19

plates and directed chip Cooling in a

play04:21

way that integrates with existing data

play04:23

center infrastructures which is a big

play04:25

deal because around 90% of all server

play04:28

racks are air cooled today

play04:30

in fact most data centers even run their

play04:32

h100 chips at low enough power so they

play04:35

can be air cooled so a lot of them will

play04:37

need to make massive infrastructure

play04:39

changes to support direct to chip liquid

play04:41

cooling for nvidia's upcoming Blackwell

play04:44

systems if they want to run those chips

play04:46

at Peak Performance that includes

play04:48

hyperscalers like Amazon Google and

play04:50

Microsoft all of which need to support

play04:52

power hungry AI workloads for thousands

play04:55

of other businesses speaking of which

play04:57

according to Market us the Global

play05:00

artificial intelligence Market is

play05:01

expected to almost 12x over the next 8

play05:04

years which is a compound annual growth

play05:06

rate of

play05:07

36.8% but many of the companies building

play05:10

the next generation of AI applications

play05:12

are not publicly traded think about the

play05:15

9s and early 2000s companies like Amazon

play05:18

and Google went public very early in

play05:20

their growth cycle but today companies

play05:22

are waiting an average of 10 years or

play05:24

longer to go public that means investors

play05:26

like us can miss out on most of the

play05:28

returns from the next Amazon the next

play05:31

Google the next Nvidia so I spent a lot

play05:34

of time digging into this and The

play05:35

fundrise Innovation fund is a great way

play05:37

to invest in some of the best tech

play05:39

companies before they go public venture

play05:42

capital is usually only for the ultra

play05:44

wealthy but fund Rises Innovation fund

play05:46

gives regular investors access to some

play05:48

of the top private pre-ipo companies on

play05:51

Earth without breaking the bank The

play05:53

fundrise Innovation fund also has an

play05:55

impressive track record already

play05:57

investing over $100 million into some of

play06:00

the largest most inem demand Ai and data

play06:03

infrastructure companies so if you want

play06:05

access to some of the best weight stage

play06:07

AI companies before they IPO check out

play06:09

the fundrise Innovation fund with my

play06:11

link below today all right so 90% of all

play06:15

server racks are air cooled today but

play06:17

industry estimates suggest that up to

play06:19

80% of data center cooling will become

play06:22

direct to chip liquid cooling over time

play06:24

direct to chip liquid cooling is where a

play06:26

heat conductive copper plate sits on top

play06:29

of a chip chip just like a normal heat

play06:30

sink but instead of being air cooled by

play06:33

a fan the plate is connected to two

play06:35

pipes one pipe brings in Cool Water to

play06:37

absorb the heat from the plate and the

play06:39

other pipe moves hot water away direct

play06:41

to chip liquid cooling is up to 3,000

play06:44

times more effective than air Cooling

play06:47

and better cooling means that servers

play06:48

can be stacked closer together without

play06:51

overheating every data center has a

play06:53

fixed amount of space so they need to

play06:55

optimize their cooling if they want to

play06:57

squeeze the most compute out of their

play06:59

entire facility as a result the liquid

play07:01

cooling market for data centers is

play07:03

expected to more than quadruple by 2030

play07:06

which would be a compound annual growth

play07:08

rate of

play07:10

27.6% for the next 6 years and Verve

play07:13

definitely knows that according to their

play07:15

quarter2 earnings call they're on track

play07:17

to expand their liquid cooling

play07:18

production capacity by a whopping 45x

play07:22

over the course of 2024 Verve also

play07:25

doubled their production capacity for

play07:26

power management products over the last

play07:28

3 years and they plan to double It Again

play07:31

by the end of 2025 all of these

play07:33

expansions should lead directly to more

play07:35

Revenue since Verve is currently limited

play07:38

by Supply not demand Verve had a$7

play07:41

billion backlog of orders at the end of

play07:43

quarter 2 which was up 11% quarter over

play07:46

quarter and 47% year over-year Verve

play07:50

stock is already up around 120% year to

play07:53

dat and I definitely think they have

play07:55

plenty of room to run as the AI boom

play07:57

continues compute and connectivity are

play07:59

also energy intensive so let's talk

play08:02

about them next today Nvidia has a

play08:04

massive share of the data center GPU

play08:06

Market with estimates ranging from 92%

play08:09

all the way to 98% market share but gpus

play08:12

are not the only way to process intense

play08:14

AI workloads over the last 3 years I've

play08:17

spent a lot of time covering the custom

play08:19

chips used by Amazon web services

play08:21

Microsoft Azure and Google Cloud these

play08:24

custom chips are called as6 application

play08:27

specific integrated circuits and they do

play08:29

exactly what their name implies their

play08:31

design is tailored to a specific

play08:33

application which simplifies the Chip's

play08:35

architecture the result is a chip that

play08:37

can run a narrow set of workloads

play08:39

extremely efficiently at the cost of

play08:42

supporting fewer kinds of workloads than

play08:44

more General processors like gpus and

play08:47

CPUs so as Amazon Microsoft Google and

play08:50

their Cloud clients need more support

play08:52

for a specific kind of workload like

play08:54

running large language models

play08:56

synthesizing speech from text or

play08:58

generating images they could make a chip

play09:00

for that workload and free up their more

play09:02

expensive Nvidia infrastructure for

play09:04

other tasks all three hyperscalers are

play09:06

making big investments into their own

play09:08

semiconductor Supply chains to reduce

play09:11

their overall Reliance on Nvidia over

play09:13

time Amazon has their inferentia and

play09:16

tranium chips for AI inference and

play09:18

training respectively Microsoft has

play09:20

their Azure Maya accelerators and of

play09:22

course Google has their tensor

play09:24

processing units or tpus the demand for

play09:27

A6 is so high that even an Nvidia is

play09:29

building a new business unit focused on

play09:31

making custom chips for other companies

play09:34

which could help extend their Cuda

play09:35

ecosystem to new kinds of chips but

play09:38

Nvidia will have some serious

play09:40

competition in this space from rival

play09:42

companies like Marvel technology and

play09:44

broadcom so let's talk about them next

play09:46

broadcom is the leader of the Asic

play09:48

Market with a dominant 55 to 60% share

play09:52

and a major focus on AI and data center

play09:54

infrastructure broadcom co-designed the

play09:57

last six generations of Google tpus and

play10:00

that partnership got extended to the

play10:02

next generation of tpus as well which

play10:05

shows just how sticky these chip design

play10:07

relationships can be once they're up and

play10:09

running Google claims that their sixth

play10:11

generation Trillium tpus are 67% more

play10:14

energy efficient than their current

play10:16

generation with 4.7 times more Peak

play10:19

compute performance and JP Morgan

play10:21

analysts estimate that broadcom's TPU

play10:24

program will generate $8 billion in

play10:26

Revenue in 2024 and another 10 billion

play10:29

in 2025 and that's just from Google's

play10:33

tpus broadcom is also behind every

play10:35

generation of the MTI chips which are

play10:38

meta's training and inference

play10:39

accelerators and broadcom's Ambitions

play10:41

for custom AI chips don't stop with

play10:43

Google or meta in July broadcom was

play10:46

rumored to be in talks with open AI to

play10:48

design as6 for them as well but who

play10:51

knows what'll happen now that open ai's

play10:53

Chief technical officer Mira Madi is

play10:55

leaving and open AI is becoming a

play10:57

for-profit company let me know if you

play10:59

want a separate Deep dive on all of that

play11:01

and the resulting drama but broadcom

play11:03

makes more than just custom chips for

play11:05

Tech Giants according to broadcom CEO

play11:08

Hawk tan more than

play11:10

99.5% of all internet traffic Touches at

play11:13

least one broadcom chip Jim I got to

play11:16

tell you in

play11:19

99.5% of every bit of data that flows in

play11:23

the internet will cross at least one or

play11:27

more broadcom chip Brom has several

play11:30

lines of network switches that also use

play11:32

as6 designed to optimize Network traffic

play11:35

by choosing the best path for data

play11:37

packets based on the Network's layout

play11:39

and its current conditions broadcom's

play11:41

Tomahawk series is tailored specifically

play11:43

for ethernet switches and their Jericho

play11:45

line is for more complex networks that

play11:47

need core and Edge Computing this past

play11:50

quarter broadcom's networking Revenue

play11:52

grew by 44% year-over-year after they

play11:55

doubled the number of switches that they

play11:57

sold and broadcom stock is it's up

play11:59

nearly 60% year-to date broadcom may not

play12:02

be another Nvidia but honestly that's a

play12:04

good thing too because it's exposed to

play12:07

all the growth that AI has to offer in a

play12:09

different way than Nvidia ethernet

play12:11

versus infiniband and as6 versus gpus

play12:15

that's why holding broadcom and Nvidia

play12:17

is a great way to have your cake and eat

play12:20

it too by the way the global market for

play12:22

AI as6 is expected to roughly 10x in

play12:25

size by 2033 which is a compound annual

play12:27

growth rate of around 30 % for the next

play12:30

10 years as6 currently represent 16% of

play12:33

the total data center accelerator market

play12:36

and I expect that to meaningfully

play12:37

increase over time as AI applications

play12:40

get more diverse and power consumption

play12:42

becomes an even bigger issue than it is

play12:44

today Marvel is the second largest

play12:46

company in the Asic Market only behind

play12:48

broadcom with around a 14% share so

play12:52

holding both broadcom and Marvel stock

play12:54

means owning roughly 75% of the entire

play12:57

Asic Market according to a recent call

play12:59

with Wall Street analysts Marvel

play13:01

currently has around a 10% share of the

play13:03

overall data center accelerator market

play13:05

and expects to double that over the next

play13:07

few years back in May Marvel confirmed

play13:10

that they partnered with at least three

play13:12

hyperscale customers which analysts

play13:14

currently believe are Amazon web

play13:15

services tranium and inferentia chips

play13:18

Microsoft azure's Maya accelerators and

play13:20

Google's arm-based Axion data center

play13:23

CPUs and going back to our original

play13:25

Power problem Google claims their Axion

play13:28

CPU is 60% more power efficient than

play13:31

comparable x86 chips that are produced

play13:34

by AMD and Intel but compute is just one

play13:37

side of the story moving data over a

play13:39

massive network is also extremely energy

play13:42

intensive picking the right Network

play13:43

switches can reduce power consumption by

play13:46

around 30% Marvel dominates the space

play13:48

with a range of products like Optical

play13:50

and copper transceivers high performance

play13:52

Optical interconnects for data centers

play13:55

and a broad range of ethernet switches

play13:57

in March Marvel extended their

play13:58

longstanding partnership with tsmc to

play14:01

develop their next generation of AI

play14:03

infrastructure products using tsmc's 2

play14:06

nanometer process technology it's worth

play14:08

noting that Marvel stock hasn't done

play14:10

nearly as well as broadcom since

play14:12

Marvel's total revenues have been in

play14:14

Decline but under the hood Marvel's

play14:16

Revenue mix is Shifting substantially

play14:18

with data centers now accounting for

play14:20

almost 70% of their total revenue last

play14:23

quarter up from around 33% at the start

play14:25

of 2023 kind of like how Nvidia saw a

play14:28

sharp decline in their revenue when they

play14:30

pivoted from gaming gpus to Data Center

play14:33

chips a few years ago case in point

play14:35

Marvel's Data Center business grew 7%

play14:38

quarter over quarter and a whopping 87%

play14:41

year over-year and for this quarter

play14:43

Marvel forecast data center revenues to

play14:46

accelerate even faster thanks to their

play14:48

custom AI chip programs which are

play14:50

beginning to ramp right now the AI boom

play14:52

is expected to increase data center

play14:54

Demand by 15 to 20% every year through

play14:57

2030 at which point they could reach a

play14:59

whopping 16% of total us power

play15:02

consumption that's the same demand as

play15:04

about 2/3 of all the homes in the United

play15:07

States and while every generation of

play15:09

nvidia's gpus is more efficient than the

play15:12

last AI has caused the overall demand

play15:14

for data center power to far exceed

play15:17

Supply it will take much more than

play15:19

efficient cooling solutions from

play15:21

companies like veriff or the as6 for AI

play15:23

Computing designed by broadcom and

play15:25

Marvell but in my opinion all three of

play15:28

these companies will will end up being

play15:29

important pieces of the puzzle and

play15:31

that's why it's so important to

play15:33

understand the science behind the stocks

play15:36

and if you want to see what else I'm

play15:37

investing in to get rich without getting

play15:39

lucky check out this video next either

play15:42

way thanks for watching and until next

play15:44

time this is ticker simple you my name

play15:47

is Alex reminding you that the best

play15:49

investment you can make is in you

Rate This

5.0 / 5 (0 votes)

Ähnliche Tags
AI EnergyData CentersEfficiency SolutionsInvestment InsightsGreen TechAI ChipsPower ManagementLiquid CoolingASIC MarketTech Growth
Benötigen Sie eine Zusammenfassung auf Englisch?