Snowflake Summit 2023 Keynote: Generative AI’s Impact on Data Innovation in the Enterprise

Snowflake Inc.
8 Aug 202344:42

Summary

TLDRAt the Snowflake Summit, Sarah Gore introduced Frank Slootman and Jensen Huang, CEOs of Snowflake and Nvidia respectively, to discuss their partnership. They explored the transformative impact of generative AI, emphasizing the importance of data. The pair highlighted their collaboration to accelerate the creation of custom generative AI applications using proprietary data, and the potential for AI to revolutionize business operations and decision-making by democratizing access to complex data insights.

Takeaways

  • 🚀 Generative AI is revolutionizing the computing landscape, with its ability to write software itself and transform how computers interact with data.
  • 🤖 The partnership between Snowflake and Nvidia aims to accelerate the development of custom generative AI applications using proprietary data, marking a significant shift in data utilization.
  • 🧠 Nvidia's and Snowflake's collaboration brings together valuable data, advanced AI algorithms, and a powerful compute engine, enabling businesses to harness AI in unprecedented ways.
  • 💡 The combination of large language models and enterprise data can turn databases into AI applications, allowing for complex queries and insights through natural language processing.
  • 🌐 Data gravity is a real challenge, making it more efficient to bring compute capabilities to data rather than moving vast amounts of data to compute resources.
  • 📈 AI's impact on businesses extends beyond simple query enhancement; it has the potential to redefine business models and address complex problems like customer churn and supply chain management.
  • 🛠️ Enterprises are encouraged to consider their most valuable databases and how AI could be leveraged to extract deeper insights and solve critical business issues.
  • 💬 The future of AI applications lies in the democratization of data interaction, where users can ask questions in natural language and receive intelligent, data-driven responses.
  • 🏭 Nvidia's experience in using AI internally for complex tasks like chip design and software optimization highlights the technology's capability to handle problems beyond human comprehension.
  • 🌟 The Snowflake Summit signifies a milestone in the AI and data industry, showcasing the potential for businesses to transform their data into actionable intelligence.

Q & A

  • What is the significance of generative AI and data according to the speakers?

    -Generative AI and data are significant because they represent a major computing revolution, enabling computers to write software for other computers by themselves. This technology is transforming the way we interact with and extract insights from data, leading to the creation of custom generative AI applications using proprietary data.

  • How does Jensen Huang describe the impact of generative AI on computing?

    -Jensen Huang describes the impact of generative AI as profound, stating that it's the biggest computing revolution we have ever seen. He emphasizes the ease of use and the ability of AI to exceed expectations by understanding user inputs and connecting to almost anything, like a universal translator.

  • What is the new partnership between Nvidia and Snowflake aimed at?

    -The new partnership between Nvidia and Snowflake aims to provide businesses with an accelerated path to creating custom generative AI applications using their own proprietary data. This collaboration brings together valuable data, AI algorithms, and a powerful compute engine to help customers leverage their data more effectively.

  • How does Frank Slootman view the evolution of the relationship with data in enterprises?

    -Frank Slootman views the evolution as a significant step function from the past, where relationships with data were not natural or productive. He highlights the current era as an infatuation with the experiences and capabilities that AI and data provide, enabling more intuitive and insightful interactions with data.

  • What are the benefits of bringing the compute engine to Snowflake, as mentioned by Jensen Huang?

    -Bringing the compute engine to Snowflake allows for the combination of data plus AI algorithms with a powerful compute engine. This enables customers to use their proprietary data to write AI applications, marking a breakthrough in the ability to develop large language models that can be queried like a person, augmenting the data with insights and predictive relationships.

  • What is the role of foundation models in the context of enterprise data and custom models?

    -Foundation models provide a broad base of general knowledge that can be further specialized and fine-tuned for specific enterprise data and use cases. While they are capable on their own, they require customization to work effectively with complex and specific enterprise datasets, leading to more targeted and valuable AI applications.

  • How does the partnership between Nvidia and Snowflake change the traditional data gravity concept?

    -Traditionally, data gravity meant moving data to the compute power because there was little data to move. However, with the partnership between Nvidia and Snowflake, the compute engine is brought to the data, which is now vast and complex. This change allows for more efficient and secure processing of data, as it eliminates the need to move large datasets across different platforms or locations.

  • What are the potential applications of the AI applications developed through the Nvidia and Snowflake partnership?

    -The AI applications developed through this partnership can be used to enhance query capabilities, transform customer service data into interactive applications, optimize supply chain management, and even assist in complex engineering tasks such as GPU design and software optimization. The applications can help businesses redefine their cost structures and business models, leading to significant economic and operational improvements.

  • How does the use of AI in enterprise data processing differ from traditional data processing methods?

    -Traditional data processing methods often involved deterministic code written by engineers and manual querying. AI introduces a more intelligent approach where large language models can be queried in natural human language, providing insights and answers that may have required extensive manual analysis before. This allows for a more dynamic and interactive relationship with data, leading to faster and more intuitive decision-making.

  • What is the expected outcome of the Nvidia and Snowflake partnership for customers?

    -Customers can expect to develop AI applications quickly, leveraging their proprietary data in a matter of days rather than months. The partnership aims to make AI more accessible and cost-effective by providing pre-trained models and a platform that integrates data processing, training, inference, and deployment. This will enable customers to turn their data into intelligence and run AI factories, producing valuable insights continuously.

Mindmap

Keywords

💡Generative AI

Generative AI refers to artificial intelligence systems capable of creating new content, such as text, images, or music. In the context of the video, it's highlighted as a revolutionary technology that has taken the world by storm, with potential to transform various industries by generating new and innovative outputs based on existing data.

💡Data

Data is the foundation upon which generative AI operates. It refers to the raw, unprocessed facts and statistics that are collected and utilized to train AI models. In the video, the importance of data is emphasized as a critical component to enable generative AI, with the discussion focusing on how data can be leveraged to create custom AI applications.

💡Snowflake

Snowflake is a cloud-based data warehousing platform that enables businesses to store, manage, and analyze large volumes of data. In the video, Snowflake is presented as a key player in the AI ecosystem, with its partnership with Nvidia to offer AI capabilities directly within their platform.

💡Nvidia

Nvidia is a technology company known for its graphics processing units (GPUs) and AI computing platforms. In the video, Nvidia is highlighted for its role in providing the computing fabric necessary for AI applications, particularly through its collaboration with Snowflake.

💡AI Entrepreneurs

AI entrepreneurs are individuals or founders of startups that focus on developing and implementing artificial intelligence technologies in their business models. In the video, the term is used to describe the target audience of the podcast 'No Priors' and the people who would benefit from the Snowflake and Nvidia partnership.

💡Conviction

Conviction, in the context of the video, refers to a venture investing firm founded by Sarah Gua, which is purpose-built for partnering with AI entrepreneurs from idea to IPO. It signifies a strong belief or confidence in the potential of AI technologies and their ability to transform industries.

💡IPO

IPO stands for Initial Public Offering, which is the process by which a private company goes public by offering its shares to the general public for the first time. In the video, it signifies the end goal for AI entrepreneurs and the growth trajectory of their ventures.

💡Podcast

A podcast is a digital audio program available for streaming or downloading over the Internet. In the video, 'No Priors' is a podcast hosted by Sarah Gua, focusing on the AI opportunity and discussing the future of AI with industry leaders.

💡Data Gravity

Data gravity is the concept that data attracts applications and processes, much like a gravitational force. In the video, it is used to describe the challenge and reality of moving large, complex datasets, which often results in a preference for bringing compute capabilities to where the data resides.

💡AI Applications

AI applications refer to software programs or tools that utilize artificial intelligence to perform tasks, often mimicking human intelligence. In the video, AI applications are discussed as the end product of combining data, AI algorithms, and compute engines, with the potential to revolutionize how businesses interact with and gain insights from their data.

💡Large Language Models

Large language models are AI models trained on vast amounts of text data, enabling them to understand and generate human-like text. These models are significant in the field of natural language processing and are used to improve AI's ability to interact with users in a conversational manner.

Highlights

Generative AI has taken the world by storm over the past year.

Data is essential to enable generative AI.

Frank Slootman and Jensen Huang, CEOs of Snowflake and Nvidia, discuss their new partnership.

The partnership aims to provide businesses with an accelerated path to creating custom generative AI applications using their own proprietary data.

Jensen Huang emphasizes the profound implications of computers writing software for other computers by themselves.

Frank Slootman highlights the historical challenges and the current ease of use in computing applications.

The Nvidia and Snowflake partnership is described as having full alignment and no conflict, offering a powerful combination of data, AI algorithms, and compute engine.

For the first time, developers can create large language models that interact with data like a person, enhancing the data with AI.

The partnership aims to help customers use their proprietary data to write AI applications.

Jensen Huang discusses the transformation from software 1.0 to software 3.0, where AI plays a central role.

Frank Slootman envisions a future where business data interfaces can change dramatically, with AI enabling new insights and predictive relationships.

The discussion highlights the potential for AI to redefine the economics of businesses, such as large call centers and sales organizations.

Jensen Huang explains the importance of fine-tuning pre-trained AI models for specific enterprise needs.

Enterprises are expected to become intelligence manufacturers, running AI factories that leverage data and AI engines.

The Nvidia and Snowflake partnership is seen as a significant step towards making AI applications more accessible and cost-effective for businesses.

The conversation underscores the revolutionary shift in computing, with AI and accelerated computing reshaping the landscape.

Jensen Huang and Frank Slootman express optimism about the transformative impact of AI on various industries and business operations.

Transcripts

play00:00

please welcome former Greylock General

play00:03

partner and founder and CEO of

play00:05

conviction Sarah Gore

play00:14

hello Las Vegas

play00:17

I have the pleasure to officially kick

play00:19

off snowflake Summit I'm Sarah Gua

play00:22

former Greylock General partner and

play00:24

founder of conviction a venture

play00:26

investing firm purpose-built for

play00:28

partnering with AI entrepreneurs from

play00:30

idea to IPO I also host no priors a Tech

play00:33

podcast focused on the AI opportunity

play00:35

ahead of all of us

play00:37

generative AI has taken the World by

play00:39

storm over the past year and is likely

play00:41

top of mind for everyone in this room

play00:43

but what do you need to enable

play00:45

generative AI That's Right data

play00:49

there are no two people better on the

play00:52

entire planet to talk about what

play00:53

generative Ai and data can do for you

play00:55

than Frank slitman and Jensen Huang the

play00:58

legendary CEOs of snowflake and Nvidia

play01:01

we'll also learn more about their new

play01:03

partnership to provide businesses with

play01:05

an accelerated path to creating custom

play01:07

generative AI applications built with

play01:09

their own proprietary data please

play01:11

welcome to the stage Jensen Huang and

play01:14

Frank slootman sir

play01:19

[Applause]

play01:26

[Applause]

play01:28

[Music]

play01:35

here come here Frank hugs hugs all

play01:37

around yeah

play01:41

he never once get left out

play01:45

okay

play01:46

um we will start with a question about

play01:48

what you two are seeing

play01:50

um the hype and excitement around AI is

play01:53

extraordinary this past year uh is it

play01:56

real is it oriented Jensen let's start

play01:58

with you

play02:00

well this is the biggest Computing

play02:02

Revolution we have ever seen

play02:06

um no one has ever seen a technology

play02:08

like this

play02:10

a computer that writes software

play02:13

itself

play02:14

and now the computer can write software

play02:18

for other computers by itself

play02:21

when you take a step back and think

play02:23

about the implications it's really

play02:25

really profound

play02:27

and and all of our experiences every

play02:30

single one of ours I'm sure

play02:32

has proven that this is the easiest

play02:36

application to use in the history of

play02:39

computing because it's so smart

play02:42

it understands what you mean even when

play02:44

you say what you mean poorly

play02:47

and it does things that you tell it to

play02:50

do that exceeds your expectations

play02:53

and you can connect it to almost

play02:55

anything it's a universal translator

play02:58

and so when you when you take a step

play03:01

back and think about its implications

play03:03

I think we can all we can all uh be

play03:06

fairly excited about the all of us in

play03:09

the computer industry and what we've

play03:10

done

play03:12

Frank is It Something That Matters to

play03:14

Enterprises and and should they be

play03:16

paying attention

play03:17

well yes

play03:19

um you know we we have a long strenuous

play03:22

uh relationship with data you know we're

play03:25

old enough actually Jensen and I might

play03:27

be you will not be you know where I was

play03:30

learning COBOL and COBOL stood for

play03:32

common business oriented language and

play03:35

there was nothing common or business

play03:36

oriented about it

play03:38

um you know in the 80s you know we had

play03:41

SQL structured data that helped a lot

play03:43

because data now kind of made sense rows

play03:45

and columns

play03:47

um but still it's it's it's been hard to

play03:49

have a relationship with data that's

play03:51

natural and productive and insightful

play03:54

all these things and all of a sudden you

play03:57

know we have this huge step function

play03:58

from where we've been all this time and

play04:01

it's like it's just an absolute wow

play04:02

which is why we're all so infatuated

play04:04

with the experiences that we're seeing

play04:08

it's a perfect segue to um news hot off

play04:11

the presses in the last two minutes

play04:12

Nvidia and snowflake just announced a

play04:15

massive expansion to their partnership

play04:17

where you know customers can Leverage

play04:19

The Nvidia AI partnership platform on

play04:22

snowflake for developing custom

play04:24

generative AI models on their own

play04:27

proprietary data can you tell us more

play04:30

about this partnership

play04:31

yeah you know you know snowflake

play04:34

disorder things that we do there's a lot

play04:36

of people trying to kill us so uh

play04:39

finding partners that have natural

play04:41

alignment with us and have extraordinary

play04:43

capabilities

play04:45

um that doesn't come along you know

play04:47

every day and uh you know obviously

play04:50

Nvidia is an extraordinary place you

play04:52

know in time and in history in terms of

play04:54

the role they play in the entire history

play04:56

you know for us you know to be able to

play04:59

bring data and relationships with large

play05:02

Enterprises everybody that you see here

play05:04

in the room if we combine that you know

play05:07

with the Computing fabric that that we

play05:10

need to enable this technology as well

play05:13

as the entire stack of services to use

play05:15

it productively yeah that's you know I

play05:17

don't want to use the say it's a match

play05:19

made in heaven but it's about as close

play05:20

as you as you're going to get we have

play05:22

full alignment there's no conflict and

play05:24

uh that's that's that's a heck of an

play05:26

opportunity for for all of us in the

play05:28

room here so we're lovers not Fighters

play05:35

so

play05:37

I woke up this morning without a voice I

play05:39

have no idea where it went I'm going to

play05:41

find it here somewhere and so but the

play05:43

the big deal here Sarah is that we're

play05:45

we're going to bring the world's best

play05:47

compute engine

play05:49

to the world's most valuable data you

play05:53

know back in the old days when we first

play05:54

started our career and and uh Frank

play05:57

Frank mentioned how long he's been

play05:59

working I've been doing my job a long

play06:01

time I'm not that old you're old

play06:06

they think that's funny well they know

play06:09

it's true and and

play06:11

so

play06:13

so back in the old days we used to bring

play06:16

data to the computer

play06:17

and the reason for that is because

play06:19

there's so little data to move but these

play06:21

days for all the reasons that we know

play06:23

well the data is gigantic the data is

play06:25

valuable it has to be secure

play06:27

it's regulated

play06:29

governance has to be incredible and so

play06:32

it's tough to move data around the data

play06:34

gravity is real and so it's a lot easier

play06:36

for us to bring our compute engine to

play06:39

Snowflake and our partnership is about

play06:42

about accelerating snowflake but it's

play06:45

also about bringing AI to snowflake you

play06:49

know that at the core

play06:51

the big revolution is about the

play06:53

combination of data plus AI algorithms

play06:56

plus compute engine

play06:58

our combination our collaboration here

play07:00

our partnership brings all of those

play07:03

three things together

play07:04

incredibly valuable data incredibly

play07:07

great AI incredibly great compute engine

play07:10

and the thing that we could do together

play07:12

is to help customers use their

play07:15

proprietary data and write AI

play07:18

applications with it you know the big

play07:20

breakthrough here and and this is I'm

play07:22

sure you guys are going to learn a lot

play07:23

about this during this week is that for

play07:26

the very first time

play07:27

you could develop a large language model

play07:30

you stick it in front of your data and

play07:33

you talk to your data

play07:34

you talk to your data like you talk to a

play07:36

person

play07:37

and that data will be augmented with the

play07:40

would augment the large language model

play07:42

and you'll ask it all kinds of questions

play07:44

about what and how and when and why all

play07:47

of the things that you might query the

play07:50

database in the future you'll talk to

play07:52

the database in the future the

play07:54

combination of a large language model

play07:55

plus knowledge base equals an AI

play07:59

application

play08:00

it's as simple as that

play08:02

a large language model turns any data

play08:06

knowledge base into an application and

play08:09

just think about all of the amazing

play08:11

applications that people have written

play08:12

it's always at the core of it some

play08:15

valuable data now you have a query

play08:18

engine Universal query engine in front

play08:19

of it that's super intelligent and you

play08:22

can get it to of course respond to you

play08:24

but you can also connect it to an agent

play08:26

as you know and this is uh the

play08:29

Breakthrough of of Lang chain plus

play08:31

Vector databases Plus data

play08:34

from large language models

play08:36

groundbreaking stuff is happening

play08:38

everywhere now everybody's going to do

play08:39

it and Frank and I are going to help

play08:41

everybody do that yeah we we've seen

play08:44

um you know in our investing a huge

play08:46

explosion of chat Bots search

play08:48

summarization and Frank you and I were

play08:50

talking a few weeks ago about how much

play08:52

the interface to business data can

play08:54

change and what a big opportunity that

play08:55

is for customers is that um something

play08:58

you expect to show up in the snowflake

play09:00

product or or P or applications that

play09:02

people build themselves

play09:03

well we we expect to be you know

play09:06

visiting with our customers at some

play09:08

point you know relatively soon you know

play09:10

I would say I said look you know we're

play09:13

we're going to push a boundary in terms

play09:15

of what kind of questions you know

play09:17

you'll be able to ask of your data I

play09:19

mean just just connecting natural

play09:20

language I think it's amazing and I I

play09:22

love it right because many more people

play09:24

can be productive and all this kind of

play09:25

stuff but asking really hard questions

play09:27

of your business is a whole you know

play09:30

different matter and I I hope you know

play09:32

we're all right here that the

play09:34

intelligence is in the data and the

play09:35

models are able to extract the the

play09:37

reasoning and intelligence from that

play09:39

data and that will lead to those you

play09:41

know incredibly insightful predictive

play09:43

relationships with data but if not you

play09:47

know we're going to sort of unpack and

play09:49

break it down and understand it so we

play09:52

can keep pushing those boundaries people

play09:55

ask very very hard questions of their

play09:57

data sometimes they have to you know

play09:59

launch whole teams for weeks on end to

play10:01

come back with answers that we even have

play10:03

difficult be trusting there's no Frank

play10:05

it's like this this is I mean there's

play10:07

there are tons of customers in the

play10:10

audience who have large customer support

play10:12

organizations yeah and all that data is

play10:15

going into snowflake

play10:17

could you imagine if you had hundreds of

play10:20

your customer support agents sitting in

play10:22

front of you and you just asked them a

play10:25

whole bunch of questions

play10:26

that's the future you're going to put it

play10:28

you're going to put a large language

play10:29

model in front of that customer service

play10:31

database and you're going to talk to it

play10:34

yeah you're going to ask you know what

play10:36

are the most frequent things that are

play10:37

happening to our customers that we could

play10:39

improve on what are the things that they

play10:42

seem to find really delightful about

play10:44

products you could ask those questions

play10:46

in normal human language and it will

play10:48

talk back to you that's the miracle

play10:51

you're going to turn customer

play10:53

service data into a customer service

play10:56

application

play10:58

that's the that's the incredible

play11:00

opportunity a lot of the uh you know the

play11:03

originals of large telcos and Banks I

play11:05

mean they're looking to redefine the

play11:07

economics of their businesses which is

play11:09

really something they have not been able

play11:11

to do and they're thinking you know that

play11:14

that opportunity is coming large call

play11:17

centers that's the examples you you just

play11:19

used right sales organizations that feel

play11:22

incredibly unwieldy and expensive you

play11:24

know to them

play11:26

um so redefining cost structures

play11:28

um I mean people get heady just thinking

play11:30

about the uh the possibility and

play11:32

opportunity yeah we're we're going to be

play11:35

part of this of this journey and I it's

play11:38

a good time to be alive in this business

play11:39

you know one of the ways that we um you

play11:42

know as investors think about this

play11:44

transition is going from software uh 1.0

play11:47

which is uh you know very deterministic

play11:50

code written function by function by

play11:52

Engineers to software 2.0 which is you

play11:56

know you're optimizing a neural network

play11:57

with um you know carefully collected

play12:00

labeled training data and I think the

play12:02

opportunity that you guys are really

play12:03

helping people Leverage is what I think

play12:05

of as software 3.0 which is we are

play12:08

working with this set of foundation

play12:09

models that are incredibly capable on

play12:12

their own but they still need to work

play12:14

with Enterprise data and custom data

play12:17

sets it's just much cheaper to go

play12:19

develop those applications against them

play12:21

right you can think of them as the

play12:22

world's best query engine on top of your

play12:25

most important data so I think that's

play12:27

like a really attractive opportunity to

play12:29

how much the cost comes down for

play12:30

customers so actually this is a this

play12:33

raises an interesting question for

play12:35

people who are paying deep attention to

play12:38

this area

play12:39

um you know one question is foundation

play12:41

models are very general uh

play12:44

can they just do everything like why do

play12:46

we need custom models and Enterprise

play12:47

data anyway

play12:49

well I mean there's there's a school of

play12:51

thought in there it's like maybe but it

play12:52

will be incredibly expensive right so we

play12:55

we have you know very broad General

play12:57

models that can do poetry and process

play13:00

the Great Gatsby and summarize it and

play13:02

all these and math problems broken down

play13:04

how I wish I had that as a kid right but

play13:07

you know in business we don't need that

play13:09

right we're looking for a co-pilot that

play13:12

has

play13:13

extraordinary Insight in a very narrow

play13:16

set of data but a very complex Sarah

play13:18

data we need to understand business

play13:19

models and business Dynamics now that's

play13:22

that is computationally not as expensive

play13:24

because you just don't need to be

play13:26

trained on a million things you need to

play13:27

be trained on you know very few and very

play13:30

very deep subject matter so if we think

play13:32

of these models as

play13:34

the general knowledge of the

play13:40

from these AI applications that is not

play13:43

in the general knowledge of the internet

play13:44

yeah well I'll give you an example

play13:47

um you know one of my uh I'm on the

play13:49

board of instacart and you know one of

play13:51

the topics a great customer of ours as

play13:54

well and you know one of the topics that

play13:56

comes up in businesses like doordash and

play13:58

all the others is you know they drive

play14:00

Top Line with huge amount of marketing

play14:02

expense and it's very very hard to do

play14:04

and then they onboard a customer and

play14:06

then the customer places an order and

play14:08

then the customer either doesn't come

play14:10

back or he comes back 90 days later it's

play14:12

very erratic and they call that churn

play14:14

churn is the most you know most

play14:17

incredibly complex problem to analyze

play14:19

because you may not come back for one

play14:21

reason I may not come back for another

play14:22

reason right so people want to find

play14:26

answers to these questions it is in the

play14:28

data it is right but but try to reason

play14:31

your way you know to that right that's

play14:34

that that's an example that is worth an

play14:36

enormous amount of money you know right

play14:38

and the answer to you know why is this

play14:41

instacart customer churning is not in

play14:42

the general knowledge of the internet no

play14:44

somewhere in the instacart data we hope

play14:46

yes it is and we we see other software

play14:50

companies and within in product

play14:52

marketing and so on they try to you know

play14:54

understand behaviors to the point where

play14:56

they can then because if you understand

play14:59

the problem you can do something about

play15:01

it and be productive in changing the

play15:03

turn ratios if you're wrong everything

play15:06

you're doing is is ineffective and that

play15:08

that is literally what's going on you

play15:10

know

play15:11

Jensen part of the partnership is you

play15:12

guys bringing your pre-trained

play15:14

foundation models and expertise

play15:16

to uh you know the snowflake platform

play15:19

right so how do you think about how

play15:21

those models should interact with

play15:22

Enterprise data

play15:25

uh first of all I've never met

play15:30

um an employee that was too smart

play15:34

you know just it embarrassed me how

play15:36

smart they were

play15:37

I've never hired a new college grad I

play15:40

just got too smart too smart

play15:45

so we can make the pre-trained models as

play15:48

smart as we can and then we still have

play15:50

to onboard them we still have to

play15:52

fine-tune them we still have to you know

play15:54

we have to still specialize them into

play15:56

our particular craft and so our strategy

play15:58

and our offering is state-of-the-art

play16:02

pre-trained models of a variety of sizes

play16:04

and sometimes you need to create a very

play16:07

large pre-trained model so that it can

play16:08

generate prompts so that you could teach

play16:10

the smaller models which generates

play16:11

prompts to teach even smaller Pro Models

play16:14

and the smaller models could run almost

play16:16

almost anything and maybe the latency is

play16:19

really really low however it doesn't

play16:21

generalize as well it's you know zero

play16:23

shot capability is probably a lot more

play16:25

limited and so you might have several

play16:28

different types of different different

play16:29

sizes of models but in every single case

play16:32

you have to do supervised fine-tuning

play16:34

you have to do reinforcement learning

play16:36

human feedback so that you could keep it

play16:39

and keep it aligned to your your um are

play16:42

your goals and principle and you need to

play16:44

guard rail it you need to augment it

play16:46

with Vector databases and things like

play16:48

that and and so all of that comes

play16:50

together in a platform and we have the

play16:52

skills and the knowledge and the basic

play16:55

platform to help them create their own

play16:57

AI

play16:58

and um and then connected to the

play17:01

incredible data that Frank has in uh

play17:04

snowflake now ultimately what every

play17:07

Enterprise customer will do is not

play17:10

ultimate their goal shouldn't be to

play17:12

think about how do I build a large

play17:14

language model their goal should be how

play17:15

do I build an AI application that solves

play17:18

this particular problem and it might

play17:21

that application might require 17

play17:24

questions in previous prompts that it

play17:27

finally came to the right answer and

play17:29

then you might say now I would like to

play17:31

take these I would like you to to to

play17:33

write me a program and make it could be

play17:36

a SQL program it could be a Python

play17:37

program so that I could do this

play17:40

automatically in the future and so you

play17:43

still have to guide this AI through as

play17:45

you know so that it could ultimately

play17:47

give you the right answer but then after

play17:49

that you can create an application that

play17:51

could run as an agent 24 7. Looking for

play17:54

circumstances like this and revealing it

play17:56

to you in advance and so our job is to

play18:00

help customers build these AI

play18:01

applications that are guard railed and

play18:03

specific and customized to them and then

play18:06

lastly we're all going to be

play18:08

intelligence manufacturers

play18:10

in the future we'll hire or hire

play18:13

employees of course and um but we're

play18:14

going to create a whole bunch of agents

play18:16

and these agents could be created with

play18:19

Lang chain and or something like that

play18:21

that

play18:22

connects models and knowledge bases and

play18:25

other apis and you deploy in the cloud

play18:30

and connected to all the snowflake data

play18:32

and and you'll operate these AIS operate

play18:36

these AIS at scale and you'll

play18:38

continuously refine these AIS and so

play18:41

every one of us are going to be

play18:42

manufacturing AIS and we're going to be

play18:44

running AI factories

play18:46

we're going to put that infrastructure

play18:48

in Snowflake and so customers could work

play18:51

with their data there

play18:52

train and develop their models there and

play18:54

then operate their AIS there and so

play18:57

snowflake will will be your data

play18:59

repository and your bank

play19:02

and Frank is sitting on a gold mine of

play19:05

data as you guys know your gold mine of

play19:07

data and all and and snowflake is going

play19:10

to help you turn that gold mine of data

play19:13

into incredible intelligence and all of

play19:16

you will be running AI factories all of

play19:18

it in Snowflake that's the goal

play19:21

one of

play19:23

yeah

play19:29

there's a lot of gold mines out there

play19:32

I I actually saw the the Nvidia Nemo

play19:36

stack you know running inside a

play19:38

snowflake on the park near pavilion

play19:40

floor at three o'clock this afternoon so

play19:43

this is not just Pie in the Sky I mean

play19:45

this is literally happening as we sit

play19:47

here so that's what's so exciting I mean

play19:49

the deployability of these Services is

play19:51

quite High yeah

play19:54

one of the things that res

play19:57

ident is talking to some of the

play19:58

technical

play19:59

working on this and I said you know

play20:01

where did the idea come from

play20:02

came from these two leaders but also

play20:05

came from seeing you know large

play20:07

snowflake customers struggled to get

play20:09

data out to GPU Computing to do machine

play20:12

learning and then get it back into

play20:14

Snowflake and you know seeing that you

play20:17

have the most sophisticated customers

play20:18

already going down that path and just

play20:20

making it possible to consolidate in one

play20:22

place with one stack that's obvious to

play20:24

operate on seems like a huge win

play20:27

you should be able to run on Nvidia

play20:29

everywhere and anywhere you know like I

play20:31

said we're lovers and so so if

play20:36

if you if you need if you need Nvidia

play20:39

GPU Computing we'll come find you

play20:42

and all of you are in snowflakes so we

play20:44

found you

play20:47

but I do have one important question on

play20:49

this

play20:49

um if we pay attention to these

play20:52

companies that are you know training

play20:54

this generation of llms it is very

play20:57

expensive to do so on huge numbers of

play20:59

gpus is you know what should customers

play21:02

expect about

play21:03

um you know GPU Computing spend

play21:06

it's not free uh and and uh that's the

play21:09

wrong answer yeah I I know and by the

play21:12

way everybody in this room uh is is

play21:14

running consumption models uh it's very

play21:17

different from a capacity model

play21:19

um you have to really Master the model

play21:21

you have to have levels of discipline

play21:23

around it it's hard to come to terms

play21:25

with so you know when you when you're

play21:27

running workloads queries or AI

play21:30

factories whatever you want to call it

play21:31

there's a cost so in other words it's

play21:34

not just fun and games you can ask the

play21:36

question what are we having for dinner

play21:37

tonight but is that a question worth

play21:39

asking just because a generation

play21:40

interesting answer right so there

play21:43

there's the consumption model is already

play21:45

you know an enormous evolution in in the

play21:48

world of software this is going to you

play21:50

know take that to another level I mean

play21:52

Nvidia are wonderful products but

play21:55

they're not free last time I checked you

play21:57

know

play21:58

I think Frank better let me let me

play22:00

answer this one

play22:06

sympathetic CEO so so for

play22:09

for first of all he's announcing they're

play22:11

all free actually first of all first of

play22:14

all there's a coupon coming

play22:18

first of all

play22:20

as

play22:21

as you know

play22:22

um we we built we we have five AI

play22:26

Factories at Nvidia and uh four of them

play22:30

are in the world's uh top 500

play22:32

supercomputers and another one is coming

play22:34

online as we speak we use those super

play22:37

computers to do pre-trained models and

play22:40

so when you use our Nemo AI Foundation

play22:43

service inside snowflake the first thing

play22:45

you're going to get

play22:47

is a state-of-the-art pre-trained model

play22:50

that tens of millions of dollars of

play22:53

expense has already gone into it not to

play22:54

mention all the r d that went into it

play22:57

and so it's pre-trained and it's like a

play23:00

new college grad pre-trained super smart

play23:04

and and not so not so smart that you're

play23:06

scared not so smart well not so smart as

play23:09

me but not I'm just

play23:11

smart super smart

play23:14

and so so if they're pre-trained and

play23:17

then there's a whole bunch of other

play23:18

models around it and those models for

play23:20

fine-tuning for uh you know human human

play23:24

feedback reinforcement learning human

play23:25

feedback for augmentation all of those

play23:28

models are a lot more cost effective

play23:31

to train and so now you've adapted that

play23:34

pre-trained model to your functionality

play23:37

to your guard rails to be able to

play23:40

optimize for the type of skills or

play23:43

functionality you would like it to have

play23:45

and and augment it with your data and so

play23:49

that is going to be a lot more cost

play23:50

effective and the the important thing is

play23:52

here what I'm what I hope what I hope

play23:55

and I believe this this will happen is

play23:58

that you'll be you'll be able to develop

play24:00

AI applications connected to your data

play24:02

at snowflake in a matter of days not

play24:06

months

play24:07

you should be able to build AI

play24:09

applications quickly in the future and

play24:11

the reason for that is because we're

play24:12

seeing it happening in real time right

play24:14

now as we speak isn't that right

play24:15

communities are

play24:16

um you know almost enthusiasts and

play24:18

developers are creating all kinds of

play24:19

applications with Lang chain and uh

play24:21

Pinecone and uh and whatever data that

play24:24

they have uh chat chat PDF is pretty

play24:28

pretty terrific you load a bunch of PDFs

play24:31

into it you talk to your PDF and uh

play24:34

instead of reading a research paper this

play24:37

morning I was asking this chat PDF about

play24:40

this particular piece of research which

play24:42

is a lot more interesting having a

play24:44

conversation about it than having to

play24:45

read the whole thing and it happens to

play24:47

be like 49 pages and so being able to

play24:49

ask a few questions about it is really a

play24:51

lot more productive you're going to be

play24:52

able to do this almost everywhere yeah I

play24:54

think one of the things that's

play24:55

encouraging if you just think about

play24:57

software 2.0 versus software 3.0 is one

play25:00

way to think about the foundation models

play25:01

is

play25:02

you know 95 of the training costs has

play25:05

been born already by somebody else

play25:06

that's right right you know as early

play25:08

stage investors did you hear that

play25:12

thank you thank you Jensen another

play25:14

incredible Frank Jensen yeah incredible

play25:18

it's 95 95 off yeah

play25:24

can't imagine a better deal slightly

play25:26

different than what I said but

play25:28

but I think it's I think it's a real

play25:30

Dynamic right because you know we're

play25:32

investors in companies like seek in the

play25:34

analytics automation space Harvey in the

play25:36

legal space where uh you know these are

play25:38

very young companies but they're

play25:40

applications that have gone to real

play25:41

business value in six months or less

play25:43

yeah and part of it is because they're

play25:45

starting with these pre-trained models

play25:46

and I imagine that's a massive

play25:48

opportunity for the Enterprise yeah

play25:50

every single company will have hundreds

play25:53

if not thousands of AI applications and

play25:56

you're just connected to all kinds of

play25:58

data in your company yeah so all of us

play26:01

have to get good at building these

play26:03

things

play26:04

one of the questions I've been hearing

play26:07

from you know large Enterprise

play26:09

practitioners is

play26:11

we have to go invest in this AI thing do

play26:13

we need a new stack like how should we

play26:15

think about this in relation to our

play26:17

existing data stack how do you how do

play26:20

you think about that you know I think

play26:22

the uh it's it's evolving actually

play26:25

something that we're going to talk about

play26:26

you know tomorrow is to sort of you know

play26:29

show all the models that are becoming

play26:32

possible some are already possible and

play26:34

already being used are in production and

play26:37

then you know they're getting

play26:38

progressively you know tighter more

play26:40

secure more governed

play26:42

um and and and and all of that so you

play26:44

know we don't have a real clear view of

play26:47

this is the reference architecture that

play26:49

everybody's going to use you know some

play26:51

people are going to have some set Of

play26:53

Central Services you know Microsoft has

play26:56

its you know open AI you know version

play26:58

running in Azure and a lot of their

play27:00

customers uh you know are interacting

play27:02

it's Azure to Azure it's not exactly you

play27:04

know uh you know where where we want it

play27:06

to be but we're not clear on what model

play27:09

is going to be sort of the dominance you

play27:12

know form that this is going to take

play27:13

which is why we will be traversing many

play27:16

paths because I think the market is

play27:17

going to sort itself on what is the

play27:19

trade-off between how hard is it and and

play27:22

and how easy is it to use and what does

play27:25

it cost and all these kinds of things so

play27:27

we're we're speculating a little bit we

play27:29

see people rigging things up like that

play27:32

right now and asking cute little

play27:34

questions why is my portfolio down today

play27:36

you know all this kind of stuff that

play27:38

that's all fine right but that's not

play27:39

that's the beginning that's not going to

play27:41

be the end State you know in terms of

play27:43

the sort of production Deployable

play27:46

applications because the whole the whole

play27:48

security side of the house is going to

play27:50

weigh in and and Enterprises like are we

play27:53

violating any copyrights here when we

play27:55

use this this kind of all these

play27:57

questions

play27:58

um you know we're very enthralled right

play28:00

now with the technology and and and for

play28:02

good reason but there's going to be a

play28:03

level of reality that's going to start

play28:05

to settle in as well you know well Sarah

play28:08

you're at the core of your question and

play28:10

and I agree with everything that Frank

play28:11

said

play28:13

um but you know that that we're living

play28:15

through right now the First Fundamental

play28:18

Computing platform change in 60 years

play28:24

if you just read the press release of

play28:25

the IBM system 360 you will just you

play28:28

will hear about central processing units

play28:31

i o subsystems dma controllers virtual

play28:33

memory multitasking it's scalable

play28:36

Computing forward and backwards

play28:37

compatibility you're going to hear about

play28:39

all those literally written in 1964 and

play28:43

those words have carried us with CPU

play28:45

scaling right for

play28:48

60 years but that has run its course we

play28:51

can't scale CPUs anymore and that is now

play28:54

well understood at exactly the same time

play28:56

that all of a sudden software is

play28:59

different the way that software is

play29:00

written the way that software is

play29:02

operated and what software can do is

play29:05

profoundly different than it was before

play29:06

isn't that right you call it software

play29:08

2.0 and now we're talking about software

play29:10

3.0 the fact of the matter is Computing

play29:13

has fundamentally changed and so we're

play29:16

seeing two fundamental Dynamics

play29:17

happening at the same time that's the

play29:20

reason why things are shaken up so hard

play29:21

on the one hand you can't just keep on

play29:24

buying CPUs you know that and the reason

play29:26

for that is because if you bought a

play29:27

whole bunch more CPUs next year your

play29:30

Computing throughput would be no more

play29:32

than if you didn't touch it at all

play29:33

because it's the end of CPU scaling you

play29:36

would spend a whole bunch more money you

play29:37

wouldn't get a lot more throughput and

play29:39

so the answer is you have to go

play29:40

accelerate it the touring Award winners

play29:43

spoke about acceleration Nvidia has

play29:45

pioneered acceleration and and so

play29:47

accelerated Computing is now here the

play29:50

second part of it of course is the whole

play29:52

operating system on top of that computer

play29:54

is profoundly different we have a layer

play29:57

called Nvidia AI Enterprise Nvidia AI

play30:00

Enterprise data processing training

play30:03

inference deployment that entire

play30:05

end-to-end is now integrated into or in

play30:08

the path of being integrated into

play30:09

snowflake so that the compute engine

play30:12

underneath from the beginning of data

play30:15

processing all the way to the large

play30:16

language models are now going to be

play30:18

completely accelerating we're going to

play30:19

turbocharge The Living Daylights out of

play30:21

snowflake that's our that's our mission

play30:24

and and you'll be able to do more and

play30:26

you'll be able to do more with less and

play30:28

so so the the um and it's it's it's uh

play30:32

very very evident if you go to any of

play30:36

the clouds you will see that Nvidia gpus

play30:38

are the most expensive compute instances

play30:43

but if you put a workload on it you will

play30:45

find

play30:46

that we do it so fast it's as if you've

play30:49

got a 90 discount you said 90 earlier is

play30:53

in fact ten to one and so what's what's

play30:55

interesting is that we're the most

play30:57

expensive instance but we're the most

play30:59

cost effective TCO

play31:02

so if your job is to run workloads then

play31:06

the best way to do it is acceleration an

play31:08

example workload is training large

play31:09

language models an example workload is

play31:12

fine-tuning large language models if you

play31:14

want to do that

play31:15

absolutely do not

play31:18

do not not accelerate it please do

play31:21

accelerate it so accelerate every

play31:23

workload you can and that's that's the

play31:25

reinvention of the whole stack of the

play31:28

processor is different the operating

play31:29

system is different the large language

play31:30

model is different and now the way you

play31:32

write AI applications is different isn't

play31:34

that right software 3.0 you don't have

play31:36

to write it at all

play31:38

AI applications we're all going to write

play31:40

applications we're all going to connect

play31:43

our prompts and our contacts and our few

play31:47

python commands right we're going to

play31:49

connect it to a large language model

play31:50

we're going to connect it to our own

play31:51

database or the corporate database and

play31:53

we'll develop our own application

play31:55

everybody's going to be an application

play31:56

developer

play31:57

but the constant

play31:59

is still your data and you still need to

play32:01

fine tune it yeah you know I I think you

play32:04

know what in the early days of stuff

play32:07

like we really because it's a cloud

play32:09

computing fabric faster is cheaper and

play32:12

faster has always been more expensive

play32:13

and all of a sudden faster is cheaper

play32:15

right it's an inverted

play32:17

mentality so there's sometimes people

play32:20

want to under provision thinking it's

play32:21

cheaper ends up being more expensive

play32:23

right so this this counter

play32:25

counterintuitive the other thing you

play32:27

mentioned in terms of compute paradigms

play32:29

I mean the idea of

play32:31

uh the data the work going to the data

play32:34

instead of the data going to the work we

play32:37

are completely retracing ourselves and

play32:39

you know for

play32:40

60 years pick a number we've been

play32:43

pushing the data to the work and it has

play32:45

led to massive siloing and segmentation

play32:48

that's going to be really difficult if

play32:50

you want to have ai factories right

play32:52

that's right so we we have to bring that

play32:54

work to the data and you know we're

play32:55

doing that now together and I think

play32:57

that's we're doing it the right way you

play32:59

know so yeah that's right yeah

play33:02

uh uh maybe just as advice for the

play33:05

audience about what you're seeing from

play33:07

your own customers or your predictions

play33:08

where do you expect that customers will

play33:12

get the most value from AI most quickly

play33:16

well most quickly it's not the most

play33:19

value but the most quickly yeah two

play33:21

separate tanks I mean most quickly is

play33:23

that

play33:24

um we're going to be you're going to see

play33:26

augmented query all over the place

play33:28

because that's just relatively easy to

play33:31

add you don't even have to be literate

play33:33

to get value out of data now which is

play33:35

quite an amazing uh thing so the

play33:37

ultimate democratization of your

play33:39

relationship and interaction so that's

play33:41

that's all great right it's like search

play33:42

on steroids right and chat and stuff you

play33:45

think that's more people interacting

play33:46

with the data yeah but but you know it's

play33:49

it's like you're at these cute little uh

play33:51

you know questions to your dashboards

play33:53

you know I mean dashboards they are

play33:55

relatively static and the data is one of

play33:58

this but then I can bring these

play33:59

questions to that data and that's that's

play34:02

low-hanging fruit and then people are

play34:04

going to go to that board meeting and

play34:05

say look at what we've got you know

play34:06

everybody will be applauding and this is

play34:09

really exciting

play34:10

um but then I'll be like phase one and

play34:12

then then it really starts you know once

play34:14

we sort of get over to low hanging fruit

play34:15

we start really focusing on a much

play34:18

harder type questions I'm talking about

play34:20

proprietary Enterprise data and

play34:23

combinational structured unstructured

play34:25

all of it right how do we mobilize that

play34:28

what's an example of a hard question in

play34:30

phase two in whatever industry well I I

play34:33

mentioned the problem of churning

play34:35

consumer Industries the the problems

play34:38

that we have in Supply Chain management

play34:39

because they they when Supply chains can

play34:42

be extraordinarily complex and if there

play34:45

is an event then you know how do we you

play34:48

know re-configure rejigger a supply

play34:50

chain they're running these extremely

play34:52

intense models you have to ask these

play34:54

questions what do I do now and that does

play34:56

the event does happen and bioid Supply

play34:58

chains are comprised of many many

play35:00

different entities it's not a single

play35:01

Enterprise right so by the way this is a

play35:04

problem that has never been solved in

play35:06

the in the history of computing Supply

play35:07

Chain management has never been platform

play35:09

it's pretty much an email you know

play35:11

spreadsheet you know type of a business

play35:12

you know with small exceptions here and

play35:15

there so so it's exciting you know

play35:17

collapsing these these enormous call

play35:20

center Investments that we we have

play35:22

pricing optimizations in the world of

play35:25

retail right like I said it's a

play35:26

redefinition of business models that

play35:29

people are going to see and it's it's

play35:32

exciting but that's that's the real

play35:34

potential that I think you know CEOs of

play35:36

large institutions are after you know

play35:39

Jensen where do you where do you see the

play35:40

most impact coming in the next five

play35:42

years across the Enterprise uh I would

play35:44

ask myself number one what is my single

play35:47

most valuable database

play35:50

and the second thing I would ask myself

play35:51

is

play35:53

if I had a super super super smart

play35:56

person in front of it

play35:58

and everything goes through that super

play35:59

intelligence

play36:02

scary scary smart person what would I

play36:04

ask that person

play36:06

and and um uh it is can I guess is it

play36:10

the customer database customer database

play36:12

in the case because Frank has a lot of

play36:14

customers I don't have that many

play36:15

customers

play36:16

um but but the asset database if you're

play36:18

manufacturing supply chain yeah so my

play36:21

supply chain is super complicated and

play36:23

and my design database is super

play36:25

complicated

play36:27

uh it's impossible for NVIDIA to build

play36:29

our gpus without AI anymore

play36:31

because none of our Engineers can go

play36:33

through the number of iterations and

play36:35

exploration that AIS could do for us and

play36:37

so when we were coming up with AI our

play36:41

first application was ourselves

play36:43

and and so

play36:45

um uh you know Hopper couldn't have been

play36:47

designed without Ai and our next

play36:49

Generation cannot be designed without AI

play36:50

none of our optimizing compilers could

play36:53

be done without AI just optimizing the

play36:55

graph is so complicated because there

play36:57

are so many targets uh so many gpus so

play37:00

many different neural network models uh

play37:02

so many different constraints that

play37:03

you're optimizing for no human could

play37:05

possibly compile for that optimize for

play37:07

that and so we have ai just run

play37:08

continuously looking for the best

play37:10

Optimal Solutions and so so we we apply

play37:14

our our AIS to our own data we would do

play37:17

we would do that of course our bugs

play37:19

database is a perfect place because we

play37:22

have so much code we've got to go and

play37:23

figure out where

play37:25

um how to how to how to as quickly as

play37:27

possible enable our Engineers we have um

play37:32

you know we're a full stack company if

play37:34

you look at the amount of code that goes

play37:36

through Nvidia AI is some several

play37:39

hundred packages of software that comes

play37:42

together so that an application can sit

play37:44

up here be accelerated by a factor of 50

play37:46

on top of a GPU and then scale across a

play37:49

full data center that piece of software

play37:51

called Nvidia AI Enterprise is insanely

play37:53

complicated

play37:54

we need in the future and this is these

play37:57

are some of the things that we're

play37:58

working on now is how do we use an AI to

play38:00

go figure out how to security patch it

play38:03

how to best how to best maintain it so

play38:06

that we can keep compatibility so that

play38:08

we don't have to disturb the entire

play38:09

upper layer on the in the meantime

play38:12

support backwards compatibility maybe

play38:14

support a particular security hole

play38:18

etc etc and these are questions that AI

play38:20

is capable of answering for you that's

play38:21

right exactly we can use a large

play38:23

language model to go answer those

play38:24

questions and then find the answer for

play38:26

us or reveal the exposure to us and then

play38:29

the engineer will be able to go in and

play38:30

fix it

play38:31

yeah or recommend a fix and then we can

play38:33

be the human in the loop to go

play38:35

acknowledge whether this is a good fix

play38:36

or there's a better fix yeah yeah when

play38:40

we talk to practitioners we hear

play38:42

um understand our data better across the

play38:45

organization so that's kind of phase one

play38:46

that you talked about like how can we

play38:48

just ask more questions and let more

play38:50

people ask those questions we hear

play38:52

change the margin structure of our

play38:54

business right and then expand the scope

play38:56

of what we are doing for our customers

play38:59

um I think that what you're talking

play39:00

about with Nvidia actually using it

play39:03

within engineering across problems that

play39:06

are too complex for humans to handle

play39:09

today I think that's actually new but it

play39:10

makes sense to be honest it makes a lot

play39:12

of sense because because you have to

play39:13

Traverse across so many layers

play39:16

no person could see all those layers and

play39:19

and so wherever we can when the data is

play39:22

so so valuable so complex and so I just

play39:24

asked myself what's my most valuable

play39:26

data

play39:27

oh you have we have great customers and

play39:30

you know for example auto loans you know

play39:33

um every time they they price alone you

play39:36

know the amount of computation and data

play39:38

that gets involved in pricing that loan

play39:41

I mean you're a market of one to them

play39:43

right and the one basis point here or

play39:46

there at the scale that they operate you

play39:48

know becomes real money and it's only

play39:50

computationally understandable you look

play39:52

at auto insurance right they use

play39:54

Telemetry data from the devices that are

play39:56

in your car to price risk because that's

play40:00

the data that determines you know

play40:01

whether you're going to have claims or

play40:03

not in other words that business can't

play40:04

be run without Telemetry data anymore

play40:07

because you know the competition drives

play40:09

the prices down but how do you need to

play40:10

drive profits up at the same time so you

play40:13

need to really really understand the the

play40:15

cost of the risks you're taking so we

play40:17

can already not live with data uh in

play40:21

many many businesses that we're in you

play40:23

know impossible

play40:24

I can't let you guys get away without

play40:26

asking at least like one more

play40:28

challenging question here so uh you know

play40:31

we have a lot of portfolio companies

play40:33

they need uh you know they have ML and

play40:35

AI workloads there's also a series of

play40:37

cloud providers out there like why

play40:39

Snowflake and Nvidia for their ml

play40:42

workloads

play40:43

what was the question hang on a second

play40:47

her mind's I think cocktail you want

play40:49

this one too no no yours is water I

play40:51

think mine's not yeah

play40:53

I was just saying I'm the guest Frank

play40:56

was I also wanted a cocktail what did

play40:59

you say what was the last part the

play41:00

question is like there's a bunch of

play41:02

cloud providers out there who are vying

play41:04

for your ml or your AI workload business

play41:06

like why why Nvidia and snowflake

play41:09

well we're the best

play41:12

that's a good answer yes

play41:16

and and we're gonna we're gonna come to

play41:18

a theater near you and prove it of

play41:20

course you know and uh and and prove Us

play41:23

in the putting you know we we always aim

play41:25

for outcomes your outcomes not ours

play41:27

yours right and uh you know that we're

play41:30

gonna leave it all in the field I mean

play41:32

this is this is I find it's the most

play41:33

interesting time to be alive you know in

play41:36

this business because the ability to aim

play41:38

for superlative outcomes is is there now

play41:41

that's for a long time you know um I I

play41:43

we talk about data warehousing it used

play41:46

to be begging for 2 30 a.m time slot

play41:48

three months from now that was the state

play41:50

of the business and look at where we are

play41:51

now it's just unbelievable right you

play41:53

know the types of problems that we're

play41:55

talking about now so it's it's shipping

play41:58

an intelligent application in days and

play42:00

weeks yeah I I think not not everybody

play42:01

realizes how much intelligence is

play42:03

embedded in the data that they look at

play42:06

every day how much there is to be

play42:09

unlocked you know in insights and and

play42:11

impact so that's where we all are going

play42:15

to come in and make that happen yeah

play42:17

you know Sarah there's a reason

play42:19

that everybody is in Snowflake

play42:23

there's a reason for that because this

play42:26

is the safest best place to store your

play42:28

data

play42:29

to process your data and and so for

play42:33

everybody who's in Snowflake

play42:35

we are your best answer

play42:39

and for the very first time if you date

play42:41

a warehouse we're going to connect the

play42:43

AI Factory next to it

play42:45

and you're going to be producing

play42:47

intelligence the most valuable commodity

play42:49

the world's ever produced

play42:52

you are sitting on a gold mine of

play42:54

Natural Resources your company's data

play42:57

proprietary data we're now going to

play43:00

connect it to an AI engine

play43:02

and you're gonna on the other end of

play43:04

that is just intelligent spewing out

play43:06

every single day

play43:07

unbelievable amounts of Intelligence

play43:09

coming out the other end even while you

play43:12

sleep

play43:13

this is the best thing ever so if you're

play43:16

if you're a snowflake customer as many

play43:19

many are

play43:21

Frank and I are going to help you

play43:23

turn your data into intelligence

play43:27

that's a great way to summarize I can't

play43:29

improve on that

play43:39

it was a great conversation guys welcome

play43:41

to snowflake Summit congratulations on

play43:43

your partnership thank you so much thank

play43:45

you so much

play43:48

right here let's do it thank you great

play43:52

job oh one more thing

play43:55

oh

play43:57

Jensen we're not going to let you go

play43:58

home empty-handed

play44:01

I mean this is one of the most valuable

play44:02

wow wow it's snowflake Nvidia Brandon

play44:06

look at this wow

play44:07

[Applause]

play44:20

this is awesome thanks awesome let it go

play44:22

everybody thank you for your office okay

play44:27

you can get on it you have to stand on

play44:30

it actually I don't want to ruin it

play44:36

what's up

Rate This

5.0 / 5 (0 votes)

Related Tags
AI InnovationData TransformationSnowflake SummitNvidia PartnershipGenerative AIEnterprise IntelligenceData DemocratizationAI WorkloadsTech AdvancementsComputing Revolution