Full Keynote: Satya Nadella at Microsoft Build 2024

Microsoft
21 May 202452:43

Summary

TLDRThe video script outlines the transformative impact of AI across various sectors, emphasizing the accessibility of expertise through personal AI assistants like GitHub Copilot. It showcases AI's potential in enhancing productivity, creativity, and learning, highlighting innovations from Microsoft, including AI-first PCs, Azure AI services, and tools like Copilot Studio for customizing AI workflows. The script also underscores the importance of security and the democratizing force of AI in creating more accessible and innovative experiences.

Takeaways

  • πŸ“Š The industry is moving towards a common vision of 'information at your fingertips', emphasizing accessibility and expertise through AI.
  • 🌐 AI is becoming ubiquitous, aiding in various sectors such as farming, laboratories, boardrooms, and classrooms, highlighting its universal applicability.
  • πŸ”§ GitHub Copilot is reported to save developers up to 50% of their time, allowing for increased efficiency and innovation.
  • πŸ§‘β€πŸ« AI is being utilized as a personal tutor, helping individuals learn new skills, including complex subjects like banking.
  • 🚜 Generative AI is being employed to improve productivity, with examples such as helping farmers analyze conditions of their ponds and boosting agricultural productivity.
  • 🎨 AI is unlocking creativity, providing detailed descriptions that inspire artwork, showcasing its role in the arts and imagination.
  • πŸ‘©β€πŸ« Teachers are now able to create customized lesson plans with the help of AI, indicating a shift in educational approaches and tools.
  • πŸ€– The introduction of Copilot+PCs represents a new category of AI-first computers, designed to be the fastest for AI operations.
  • πŸ› οΈ Windows Copilot Runtime aims to be as transformative for AI as Win32 was for graphical user interfaces, offering a new layer for developers.
  • 🌟 Microsoft is focusing on democratization and sustainability, expanding AI infrastructure globally while aiming for 100% renewable energy in data centers.
  • πŸ”¬ The tech stack is evolving at every level, from data center power management to edge computing, adapting to new workloads and use cases.

Q & A

  • What is the core vision presented for the industry in the script?

    -The core vision is 'information at your fingertips', which has evolved into 'access to expertise', indicating a shift towards making AI available for everyone, everywhere.

  • How does the speaker describe the impact of AI on productivity and innovation?

    -The speaker mentions that AI, such as GitHub Copilot, can save about 50% of time, allowing individuals to use that time for other innovative tasks.

  • What is the significance of the new era of AI mentioned in the script?

    -The new era of AI is significant because it represents a shift from mere access to information to providing access to expertise, which can be utilized across various sectors like farming, laboratories, boardrooms, and classrooms.

  • How does AI technology facilitate learning according to the script?

    -AI technology serves as a personal tutor, helping individuals learn new skills such as banking, loan applications, and saving money, with the potential to reimagine student learning worldwide.

  • What role does Generative AI play in enhancing productivity and insights?

    -Generative AI can learn from data to improve productivity, such as helping farmers improve productivity, and providing insights in near real-time through personal coaching.

  • How is AI contributing to creativity, as described in the script?

    -AI is unlocking creativity by providing detailed descriptions that fuel imagination, aiding in activities like painting artwork, and allowing teachers to create lesson plans tailored to specific needs.

  • What is the significance of the introduction of 'Copilot+PCs' in the script?

    -The introduction of 'Copilot+PCs' signifies the creation of the fastest AI-first PCs, which are part of the new platforms being developed to support AI applications and solutions.

  • What is the role of the 'Windows Copilot Runtime' in the tech stack?

    -The 'Windows Copilot Runtime' is introduced to make Windows the best platform for building AI applications, similar to how Win32 was pivotal for the graphical user interface.

  • How does the speaker describe the advancements in AI infrastructure?

    -The speaker describes the AI infrastructure advancements as comprehensive and scalable, with Azure being built as the world's computer, offering the most complete selection of AI accelerators and focusing on sustainability.

  • What is the significance of the partnership with NVIDIA and AMD in the AI infrastructure?

    -The partnership with NVIDIA and AMD is significant as it brings the latest AI accelerators to Azure, offering the best performance and cost for AI workloads, and enabling advanced AI capabilities like confidential compute on GPUs.

  • How does the script emphasize the importance of democratization in technology?

    -The script emphasizes democratization by showcasing how AI models developed in one part of the world can be used to improve lives in another, and by making AI capabilities more accessible to developers globally.

Outlines

00:00

πŸ€– AI Transformation and Accessibility

The speaker introduces the concept of 'information at your fingertips' and discusses the evolution of AI to become accessible expertise for everyone. They highlight the impact of AI on various sectors like farming, education, and business, and share personal experiences of time saved through GitHub Copilot. The transformative potential of AI in reimagining learning and productivity is emphasized, along with its ability to unlock creativity and enhance daily tasks.

05:02

πŸš€ Accelerating AI Capabilities

This paragraph delves into the rapid advancements in AI, comparing the scaling laws of Deep Neural Networks (DNNs) to Moore's Law. The speaker discusses the new natural user interface that supports multimodal input and output, the importance of memory in AI for context retention, and the new capabilities in reasoning and planning. They also reflect on the significant changes brought about by developers utilizing AI to improve lives globally, exemplified by a rural Indian farmer's use of GPT-3.5.

10:02

πŸ’» Introducing Copilot+ PCs and Windows AI Integration

The speaker announces the introduction of Copilot+ PCs, highlighting their AI-first design and the integration of AI as a first-class namespace for Windows. They discuss the Windows Copilot Runtime and its significance for building AI applications, as well as the new Windows Copilot library that facilitates AI integration with local APIs. The paragraph also mentions the introduction of native PyTorch support and the WebNN framework through Windows DirectML.

15:04

🌐 Expanding AI Infrastructure and Sustainability

The speaker outlines the expansion of Microsoft's AI infrastructure, emphasizing the company's commitment to a sustainable, scalable, and comprehensive global presence. They discuss the optimization of power and efficiency, the use of AI accelerators, and collaborations with industry leaders like NVIDIA and AMD. The paragraph also touches on the integration of enterprise platforms like Omniverse Cloud and the strategic partnership with OpenAI.

20:04

πŸ› οΈ Democratizing AI with Diverse Models and Tools

This paragraph focuses on the democratization of AI through a broad selection of models and tools. The speaker mentions partnerships with various AI model providers and the integration of open-source models via Azure AI. They also discuss the introduction of small language models (SLMs) like the Phi-3 family and their cost-effectiveness and performance, as well as the new partnership with Khan Academy to enhance math tutoring.

25:04

πŸ›‘οΈ Prioritizing AI Safety and Customization

The speaker discusses the importance of AI safety and the new capabilities in Azure AI Studio for developing and safeguarding AI models responsibly. They announce the general availability of Azure AI Studio and introduce Azure AI custom models for training domain-specific models. The paragraph also covers the integration of AI with data platforms and the introduction of real-time intelligence in Microsoft Fabric.

30:05

πŸ”§ Enhancing Developer Experience with GitHub Copilot Extensions

The speaker introduces GitHub Copilot Extensions, which allow for customization of GitHub Copilot with third-party services. They demonstrate how Copilot can assist with coding and non-coding tasks, such as gathering requirements and creating plans, by staying in the flow across the entire development process. The paragraph also showcases how developers can create extensions for any tool in their stack.

35:06

🀝 Empowering Teams with Team Copilot and Security

The speaker announces Team Copilot, an extension of the personal assistant concept to team collaboration. They discuss how Team Copilot can facilitate meetings, manage projects, and serve as a team assistant across various platforms. The paragraph also emphasizes the security measures underlying Microsoft's approach to AI, including the Secure Future Initiative and its core design principles.

40:06

🎨 AI for a More Accessible World

The speaker concludes with an inspiring example of how AI is being used to create audio descriptions for the visually impaired, making media and art more accessible. They highlight the impact of AI in strengthening culture and shared humanity by providing audio descriptions for art galleries and advertising campaigns, showcasing the power of AI to enhance inclusivity and experience.

Mindmap

Keywords

πŸ’‘AI Transformation

AI Transformation refers to the shift in industries and processes due to the integration of artificial intelligence. In the video, it is the overarching theme where AI is depicted as a catalyst for change, enabling access to expertise and enhancing productivity across various sectors such as farming, education, and business.

πŸ’‘GitHub Copilot

GitHub Copilot is an AI-powered development tool that assists in coding by predicting what a programmer intends to write next. The script mentions it as a time-saving tool, allowing developers to increase their efficiency by approximately 50%, which can then be channeled into other innovative endeavors.

πŸ’‘Personal Assistant

A personal assistant in the context of the video represents AI systems that provide personalized support to individuals, saving time and enhancing productivity. The script illustrates this with examples such as helping to manage ponds or learn new skills like banking, showcasing the versatility of AI in personal use cases.

πŸ’‘Generative AI

Generative AI is a subset of AI that can create new content, such as text, images, or music. The video emphasizes its role in learning from data to improve productivity, like helping farmers, and in unlocking creativity, as in the case of generating detailed descriptions for artwork creation.

πŸ’‘Expertise at Your Fingertips

This concept in the video represents the ease of access to information and knowledge through AI, allowing individuals to make informed decisions quickly. It is exemplified by the script's mention of developers being able to optimize their work with AI and teachers creating lesson plans tailored to needs.

πŸ’‘AI-first PCs

AI-first PCs, as introduced in the script, are personal computers designed with AI capabilities at their core. They are built to leverage AI for various tasks, signifying a new category of devices that are optimized for AI applications and experiences.

πŸ’‘Azure

Azure, in the context of the video, refers to Microsoft's cloud computing service that provides a range of AI infrastructure and services. It is highlighted as a platform that supports AI applications, from training to inference, and is integral to the deployment and scaling of AI models.

πŸ’‘Moore's Law

Moore's Law is a principle observed in the evolution of computing hardware, predicting the doubling of transistor count and computing power every couple of years. The video script refers to it as a historical benchmark for progress, comparing it to the current rapid scaling laws of deep neural networks (DNNs).

πŸ’‘Distributed Synchronous Data Parallel Workloads

This term from the script describes a type of computing workload where data is processed in parallel across multiple distributed systems, synchronized to handle large-scale AI tasks. It is indicative of the changes in tech stack design influenced by AI and machine learning requirements.

πŸ’‘Natural User Interface

A natural user interface in the video is an interface that supports multiple modes of interaction like text, speech, images, and video. It represents the evolution of how humans interact with machines, moving towards more intuitive and human-like communication methods, facilitated by advances in AI.

πŸ’‘Democratization of AI

The democratization of AI as mentioned in the script refers to making AI technology accessible to everyone, regardless of their location or background. It is illustrated through examples of developers worldwide using AI to solve local problems and create impactful solutions.

Highlights

The vision of 'information at your fingertips' has evolved into 'access to expertise' in the new era of AI.

AI is transforming various sectors including farming, labs, business, and education by providing personalized assistance and tutoring.

GitHub Copilot is reported to save developers up to 50% of their time, enabling them to focus on innovation.

AI's potential to reimagine student learning worldwide through personalized tutoring is highlighted.

Generative AI is aiding farmers by learning from data to improve productivity.

AI is unlocking creativity, providing detailed descriptions that inspire artwork creation.

Teachers can now create lesson plans tailored to needs with AI providing expertise at their fingertips.

Microsoft Build conference marks a fundamental change in the tech industry, with every layer of the tech stack being reshaped by AI.

Breakthroughs in AI have been driven by scaling laws similar to Moore's Law, but with more rapid advancements every six months.

New natural user interfaces support multimodal input and output, including text, speech, images, and video.

Developers are leveraging AI capabilities to make a global impact, as demonstrated by the use of GPT-3.5 in rural India.

Microsoft introduced Copilot+PCs, the fastest AI-first PCs, and the Windows Copilot Runtime to support AI application development.

Azure's AI infrastructure is expanding globally, with a focus on sustainability and efficiency.

Microsoft is partnering with NVIDIA to offer Azure confidential compute on GPUs and integrate NVIDIA's enterprise platforms with Azure.

AMD's MI300X AI accelerator is now available in Azure, offering the best price-performance ratio for GPT-4 inference.

Microsoft Fabric is introducing Real-Time Intelligence, enabling actionable insights across the entire data estate.

GitHub Copilot Extensions allow for customization with third-party services, enhancing the developer experience.

Team Copilot is announced, extending assistance to team collaborations within Microsoft Teams and other platforms.

Copilot Studio enables the creation of custom AI agents that can work independently and manage tasks asynchronously.

Microsoft's Secure Future Initiative underpins the security of all AI and developer tools, ensuring they are secure by design, default, and operation.

The impact of AI in making the world more accessible, such as through audio descriptions for the visually impaired, is showcased.

Transcripts

play00:06

I think that our industry

play00:08

has to have a common vision.

play00:10

It was a time that connected us

play00:13

to incredible things.

play00:14

My name for this vision is,

play00:17

information at your fingertips.

play00:21

And three decades later,

play00:24

we find ourselves in a new era.

play00:29

One where access to information

play00:31

becomes access to expertise.

play00:34

From the farm, to the lab,

play00:38

from the boardroom, to the classroom,

play00:40

this new generation of AI

play00:43

is for everyone, everywhere.

play00:47

Now, anyone can save time

play00:49

with a personal assistant.

play00:51

With GitHub Copilot,

play00:52

I’m saving about 50% of time.

play00:54

And that's time that I can use

play00:55

to do other innovative things.

play00:58

It allows me to find out

play01:00

the condition of my ponds faster.

play01:03

Anyone can access

play01:05

a personal tutor to learn new skills.

play01:08

We got to learn about banking:

play01:11

How to apply for a loan,

play01:13

how to save money.

play01:14

We learned so much.

play01:16

I think this technology

play01:17

has the potential to completely reimagine

play01:18

the way every single student learns in the world.

play01:21

This is a new way to analyze

play01:23

with a personal coach.

play01:25

We're going to be able to have

play01:26

not only productivity gains,

play01:28

but insights served to us, near real-time.

play01:30

Generative AI can learn from the data

play01:33

to help improve the farmer productivity.

play01:40

AI is unlocking creativity for us all.

play01:44

Descriptions are so detailed,

play01:46

in my imagination

play01:47

I can paint the artwork.

play01:50

Now teachers are free to create

play01:52

lesson plans according to our needs.

play01:56

With expertise at your fingertips.

play01:59

You can build,

play02:01

what matters.

play02:06

Welcome to the age of AI transformation.

play02:17

Good morning.

play02:24

Good morning.

play02:24

It's fantastic to be back here at Microsoft Build.

play02:28

Welcome to everyone here and joining us on the web.

play02:33

You know, developer conferences

play02:35

are always most exciting, most fun

play02:39

when there's these fundamental changes

play02:42

that you can sense in the air.

play02:44

You know, I've marked all my adult life

play02:47

by coming to PDCs and Builds

play02:50

for the last three decades.

play02:52

I still remember,

play02:54

you know, distinctly

play02:55

the first time Win32 was discussed,

play02:58

I guess it was β€˜91,

play03:00

.NET, Azure, right?

play03:03

These are moments that I marked my life with.

play03:07

And

play03:07

it just feels like we're yet again

play03:10

at a moment like that.

play03:12

It's just that

play03:13

the scale,

play03:14

the scope is so much deeper,

play03:17

so much broader this time around, right?

play03:19

Every layer of this tech stack is changing,

play03:22

you know, from everything from the power draw

play03:25

and the cooling layer of the data center

play03:27

to the NPUs at the Edge

play03:30

are being shaped by these new workloads, right?

play03:33

These distributed,

play03:35

synchronous, data parallel workloads

play03:38

are reshaping every

play03:40

layer of the tech stack.

play03:43

But if you think about

play03:45

even going all the way back

play03:47

to the beginning of modern

play03:48

computing, say, 70 years ago

play03:51

there have been two real dreams we've had.

play03:55

First

play03:56

is can computers understand us instead of us

play04:00

having to understand computers?

play04:03

And second,

play04:04

in a world

play04:05

where we have this ever increasing information

play04:09

of people, places and things, right?

play04:12

So, as you digitize more artifacts

play04:14

from people, places and things

play04:16

and you have more information,

play04:18

can computers

play04:20

help us

play04:21

reason, plan

play04:23

and act more effectively on all that information?

play04:26

Those are the two dreams that we've had

play04:28

for the last 70-plus years.

play04:30

And here we are.

play04:31

I think that we have real

play04:33

breakthroughs on both fronts.

play04:37

The core underlying force,

play04:39

one of the questions I always ask myself is like,

play04:41

β€œOkay, this is great.

play04:42

This is like maybe the golden age of systems.

play04:44

What's really driving it?”

play04:47

I always come back to these scaling laws,

play04:49

just like Moore's Law,

play04:50

you know, helped drive the information revolution.

play04:53

The scaling laws of DNNs

play04:56

are really, along with the model architecture,

play04:59

interesting ways to use data,

play05:01

generate data,

play05:02

are really driving this intelligence revolution.

play05:06

You could say Moore's Law was probably,

play05:09

you know, more stable in the sense

play05:10

that it was scaling at maybe 15 months, 18 months.

play05:14

We now have these things that are scaling

play05:16

every six months or doubling every six months.

play05:20

You know, what we have, though,

play05:22

with the effect of these scaling

play05:24

laws is a new natural user interface that's multimodal.

play05:27

That means supports text,

play05:28

speech, images, video as input and output.

play05:31

We have memory

play05:33

that retains important

play05:34

context, recalls

play05:36

both our personal knowledge and data

play05:37

across our apps and devices.

play05:40

We have new reasoning and planning capabilities

play05:43

that helps us understand very complex context

play05:46

and complete complex tasks.

play05:49

While reducing the cognitive load on us.

play05:52

But what stands out for me

play05:55

as I look back at this past year

play05:58

is how you all as developers have taken

play06:03

all of these capabilities

play06:04

and applied them, quite frankly,

play06:06

to change the world around us.

play06:09

I’ll always remember this moment in January 2023

play06:13

when I met a rural Indian farmer

play06:15

who was able to reason over some government

play06:17

farm subsidies

play06:18

that he had heard about on television

play06:20

using GPT-3.5 and his voice.

play06:23

It was remarkable right?

play06:25

For me, it just brought home the power of all of this

play06:28

because a frontier model

play06:30

developed in the West Coast of the United States

play06:32

just a few months

play06:33

earlier was used by a developer in India

play06:36

to directly improve the life of a rural Indian farmer.

play06:40

The rate of diffusion

play06:43

is unlike anything

play06:44

I've seen in my professional career,

play06:46

and it's just increasing.

play06:48

In fact, earlier this month I was in Southeast Asia.

play06:50

I was in Thailand where I met a developer

play06:52

and I was having a great roundtable

play06:54

and he was talking to me

play06:55

about how he's using Phi-3 and GPT-4

play06:58

and he was using Phi-3

play06:59

to just optimize

play07:01

all of the things that he was doing with RAG.

play07:03

I mean, this is crazy I mean, this is unbelievable.

play07:06

It had just launched a few weeks earlier

play07:08

and I was there in Thailand, in Bangkok,

play07:10

listening to a developer talk

play07:12

about this technology as a real expert on it.

play07:14

So it's just great to see the democratization force

play07:18

that we love to talk about

play07:19

but to witness it is just been something.

play07:23

And this is, quite frankly,

play07:24

the impact of why we are in this industry.

play07:28

And it's what gives us,

play07:29

I would say that deep meaning in our work.

play07:31

So I want to start, though, with a very big thank you

play07:36

to every one of you

play07:38

who is really going about bringing

play07:39

about this impact to the world.

play07:42

Thank you all so very much.

play07:50

You know, when I think about

play07:52

what progress we've made

play07:54

even since last time we were here at Build,

play07:56

we built really three platforms.

play07:58

The first is Microsoft Copilot,

play07:59

which is your everyday AI companion.

play08:02

It puts knowledge and expertise

play08:03

at your fingertips, helps you act on it.

play08:06

And we built the Copilot stack

play08:08

so that you can build your AI applications

play08:10

and solutions and experiences.

play08:12

And just yesterday,

play08:13

we introduced a new category of

play08:15

Copilot+PCs, the fastest AI-first PCs ever built.

play08:21

All three of these things are exciting platforms

play08:25

but I want to start with Copilot+ PCs.

play08:28

You know, we're exposing AI

play08:30

as a first-class namespace for Windows.

play08:35

This week

play08:35

we are introducing the Windows Copilot Runtime

play08:38

to make Windows the best platform

play08:40

for you to be able to build your AI applications.

play08:44

Yeah.

play08:50

You know what Win32 was to graphical user interface,

play08:55

we believe the Windows Copilot Runtime

play08:57

will be for AI.

play08:59

It starts with our Windows Copilot library,

play09:02

a collection of these ready-to-use local APIs

play09:06

that help you integrate into your new experiences

play09:09

all of the AI capabilities that we shared yesterday.

play09:13

Now, this includes

play09:15

no code integrations for Studio Effects

play09:18

things like creative filters,

play09:19

teleprompter, voice focus, and much more.

play09:23

But of course, if you want to access these models itself,

play09:26

you can directly call them through APIs.

play09:28

We have 40 plus models available

play09:31

out of the box, including Phi-Silica

play09:34

our newest member

play09:35

of our small language

play09:36

family model, which we can specific,

play09:38

which we specifically designed to run

play09:40

locally on your NPUs, on Copilot+ PCs

play09:44

bringing that

play09:45

lightning-fast local inference to the device.

play09:48

You know, the other thing is the Copilot library

play09:50

also makes it easy for you to incorporate RAG

play09:52

inside of your applications on the,

play09:55

on device data.

play09:56

It gives you the right tools to build

play09:58

a vector store within your app.

play10:01

It enables you to do that

play10:02

semantic search that you saw with Recall.

play10:04

But now you can, in your own application, construct

play10:08

these prompts using local data for RAG applications.

play10:13

Now, I’m so thrilled to announce as well today

play10:17

that we will be natively supporting

play10:19

PyTorch and new WebNN framework

play10:22

through Windows DirectML.

play10:29

Native PyTorch support means

play10:31

thousands of OSS models

play10:33

will just walk out of the box on Windows,

play10:36

making it easy for you to get started.

play10:39

In fact, with WebNN, web developers

play10:41

finally have a web-native machine learning framework

play10:45

that gives them direct access

play10:46

to both GPUs and NPUs

play10:48

in fact, last night

play10:49

I was playing with it, turning it on in Edge

play10:51

and seeing the WebNN sample code running.

play10:55

It's just so cool to see it

play10:56

you know, now use even the NPUs.

play10:58

Both PyTorch and WebNN are available

play11:02

in Developer Preview today, let's take a look.

play12:06

And these are just

play12:09

one of the many announcements today.

play12:11

We're introducing more than 50-plus

play12:13

new products and partnerships

play12:15

to create a new opportunity for you.

play12:18

We’ve always been a platform company

play12:20

and our goal is to build the most complete

play12:22

end-to-end stack

play12:23

from infrastructure,

play12:25

to data, to tooling to the application extensibility

play12:29

so that you can apply the power of this technology

play12:33

to build your own applications.

play12:35

And so today I want to highlight

play12:36

our top news for this event

play12:39

across every layer of this Copilot stack.

play12:42

So let's dive right in with infrastructure.

play12:46

You know, we have the most complete

play12:47

scalable AI infrastructure

play12:49

that meets your needs in this AI era.

play12:52

We're building Azure as the world's computer.

play12:54

We have the most comprehensive global

play12:57

infrastructure with more than 60-plus

play12:59

datacenter regions,

play13:00

more than any other cloud provider.

play13:02

Over the past year,

play13:03

we have expanded our datacenter regions

play13:06

and AI capacity from Japan

play13:08

to Mexico, from Spain to Wisconsin.

play13:11

We're making our best-in-class AI

play13:13

infrastructure available everywhere

play13:15

and we're doing this with a focus

play13:17

on delivering on cloud services sustainability.

play13:21

In fact,

play13:22

we're on track to meet our goal

play13:24

to have our data centers powered

play13:26

by 100% renewable energy by next year.

play13:30

Yeah.

play13:35

You know,

play13:36

we’re optimizing power

play13:38

and efficiency across every layer of the stack

play13:41

from the data center to the network.

play13:43

Our latest data center

play13:45

designs are purpose built for these AI workloads

play13:48

so that we can effectively and responsibly use

play13:50

every megawatt of power to drive down the cost of AI

play13:56

and the power draw.

play13:57

And we are incorporating

play13:58

advanced data center cooling techniques

play14:01

to fit the thermal profile of the workloads

play14:04

and match it to the environment

play14:06

in the location where it operates.

play14:09

And the silicon layer,

play14:11

we are dynamically

play14:12

able to map workloads

play14:14

to the best accelerated AI hardware

play14:17

so that we have the best performance.

play14:19

And our custom IO hardware and server

play14:22

designs allow us to provide dramatically faster

play14:25

networking, remote storage

play14:27

and local storage throughput.

play14:29

You know, this

play14:30

end-to-end approach

play14:33

is really helping us get to the unprecedented scale.

play14:36

In fact, last November

play14:37

we announced the most powerful

play14:39

AI supercomputer in the cloud

play14:41

for training.

play14:42

Using just actually a very small fraction

play14:44

of our cloud infrastructure.

play14:46

And over the past six months

play14:48

we've added 30 times that

play14:51

supercomputing power to Azure.

play14:53

Yeah, it's crazy to see the scale.

play14:57

And of course we're not just scaling training our fleets,

play15:01

we’re scaling our inference fleet

play15:03

around the world, quadrupling the number of countries

play15:07

where Azure AI services are available today

play15:09

and it's great to see that.

play15:13

At the heart of our AI infrastructure

play15:17

are the world's most advanced AI accelerators, right?

play15:20

We offer the most complete selection

play15:22

of AI accelerators, including from NVIDIA and AMD,

play15:26

as well as our own Azure Maia,

play15:28

all dynamically optimized for the workloads.

play15:32

That means whether you're using Microsoft Copilot

play15:35

or building your own Copilot apps,

play15:37

we ensure that you get

play15:38

the best accelerator performance at the best cost.

play15:42

For example, you know,

play15:43

you see this in what has happened with GPT-4, right?

play15:46

It's 12x cheaper

play15:48

and 6x faster since it launched.

play15:50

And that's,

play15:51

you know,

play15:52

the type of progress

play15:53

you can continue to see,

play15:55

how, you know, you continue to see the progress

play15:57

as we evolve the system architecture.

play16:00

It all starts, though,

play16:02

with this very deep, deep partnership with NVIDIA,

play16:05

which spans the entirety of the Copilot

play16:08

stack across

play16:09

both all of their hardware innovation

play16:11

as well as their system software innovation.

play16:13

Together, we offer Azure

play16:16

confidential compute on GPUs to

play16:19

really help you protect sensitive data

play16:22

around the AI models end to end.

play16:24

We're bringing

play16:25

in fact the latest H200s

play16:27

to Azure later this year,

play16:29

and will be among the first cloud providers

play16:32

to offer NVIDIA's

play16:33

Blackwell GPUs in B100

play16:35

as well as GB200 configurations.

play16:38

And we are continuing

play16:41

to work with them to train and optimize

play16:43

both large language models

play16:46

like GPT-4o, as well as small language

play16:48

models like the Phi-3 family.

play16:51

Now beyond the hardware,

play16:54

we are bringing NVIDIA’s key

play16:55

enterprise platform offerings to our cloud,

play16:58

like the Omniverse Cloud

play16:59

and DGX Cloud to Azure

play17:01

with deep integration

play17:03

with even the broader Microsoft Cloud.

play17:05

For example,

play17:06

NVIDIA recently announced

play17:08

that their DGX Cloud integrates

play17:09

natively with Microsoft Fabric.

play17:11

That means you can train those models using

play17:14

DGX Cloud with the full access through Fabric data.

play17:18

And Omniverse APIs will be available

play17:20

first on Azure for developers

play17:22

to build their industrial AI solutions.

play17:25

We're also working

play17:25

with NVIDIA’s NIM industry

play17:27

specific developer services

play17:29

and making them fantastic on Azure.

play17:31

So, a lot of exciting work with NVIDIA.

play17:35

Now, coming to AMD,

play17:38

I am really excited to share

play17:40

that we are the first cloud

play17:42

to deliver

play17:42

general availability of VMs based on AMD’s MI300X

play17:47

AI accelerator.

play17:54

It's a big milestone for both AMD and Microsoft.

play17:57

We've been working at it for a while

play17:59

and it's great to see that today

play18:01

as we speak, it offers the best price performance

play18:03

on GPT-4 inference.

play18:06

And we'll continue to move forward with Azure Maia.

play18:09

In fact, our first clusters are live and soon

play18:12

if you're using Copilot

play18:13

or one of the Azure OpenAI services,

play18:15

some of your prompts will be served

play18:18

using Maia hardware.

play18:20

Now beyond AI, our end to end systems

play18:24

optimization also makes cloud-native apps

play18:28

and the development of cloud-native

play18:30

apps better, right?

play18:31

Six months ago

play18:32

is when we announced our first general purpose

play18:35

ARM-based compute processor Microsoft Cobalt.

play18:38

And today

play18:39

I am really excited to announce

play18:41

the public preview of Cobalt-based Vms.

play18:48

You know,

play18:49

Cobalt is being used for video processing

play18:52

and permissions management in Microsoft 365,

play18:55

helping power billion of conversations

play18:58

on services like Microsoft Teams already.

play19:00

And we are delivering that same ARM-based

play19:03

performance and efficiencies to many customers,

play19:05

in fact, including

play19:06

Elastic, Mongo, Siemens, Snowflake and Teradata.

play19:11

In our most recent benchmark

play19:13

data and tests, our Cobalt 100 VMs delivered up to

play19:17

40 percent better performance

play19:19

than any other generally available ARM-based VMs.

play19:22

So we are very very excited about Cobalt

play19:24

getting into the market.

play19:26

Now let's move up the stack to the foundation models.

play19:30

With Azure AI,

play19:32

we offer the broadest selection

play19:33

of frontier and open source models,

play19:36

including LLMs and SLMs,

play19:38

so you can choose the model

play19:39

that makes the most sense for your unique needs

play19:41

and your application needs.

play19:42

In fact,

play19:43

more than 50,000 organizations use Azure AI today.

play19:48

Yeah.

play19:49

It's great momentum

play19:52

and it all starts

play19:54

though, with our most strategic

play19:56

and most important partnership with OpenAI.

play20:00

Just last week, OpenAI announced GPT-4o,

play20:03

for all their latest multimodal model,

play20:05

which was trained on Azure.

play20:07

It's an absolute breakthrough.

play20:09

It has text, audio, image and video as input and output.

play20:13

It can respond and just have a humanlike conversation

play20:17

that's fast and fluid.

play20:18

It can even be interrupted mid-sentence.

play20:21

GPT-4o is also the top performing model

play20:24

on benchmarks across a variety of modalities

play20:27

and it's always going to get you to the best answer.

play20:30

It has state-of-the-art performance

play20:32

at understanding the information

play20:34

you actually provide in your prompt

play20:35

because that's really what matters.

play20:38

What OpenAI demoed last week,

play20:41

I mean, that was just pretty magical stuff.

play20:43

And yesterday

play20:44

we showed how Copilot will leverage GPT-4o.

play20:49

In the video

play20:50

I'm just going to play, notice

play20:52

that you can share

play20:53

your screen or session as a prompt in Copilot

play20:58

and have Copilot help you in

play21:00

whatever it is that you're doing.

play21:02

Let's roll the video.

play21:07

Hey Copilot, how's it going?

play21:09

Hey, I'm doing great.

play21:11

And it looks like you're about to dive

play21:12

into some Minecraft.

play21:13

Ready to build, explore and maybe dodge a few mobs?

play21:17

Yeah, I'm trying to figure out how to make a sword.

play21:19

My son plays this game all the time

play21:20

and I have no idea what I'm doing.

play21:23

No worries. I'm here to help you impress your son.

play21:26

I see you've got a crafting table in front of you,

play21:28

which is a great start.

play21:30

To craft a sword

play21:31

you'll need some materials.

play21:32

Can you open your inventory?

play21:34

Just press β€œE” on your keyboard.

play21:36

Yeah. Do I have what I need?

play21:39

Let's see.

play21:40

You've got some sticks.

play21:41

Perfect for the swords handle.

play21:43

But it looks like you're missing the material

play21:45

for the blade.

play21:46

You can use wood, stone, iron, gold or diamond.

play21:51

Let's start by gathering some wood or stone.

play21:53

Shall we head out and find some resources?

play21:56

Yeah, that sounds great.

play21:58

Oh my god, what's that?

play22:00

Oh no, it's a Zombie!

play22:01

Run! You need to get away from it fast.

play22:03

Either sprint away or quickly

play22:05

build a tower of blocks straight up

play22:06

to get out of its reach.

play22:08

If you can find a shelter

play22:09

or dig into the side of a hill, do it now.

play22:12

Just make sure it can't reach you.

play22:15

Alright, I think I might be safe now.

play22:19

That was a close one. Great job finding shelter.

play22:22

It's really cool to see that.

play22:29

You know,

play22:31

the magic here is that

play22:32

Copilot is your personal assistant

play22:34

that's helping you with any task, helping

play22:36

you stay, quite frankly, in the flow in the moment,

play22:39

whether it's a game or a document or a line of code.

play22:43

And as OpenAI innovates,

play22:45

our promise is that we will bring all that innovation

play22:48

to Azure too, in fact,

play22:49

the same day that OpenAI

play22:51

announced GPT-4o,

play22:52

we made the model available

play22:53

for testing on Azure OpenAI Service and today

play22:56

I'm excited to say

play22:58

that it’s generally available on Azure AI.

play23:07

You know, what this means though

play23:09

is that now we can have these groundbreaking apps

play23:12

that all of you can build using this capability.

play23:15

One of the coolest things is that now any app,

play23:18

any website, can essentially be turned

play23:21

into a full multi-modal,

play23:23

full duplex conversational canvas.

play23:26

Let's take a look.

play23:28

Alright, let me ask the agent here.

play23:31

Hey, Seth. How's it going?

play23:33

What can I help you with today?

play23:35

I am fantastic.

play23:36

I'm sorry, I'm in such a hurry.

play23:37

I'm going to go on an

play23:38

overnight camping hike,

play23:39

where we're camping -

play23:40

it's going to be dark, cold.

play23:42

Should I use these shoes?

play23:43

It's the mountains.

play23:44

It's probably going to be cold.

play23:45

Hey, no worries.

play23:47

Let me take a look.

play23:49

Ah, those are the TrekStar hiking sandals.

play23:52

While they're fantastic for summer hikes

play23:54

and keeping your feet cool,

play23:55

they might not be the best choice for cold -

play23:58

Okay, I'm sorry to interrupt you.

play23:59

Pick the best thing that I should use,

play24:01

and can you just add it to my cart?

play24:02

Just one product.

play24:04

No worries.

play24:05

Let's go with the TrekReady

play24:07

hiking boots.

play24:08

They're durable,

play24:09

provide great ankle support,

play24:11

and most importantly,

play24:12

they'll keep your feet warm

play24:13

in the cold mountains.

play24:15

I'll add them to your cart now.

play24:16

Ready for checkout?

play24:17

I see that, amazing.

play24:19

Thank you very much.

play24:20

I hope to have a good time camping.

play24:23

You’re welcome, Seth.

play24:24

Enjoy camping and stay warm in those boots.

play24:28

Good luck and have fun in the mountains.

play24:33

While Seth was,

play24:40

you know, Seth was in a hurry, but there you go.

play24:42

GPT-4o kept up with him.

play24:44

It's incredible.

play24:45

I really want to thank the OpenAI

play24:47

team for their partnership and

play24:49

and really the responsible approach to innovation,

play24:52

helping our industry move forward.

play24:54

Sam will be here in fact, joining Kevin

play24:56

in a little bit to talk a lot more about what's coming

play24:59

because that's the exciting stuff,

play25:00

how do you all sample what comes next.

play25:03

We also,

play25:05

I brought, we're bringing lots

play25:08

and lots of other models

play25:09

as well from Cohere and Databricks and Deci, Meta,

play25:13

Mistral, Snowflake, all through Azure AI.

play25:17

We want to support the broadest set of models

play25:19

from every country, every language.

play25:22

I'm excited to announce,

play25:23

in fact, we're bringing models from Cohere,

play25:25

G42, NTT DATA, Nixtla,

play25:28

as well as many more, as models of services,

play25:31

because that's the way

play25:32

you can easily get to managed AI models.

play25:35

And we all love open source, too.

play25:38

In fact,

play25:38

two years ago at Build,

play25:40

we were the first to partner

play25:42

with Hugging Face, making it simple

play25:44

for you to access the leading open source library

play25:47

with state-of-the-art language models

play25:49

via Azure AI.

play25:50

And today I'm really excited to announce

play25:53

that we're expanding our partnership,

play25:54

bringing more models from Hugging Face

play25:56

with text generation inference,

play25:58

with text embedding inference

play25:59

directly into Azure AI Studio.

play26:07

And, and we're not stopping there.

play26:08

We are adding

play26:10

not just large language models,

play26:11

but we are also leading the small language revolution.

play26:15

So small language model revolution,

play26:17

you know, our Phi-3 family of SLMs

play26:20

are the most capable and most cost effective.

play26:23

They outperform models of the same size

play26:25

or the next size up

play26:26

even across

play26:27

a variety of language

play26:29

reasoning, coding, as well as math benchmarks.

play26:32

If you think about it by performance

play26:35

to parameter count ratio, it's truly best in class.

play26:38

And today we're adding new models

play26:41

to the Phi-3 family

play26:43

to add even more flexibility

play26:45

across that quality cost curve.

play26:47

We're introducing Phi-3 Vision,

play26:50

a 4.2 billion parameter

play26:52

multimodal model with language

play26:53

and vision capabilities.

play26:55

It can be used to reason our real-world images so

play26:59

generate insights and answer questions about images.

play27:02

As you can see right here. Yeah.

play27:07

And we're also making a 7 billion parameter

play27:10

Phi-3 small in a 14 billion parameter

play27:13

Phi-3 medium models available.

play27:16

With Phi,

play27:17

you can build apps that span the web,

play27:19

your Android, iOS, Windows and the Edge.

play27:23

They can take advantage of local hardware

play27:26

when available and fall back on the cloud.

play27:28

We're not simplifying really

play27:30

all of what

play27:31

we as developers have to do to support

play27:33

multiple platforms using one AI model.

play27:36

Now, it's just awesome

play27:38

to see how many developers are already using

play27:41

Phi0-3 to, you know, do incredible things.

play27:43

From Amity Solutions, the Thai company

play27:46

that I mentioned earlier,

play27:48

the ITC, which is been

play27:51

built a copilot for Indian farmers

play27:53

to ask questions

play27:54

about their crops.

play27:55

Epic in health care

play27:57

which is now using Phi to summarize complex

play27:59

patient histories more quickly

play28:01

and efficiently.

play28:02

And out of the very,

play28:03

very cool use cases in education.

play28:06

Today, I'm very thrilled to announce

play28:08

a new partnership with Khan Academy.

play28:11

We'll be working together to use Phi-3

play28:14

to make math tutoring more accessible.

play28:16

And I'm also excited to share

play28:18

that they'll be making Khanmigo

play28:20

their AI assistant free to all US teachers.

play28:23

Let's roll the video.

play28:28

I felt like I was in a place in my teaching career

play28:32

where I felt like I was kind of losing my sparkle.

play28:35

And I would just feel really defeated

play28:38

when I looked out on the classroom

play28:39

and I would see students

play28:40

that just didn't look engaged.

play28:44

Teachers have an incredibly hard job

play28:45

and what we think we can do

play28:47

is leverage technology

play28:48

to take some of the

play28:49

stuff off of their plate,

play28:50

to really actually humanize the classroom.

play28:52

By some miracle, we became

play28:55

a Khanmigo pilot school.

play28:58

With new advances in generative AI,

play29:00

we launched Khanmigo.

play29:01

The point is to be that personalized tutor

play29:04

for every student

play29:05

and to be a teaching assistant for every teacher.

play29:09

I started to build these more robust lessons

play29:13

and I started to see my students engage.

play29:19

We're working with Microsoft

play29:21

on these Phi models

play29:22

that are specifically tuned for math tutoring.

play29:26

If we can make a small language model like Phi,

play29:29

work really well in that use case,

play29:30

then we would like to, kind of, shift the traffic to Phi

play29:34

in those particular scenarios.

play29:36

Using a small language model,

play29:38

the cost is a lot lower.

play29:42

We're really excited that Khanmigo,

play29:44

and especially in the partnership with Microsoft,

play29:46

being able to give these teacher tools

play29:49

for free, to U.S. teachers

play29:52

is going to make a dramatic impact

play29:53

on U.S. education.

play29:54

I think we're going to make them the innovators,

play29:57

the questioners, isn't that really

play30:00

just why you wake up every morning?

play30:02

Right? Because that's our future,

play30:03

our next generation.

play30:05

And to me, that's everything.

play30:14

You know, I’m super excited to see the impact

play30:17

this all will have and what Khan Academy will do.

play30:20

And Sal is going to, in fact,

play30:21

join Kevin soon to share more.

play30:24

And I'm really thankful for Teachers

play30:26

like Melissa and everything that they do.

play30:28

Thank you very much.

play30:30

You know, of course,

play30:31

it's about more than just models.

play30:34

It's about the tools

play30:35

you need to build these experiences.

play30:39

With Azure AI Studio

play30:41

we provide an end-to-end

play30:42

tooling solution to develop and safeguard

play30:45

the copilot apps you build.

play30:48

We also provide tooling and guidance

play30:50

to evaluate your AI models

play30:52

and applications

play30:52

for performance and quality,

play30:54

which is one of the most important tasks

play30:56

as you can imagine with all of these models.

play30:59

And I'm excited to announce

play31:00

that Azure AI Studio now is generally available.

play31:09

It's an end to end

play31:11

development environment to build, train,

play31:13

and fine tune AI models – and do so responsibly.

play31:16

It includes built-in support.

play31:19

For what is perhaps the most important feature,

play31:21

which is, in this age of AI,

play31:23

which is AI Safety.

play31:25

Azure AI Studio

play31:26

includes the state of the art safety tooling.

play31:28

You know,

play31:29

to everything from detecting hallucinations

play31:31

in model outputs, risk and safety monitoring.

play31:34

It helps understand

play31:35

which inputs and outputs are triggering

play31:38

content filters. Prompt shields, by the way,

play31:41

to detect and block these prompt injection attacks.

play31:44

And so today

play31:45

we are adding

play31:46

new capabilities, including custom categories,

play31:48

so that you can create these unique filters

play31:51

for prompts and completions

play31:52

with rapid

play31:53

deployment options,

play31:54

which I think is super important

play31:56

as you deploy these models into the real world.

play31:58

Even when an emerging threat is, you know, appears.

play32:02

Beyond Azure AI Studio,

play32:04

we recognize that there are advanced applications

play32:06

where you need

play32:07

much more customization

play32:09

of these models for very specific use cases.

play32:12

And today

play32:13

I'm really excited to announce that

play32:15

Azure AI custom models

play32:17

will come, giving you the ability

play32:19

to train a custom model

play32:20

that's unique to your domain, to your data, that's

play32:24

perhaps proprietary.

play32:25

That's same builders and data scientists

play32:28

who’ve been working with

play32:29

Open AI, brought all the Phi

play32:31

advances to you, will work

play32:33

with all of you to be able

play32:34

to build out these custom models.

play32:36

The output will be domain specific.

play32:38

It will be multitask

play32:40

and multimodal,

play32:41

best in class as defined by benchmarks,

play32:44

including perhaps even specific language proficiency

play32:47

that may be required.

play32:49

Now, let's just roll up the stack to data.

play32:53

Ultimately,

play32:55

in order to train

play32:56

fine-tune, ground your models,

play32:59

you need your data to be in its best shape.

play33:02

And to do so, we are building out the full data estate

play33:05

right from operational stores to analytics in Azure.

play33:10

We’ve also added

play33:11

AI capabilities

play33:12

to all of our operational stores,

play33:14

whether it's Cosmos DB or SQL, or PostgreSQL.

play33:17

At the core though,

play33:19

of the Intelligent Data Platform.

play33:21

Is Microsoft Fabric.

play33:22

We now have over 11,000 customers,

play33:26

including leaders in every industry who’re using Fabric.

play33:29

It's fantastic to see the progress.

play33:34

With Fabric,

play33:36

you get everything you need in a single integrated

play33:41

SAS platform.

play33:42

It's deeply integrated at its most fundamental level

play33:45

with compute and storage being unified.

play33:48

Your experience is

play33:49

unified, governance is unified, and more importantly,

play33:52

the business model is unified.

play33:54

And what's also great about Fabric

play33:57

is that it works with data anywhere, right?

play33:59

Not just on Azure,

play34:00

but it can be on AWS or on GCP

play34:03

or even in your on-premise data center.

play34:05

And today we are taking the next step.

play34:08

We're introducing Real-Time Intelligence in Fabric.

play34:16

Customers today have

play34:18

more and more of this real-time

play34:19

data coming from your IoT systems,

play34:22

your telemetry systems. In fact, cloud

play34:25

applications themselves are generating lots of data,

play34:28

but with Fabric, anyone can unlock

play34:30

actionable insights across all of your data estate.

play34:34

Let's take a look.

play34:35

Introducing

play34:36

real-time intelligence

play34:38

in Microsoft Fabric,

play34:39

an end-to-end solution

play34:40

empowering you to get instant

play34:42

actionable insights

play34:43

on streaming data.

play34:44

At its heart lies

play34:45

a central place to discover,

play34:47

manage, and consume event data

play34:49

across your entire organization

play34:51

with a rich governed experience.

play34:54

Get started quickly

play34:55

by bringing in data

play34:56

from Microsoft sources

play34:57

and across clouds with a variety

play34:59

of out-of-the-box connectors.

play35:01

Route the relevant data to

play35:02

the right destination in Fabric

play35:04

using a simple

play35:05

drag-and-drop experience.

play35:08

Explore insights on petabytes

play35:10

of streaming data

play35:11

with just a few clicks.

play35:13

Elevate your analysis

play35:14

by harnessing the intelligence

play35:16

of Copilot in Microsoft Fabric

play35:18

using simple natural language.

play35:20

Make efficient business decisions

play35:22

in the moment,

play35:23

with real-time actionable insights

play35:25

and respond to changing landscapes proactively.

play35:29

Allow users to monitor

play35:30

the data they care about,

play35:31

detect changing patterns,

play35:33

and set alerts or actions

play35:34

that drive business value.

play35:36

All your data, all your teams,

play35:39

all in one place.

play35:41

This is Microsoft Fabric.

play35:48

And, we're making it

play35:50

even easier to design, build and interoperate

play35:54

with Fabric with your own applications, right?

play35:57

And in fact, we're building out a new app platform

play36:00

with Fabric Workload Development Kit

play36:02

so that people like ESRI, for example, having,

play36:05

you know, who have integrated

play36:06

their spatial analytics with Fabric

play36:08

so that customers can generate insights

play36:11

from their own location

play36:12

data using ESRI’s rich tools

play36:15

and libraries, right on Fabric, right.

play36:16

This is just exciting to see

play36:19

As the first time,

play36:21

you know,

play36:22

where the analytics stack is really a first-class

play36:24

app platform as well.

play36:26

And beyond Fabric,

play36:27

we are integrating the power of AI across

play36:30

the entirety of the data stack.

play36:32

There's no question that RAG is core to any

play36:35

AI-powered application,

play36:36

especially in the enterprise today.

play36:38

And Azure AI Search makes it possible

play36:41

to run RAG at any scale,

play36:43

delivering very highly accurate responses

play36:45

using the state of the art retrieval systems.

play36:48

In fact, ChatGPT supports for GPTs, their Assistants

play36:54

API, are all powered by Azure AI Search today.

play36:58

And with built-in OneLake integration.

play37:01

Azure AI Search will automatically

play37:03

index your unstructured data too.

play37:06

And it's also integrated into Azure

play37:07

AI Studio to support

play37:08

bringing your own embedding model, for example.

play37:11

And so it's pretty incredible

play37:12

to see Azure Search grow over the last year

play37:14

into that very core developing service.

play37:18

Now let's go up through developer tools.

play37:21

Nearly 50 years after our

play37:22

founding as a developer tools company,

play37:24

here, we are once again

play37:26

redefining software development, right?

play37:29

GitHub Copilot

play37:30

was the first, I would say, hit product

play37:33

of this generative AI age.

play37:36

And it's the most widely adopted AI

play37:38

developer tool, 1.8 million subs

play37:41

across 50,000 organizations, are using it.

play37:49

And GitHub Copilot,

play37:51

we're empowering

play37:52

every developer on the planet to be able to access

play37:54

programing languages and programing knowledge

play37:57

in their own native language.

play38:00

Think about that.

play38:00

Any person can start programing,

play38:02

whether it's in Hindi

play38:03

or Brazilian Portuguese,

play38:05

and then bring back

play38:06

the joy of coding to their native language.

play38:09

And with Copilot Workspace, staying in your flow

play38:13

has never been easier.

play38:14

We are an order of magnitude closer

play38:17

to a world

play38:17

where any person can go from idea to code

play38:20

in an instant.

play38:22

You start with an issue,

play38:24

it creates a spec

play38:26

based on its deep understanding of your codebase.

play38:30

It then creates a plan

play38:31

which you can execute to generate the code

play38:34

across the full repo that is multiple files.

play38:37

At every point in this process – from the issue, to spec,

play38:42

to plan, to code, you are in control, you can edit it.

play38:47

And that's really what is fundamentally

play38:49

a new way of building software.

play38:52

And we are looking forward

play38:53

to making it much more broadly

play38:54

available in the coming months.

play38:56

And today, we are taking one more big leap forward.

play39:01

You know, we are bridging the broader developer

play39:03

tools and services ecosystem

play39:05

with Copilot for the first time.

play39:08

We are really thrilled to be announcing

play39:10

GitHub Copilot Extensions.

play39:18

Now you can

play39:19

customize GitHub Copilot

play39:21

with capabilities from third-party services,

play39:23

whether it's Docker, Sentry, and many, many more.

play39:26

And of course we have a new extension

play39:28

for Azure too: GitHub Copilot for Azure.

play39:32

You can instantly deploy to Azure

play39:34

to get information about your Azure resources

play39:38

just using natural language.

play39:40

And what Copilot did for coding.

play39:42

We are now doing for infra and ops.

play39:45

To show you all this in action

play39:48

here is Neha from our GitHub team.

play39:49

Neha, take it away.

play39:53

Thanks Satya.

play39:55

GitHub Copilot gives you suggestions

play39:57

in your favorite editor like here

play39:59

where I'm writing unit tests.

play40:01

Copilot is great, at meeting you

play40:03

where you're at regardless of the language

play40:06

you're most comfortable with.

play40:07

So, let's ask for something simple,

play40:09

like how to write a prime number test in Java?

play40:12

But, let's converse in Spanish using my voice.

play40:17

How to check if the given number,

play40:20

is a prime number in java?

play40:25

Look at that.

play40:27

Thank you, Copilot.

play40:29

Copilot is great at turning natural language

play40:31

into code and back again.

play40:34

But, what about beyond the code

play40:36

with the new GitHub Copilot Extensions,

play40:39

you can now bring the context

play40:40

from your connected systems to you.

play40:43

So, now I can ask Azure,

play40:48

where my app is deployed.

play40:51

I could ask

play40:51

what my available Azure resources are

play40:53

or I could diagnose issues with my environment.

play40:57

And this isn't just for Azure.

play40:59

As Satya announced,

play41:00

any developer can now

play41:02

create Extensions for GitHub Copilot,

play41:04

and that includes any tool in your stack.

play41:06

Include your in-house tools,

play41:08

keeping you in the flow across your entire day.

play41:12

Actually, 75% of a developer's day

play41:14

is spent outside of coding:

play41:16

gathering requirements,

play41:17

writing specifications, and creating plans.

play41:20

Let's show how GitHub Copilot can help with that.

play41:24

Live, on stage, for the first time.

play41:27

So typically,

play41:28

my day starts by looking at GitHub issues.

play41:30

Looks like we want to support a rich text

play41:33

input for our product description.

play41:34

Let's open Workspace and get some help with that.

play41:38

Copilot interprets

play41:40

the intent of the issue to see what's required.

play41:42

And it then looks across the entire codebase,

play41:46

and it proposes what changes should be made.

play41:48

This specification is fully editable,

play41:50

and the whole process is iterative,

play41:52

but actually, this looks pretty good.

play41:56

Copilot can now help us build a plan

play41:58

on how to implement this change.

play42:01

All right,

play42:02

that's a great start,

play42:03

but we must not forget about our documentation.

play42:06

So let's edit the plan,

play42:08

and have Copilot update our readme.

play42:14

And then we can even get Copilot’s help

play42:16

in starting to implement the code for us.

play42:20

Now, this was just a simple example,

play42:22

but in a large enterprise codebase,

play42:24

there are tens of thousands of files,

play42:27

and dozens of stakeholders involved.

play42:29

And that means meetings. So many meetings.

play42:34

Workspace helps you focus

play42:35

on what you need to change.

play42:36

And by the way, as a developer, I'm always in control.

play42:39

I can see exactly what changes Copilot is proposing

play42:42

and I can even get a live preview.

play42:47

All right. Let's test out the input.

play42:51

All right.

play42:51

This looks great.

play42:53

So, I can go back and I can edit my code,

play42:56

in VS Code,

play42:57

or I can submit these changes as a pull request

play42:59

to share with my team.

play43:01

GitHub Copilot,

play43:02

Copilot Extensions, and Copilot Workspace

play43:05

help you stay focused on solving problems

play43:08

and keeping you in the flow.

play43:10

Back to you, Satya.

play43:18

Thank you so much, Neha!

play43:20

I tell you GitHub, Copilot

play43:21

and everything that that ecosystem is doing

play43:25

is just bringing back a lot of fun

play43:27

and a lot of joy back to coding.

play43:29

And really the thing about staying in that flow,

play43:32

is I think what we all have dreamt for,

play43:34

and dreamt about, and it's coming back.

play43:37

That brings us to the very top of the stack.

play43:40

Microsoft Copilot.

play43:42

We built Copilot so that you have the ability

play43:45

to tap into the world’s knowledge

play43:46

as well as the knowledge

play43:47

inside of your organization and act on it.

play43:51

Now, Copilot has had a remarkable impact.

play43:53

It's democratizing expertise across organizations.

play43:57

It's having a real cascading effect.

play43:59

Right. In fact,

play44:00

it reminds me

play44:01

like of the very beginning of the PC era

play44:03

where work, the work artifact and the workflow.

play44:06

We're all changing.

play44:08

And it's just,

play44:09

you know, really having broad

play44:11

enterprise business processes impact.

play44:13

It's lowering, I always say this, it’s lowering

play44:15

both the floor

play44:16

and raising the ceiling at the same time,

play44:18

for anything any one of us can do.

play44:21

Since no two business processes are the same

play44:25

with Copilot Studio, you now can extend Copilot

play44:30

to be able to customize it,

play44:33

for your business processes and workflows.

play44:36

Today we're introducing

play44:38

Copilot connectors in Copilot Studio,

play44:41

so you can ground Copilot with data

play44:44

from across the Graph, from Power Platform.

play44:48

Fabric, Dataverse,

play44:50

as well you now have all the third-party

play44:53

connectors for SaaS applications

play44:55

from Adobe, Atlassian, ServiceNow,

play44:57

Snowflake and many, many more.

play45:00

This makes the process of grounding Copilot

play45:04

in first and third-party line of business data.

play45:06

Just a wizard-like experience enabling you

play45:09

to quickly incorporate your own organizational

play45:11

knowledge in data.

play45:14

We're also extending Copilot

play45:16

beyond a personal assistant

play45:19

to become a team assistant.

play45:22

I'm thrilled today to announce Team Copilot.

play45:32

You'll be able to invoke a Team Copilot

play45:34

wherever you collaborate in Teams, right?

play45:36

It can be in Teams,

play45:38

it can be in Loop, it can be Planner and many,

play45:42

many other places. I mean, think about it, right?

play45:44

It can be your meeting facilitator,

play45:47

when you're in Teams,

play45:49

creating agendas, tracking time, taking notes for you.

play45:53

Or a collaborator writing chats,

play45:56

surfacing the most important information,

play45:58

tracking action items, addressing unresolved issues.

play46:03

And it can even be your project manager,

play46:06

ensuring that every project that you're

play46:08

working on as a team is running smoothly.

play46:11

These capabilities will all come to you

play46:13

all and be available in preview later this year.

play46:16

And we're not stopping there.

play46:18

With Copilot Studio,

play46:20

anyone can build copilots

play46:22

that have agent capabilities

play46:25

and work on your behalf, and independently,

play46:28

and proactively orchestrate tasks for you.

play46:31

Simply provide your Copilot a job description

play46:35

or choose from one of our pre-made templates

play46:37

and equip it with the necessary knowledge

play46:40

and actions,

play46:41

and Copilot will work in the background

play46:43

and act asynchronously for you.

play46:47

That's I think, one of the key things

play46:48

that's going to really change in the next year

play46:50

where you're going to have Copilots plus agents

play46:52

with this async behavior.

play46:55

You can delegate authority to Copilots

play46:57

to automate long running business processes.

play46:59

Copilot can even ask for help

play47:02

when it encounters situations

play47:03

that he does not know much about and it can’t handle.

play47:06

And to show you all of this, let's roll the video.

play47:09

Redefine business processes

play47:11

with Copilot Studio.

play47:13

Create copilots that act as agents

play47:15

working independently for you.

play47:18

Simply describe what you want your copilot to do.

play47:21

Easily configure your copilot with the details it needs

play47:24

like instructions, triggers,

play47:27

knowledge and actions.

play47:30

Quickly test your copilot before you deploy,

play47:34

and seamlessly publish across

play47:36

multiple channels.

play47:41

Watch it use memory for context,

play47:44

reason over user input,

play47:47

and manage long running tasks.

play47:52

Copilot can learn from feedback to improve,

play47:59

and you're always in control.

play48:05

Put copilot to work for you.

play48:09

with Copilot Studio.

play48:20

You know all around

play48:21

this stack

play48:23

is perhaps

play48:24

one of the most important things that we at

play48:26

Microsoft are doing, which is wrapping it

play48:29

with robust security.

play48:32

You know, security underlies

play48:34

our approach with Copilot, Copilot+PCs,

play48:37

Copilot Stack.

play48:38

We're committed to our Secure Future Initiative.

play48:43

You can see,

play48:44

you will see us make rapid progress

play48:46

across each of the six pillars of SFI,

play48:50

and the core design principles, right?

play48:52

Which is secure by design,

play48:53

secure by default and secure operations.

play48:56

You'll hear about throughout this conference.

play48:59

In fact, a lot more in Scott's keynote tomorrow,

play49:01

how it underlies everything

play49:03

that we build and everything that we do.

play49:06

So, coming to the close, I want to sort of,

play49:10

you know,

play49:11

there are many announcements

play49:12

that you will hear about at Build,

play49:15

but I want to go back to

play49:17

I think the core of what I think

play49:19

why we chose to be in this industry

play49:21

and why we come to work every day as developers,

play49:26

which is the mission ultimately of

play49:27

empowering every person and every organization.

play49:31

At the end of the day, it's not about innovation,

play49:35

that is only useful for a few.

play49:36

It's about really being able to empower that everyone,

play49:40

And it comes down to you

play49:42

all as developers and builders of this new world.

play49:45

For us, it's never,

play49:47

never about celebrating tech for tech's sake.

play49:50

It's about celebrating what we can do

play49:52

with technology

play49:53

to create magical experiences

play49:55

that make a real difference in our countries,

play49:57

in our companies, in our communities.

play50:01

Already, this new generation of AI

play50:03

is having an incredible impact,

play50:05

thanks to all of you,

play50:06

the passion you bring and the hard work you put in.

play50:10

And I want to leave you with this one

play50:12

unbelievable example of how

play50:13

you are all building a more accessible world,

play50:15

which means a lot to me using our platform and tools.

play50:19

Thank you all so very much.

play50:20

Enjoy the rest of Build.

play50:25

Audio description is something

play50:28

that enables me to be able

play50:29

to watch a program or a film

play50:32

and get as much out of it

play50:34

as everybody else who is sighted.

play50:36

A white car drives down a road.

play50:38

Hands on a steering wheel.

play50:41

I see art as a collective good,

play50:43

I think everyone should be able to have access to art.

play50:46

Audio description

play50:47

really helps me get the full experience.

play50:50

A portrait of a group of 17th century

play50:52

civic guardsmen in Amsterdam.

play50:54

The challenge, though, is that there are limited

play50:57

amounts of audio descriptions

play50:59

being incorporated across media and entertainment.

play51:03

Tech and AI have the potential

play51:05

to bring the blind and low vision community

play51:08

into the fold.

play51:14

So at WPP,

play51:15

we really care

play51:17

passionately about opening up access to content

play51:20

to people in the way that they want to consume it.

play51:22

The tool that I've made

play51:24

is an application

play51:25

that allows you to upload videos,

play51:27

and on the other end, with GPT-4 with Vision and

play51:31

Azure AI services

play51:32

you get your video back

play51:34

with spoken narrations over the top.

play51:36

Kitchen scene with cat and Hellmann's mayonnaise.

play51:39

This makes audio descriptions cheaper and faster.

play51:44

Our goal is to be able to offer this product

play51:47

as a service for all of our advertisement campaigns.

play51:52

There are so many artworks in the Rijksmuseum,

play51:55

there are almost a million.

play51:57

To describe ourselves,

play51:58

it would have taken hundreds of years.

play52:01

With AI, we can do this in a matter of hours.

play52:06

The subject is a male,

play52:07

with a reddish beard and mustache,

play52:09

visible brushstrokes that add texture and mood.

play52:15

The first time I heard audio descriptions

play52:18

it just brought me delight.

play52:20

It was this opportunity of β€œOh my gosh, I'm seen.”

play52:24

Through the power of AI we’re able to do things

play52:27

only dreamt about until recently.

play52:30

When we strengthen

play52:31

our access to culture,

play52:33

we strengthen the culture itself,

play52:36

connecting our shared humanity.

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Artificial IntelligenceAccessibilityExpertiseInnovationAI TransformationDeveloper ToolsGitHub CopilotAzure AIEducationInclusive Tech