ChatGPT-4o Comes to Azure AI Studio

Microsoft DevRadio
23 May 202429:34

Summary

TLDRこのスクリプトは、最新のデータセンター設計からネットワークに至るまでの各レイヤーでの電力と効率の最適化を語っています。AIワークロードに特化したデータセンター設計により、電力使用の責任を負いながらAIのコストと消費電力を削減しています。さらに、シリコンレイヤーでは、ワークロードに最適なAIハードウェアへの動的マッピングが可能で、カスタムのIOハードウェアとサーバー設計により、ネットワーク、リモートストレージ、ローカルストレージの高速化を実現しています。また、AIアクセラレータの選択肢も充実しており、NvidiaやAMD、Azure Mayaなどから選べる最適なアクセラレータを提供しています。さらに、AIモデルのトレーニングや微調整に必要なデータの最適な状態を保つために、Microsoft Fabricなどの統合されたデータプラットフォームを構築しています。開発ツールとして、GitHub Copilotがコード生成を支援し、新しいGitHub Copilot Extensionsにより、開発者の日常業務をさらに効率化しています。

Takeaways

  • 🌟 Microsoftは、データセンターからネットワークまでの各レイヤーを通じて電力と効率を最適化し、AIワークロードに特化した最新のデータセンター設計を行っています。
  • 💻 Azureは、AIトレーニングに使用されるクラウドインフラの非常に小さい部分にすぎないが、その上級力を持つAIスーパーコンピュータを発表し、過去6ヶ月でそのスーパーコンピューティングパワーを30倍に拡大しました。
  • 🔌 Azureは、世界中で利用可能なAIサービスの国々の数を4倍に増やし、トレーニング艦隊だけでなくインフラ艦隊も拡大しています。
  • 🚀 Azureは、NvidiaやAMDだけでなく、独自のAzure Mayaを含めた、最も完全なAIアクセラレータの選択肢を提供しており、これらはワークロードに動的に最適化されています。
  • 🤖 AzureはNvidiaと深いパートナーシップを築き、そのハードウェアとシステムソフトウェアのイノベーション全体を活用して、Azure Confidential Computing on GPUsを提供しています。
  • 🎓 Microsoftは、AMDと共同でAMD MI300X AIアクセラレータに基づくVMの一般提供を開始し、GPT-4推論において最高の価格パフォーマンスを提供しています。
  • 📚 Microsoftは、独自のAIサービスであるAzure Open AIサービスにおいて、Open AIが発表した最新のマルチモーダルモデルであるGPT-40をテストし、一般提供しています。
  • 🛠️ Azure AI Studioが一般提供され、AIモデルを構築、トレーニング、微調整するためのエンドツーエンドのツールソリューションを提供しています。
  • 🔍 Azure AI Searchは、任意の規模でRAG(Retrieval-Augmented Generation)を実行し、高精度な応答を提供する検索サービスであり、Azure AI Studioと統合されています。
  • 👨‍💻 GitHub Co-pilotは、プログラミング言語と知識にアクセスできるAI開発者ツールであり、1.8百万のサブスクリプションを超え、50,000以上の組織で使用されています。

Q & A

  • データセンターからネットワークまでの各レイヤーでの電力と効率の最適化とはどのようなものですか?

    -スクリプトでは、AIワークロードに特化したデータセンター設計、高度な冷却技術、シリコンレイヤーでのワークロードの最適なAIハードウェアへのマッピング、カスタムIOハードウェアとサーバー設計による高速ネットワーク、リモートストレージ、ローカルストレージの提供などについて説明されています。

  • AIスーパーコンピュータのスケールアップに関する最近の発表は何ですか?

    -昨年11月に発表されたクラウドでトレーニングに使用される最も強力なAIスーパーコンピュータについて説明されており、過去6ヶ月でそのスーパーコンピューティングパワーを30倍に拡大し、Azureに追加されました。

  • Azure AIサービスが利用可能な国数の増加とはどのようなものですか?

    -スクリプトでは、Azure AIサービスが利用可能な国々の数を4倍に増やすと発表されています。

  • AIアクセラレータの選択肢には何がありますか?

    -Nvidia、AMD、Azure Mayaなど、世界最先端のAIアクセラレータが提供されており、ワークロードに最適化されています。

  • Azure Confidential Computing on GPUsとは何ですか?

    -Azure Confidential Computing on GPUsは、Nvidiaと共同で提供される機能で、AIモデルに関する機密データを保護するように設計されています。

  • NvidiaのBlackwell GPUとGB200構成とは何ですか?

    -Blackwell GPUとGB200構成は、Nvidiaから提供される最新のGPUで、AzureはこれらのGPUを提供するクラウドプロバイダーの最前列に位置することになります。

  • AMDと共同で達成された大きなマイルストーンとは何ですか?

    -AMDと共同で達成された大きなマイルストーンは、AMD Ami 300X AIアクセラレータに基づくVMSの一般提供です。これはGPT-4推論において最高の価格パフォーマンスを提供しています。

  • Azure Mayaとはどのようなものですか?

    -Azure Mayaは、マイクロソフトのカスタムAIアクセラレータで、Co-pilotやAzure Open AIサービスの一部として使用されています。

  • Microsoft CobaltベースのVMの公開プレビューが発表されたのはいつですか?

    -スクリプトでは、Microsoft CobaltベースのVMの公開プレビューが発表されており、これはビデオプロセッシングやMicrosoft 365のアクセス管理に使用されています。

  • Azure AI Studioの提供内容とその利点は何ですか?

    -Azure AI Studioは、AIモデルを構築、トレーニング、微調整するためのエンドツーエンドのツールリングソリューションを提供しており、AIの安全性に関する最新のツールも含まれています。

  • GitHub Copilot Extensionsの利点とは何ですか?

    -GitHub Copilot Extensionsを使用すると、第三者サービスやAzureなどの機能をGitHub Copilotにカスタムできます。これにより、開発者は自分の開発環境全体で一貫性を保ちながら、自然言語を使用してコードの記述からインフラ運用までを支援されます。

  • KH Academyとの新しいパートナーシップはどのようなものですか?

    -KH Academyとの新しいパートナーシップでは、53を使用して数学チューターをよりアクセス可能にし、米国のすべての教師に対してK AmigoというAIアシスタントを無料で提供することで、教育に大きな影響を与えることを目指しています。

  • Azure AI検索の主な機能とは何ですか?

    -Azure AI検索は、任意の規模でRAG(Retrieval-Augmented Generation)を実行し、最新のリトリーブシステムを使用して非常に高精度な応答を提供する機能です。また、Azure AI Studioと統合されており、独自の埋め込みモデルを使用するサポートも提供しています。

  • Microsoft Fabricのリアルタイムインテリジェンスとは何ですか?

    -Microsoft Fabricのリアルタイムインテリジェンスは、組織全体のイベントデータを管理し、ストリーミングデータからすぐに行動可能な洞察を得るためのエンドツーエンドのソリューションです。これにより、ユーザーは自分の関心を持つデータを監視し、変化するパターンを検出してアラートやアクションを設定できます。

Outlines

00:00

🔌 AIのパワーと効率を最適化

データセンターからネットワークまでの各層で電力と効率を最適化し、AIワークロードに特化した最新のデータセンター設計を展開しています。電力を効果的に使い、AIのコストと電力消費を削減します。データセンターの冷却技術も進化しており、ワークロードに合わせた環境に適応します。シリコンレイヤーでは、ワークロードを最適なAIハードウェアにマッピングし、パフォーマンスを最大限に引き出します。カスタムのIOハードウェアとサーバー設計により、ネットワーク、リモートストレージ、ローカルストレージを高速化。エンドツーエンドのアプローチで、前年に発表したクラウドで最も強力なAIスーパーコンピューターをトレーニングしており、わずかなインフラストラクチャで実現しています。過去6か月で、そのスーパーコンピューティングパワーを30倍に拡大し、Azureに追加しています。また、トレーニングだけでなく、インフラの拡大も進んでおり、世界中の拠点数を4倍に増やしています。AI基盤の中心には、世界最先端のAIアクセラレータが揃っています。Nvidia、AMD、Azure Mayaなど、ワークロードに最適化されています。Microsoft co-pilotを使用したり、独自のco-pilotアプリを開発したりする際には、最高のアクセラレータパフォーマンスとコストを確保します。

05:02

🤖 Azure AIのモデルとサービスの拡大

Azure AIでは、フロンティアとオープンソースのモデルを幅広く提供しており、組織数で5万人以上が利用しています。重要なパートナーであるOpenAIと共に、GPT-40という最新のマルチモーダルモデルをトレーニングし、Azure上で実現しています。テキスト、音声、画像、ビデオの入出力が可能で、人間のように会話をすることができます。また、ベンチマークにおいてもトップクラスのパフォーマンスを誇ります。ビデオデモでは、co-pilotがビデオのプロンプトとして共有され、さまざまなタスクを支援する様子が紹介されています。さらに、Azure AI StudioでGPT-40を使用して、多様なアプリケーションを構築できるようになりました。これにより、あらゆるアプリやウェブサイトをフルデュプリクスで対話型に変えることができます。

10:02

👢 AIを活用したアウトドア用品の選択

AIを活用して、アウトドア用品の選択を支援するシナリオが紹介されています。ビデオでは、山登りの際に適した靴選びをAIが提案し、買い物カゴに追加する様子が見られます。GPT-40は、ユーザーの要件に応じて迅速かつ適切な回答を提供できる力を発揮しています。また、OpenAIチームと協力し、革新的なAIを提供し続ける旨が強調されています。

15:02

🎓 教育におけるAIの活用

AIを教育分野に取り入れ、Khan Academyとパートナーシップを結んで数学チューターにAIを活用しています。特定の分野に特化した小さい言語モデルを使用することで、教育に大きな影響を与えると期待されています。また、Khan Academyが提供するAIアシスタントを全米の教師に無料で提供することで、教育に貢献するという旨が紹介されています。

20:02

🛠️ Azure AI Studioの強化とデータプラットフォームの拡張

Azure AI Studioが一般公開され、AIモデルを開発、トレーニング、微調整するためのエンドツーエンドのツールとして機能しています。AIの安全性に関する最新のツールも提供しています。また、データプラットフォームMicrosoft Fabricについても、リアルタイムのデータ分析を可能にし、ビジネスの意思決定をサポートする機能が強化されています。

25:05

💻 GitHub Copilotの拡張と開発ツールの進化

GitHub Copilotが開発ツールとして進化し、1800万人のユーザーが50,000以上の組織で利用しています。プログラミング言語と知識にアクセスできるようになり、ネイティブ言語でのコーディングを可能にしています。さらに、GitHub Copilotの機能を拡張し、第三者サービスやAzureとの連携が可能になりました。これにより、開発者は自分の好みのエディター内でコーディングを始め、自然言語を使用してAzureなどのリソース情報を取得できるようになりました。

Mindmap

Keywords

💡AIワークロード

AIワークロードとは、人工知能のアルゴリズムを実行するために必要なコンピューティング能力を指します。ビデオでは、データセンターがこれらのAIワークロードに特化した設計へと進化し、効果的かつ責任を持って電力を使い、AIのコストと電力消費を削減するように最適化されていると述べられています。

💡データセンター冷却技術

データセンター冷却技術とは、データセンター内の機器から発生する熱を効率的に排出するための技術です。ビデオでは、これらの技術がワークロードの熱プロファイルに合わせて環境に適応し、冷却を実行する上で重要な役割を果たしていると説明されています。

💡AIアクセラレータ

AIアクセラレータは、人工知能のモデルを高速で高精度に実行するための特別なハードウェアです。ビデオでは、NvidiaやAMDなどの主要なAIアクセラレータから、Azure Mayaという独自のアクセラレータまで、ワークロードに最適化された選択肢が提供されていると紹介されています。

💡Azure

Azureは、マイクロソフトが提供するクラウドコンピューティングサービスです。ビデオでは、AzureがAIスーパーコンピュータやAIサービスを提供し、グローバルなインフラストラクチャを拡大していると述べられています。

💡Azure AI Studio

Azure AI Studioは、AIモデルの開発、トレーニング、微調整を行うためのエンドツーエンドのツールchainsです。ビデオでは、このプラットフォームがAIの安全性を確保するための最新のツールも提供していると強調されています。

💡オープンAI

オープンAIは、人工知能技術の研究と開発に特化した組織です。ビデオでは、オープンAIとマイクロソフトとのパートナーシップがAI分野での進歩を促進し、GPT-40という最新のマルチモーダルモデルをトレーニングし、Azure上で利用可能にしていると紹介されています。

💡GitHub Copilot

GitHub Copilotは、開発者がコードを書く際にAIを活用してサポートを受けるためのツールです。ビデオでは、GitHub Copilotがプログラミング言語や開発知識にアクセスし、開発者をサポートする上で重要な役割を果たしていると説明されています。

💡Azure Confidential Computing

Azure Confidential Computingは、クラウド上で実行されるアプリケーションとデータを保護するためのサービスです。ビデオでは、この技術がNvidia GPU上で動作し、AIモデルに関する機密データを保護する上で重要な役割を果たしていると紹介されています。

💡Azure AI Search

Azure AI Searchは、エンタープライズにおけるAI駆動型アプリケーションのための検索サービスです。ビデオでは、このサービスが任意の規模でレトロアクティブ検索(RAG)を実行し、高精度な応答を提供していると述べられています。

💡GitHub Copilot Extensions

GitHub Copilot Extensionsは、GitHub Copilotの機能を拡張し、他のサービスやツールと連携することができる機能です。ビデオでは、これにより開発者は自分の開発環境全体で一貫性をもって作業を進めることができると紹介されています。

Highlights

最新のデータセンターデザインはAIワークロードに特化しており、電力消費を効率的に抑えつつコストを削減します。

高度なデータセンター冷却技術を取り入れ、ワークロードの熱特性に合わせて最適化しています。

シリコン層でワークロードを最適なAIハードウェアに動的にマッピングし、最高の性能を提供します。

カスタムI/Oハードウェアとサーバーデザインにより、ネットワーク、リモートストレージ、ローカルストレージを劇的に高速化します。

昨年11月にはクラウドで最も強力なAIスーパーコンピュータを発表し、6ヶ月でそのパワーを30倍に増強しました。

Azure AIサービスが提供される国の数を4倍に増やし、インフラストラクチャーをスケールしています。

Nvidia、AMD、Azure MayaのAIアクセラレータを動的に最適化し、最高のコストパフォーマンスを提供します。

GPT-4のリリース以来、12倍のコスト削減と6倍の高速化を達成しました。

Nvidiaとの深いパートナーシップにより、Azureの機密計算をGPU上で実現し、AIモデルのエンドツーエンドのデータ保護を提供します。

Nvidiaの最新のBlackwell GPUs B100sおよびGB200構成を今年後半に提供予定です。

AzureはAMDのAmi 300X AIアクセラレータに基づくVMを一般提供する最初のクラウドプロバイダです。

Azure Mayaハードウェアを使用して、いくつかのプロンプトが処理されます。

Azure AI Studioが一般提供され、エンドツーエンドのツールを提供します。

Microsoft Fabricは、リアルタイムのインテリジェンスを統合し、データから瞬時に行動可能なインサイトを提供します。

GitHub Copilotは、あらゆる開発者にアクセス可能なプログラミング知識とツールを提供し、ネイティブ言語でのコーディングをサポートします。

Transcripts

play00:00

if we're optimizing power and efficiency

play00:03

across every layer of the stack from the

play00:05

data center to the network our latest

play00:08

latest data center designs a purpose

play00:10

built for these AI workloads so that we

play00:12

can effectively and responsibly use

play00:14

every megawatt of power to drive down

play00:17

the cost of AI and the power draw uh and

play00:21

we are incorporating Advanced Data

play00:23

Center cooling techniques to fit the

play00:25

thermal profile off the workloads and

play00:29

match it to the environment and the

play00:30

location where it

play00:32

operates um at the Silicon layer we are

play00:35

dynamically able to map workloads to the

play00:38

best accelerated AI Hardware so that we

play00:41

have the best performance uh and our

play00:44

custom IO hardware and server designs

play00:47

allow us to provide dramatically faster

play00:49

networking remote storage and local

play00:51

storage threr you know this endtoend

play00:55

approach is really helping us get to the

play00:58

unprecedented scale in fact last

play01:00

November we announced the most powerful

play01:02

AI super computer in the cloud for

play01:05

training using just actually a very

play01:07

small fraction of our Cloud

play01:09

infrastructure and over the past 6

play01:11

months we've added 30 times that

play01:14

supercomputing power to

play01:16

Azure yeah it's crazy to see the

play01:20

scale and of course we're not just

play01:23

scaling training uh fleets we're scaling

play01:26

our infrance Fleet around the world

play01:28

quadrupling the number of countries

play01:30

where Azure AI services are available

play01:33

today and it's great to see

play01:36

that at the heart at the heart of our AI

play01:40

infrastructure are the world's most

play01:42

advanced AI accelerators right we offer

play01:45

the most complete selection of AI

play01:47

accelerators including from Nvidia and

play01:49

AMD as well as our own Azure Maya all

play01:53

dynamically optimized for the workloads

play01:56

uh that means whether you're using

play01:58

Microsoft co-pilot or building your own

play01:59

own co-pilot apps uh we ensure that you

play02:02

get the best accelerator performance at

play02:04

the best cost uh for example you know

play02:07

you see this in what has happened with

play02:09

gp4 right it's 12x cheaper and 6X faster

play02:13

since it launched uh and that's you know

play02:16

the type of progress you can you know

play02:18

continue to see how you know to continue

play02:21

to see the progress as we evolve the

play02:23

system architecture it all starts though

play02:26

with this very deep deep partnership

play02:28

with Nvidia which

play02:30

the entirety of the co-pilot stack

play02:33

across both all of their Hardware

play02:35

Innovation as well as the system

play02:36

software Innovation together we offer

play02:39

Azure confidential Computing on gpus to

play02:42

design to be really help you protect

play02:45

sensitive data around the AI models end

play02:47

to endend uh we're bringing in fact the

play02:50

latest h20s to Azure later this year uh

play02:53

and we will be among the first Cloud

play02:55

providers to offer nvidia's Blackwell

play02:57

gpus b100s as well as gb200

play03:01

configurations and we're

play03:03

continuing to work with them to train

play03:06

and optimize both large language models

play03:10

like GPT 40 as well as small language

play03:12

models like the 53

play03:14

family now beyond the hardware we are

play03:18

bringing nvidia's key Enterprise

play03:20

platform offerings uh to our cloud like

play03:22

the Omniverse cloud and dgx Cloud to

play03:24

Azure with deep integration with even

play03:27

the broader Microsoft cloud for example

play03:30

uh Nvidia recently announced that their

play03:32

djx cloud integrates natively with

play03:34

Microsoft fabric that means you can

play03:36

train those models using dgx cloud with

play03:39

the full access to fabric data and

play03:42

Omniverse apis will be available first

play03:44

on Azure for developers to build their

play03:46

industrial AI Solutions we're also

play03:49

working with Nvidia Nim industry

play03:51

specific developer services and making

play03:53

them fantastic on Azure so lot of

play03:56

exciting work uh with

play03:58

Nvidia now now coming to

play04:01

AMD I am really excited to share that we

play04:04

are the first Cloud to deliver General

play04:07

availability of VMS based on AMD Ami

play04:10

300X AI

play04:17

accelerator it's a big milestone for

play04:19

both AMD and Microsoft we've been

play04:21

working at it for a while and it's great

play04:23

to see that today as we speak it offers

play04:26

the best price performance on GPT 4

play04:28

inference

play04:30

and we'll continue to move forward uh

play04:32

with Azure Maya in fact our first

play04:34

clusters are live and soon if you're

play04:36

using co-pilot or one of the Azure open

play04:38

AI Services some of your prompts will be

play04:41

served using Maya

play04:43

Hardware now Beyond AI our endtoend

play04:47

systems optimization also makes Cloud

play04:50

native

play04:51

apps and the development of cloud native

play04:53

apps better right 6 months ago is when

play04:56

we announced our first general purpose

play04:58

Arm based compute processor Microsoft

play05:01

Cobalt and today I am really excited to

play05:05

announce the public preview of cobalt

play05:07

based

play05:11

VMS you know Cobalt is being used for

play05:15

video processing and permissions

play05:17

Management in Microsoft 365 helping

play05:19

power billions of conversations on

play05:22

services like Microsoft teams already

play05:24

and we're delivering that same arms

play05:26

based performance and efficiencies to

play05:28

many customers in fact including elastic

play05:31

mango semen Snowflake and teradata in

play05:35

our most recent Benchmark data and Tas

play05:38

our Cobalt 100 VMS delivered up to

play05:41

40% better performance than any other

play05:44

generally available arm-based VM so we

play05:46

are very very excited about Cobalt

play05:48

getting into the

play05:49

market now let's move up the stack to

play05:52

the foundation

play05:53

models look with Azure AI we offer the

play05:56

broadest selection of Frontier and open

play05:59

source model models including llms and

play06:01

slms so you can choose the model that

play06:03

makes the most sense for your unique

play06:05

needs and your application needs in fact

play06:07

more than 50,000 organizations use Azure

play06:10

AI today yeah it's great

play06:15

momentum and it all starts though with

play06:19

our most strategic and most important

play06:21

partnership with open AI just last week

play06:25

open AI announced GPT 40 their latest

play06:28

multimodal model which was trained on

play06:30

Azure it's an absolute breakthrough it

play06:33

has text audio image and video as input

play06:36

and output it can respond and just have

play06:38

a a humanlike conversation that's fast

play06:41

and fluid it can even be interrupted mid

play06:44

sentence uh gb40 is also the top

play06:47

performing model on benchmarks across a

play06:49

variety of modalities and it's always

play06:52

going to get you to the best answer uh

play06:54

it has state-of-the-art performance at

play06:56

understanding the information you

play06:58

actually provide in your prompt cuz

play06:59

that's really what it matter what

play07:01

matters uh what open a i demoed last

play07:04

week I mean that was just pretty magical

play07:06

stuff and yesterday we showed how

play07:09

co-pilot will leverage GPT 40 in the

play07:14

video I'm just going to play notice that

play07:16

you can share your screen or session as

play07:20

a prompt in co-pilot and have co-pilot

play07:24

help you in whatever it is that you're

play07:25

doing let's roll the video

play07:31

hey co-pilot how's it

play07:32

going hey I'm doing great and it looks

play07:35

like you're about to dive into some

play07:36

Minecraft ready to build explore and

play07:39

maybe dodge a few moms yeah I'm trying

play07:41

to figure out how to make a sword my son

play07:43

plays this game all the time and I have

play07:45

no idea what I'm

play07:46

doing no worries I'm here to help you

play07:49

impress your son I see you've got a

play07:51

crafting table in front of you which is

play07:52

a great start to craft a sword you'll

play07:55

need some materials can you open your

play07:57

inventory just press e on your keyboard

play08:01

yeah do I have what I need let's see

play08:04

you've got some sticks perfect for the

play08:06

swords handle but it looks like you're

play08:08

missing the material for the blade you

play08:10

can use wood Stone iron gold or Diamond

play08:15

let's start by gathering some wood or

play08:16

stone shall we head out and find some

play08:19

resources yeah that sounds great oh my

play08:22

God what's that oh no it's a zombie run

play08:26

you need to get away from it fast either

play08:28

Sprint away or quickly build the Tower

play08:29

of blocks straight up to get out of its

play08:31

reach if you can find a shelter or dig

play08:34

into the side of a hill do it now just

play08:37

make sure it can't reach you all right I

play08:40

think I might be safe now cool that was

play08:43

a close one great job finding

play08:45

shelter it's really cool to see

play08:52

that you know the magic here is that

play08:56

co-pilot is your personal assistant

play08:58

that's helping you with any task helping

play09:00

you stay quite frankly in the flow in

play09:02

the moment whether it's a game or a

play09:04

document or a you know line of code and

play09:07

as openai innovates our promise is that

play09:10

we will bring all that Innovation to

play09:12

Azure 2o in fact the same day that

play09:14

openai announced GPD 40 we made the

play09:16

model available for testing on Azure

play09:18

open AI service and today it I'm excited

play09:21

to say that it's generally available on

play09:23

Azure AI

play09:31

you know what this means though is that

play09:34

now we can have these groundbreaking

play09:36

apps that all of you can build using

play09:38

this capability one of the coolest

play09:39

things is that now any app any website

play09:43

can essentially be turned into a full

play09:46

multimodal full duplex conversational

play09:48

canvas let's take a

play09:51

look all right let me ask the agent here

play09:55

hey Seth how's it going what can I help

play09:58

you with today I am fantastic I'm sorry

play10:00

I'm in such a hurry I'm going to go on

play10:02

an overnight camping hike where we're

play10:03

camping all this it's going to be dark

play10:05

cold should I should I use these shoes

play10:07

it's the mountains it's probably going

play10:08

to be cold hey no worries let me take a

play10:12

look ah those are the trar hiking

play10:15

sandals while they're fantastic for

play10:17

summer hikes and keeping your feet cool

play10:19

they might not be the best choice for Co

play10:21

oh my good okay I'm sorry to interrupt

play10:23

you pick the best thing that I should

play10:24

use and can you just add it to my car

play10:26

just one product no worries let's go

play10:30

with the Tre ready Hawking boots they're

play10:32

durable provide great ankle support and

play10:35

most importantly they'll keep your feet

play10:36

warm in the cold mountains I'll add them

play10:39

to your cart now ready for check out I

play10:41

see that

play10:56

amazing while Seth was

play11:04

you know Seth was in a hurry but there

play11:06

you go GPD 40 kept up with him it's

play11:08

incredible I really want to thank the

play11:10

open AI team uh for their partnership

play11:13

and and really the responsible approach

play11:15

to Innovation helping our industry move

play11:17

forward Sam will be here in fact joining

play11:20

Kevin in a little bit to talk a lot more

play11:21

about what's coming right because that's

play11:23

the exciting stuff how do you all sample

play11:26

what comes next uh we also I brought we

play11:31

bringing lots and lots of other models

play11:33

as well from coher and data bricks and

play11:35

Desi meta Mistral snowflake all to Azure

play11:40

AI we want to support the broadest set

play11:42

of models from every country every

play11:45

language I'm excited to announce in fact

play11:47

we're bringing models from coher g42

play11:50

ntity data Nixa as well as many more as

play11:53

models as Services because that's the

play11:55

way you can easily get to managed AI

play11:58

models

play12:00

uh and we all love open source too in

play12:02

fact two years ago uh at Bill we were

play12:04

the first to partner with hugging face

play12:06

making it simple for you to access the

play12:09

leading open source Library uh with

play12:11

state-of-the-art language models via

play12:13

Azure Ai and today I'm really excited to

play12:16

announce that we're expanding our

play12:17

partnership bringing more models from

play12:19

hugging face with text generation

play12:21

inference with text embedding inference

play12:23

directly into Azure I Studio

play12:31

and and we're not stopping there we are

play12:33

adding not just large language models

play12:35

but we're also leading the small

play12:37

language Revolution small language model

play12:40

Revolution you know our five3 family of

play12:43

slms are the most capable and most cost

play12:46

effective they outperform models of the

play12:49

same size with the next size up even

play12:51

across a variety of language reasoning

play12:54

coding as well as math benchmarks uh if

play12:57

you think about it by performance

play12:59

parameter count ratio it's truly Best in

play13:01

Class uh and today we are adding new

play13:04

models to the 53 family uh to add even

play13:08

more flexibility across that quality

play13:10

cost curve uh we're introducing 53

play13:13

vision of 4.2 billion parameter

play13:16

multimodal model with language and

play13:17

vision capabilities it can be used to

play13:20

reason over real world images or

play13:22

generate insights and answer questions

play13:25

about images as you can see right here

play13:28

yeah

play13:31

and we're also making a 7 billion

play13:33

parameter 53 small and a 14 billion

play13:36

parameter 53 medium models available uh

play13:40

with Fi you can build apps that span the

play13:42

web your Android iOS windows and the

play13:46

edge uh they can take advantage of local

play13:49

hardware when available and fall back on

play13:52

the cloud when not simplifying really

play13:54

all of what vs developers have to do to

play13:57

support multiple platforms using one AI

play13:59

model now it's just awesome to see how

play14:03

many developers are already using 53 to

play14:05

you know do incredible things from am

play14:08

Solutions the Thai company uh that I

play14:11

mentioned uh earlier the ITC which has

play14:14

been uh built a co-pilot for Indian

play14:16

Farmers to ask questions about their

play14:18

crops epic uh in healthcare which is now

play14:21

using fi to summarize complex patient

play14:24

histories more quickly and efficiently

play14:26

uh and another very very cool use cases

play14:29

in education uh today I'm very thrilled

play14:32

to announce a new partnership with KH

play14:34

academy uh we'll be working together to

play14:36

use 53 to make math tutoring more

play14:39

accessible and I'm also excited to share

play14:42

that they'll be making K Amigo their AI

play14:44

assistant free to all us teachers let's

play14:47

roll the

play14:48

[Applause]

play14:49

[Music]

play14:50

video you forgot already felt like I was

play14:53

in a place in my teaching career where I

play14:57

felt like I was kind of losing my

play14:58

Sparkle

play14:59

and I would just feel really defeated

play15:02

when I looked out on the classroom and I

play15:03

would see students that just didn't look

play15:06

engaged teachers have an incredibly hard

play15:09

job and what we think we can do is

play15:11

leverage technology to take some of the

play15:13

stuff off of their plate to really

play15:14

actually humanize the classroom by some

play15:17

miracle we became a kigo pilot

play15:21

school with new advances in generative

play15:23

AI we launched kigo the point is to be

play15:26

that personalized tutor for every

play15:28

student student and to be a teaching

play15:30

assistant for every

play15:32

teacher I started to build these more

play15:35

robust lessons and I started to see my

play15:38

students

play15:42

engage we're working with Microsoft on

play15:45

these fi models that are specifically

play15:47

tuned for math tutoring if we can make a

play15:51

small language model like fi work really

play15:53

well in that use case then we would like

play15:55

to kind of shift the traffic to fi in

play15:58

those particular scenarios using a small

play16:00

language model the cost is a lot

play16:04

lower we're really excited that kigo and

play16:08

especially in the partnership with

play16:09

Microsoft being able to give these

play16:11

teacher tools for free to us teachers is

play16:15

going to make a dramatic impact in US

play16:18

education I think we're going to make

play16:20

them the innovators the questioners

play16:23

isn't that really just why you wake up

play16:25

every morning right because that's our

play16:26

for our future our next generation and

play16:29

to me that's

play16:31

[Music]

play16:37

everything you know I'm super excited to

play16:40

see the impact uh this all will have and

play16:42

what Khan Academy will do and Sal's

play16:45

going to in fact join Kevin uh soon to

play16:47

share more and I'm really thankful to te

play16:50

for teachers like Melissa and everything

play16:51

that they do thank you very much you

play16:54

know of course it's about more than just

play16:57

models it's about the tools you need to

play17:00

build these experiences uh with Azure AI

play17:04

Studio we provide an end to-end tooling

play17:07

solution to develop and Safeguard the

play17:10

copilot apps you build uh we also

play17:12

provide tooling and guidance to evaluate

play17:14

your AI models and applications for

play17:17

performance and quality which is one of

play17:19

the most important tasks as you can

play17:20

imagine with all these models uh and I'm

play17:23

excited to announce that Azure AI studio

play17:26

now is generally available

play17:33

it's an endtoend development environment

play17:35

to build train and fine-tune AI models

play17:38

and do so responsibly it includes

play17:41

rebuilt in support what is perhaps the

play17:44

most important feature which is in this

play17:46

age of AI which is AI safety Azure AI

play17:49

Studio includes the state-of-the-art

play17:51

safety tooling you know to everything

play17:53

from detecting hallucinations in model

play17:55

outputs risk and safety monitoring uh it

play17:58

helps understand which inputs and

play18:00

outputs are triggering uh content

play18:02

filters uh prompt Shields by the way to

play18:05

detect and block these prompt injection

play18:07

attacks um and so today we are adding

play18:10

new capabilities including custom

play18:12

categories so that you can create these

play18:13

unique filters for Proms and completions

play18:16

with rapid deployment options which I

play18:18

think is super important as you deploy

play18:20

these models into the real world uh if

play18:22

when an emerging threat is you know

play18:25

appears Beyond Azure AI Studio we

play18:28

recognize that there are Advanced

play18:29

applications where you need much more

play18:32

customization of these models for very

play18:35

specific use cases and today I'm really

play18:37

excited to announce that Azure AI custom

play18:40

models will come giving you the ability

play18:43

to train a custom model that's unique to

play18:45

your domain to your data that's perhaps

play18:48

proprietary uh that same Builders and

play18:51

data scientists who have been working

play18:52

with open AI brought all the five

play18:54

advances to you will work you know with

play18:57

all of you to be able to build out these

play18:59

custom models the output will be domain

play19:01

specific it'll be multitask and

play19:04

multimodal uh Best in Class as defined

play19:07

by benchmarks including perhaps even

play19:09

specific language proficiency that may

play19:12

be

play19:12

required now let's just go up the stack

play19:16

uh to data ultimately in order to train

play19:20

fine-tune ground your models you need

play19:23

your data to be in its best shape and to

play19:26

do so we are building out the full dat

play19:28

data estate right from operational

play19:30

stores to analytics in Asia we've also

play19:34

added AI capabilities to all of our

play19:37

operational stores whether it's Cosmos

play19:38

DB or SQL or postra SQL uh the core

play19:42

though is this of the intelligent data

play19:44

platform is Microsoft fabric uh we now

play19:47

have over 11,000 customers including

play19:50

leaders in every industry who are using

play19:52

fabric it's fantastic to see the

play19:54

progress

play19:58

with fabric you get everything you need

play20:02

in a single integrated SAS platform it's

play20:06

deeply integrated at most fundamental

play20:08

level with computer and storage being

play20:11

unified your experience is Unified

play20:14

governance is Unified and more

play20:16

importantly the business model is

play20:17

Unified and what's also great about

play20:20

fabric is that it works with data

play20:22

anywhere right not just on Azure but it

play20:24

can be on AWS or on gcp or even on your

play20:27

on premise data Center uh and today we

play20:30

are taking the next step we're

play20:32

introducing realtime intelligence in

play20:39

fabric customers today have more and

play20:42

more of this real-time data coming from

play20:44

your iot systems your Telemetry systems

play20:48

in fact Cloud applications themselves

play20:50

are generating lots of data but with

play20:52

fabric anyone can unlock actionable

play20:55

insights across all of your data estate

play20:58

let's take a look look introducing

play21:00

realtime intelligence in Microsoft

play21:02

fabric an end to-end solution empowering

play21:04

you to get instant actionable insights

play21:06

on streaming data at its heart lies a

play21:09

central place to discover manage and

play21:11

consume event data across your entire

play21:14

organization with a rich governed

play21:16

experience get started quickly by

play21:19

bringing in data from Microsoft sources

play21:21

and across clouds with a variety of

play21:23

out-of-the-box connectors route the

play21:26

relevant data to the right destination

play21:28

in fabri using a simple drag and drop

play21:31

experience explore insights on pedabytes

play21:33

of streaming data with just a few clicks

play21:36

elevate your analysis by harnessing the

play21:39

intelligence of co-pilot in Microsoft

play21:41

fabric using simple natural language

play21:44

make efficient business decisions in the

play21:46

moment with real time actionable

play21:48

insights and respond to changing

play21:50

Landscapes proactively allow users to

play21:53

monitor the data they care about detect

play21:55

changing patterns and set alerts or

play21:58

actions that drive business value all

play22:00

your data all your teams all in one

play22:03

place this is Microsoft

play22:07

[Music]

play22:11

Fabric and we're making it even easier

play22:15

to design build and

play22:17

interoperate with fabric with your own

play22:20

applications right right and in fact

play22:22

we're building out a new app platform

play22:24

with fabric workload development kit so

play22:26

that people like ezri for example ex Le

play22:29

have you know who have integrated their

play22:31

spatial analytics with fabric so that

play22:32

customers can generate insights from

play22:35

their own location data using EZ's Rich

play22:38

tools and libraries right on fabric

play22:40

right this is just exciting to

play22:42

see it's the first time you know where

play22:46

the analytics stack is really a first

play22:48

class app platform as well and Beyond

play22:50

fabric we're integrating the power of AI

play22:53

across the entirety of the data stack

play22:56

there's no question that rag is core to

play22:59

any AI powered application especially in

play23:01

the Enterprise today and Azure AI search

play23:03

makes it possible to run rag at any

play23:06

scale delivering very highly accurate

play23:09

responses using the state-of-the-art

play23:11

retrieval systems in fact chat GPT

play23:13

supports for you know where gpts their

play23:17

assistance API are all powered by Azure

play23:20

AI search today uh and with buil-in one

play23:24

Lake integration Azure AI search will

play23:26

automatically index your structured data

play23:29

too and it's also integrated into Azure

play23:31

AI Studio to support bringing your own

play23:33

embedding model for example and so it's

play23:35

pretty incredible to see aure search

play23:37

grow over the last year into that very

play23:39

core developer service now let's go up

play23:42

to developer tools nearly 50 years after

play23:46

our founding as a developer tools

play23:48

company here we are once again

play23:50

redefining software development right

play23:53

gith up co-pilot was the first I would

play23:56

say hit product of this generative AI

play23:58

age uh and it's the most widely adopted

play24:02

AI developer tools 1.8 million Subs

play24:05

across 50,000 organizations are using

play24:12

it and get up Co with get up copilot we

play24:15

are empowering every developer on the

play24:17

planet to be able to access programming

play24:19

languages and programming knowledge in

play24:21

their own native language uh think about

play24:24

that any person can start programming

play24:26

whether it's in Hindi or Brazilian

play24:28

Portuguese and Jo they bring back the

play24:30

joy of coding to their native language

play24:33

uh and with co-pilot workspace staying

play24:36

in your floor has never been easier uh

play24:38

we are an order of magnitude closer to a

play24:41

world where any person can go from idea

play24:43

to code in an instant uh you start with

play24:47

an issue it creates a spec based on its

play24:51

deep understanding of your code base it

play24:54

then creates a plan which you can

play24:56

execute to generate the code across the

play24:58

full repo that is multiple files at

play25:01

every point in this process from the

play25:04

issue to spec to plan to code you are in

play25:08

control you can edit it and that's

play25:12

really what is fundamentally a new way

play25:14

of building software and we're looking

play25:17

forward to making it much more broadly

play25:18

available in the coming months and today

play25:21

we're taking one more big leap forward

play25:24

you know we are Bridging the broader

play25:26

developer tools and services ecosystem

play25:29

with co-pilot for the first time we're

play25:32

really thrilled to be announcing get up

play25:35

co-pilot

play25:41

extensions now you can customize get up

play25:44

co-pilot with capabilities from third

play25:46

party Services whether it's Docker

play25:48

Sentry and many many more and of course

play25:51

we have a new extension for Azure to get

play25:54

up co-pilot for Azure uh you can

play25:57

instantly deploy to Azure to get

play25:59

information about your Azure resources

play26:02

uh just using natural language uh and

play26:04

what co-pilot did for coding we're now

play26:06

doing for infra and Ops uh to show you

play26:10

all this in action here is Nia from our

play26:13

GitHub team Nia take it

play26:16

away thanks seia GitHub co-pilot gives

play26:20

you suggestions in your favorite editor

play26:23

like here where I'm writing unit tests

play26:25

co-pilot is great at meeting you where

play26:27

you're at regardless of the language

play26:29

you're most comfortable with so let's

play26:32

ask for something simple like how to

play26:33

write a prime number test in Java but

play26:36

let's Converse in Spanish using my

play26:44

voice in

play26:49

Java look at that gracias co-pilot

play26:53

co-pilot is great at turning natural

play26:55

language into code and back again but

play26:58

what about beyond the code with the new

play27:01

GitHub co-pilot extensions you can now

play27:03

bring the context from your connected

play27:05

systems to you so now I can ask

play27:11

Azure where my app is

play27:13

deployed I could ask what my available

play27:16

Azure resources are or I could diagnose

play27:18

issues with my

play27:20

environment and this isn't just for

play27:22

Azure as Saia announced any developer

play27:25

can now create extensions for GitHub

play27:27

co-pilot and that includes any tool in

play27:29

your stack including your in-house tools

play27:32

keeping you in the flow across your

play27:34

entire day actually 75% of a developer's

play27:38

day is spent outside of coding Gathering

play27:40

requirements writing specifications and

play27:43

creating plans let's show how GitHub

play27:45

co-pilot can help with that live on

play27:49

stage for the first time so typically my

play27:52

day starts by looking at GitHub issues

play27:54

looks like we want to support a rich

play27:56

text input for our product description

play27:59

let's open workspace and get some help

play28:01

with that co-pilot interprets the intent

play28:04

of the issue to see what's required and

play28:06

it then looks across the entire code B

play28:09

base and it proposes what changes should

play28:11

be made this specification is fully

play28:13

editable and the whole process is

play28:15

iterative but actually this looks pretty

play28:19

good co-pilot can now help us build a

play28:21

plan on how to implement this

play28:24

change all right that's a great start

play28:27

but we must not forget about our

play28:29

documentation so let's edit the plan and

play28:32

have co-pilot update our

play28:37

readme and then we can even get C-

play28:39

palat's help in starting to implement

play28:41

the code for

play28:43

us now this was just a simple example

play28:46

but in a large Enterprise codebase there

play28:48

are tens of thousands of files and

play28:51

dozens of stakeholders involved and that

play28:54

means meetings so many meetings where

play28:58

work space helps you focus on what you

play28:59

need to change and by the way as a

play29:01

developer I'm always in control I can

play29:03

see exactly what changes co-al is

play29:05

proposing and I can even get a live

play29:10

preview all right let's test out the

play29:14

input all right this looks great so I

play29:17

can go back and I can edit my code in BS

play29:20

code or I can submit these changes as a

play29:22

pull request to share with my team

play29:25

GitHub co-pilot co-pilot extensions and

play29:28

pallet workspace help you stay focused

play29:30

on solving problems and keeping you in

play29:33

the flow

Rate This

5.0 / 5 (0 votes)

Related Tags
AIワークロードクラウドコンピューティングデータセンターパワーオプティマイゼーションハードウェアアクセラレーションネットワークストレージカスタムAIハイブリッドクラウドエンタープライズAI開発ツール
Do you need a summary in English?