Apple Intelligence: Use Cases You Should Know About

The AI Advantage
10 Jun 202418:30

Summary

TLDRApple's WWDC keynote unveiled a plethora of AI features, including an on-device AI model for personalized experiences, writing tools, summaries, and image generation capabilities. The update will roll out in beta this summer and be available on all iPhones, iPads, and Macs with M1 chips or newer in Fall 2024. Privacy is emphasized with on-device processing and a private cloud compute option. Siri will see a major overhaul with improved contextual understanding and integration of chat GPT, enhancing voice commands and third-party app accessibility.

Takeaways

  • 🍏 Apple has announced new AI features at WWDC, including a partnership with GPT and enhancements to Siri, which will be available in beta over the summer and on all devices in September 2024.
  • πŸ“… The new features will be available for devices with the latest chips: iPhone 15 Pro or Pro Max, iPads with M1 chips or newer, and Macs with an M-chip.
  • 🧠 Apple's own AI models will focus on on-device AI and image generation, while also integrating with GPT for certain tasks.
  • πŸ“± The new AI will offer personalized experiences, including prioritizing notifications based on user context, gathered from messages, calendar, notes, photos, and videos.
  • ✍️ Writing tools will be improved with features like tone adjustment, text summarization, and an option to describe changes in a custom way.
  • πŸ“§ Summarization capabilities will extend across the OS, allowing AI to provide essential information from emails and messages.
  • πŸ“ž Transcription features will allow summarization of phone calls and other audio inputs.
  • πŸ” An image generator will be introduced, enabling freestyle image creation in animation, illustration, and sketch styles, with the ability to create custom emojis.
  • 🎨 AI will also enhance image editing, allowing users to transform sketches into detailed images and automatically arrange photos into movies based on context.
  • πŸ”’ Privacy is a key focus, with Apple emphasizing that all AI features are secure and private, processing data on-device where possible and using private cloud compute for cloud tasks.
  • πŸ€– Siri will be overhauled to be more intuitive and context-aware, powered by large language models, and integrating with third-party apps and services.

Q & A

  • What major AI announcements did Apple make during their WWDC keynote?

    -Apple announced several AI features, including their own proprietary models for on-device AI and image generation, as well as a partnership with OpenAI for chat GPT integration, all set to release in beta over the summer and on all iPhones, iPads, and Mac devices in September.

  • When will the new Apple AI features be available to users?

    -The beta versions of the new Apple AI features will be available in the summer, with full releases planned for Fall 2024, likely around the release of the new iPhone in September.

  • Which devices will support the new Apple AI features?

    -The new features will be supported on iPhones 15 Pro or later, iPads with M1 chips or newer, and Macs with an M-chip. Previous iPhone models and those without the specified chips will not support these features.

  • What does 'on-device AI' mean in the context of Apple's announcements?

    -On-device AI refers to AI processes that run directly on the user's device, such as an iPhone or iPad, rather than relying on cloud computing. This allows for faster and more personalized experiences while maintaining privacy.

  • How will the new Apple AI features enhance user notifications?

    -The new AI features will prioritize notifications based on the user's personal context, gathered automatically from messages, calendar, notes, photos, and videos, to provide a more personalized experience.

  • What writing tools will be integrated into iPhones, Macs, and iPads as part of the AI update?

    -The writing tools will include features for changing the tone of text to be more conversational, shortening text, and a custom 'describe your change' option for proofreading and text correction, all accessible system-wide and for third-party apps.

  • What is the significance of the AI-generated summaries feature?

    -AI-generated summaries will provide users with essential information from emails or messages, allowing for quicker understanding and response without having to read through entire texts. This feature will be available across the entire OS and for third-party apps.

  • What is the 'Reduce Interruptions' feature and how does it work?

    -The 'Reduce Interruptions' feature uses Apple's AI models to determine the relevance of incoming notifications based on the user's current context, such as calendar events or ongoing conversations, and only surfaces notifications that are deemed important.

  • How will the new image generation features work with Apple's devices?

    -The image generation features will allow for the creation of custom emojis, detailed sketches from simple drawings, and automatic enhancements of photos and videos with AI. These features will be integrated into various apps and will be accessible with a simple interface or automatically in the background.

  • What is the 'Smart Replies' feature and how will it assist users?

    -Smart Replies is a feature that allows users to quickly select from predefined responses to messages using a toggle interface, with the AI drafting a response that the user can edit or send directly.

  • How does Apple plan to address privacy concerns with the new AI features?

    -Apple has emphasized that all AI features are secure and private, with data processed on-device whenever possible. For cloud-based computations, they introduced 'Private Cloud Compute,' ensuring data is never stored and is auditable by experts, all handled on custom Apple servers.

  • What is the significance of Siri's integration with the new AI features?

    -The updated Siri will act as an interface to the new AI features, allowing voice commands and access to advanced capabilities like chat GPT integration. Siri will also be able to understand context and perform actions based on the user's screen and calendar, making it a more powerful and intuitive assistant.

  • How will the collaboration with OpenAI affect Apple's AI capabilities?

    -The collaboration with OpenAI will bring chat GPT's advanced features into Apple's ecosystem, enhancing Siri's capabilities and allowing for seamless integration of AI functionalities across Apple devices, providing users with more advanced and intuitive AI experiences.

  • What opportunities does Apple's AI update present for third-party developers?

    -Apple's AI update includes a developer kit called Siri kit, which allows third-party developers to integrate AI features deeply into their own applications, enhancing the capabilities of apps available in the App Store and providing users with more AI-powered tools.

Outlines

00:00

πŸ“± Apple's AI Innovations and Upcoming Features

Apple's WWDC keynote unveiled a plethora of AI features, including a partnership with a chat GPT and proprietary models for on-device AI and image generation. These features are set to roll out in beta over the summer and will be available on iPhones, iPads, and Mac devices by September 2024. The availability is restricted to newer models, specifically iPhones 15 Pro or Pro Max, iPads with M1 chips, and Macs with m-chip. The AI advancements aim to offer a personalized user experience by prioritizing notifications, enhancing writing tools, and providing system-wide integration across third-party apps. The script also teases the overhaul of Siri with improved capabilities and a new look.

05:00

🎨 Apple's Image Generation and Smart Features

Apple introduced an image generation model capable of creating freestyle, animation, illustration, and sketch styles, but not photo-realistic images, thus avoiding deepfake concerns. The model includes a fine-tuning feature for personalized image generation and the ability to create custom emojis. Additionally, Apple's AI will offer smart features across various apps, such as turning sketches into detailed images and automatically generating movie montages from photos. The company also highlighted a new semantic search capability for photos and videos, promising to make the vast amount of personal data on devices more accessible through AI advancements.

10:00

πŸ”’ Privacy and Security in Apple's AI Features

Apple emphasized the security and privacy of their new AI features, stating that all computations possible on the device will be processed locally. For cloud-based computations, Apple introduced 'Private Cloud Compute,' ensuring data is never stored and is auditable. The company differentiates between local and cloud models, noting that while smaller models can run on local devices, larger models like GPT-4 require cloud processing. Apple's approach aims to balance AI capabilities with user privacy, providing a seamless experience without compromising data security.

15:01

πŸ”Š Siri's Enhancements and Chat GPT Integration

Siri is set to become more powerful and intuitive with the integration of large language models (LLMs), allowing for more natural and forgiving voice commands. Siri will now be able to understand context, perform complex tasks, and make better recommendations by analyzing on-screen content and calendar events. Apple also announced a collaboration with OpenAI to integrate Chat GPT natively into iOS, enhancing Siri's capabilities further. This integration is expected to bring advanced features from Chat GPT to Apple devices, making AI more accessible and mainstream for users.

πŸ› οΈ Developer Opportunities and AI's Mainstream Adoption

Apple's AI announcements at WWDC extend to developers, offering a Siri kit for third-party integration, allowing developers to enhance their apps with AI capabilities. This move signifies AI's mainstream adoption, as Apple devices will run large language and image generation models by fall 2024. The script concludes by highlighting the significance of these innovations, making AI accessible to a broader audience and promising further in-depth coverage and tutorials to help users maximize the use of these new technologies.

Mindmap

Keywords

πŸ’‘AI features

AI features refer to the various capabilities and functionalities that incorporate artificial intelligence into technology products. In the context of the video, Apple has announced several AI features during their WWDC keynote, which are set to enhance user experience across their devices. Examples from the script include improved Siri, chat GPT integration, and on-device AI for image generation.

πŸ’‘WWDC keynote

The WWDC keynote is the opening presentation at Apple's annual Worldwide Developers Conference, where the company typically unveils new products, technologies, and features. The script discusses the AI announcements made during this event, indicating the significance of these updates for the tech community and Apple users.

πŸ’‘On-device AI

On-device AI refers to artificial intelligence processes that occur directly on the user's device, as opposed to relying on cloud computing. The script mentions Apple's proprietary models for on-device AI, emphasizing local processing for tasks like image generation and notification prioritization, which can enhance privacy and responsiveness.

πŸ’‘Beta release

A beta release is a version of a product that is almost ready for release but is still tested by a select group of users before being made available to the public. The script states that the new Apple AI features will be in beta over the summer, indicating that users will soon have the opportunity to try these features before their official launch in the fall.

πŸ’‘Personalized experience

A personalized experience is tailored to an individual's preferences, behavior, or context. The video script discusses how Apple's AI will provide a personalized experience by analyzing various aspects of a user's device usage, such as messages, calendar events, and photos, to prioritize notifications and generate summaries.

πŸ’‘Large language models (LLMs)

Large language models are AI systems that are trained on vast amounts of text data and can generate human-like text based on the input they receive. The script mentions the integration of LLMs like GPT 40 for various writing tools and Siri's enhanced capabilities, demonstrating the application of these models in improving user interaction with devices.

πŸ’‘Image generation

Image generation is the process by which AI creates visual content based on given prompts or data. The script describes Apple's new image generation features that allow for the creation of custom emojis and the enhancement of sketches into detailed images, showcasing the creative potential of AI integration.

πŸ’‘Privacy

Privacy in the context of the video refers to the protection of users' personal data and the secure handling of information processed by AI features. Apple emphasizes the privacy of their AI features, stating that on-device processing and private cloud compute ensure data is not stored and remains secure.

πŸ’‘Chat GPT

Chat GPT is a specific type of large language model designed for conversational AI, capable of engaging in dialogue with users. The script discusses a partnership with OpenAI to integrate Chat GPT into Apple's ecosystem, enhancing Siri's capabilities and offering a more natural and context-aware interaction with users.

πŸ’‘Siri

Siri is Apple's voice-activated virtual assistant, which is set to receive significant upgrades as discussed in the script. The new Siri will be powered by LLMs, offering a more intuitive and context-aware experience, including the ability to make corrections and understand natural language more effectively.

πŸ’‘Developers conference

A developers conference is an event focused on engaging and informing software developers about new tools, technologies, and opportunities. The script mentions that the AI features announced at Apple's WWDC are also made available to third-party developers through Siri kit, allowing them to enhance their apps with these advanced AI capabilities.

Highlights

Apple announced a range of AI features at WWDC, including a partnership with a chat GPT and other AI advancements.

New AI features will be released in beta over the summer and available on all iPhones, iPads, and Mac devices in September.

Apple's AI advancements are categorized into their own proprietary models and those using GPT.

Beta availability is planned for the summer, with full deployment in Fall 2024, coinciding with the new iPhone release.

Only the latest iPhone models (15 Pro or Pro Max), iPads with M1 chips, and Macs with m-chip will support Apple Intelligence features.

Apple's new AI will offer personalized experiences by analyzing messages, calendar, notes, photos, and videos.

Writing tools will be integrated into iPhones, Macs, and iPads, allowing for more conversational tones and text summarization.

AI-generated summaries will provide essential information from emails and messages.

Transcription features will allow summarization of phone calls and other audio inputs.

A new 'Reduce Interruptions' feature will surface only relevant notifications based on context and calendar.

Smart replies will allow for quick message responses with minimal input from the user.

Apple's image generation features will include freestyle animation, illustration, and sketch capabilities.

AI will enable the creation of custom emojis and enhance images within notes and other apps.

A new cleanup tool will allow for easy removal of unwanted subjects from photos.

AI will enable semantic search across photos and videos, allowing users to find media based on content.

Apple emphasized privacy, stating that all AI features are secure and prioritize on-device processing.

For cloud processing, Apple introduced 'Private Cloud Compute' ensuring data is never stored and is auditable.

Siri will be overhauled with new capabilities powered by large language models, offering more natural interactions.

Siri will integrate with chat GPT, bringing advanced conversational AI to Apple devices.

Third-party developers will be able to enhance their apps with Apple's AI features through SiriKit.

Transcripts

play00:00

oh boy oh boy Apple just came out with

play00:02

their WWDC keynote and they announced a

play00:04

bunch of AI features including a chat

play00:07

GPT partnership and so much more there's

play00:09

actually way more to cover here than I

play00:10

expected initially and we're going to

play00:12

break down everything you need to know

play00:15

concerning all brand new Apple AI

play00:17

announcements that are releasing in beta

play00:19

over the summer and are going to be on

play00:20

all iPhones iPads and Mac devices in

play00:24

September so let's take this step by

play00:26

step because as I mentioned there really

play00:28

is a lot to unpack here okay and there

play00:30

categories of AI announcements they just

play00:31

made during this event the very first

play00:34

one being their very own model they did

play00:37

not explicitly say this but it was clear

play00:39

that they segmented what they can do by

play00:41

themselves as apple with their

play00:42

proprietary models with stuff that

play00:44

they're using cat G pt4 so their own

play00:47

model they're using for the on device Ai

play00:49

and image generation so let's talk about

play00:52

what exactly that means for you as a

play00:54

user of all this but hold up before we

play00:56

get into all the details here one super

play00:58

important thing I need to address when

play01:00

is this becoming available and what

play01:02

devices will be able to use everything

play01:04

we're going to talk about here like the

play01:05

improved Siri or cat GPT generation well

play01:08

this graphic on their website says it

play01:11

all almost in the presentation they

play01:13

pointed out that the beta is going to

play01:14

become available this summer and then

play01:16

all of these Apple intelligence features

play01:18

including Siri chat GPT everything else

play01:20

are going to be coming to all of these

play01:22

devices in Fall 2024 probably somewhere

play01:25

around the release of the new iPhone end

play01:27

of September and there's one thing I

play01:29

need to highlight here which makes sense

play01:31

from Apple's perspective but a lot of

play01:33

people are not going to like this no

play01:35

previous iPhones except of the 15 Pros

play01:37

or the pro Max are going to have apple

play01:40

intelligence meaning even if you

play01:41

upgraded to a new iPhone this year and

play01:43

you didn't end up getting the pro

play01:45

version well if you want these features

play01:46

you'll have to upgrade this

play01:48

year as of iPads you need M1 chips or

play01:51

newer and with all the Macs it's pretty

play01:52

straightforward you need an mchip to run

play01:55

all this okay so with that out of the

play01:56

way let's talk about all these new

play01:58

features starting with the Apple models

play02:01

and they don't call it this if you're

play02:02

following this channel we cover all the

play02:03

iterations in the various models Opus

play02:06

coming out GPT 40 coming out being

play02:08

better than gbd4 these are various large

play02:10

language models that have different

play02:11

capabilities Apple chose a different

play02:13

style of communication as they're

play02:14

talking to a billion people their user

play02:16

base everyday consumers that are not

play02:18

that deep into this stuff so what we got

play02:20

here is a multitude of models they did

play02:22

not give us the details because they

play02:24

don't really matter it matters what

play02:25

you're going to be able to do with it

play02:26

and there's a lot matter of fact there's

play02:28

more than I expected and no worries

play02:29

we're we're going to talk about all the

play02:30

specific use cases in a second here but

play02:32

before that I got to add that everything

play02:34

we talk about here is going to be

play02:36

systemwide not just limited to Apple

play02:38

apps all right meaning all your favorite

play02:40

third party apps will have access to

play02:42

these features too amazing so what can

play02:44

this new Apple intelligence actually do

play02:46

with your phones and Macs so first

play02:47

things first and this one got me very

play02:49

excited is that it will be able to

play02:50

prioritize your notifications for you in

play02:52

other words depending on your personal

play02:54

context that the phone will

play02:55

automatically gather what this means in

play02:57

practice is that it will be looking at

play02:59

your messages your calendar your notes

play03:01

even your photos and videos to give you

play03:03

a better personalized experience this is

play03:05

a word that you'll hear a lot today

play03:06

personalized because by looking at all

play03:08

of these different parts of your phone

play03:10

it can prioritize certain notifications

play03:12

over others other things are important

play03:14

if you're at work versus when you're at

play03:15

dinner with your family the next one

play03:17

that they'll be integrating into iPhones

play03:19

Macs iPads is various writing tools and

play03:21

these are very basic but they're also

play03:23

the most used ones if you're a power

play03:24

user of tools like GPT 40 it's changing

play03:27

the tone of things to be more

play03:28

conversational or shortening text

play03:30

there's only a handful of these and I

play03:31

would like to highlight this option to

play03:33

describe your change where you're

play03:34

essentially prompting it in a custom way

play03:36

so even though they had a quote in a

play03:38

presentation saying no need to engineer

play03:39

the perfect prompt yeah there's no need

play03:42

but doesn't mean there's no value left

play03:43

to it I've been saying this since a

play03:45

while but if you want this intuitive

play03:46

user interface it is there for you one

play03:48

funny side note is that they sort of

play03:49

revealed one of the prompts they're

play03:50

using here for the proof reading so

play03:52

basically what happens if you hit this

play03:53

button that it checks the text you wrote

play03:55

for grammar spelling and sentence

play03:56

structure something up on allow you had

play03:58

to prompt manually now it's a button and

play04:00

this theme spans throughout the whole

play04:01

announcement all of these improvements

play04:03

what you see right here are accessible

play04:05

in the entire operating system also for

play04:07

third party app so this is not going to

play04:09

be limited just to their Notes app or

play04:10

their mail app no all of your favorite

play04:12

apps will be able to introduce some of

play04:13

the features we talk about here but now

play04:15

on to the next one which is summaries

play04:17

and these span really across the entire

play04:19

OS whether you receive an email or a

play04:21

message it's going to be able to

play04:22

summarize everything for you to get you

play04:24

the essential information I really like

play04:26

this in the context of emails because a

play04:27

lot of times the sender gets to decide

play04:29

what part of the email they show you now

play04:31

this is going to be replaced by AI

play04:33

generated summaries that happen on your

play04:35

device by the way we'll talk about

play04:36

privacy in a second here I I know many

play04:38

people are concerned and rightfully so

play04:40

but look there's more there's not just

play04:41

this writing assistant there's also the

play04:43

ability to transcribe wherever you are

play04:45

now so there's examples of phone calls

play04:47

happening where you can just summarize

play04:48

what the phone call was about all of

play04:50

these AI capabilities that you had to do

play04:51

manually now will be integrated right

play04:53

there it's just going to be the Press of

play04:54

a button or not even that it's just

play04:56

going to happen automatically in the

play04:58

background and it can look like this

play05:00

feature that they implemented I'm a big

play05:01

fan of this one by the way if you know

play05:03

me personally there must have been a

play05:04

point in time where you were frustrated

play05:05

with the fact that I set my phone to do

play05:07

not disturb way too often I just can't

play05:09

focus on any work if there's

play05:10

notifications popping up and there's

play05:12

this new mod which is called reduce

play05:14

interruptions which is a middle ground

play05:16

between having notifications on and

play05:17

having do not disturb on because these

play05:19

Apple models will look at the message

play05:21

they will look at the context of it

play05:23

while considering your calendar or other

play05:25

conversations that you're having and

play05:26

it's only going to surface the

play05:27

notification if it's relevant to you how

play05:30

well will this work in practice we shall

play05:31

see everybody will have access to this

play05:33

by September which is not too far out

play05:35

but I really like this idea it feels

play05:36

like something that I would have set up

play05:38

on my phone most of the time I want to

play05:40

reduce interruptions in my life pretty

play05:41

much all the time and one more feature

play05:43

related to this text and summarization

play05:44

capabilities is the fact that you can

play05:46

have Smart replies so basically if you

play05:48

want to reply to a message you can now

play05:49

have a little toggle interface where you

play05:51

basically can say Hey will your partner

play05:52

be joining yes or no or will you be

play05:54

taking an Uber or driving no need to

play05:56

type out everything you can do it with

play05:58

one hand pick the reply and the large

play05:59

language model drafts a response for you

play06:01

and you can edit it or send it right

play06:03

away okay so those are some of the text

play06:04

and transcription capabilities pretty

play06:06

nice pretty nice not going to lie I'm

play06:07

looking forward to some of those

play06:08

especially the new notifications and the

play06:10

prioritization but there is more way

play06:12

more because they released some new

play06:14

image generation features that are going

play06:15

to be implemented we'll talk about those

play06:17

now and they overhauled Siri even

play06:19

offering a chat GPT integration no

play06:22

worries but we'll talk about it all but

play06:23

we need to understand these image

play06:25

generation capabilities before we move

play06:26

on to Siri because Siri brings it all

play06:28

together okay so what is this image

play06:30

generator all about well edit core it's

play06:32

a decent image generation model that can

play06:34

do freest Styles none of those are

play06:36

realistic okay concretely they animation

play06:38

illustration and sketch important side

play06:41

note none of these are photo realistic

play06:43

so you know deep fakes are not an issue

play06:45

but they also circumvented the biggest

play06:46

challenge in AI image generation which

play06:48

currently not many models got right

play06:50

how's the quality of these models it's

play06:52

okay obviously nowh close to leaders in

play06:54

the space like M Journey but hey all of

play06:56

this will be free and integrated right

play06:57

on your device if you have to write

play06:59

device by the way it also does something

play07:01

that is referred to as fine-tuning where

play07:02

it regenerates images of you with the AI

play07:05

image generator I mean to be fair it

play07:07

sort of looks like you turning her into

play07:09

a pretty generic but close enough type

play07:11

of image like this and again this

play07:12

integrates into everything so you get to

play07:14

write messages with them you get to add

play07:16

it as a contact and there's actually one

play07:17

kind of fun thing here that I think

play07:19

people will be using a lot and that is

play07:21

creating custom Emojis with AI they call

play07:23

as gen emojis but basically you can

play07:25

express any sort of mood you might be in

play07:27

with the power of these image generators

play07:30

this is what that would look like custom

play07:31

emojis for you that's kind of fun but

play07:33

there's more actually there's a whole

play07:34

set of smart features that are

play07:36

integrated into various apps so for

play07:37

example this shows a Notes app where you

play07:39

have a sketch that you just quickly

play07:40

threw up with your pen and using the

play07:42

image generation model you can turn the

play07:44

sketch into a more detailed version of

play07:46

it great even better than that it

play07:48

supports a feature where you can just

play07:49

draw on a certain part of your nodes and

play07:51

it will generate the image in there

play07:54

depending on what it sees on the screen

play07:56

right it takes the context around it

play07:58

into a account so in other words because

play08:00

this text is talking about architecture

play08:02

in India it will generate an appropriate

play08:04

image that fits here just imagine you

play08:06

creating some sort of presentation or

play08:07

Word document it's going to be really

play08:09

easy to enhance them with AI Imaging now

play08:11

for everyday users no need to round trip

play08:13

to Discord or installing a local model

play08:15

it's just right there all you need to do

play08:17

is circle and it works this is what AI

play08:19

adoption looks like in practice nice

play08:21

they also sh off this cleanup tool this

play08:23

is very simply described and actually

play08:24

Google photos released the same thing

play08:26

last week if you're following our Friday

play08:28

show where we update you on the new AI

play08:30

use cases that come out every single

play08:31

week you basically roughly draw around

play08:33

the subject and it just figures out what

play08:35

to do in order to remove it oh yeah and

play08:37

there's also this feature where they

play08:38

look at your photos and actually

play08:40

understand what's in them and then if

play08:41

you want to edit them into a little

play08:42

movie montage then you can do that with

play08:45

the power of AI see you just give it a

play08:47

little prompt there at the bottom as

play08:49

into what you want to create and it

play08:50

picks and arranges them for you in a way

play08:52

that is apparently better than what we

play08:54

have right now very much looking forward

play08:55

to trying this out this is really a baby

play08:57

step towards full AI video editing

play08:59

capabil abilities and that brings me to

play09:00

the next point and this one was so

play09:01

surprising to me I still haven't wrapped

play09:03

my head around it to be fair they're

play09:04

going to allow you to semantically

play09:06

search over all of your photos and

play09:08

videos in other words you can tell your

play09:10

phone about any subject in any picture

play09:12

that you have on your phone or video and

play09:14

it will pull up those videos and this

play09:16

surface is one of the big questions I

play09:18

have with all of this stuff how deep

play09:19

does the personal context it looks into

play09:22

go if I'm going to be receiving a new

play09:24

email will it be looking at videos from

play09:26

2009 probably not but it could right not

play09:29

to get too technical but it seems almost

play09:31

impossible to create embeddings for all

play09:32

photos and videos that people have on

play09:34

their phones I mean people use this as

play09:36

an external hard drive these days they

play09:37

just never clean the phones and now all

play09:39

of that is going to be accessible by

play09:41

these new large language models as they

play09:42

help you in your everyday life I don't

play09:44

know we'll have to wait to see how deep

play09:46

that personal context really goes but

play09:47

that's a big deal because there's a lot

play09:49

of data on your phone and we haven't

play09:50

even considered the data that comes from

play09:52

the Apple watch or your everyday usage

play09:53

of your computer they haven't talked

play09:55

about that but it's sort of implied I

play09:56

mean if it's looking at your calendar

play09:58

and all your videos well in the same

play10:00

breath we should also talk about privacy

play10:02

which they made a big deal out of they

play10:03

have a brand new animation where the

play10:05

Apple logo kind of unlocks that's pretty

play10:07

neat and they made it clear that all of

play10:08

these Apple intelligence features are

play10:10

extremely secure and private matter of

play10:12

fact they stated that everything that

play10:14

can be processed on device is happening

play10:17

in that way but obviously a lot of these

play10:18

computations especially when we talk

play10:19

about the chat GPT integration a second

play10:21

here will not be able to happen on

play10:23

device you just cannot run GPT 40 on an

play10:26

iPhone it's too large too demanding so

play10:28

for that you need to go out and you need

play10:29

to send a request to the cloud now when

play10:31

that happens you kind of lose control

play10:32

over your data now they have an answer

play10:34

to that too they called it private Cloud

play10:36

compute and what they promise with it is

play10:38

whenever data goes out to the cloud it

play10:40

is never stored and it is auditable by

play10:43

experts and they stated this multiple

play10:45

times and backing it up with the fact

play10:46

that all of this is going to be

play10:47

happening on custom Apple servers that

play10:49

are built for specifically this

play10:51

resulting on all of these Apple

play10:52

Intelligence being aware of your

play10:54

personal context without collecting any

play10:56

data and for anybody who's new around

play10:57

here just a quick primer on local versus

play10:59

Cloud models you can run a lot of these

play11:01

AIS that we have these days on your

play11:03

local device even if you don't have a

play11:05

very beefy device models with smaller

play11:06

parameter sizes like metas llama 8p I

play11:09

think that would be kind of the king in

play11:10

terms of performance versus size right

play11:12

now you could run this thing on most

play11:14

MacBooks matter of fact I have a video

play11:15

on the Channel showing you exactly how

play11:16

to do that and it works if you turn off

play11:18

your internet because you have the model

play11:19

locally it only has 8 billion parameters

play11:22

but the big models have way more than

play11:23

that well we don't know exactly what the

play11:25

numbers are open ey hasn't published

play11:26

them that they're training right now to

play11:27

compete with gb4 and GP 40 will be 400

play11:31

billion parameters in size and if you're

play11:33

new around here just a quick primer on

play11:34

model sizes there's really small models

play11:36

and there's really ginormous model

play11:38

something like GPT 40 is massive they

play11:40

haven't even released how large exactly

play11:42

it is but a competitor from meta that is

play11:44

training right now will be 400 billion

play11:46

parameters that's going to be their Lama

play11:48

free 400b now if you want to fit the

play11:49

model onto the phone meaning you do not

play11:51

need the Internet you do not need to

play11:53

send anything to the cloud to run the

play11:54

model locally you're going to be looking

play11:56

at something like a 8 billion sized

play11:58

model now these are are very limited and

play12:00

they don't have as much or as deep of a

play12:01

knowledge of the world as these larger

play12:03

models but you don't need that if you're

play12:05

just summarizing or if you're just

play12:06

generating one tiny image and that's

play12:08

while having local models for everything

play12:09

that can be done locally and then going

play12:11

to the cloud if necessary is a fantastic

play12:13

combination but the privacy issues are

play12:16

the concerns so I'm definitely curious

play12:17

to hear more details about their privacy

play12:19

approach but again this type of stuff

play12:20

didn't even show up in the presentation

play12:22

cuz it doesn't matter at the end of the

play12:24

day you're going to hit a button it's

play12:25

going to summarize you're going to hit

play12:26

another one it's going to do something

play12:27

more advanced you're not going to know

play12:28

if it's going to the cloud or not it's

play12:30

like going into that sometimes so you

play12:31

can understand how these things work

play12:33

under the hood in order for you to get

play12:35

the most out of all of these tools that

play12:36

are coming our way oh and if you're

play12:38

enjoying this video don't forget to hit

play12:39

the like button it really does help out

play12:41

the channel okay so now let's bring it

play12:42

all together and let's talk about Siri

play12:44

because that's exactly what Siri does it

play12:47

brings it all together you're going to

play12:48

be able to use a voice interface by the

play12:50

way no new voices a lot of people

play12:51

expected something like the gp4

play12:53

announcement no same old Siri voices

play12:54

which are decent but the capabilities

play12:56

changed a lot first of all the new Siri

play12:58

has a look and whenever series is

play13:00

working you can see this little pink

play13:02

purplish type of glow to indicate that

play13:04

you're using Apple intelligence gosh

play13:05

Apple intelligence people are barely

play13:07

getting used to the world artificial

play13:08

intelligence and now they changing the

play13:09

definition or what anyway this is the

play13:11

new logo for Siri and everything we

play13:12

talked about will be accessible through

play13:14

it plus so much more because Siri

play13:17

already integrates with actions they're

play13:19

actually called shortcuts on an iPhone

play13:20

you might be familiar with them and

play13:22

these are automations that can happen on

play13:24

your phone that are already there today

play13:26

it's just not easy to access them for

play13:28

the current Siri cuz because it's pretty

play13:29

clunky you got to be super specific and

play13:31

now Siri is going to be powered by llm

play13:33

meaning you can make mistakes as you

play13:35

speak you can speak at different Paces

play13:37

you can leave out certain words it's

play13:38

going to understand the context of the

play13:40

sentence and you don't have to get every

play13:42

single thing right so something like uh

play13:45

Siri set an alarm for um oh wait no set

play13:47

a timer for 10 minutes actually make

play13:50

that five it's going to be a command

play13:51

that understands where as of now Siri

play13:53

would refuse to collaborate with you and

play13:55

that's the reason why I have her turned

play13:56

off on my phone right now it's just not

play13:58

good enough except for setting timers

play14:00

maybe and you can start making these

play14:02

requests like you would to a human which

play14:04

just didn't work up until now if you say

play14:05

show the files June sent me last week it

play14:08

needs to understand who is Jun in your

play14:09

contacts which June you may be referring

play14:11

to and it needs to know about all the

play14:13

files and email communication from last

play14:15

week now that it has all this access and

play14:17

not just that it has all this

play14:18

understanding of this context it is able

play14:20

to perform actions like this for you now

play14:22

yes this is a very first look at this

play14:24

exciting agentic feature where our

play14:26

devices perform some of the work for us

play14:28

and don't just assist and US performing

play14:30

the work oh and if you thought that

play14:31

Apple has already enough data on you

play14:33

well Siri is also looking at your screen

play14:35

so when you're using her she will be

play14:36

able to see what you're doing right now

play14:38

and use that context recognize all the

play14:40

images the people the context of that

play14:43

within your day because it also sees

play14:45

your calendar it will be able to

play14:46

consider all that to make better

play14:48

recommendations or take better actions

play14:50

right now we've seen something like this

play14:52

during openis gp4 announcement if you're

play14:54

not up to speed on that you definitely

play14:56

got to catch up there because those

play14:57

capabilities are even more advanced than

play14:59

what we see here but you're going to

play15:01

have them in here right you can just

play15:02

install the chat GPT app on your iPad

play15:04

here and just use it as a part of your

play15:06

workflow and then the even more advanced

play15:07

voice assistant that is able to pick up

play15:09

on tonality and assist you with the

play15:11

multimodal model is going to integrate

play15:13

into here a matter of fact you might not

play15:15

even need an external app because they

play15:17

announced a collaboration with open AI

play15:19

where chat GPT will be seamlessly

play15:21

integrating into iPhones Max iPads all

play15:24

of it so look at that there's no icon

play15:25

here when I open my cat GPT app there's

play15:27

an icon here on top but they're showing

play15:29

it's natively integrated into the OS

play15:31

making Siri even better because all the

play15:33

stuff that we talked about here you

play15:35

might have caught the fact that there's

play15:36

no Advance features like data analysis

play15:38

or writing or accessing all of the deep

play15:41

knowledge that GPT 40 has these local

play15:43

models they're not going to excel at

play15:45

writing backstories or simulating

play15:47

conversations between some of the grades

play15:49

in history this is stuff that gbd4 is

play15:51

good at and now it's going to integrate

play15:53

into everything just because opening

play15:55

eyes so far ahead Apple just had to sort

play15:56

of partner with them and I'm really

play15:58

looking forward to that because all of

play15:59

their Innovation is going to go right

play16:01

back into my devices today as a consumer

play16:04

that's great if they get the Privacy

play16:06

piece right which you know they seem to

play16:07

be on the right track and everything and

play16:09

one thing as a chat gbd power user I

play16:11

think that's fair to say I mean it's

play16:12

sort of all I do and then I teach you

play16:14

guys what I find along the way on this

play16:16

channel so as a power user I'm really

play16:18

excited for the fact that actually all

play16:19

the advanced features are going to be

play16:21

coming to this integration too so if you

play16:23

have a paid plan and The Voice Assistant

play16:25

will roll out it's going to be natively

play16:26

integrated and then my guess is that the

play16:27

chat histories will just translate from

play16:29

my Mac Mini to my phone to my MacBook to

play16:32

my iPad and yeah if you didn't know I'm

play16:33

a heavy Apple user I really like the

play16:35

convenience and I do a lot of creative

play16:36

tasks I just like the E of views but

play16:38

with that being said for most of my life

play16:39

I always had a PC on the side for

play16:41

specific applications or now I'm looking

play16:43

at it to run some of these more advanced

play16:44

large language models locally with rag

play16:46

so I really have full control anyway

play16:48

these are the Apple Innovations but

play16:49

there's one more thing we need to point

play16:51

out and that is that all of this is

play16:53

going to be available to developers I

play16:55

mean heck this was the developers

play16:56

conference and I was kind of surprised I

play16:58

I mean it makes sense in highsight but I

play17:00

was sort of surprised that they opened

play17:02

all of this up to thirdparty developers

play17:03

meaning you can enhance your own apps

play17:06

with these Integrations there's a whole

play17:08

developer kit they call Siri kit just so

play17:10

you can integrate that deeply into your

play17:11

own applications so as a consumer this

play17:13

means that by fall

play17:15

20024 all of our Apple devices are going

play17:17

to be running large language models

play17:19

image generation models with all these

play17:21

little quality of life improvements

play17:23

across most relevant apps in the App

play17:25

Store you're going to be able to use all

play17:27

of this for free if you're one of the

play17:29

compatible devices that is and that's it

play17:31

a summary of the events that announced

play17:33

AI features for the largest amount of

play17:35

devices and therefore people in human

play17:37

history so yeah it's official AI is

play17:39

going mainstream and they even have this

play17:40

tagline saying AI for the rest of us in

play17:43

other words this is AI for all the

play17:44

people who are not watching this type of

play17:46

Channel because we do like to go deeper

play17:48

we do like to do custom prompts we look

play17:50

at all the bleeding edge Innovations but

play17:52

now they're all being natively

play17:53

integrated into the devices that you

play17:55

might already have that doesn't mean I'm

play17:56

stopping with the indepth coverage and

play17:58

the tutorials on how to get the most out

play17:59

of this technology matter of fact it

play18:00

means the exact opposite once we'll have

play18:02

this I'll show you how to get the most

play18:04

out of it how to create the custom

play18:05

shortcuts we're always looking at how to

play18:07

get the most out of chat GPT and similar

play18:09

applications so subscribe for more

play18:11

content like this if you want to stay on

play18:13

top of this technological Revolution

play18:14

happening right in front of us and if

play18:16

you're not sure where to begin learning

play18:17

more about this we have a newsletter

play18:19

that comes with a massive template that

play18:20

you get for free on sign up that

play18:21

newsletter and the template are my best

play18:23

attempt at helping you on your first

play18:25

steps and staying up to date with all

play18:26

this madness all right that's it for

play18:28

today have a good one

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Apple AISiri UpdateImage GenerationAI FeaturesWWDC KeynotePersonalized AIChat GPTDevice IntegrationPrivacy FocusDeveloper ToolsUser Experience