Apple Intelligence EXPLAINED: No iPhone 16 Needed!

Arthur Winer
27 Sept 202416:49

Summary

TLDRThe video explores Apple's new AI features, focusing on how they leverage a hybrid AI system to balance on-device processing and cloud computing for privacy and efficiency. The new capabilities include smarter Siri, notification summarization, and on-device text rewriting. It highlights how these features enhance user experience without compromising security, and discusses Apple's unique approach to handling AI tasks. The video also suggests alternatives like ChatGPT and Google Lens for those who want similar features on older iPhones, emphasizing that while the upgrade isn't necessary, Apple's integrated experience offers a seamless user experience.

Takeaways

  • 📱 Apple's upcoming iPhone models are set to feature advanced AI capabilities, but the specifics of how they operate have been somewhat unclear.
  • 🤖 Apple uses a hybrid AI system that balances tasks between on-device processing and cloud computing to optimize performance and security.
  • 🔒 Privacy is a key focus, with Apple's AI models designed to handle complex tasks locally, reducing the need to send data to external servers.
  • 📊 On-device AI features like notification summarization and emoji generation are powered by lightweight models that run directly on the iPhone.
  • 📝 The standout feature, on-device text summarization, uses a dedicated model to proofread and rewrite text in different styles,受限于iPhone的RAM和处理能力。
  • 🌐 Third-party apps like ChatPlayground offer additional AI capabilities, such as an AI browser co-pilot that simplifies information gathering and text generation.
  • 📝 Apple provides AI with clear, detailed instructions to guide its behavior, aiming to keep it on track and prevent it from 'hallucinating' or making up factual information.
  • 💡 Apple's on-device intelligence relies on a model called OpenElm, which is small but efficient, requiring less RAM compared to larger models likeGBT.
  • 🛠️ Adapters and quantization techniques allow Apple to further optimize AI models for specific tasks, reducing memory usage and improving speed.
  • ☁️ Private Cloud Compute (PCC) is Apple's cloud-based AI that steps in when on-device models can't handle a task, with a focus on privacy and security.
  • 🚀 Siri has been significantly upgraded to access on-device data, apps, and personal contacts, providing more integrated and personalized assistance.
  • 📸 Visual intelligence, a feature of the new iPhone, allows users to point their camera at objects to get instant insights, similar to Google Lens but with deeper system integration.
  • 🖼️ Image Playground is an on-device image generation tool that can create images from text prompts or suggestions based on personal context, although its styles are currently limited.
  • 🧼 The 'clean up' feature in Photos, which removes unwanted objects from images, can be replicated using third-party apps like Google Photos, negating the need for a new iPhone upgrade.

Q & A

  • What is Apple's approach to AI processing on devices?

    -Apple uses a hybrid AI system where simple tasks are handled on the device, and more complex tasks are processed using Apple's private compute, ensuring data security without relying on cloud servers.

  • How does Apple's notification summarizer work?

    -The notification summarizer uses a lightweight AI model to analyze all notifications and sorts them by urgency or priority, providing a summary of important notifications without needing cloud processing.

  • What is the role of 'adapters' in Apple's AI system?

    -Adapters are small, task-specific tweaks that can be added to the base AI model. They allow the model to switch between different tasks efficiently without needing separate models for each job.

  • How does Apple's AI system handle personalization?

    -Apple's AI system uses personal context from stored data like messages, contacts, and calendar events to customize responses specifically for the user, ensuring privacy by keeping all data on the device.

  • What is the significance of 'quantization' in Apple's AI models?

    -Quantization is a technique used to shrink AI models by lowering the precision of their parameters, allowing the model to trade off some accuracy for speed and efficiency while reducing memory usage.

  • How does Apple's Private Compute Center (PCC) work?

    -PCC is Apple's cloud-based AI that kicks in when the on-device model can't handle a task. It decides whether to keep the task on the device or push it to the cloud, ensuring data privacy and security.

  • What is the new Siri's capability in terms of accessing personal data?

    -The new Siri can access on-device data, apps, and personal contacts, providing more integrated and personalized assistance compared to previous versions.

  • How does Apple's 'Visual Intelligence' feature differ from Google Lens?

    -Apple's Visual Intelligence is integrated into the system and uses a hybrid approach of on-device processing and cloud computing. It syncs with Siri and personal contacts, offering insights into objects, text, or landmarks by pointing the iPhone at them.

  • What is 'Image Playground' and how does it work?

    -Image Playground is an image generation tool that can be accessed from iMessage or as a standalone app. It generates images using text prompts or suggestions based on recent activities and personal context.

  • How can users replicate Apple's AI features on older iPhones?

    -Users can replicate some of Apple's AI features on older iPhones by using standalone apps like ChatGPT's Dolly for image generation or Google Lens for visual intelligence.

  • What is the main advantage of Apple's approach to AI compared to standalone apps?

    -The main advantage of Apple's approach is the seamless integration of AI features across the entire system, providing a cohesive user experience that standalone apps may not match.

Outlines

00:00

📱 Apple's AI Features and Hybrid System

Apple is introducing new AI features with its upcoming devices, focusing on a hybrid AI system that balances cloud computing with on-device processing to enhance privacy and efficiency. The company has been teasing features like notification summarization and emoji generation, which run on lightweight models directly on the device. For more complex tasks, Apple's private compute is used. This approach aims to keep user data secure without sacrificing functionality. The video will explore whether these AI features are worth upgrading to a new iPhone and how they can be utilized on older models.

05:01

🤖 On-Device AI Models and Efficiency

Apple's on-device intelligence is centered around a model called Open Elm, which is more efficient than larger models due to its smaller size and the use of adapters for specific tasks. This approach conserves memory and processing power. Apple also employs quantization to further reduce the model's memory requirements. The video discusses how Apple's AI system works, including the use of private cloud computing for tasks that exceed on-device capabilities, ensuring data privacy and security through encrypted connections and minimal data sharing.

10:03

🔍 Siri's Enhanced Capabilities and Integration

Siri has been significantly upgraded to leverage on-device data and personal contacts, providing real-time answers and deeper system integration. Apple's approach to AI includes clear instructions to guide AI behavior and prevent 'hallucinations,' or the creation of false information. The video highlights how Siri's new capabilities can make the iPhone upgrade worthwhile for heavy users, while also suggesting workarounds for those who wish to replicate similar functionality on older iPhones.

15:08

🖼️ Image Generation and Visual Intelligence

The video discusses Apple's 'Image Playground' feature, which allows for image generation using text prompts or suggestions based on personal context. While this feature is fun, it is not groundbreaking and can be replicated using other apps like Chat GPT's Dolly. Additionally, 'Visual Intelligence' is Apple's version of Google Lens, providing instant insights into objects, text, or landmarks by combining on-device processing with cloud computing. The video suggests that for most users, free apps can offer similar functionality without the need to upgrade to the latest iPhone.

📸 Photo Cleanup and Weighing Upgrade Options

The final paragraph discusses the 'cleanup' feature in Photos, which allows for the easy removal of unwanted objects from images. This feature, however, will not be available until October. The video suggests that there is no need to rush to purchase a new iPhone for this feature, as similar functionality can be found in existing apps. It concludes by encouraging viewers to consider whether the new AI features are worth the upgrade cost, especially when standalone apps can provide similar experiences.

Mindmap

Keywords

💡Apple intelligence

Apple intelligence refers to the suite of AI features that Apple is integrating into its devices. In the context of the video, it encompasses a range of functionalities from on-device processing to cloud-based AI services. The script mentions Apple's approach to AI as a hybrid system that balances local device processing with cloud computing to optimize performance and security.

💡Hybrid AI system

A hybrid AI system is a combination of on-device and cloud-based AI processing. The video explains how Apple uses this approach for efficiency and data security. Simple tasks are handled locally on the phone, while more complex tasks are offloaded to Apple's private compute for heavy lifting, ensuring that user data remains secure without being sent to external servers.

💡On-device processing

On-device processing is a method where AI tasks are performed directly on the user's device, such as the iPhone. The video gives examples like notification summarization and emoji generation, which are handled by lightweight models running locally without the need for cloud intervention.

💡Private compute

Private compute is Apple's term for the cloud-based AI processing that complements on-device tasks. The video script explains that for more complex AI operations, Apple's private compute is engaged to ensure robust performance without compromising user data privacy.

💡Efficient models

Efficient models in the video refer to AI algorithms designed to perform tasks using minimal system resources. Apple's use of efficient models allows for features like notification summarization and audio recording summarization to run directly on the device without significant strain on the phone's hardware.

💡Chat playground

Chat playground is mentioned as a standalone app that uses AI models to assist with tasks like web browsing and information gathering. The video highlights its utility in making life easier by leveraging AI to process and summarize web content, offering a glimpse into how AI can enhance productivity.

💡Quantization

Quantization in the context of the video is a technique used to reduce the size of AI models by lowering the precision of their parameters. Apple applies quantization to make its on-device AI models more efficient, trading off some accuracy for significant gains in speed and reduced memory usage.

💡Semantic index

Semantic index is a system that organizes a user's data, such as messages, photos, and calendar events, to enable AI like Siri to provide more contextually relevant responses. The video suggests that this feature enhances Siri's capabilities, allowing it to handle more complex requests by accessing personal data on the device.

💡Visual intelligence

Visual intelligence, as discussed in the video, is Apple's version of image recognition technology, similar to Google Lens. It allows users to point their iPhone at objects, text, or landmarks to receive instant information. The video positions this as a selling point for the iPhone, showcasing Apple's integration of AI into everyday tasks.

💡Adaptors

Adaptors in the video are described as task-specific tweaks to the base AI model, allowing it to handle different tasks efficiently. Apple uses a technique called low-rank adaptation to load these adaptors only when needed, which helps in conserving memory and processing power on the iPhone.

💡Siri

Siri, Apple's voice assistant, is highlighted in the video as being significantly enhanced with new AI capabilities. It now has deeper integration with on-device data and apps, allowing for more personalized and complex interactions. The video suggests that these improvements could be a compelling reason for users to upgrade their iPhones.

Highlights

Apple's upcoming AI features are designed to be both powerful and secure.

Apple uses a hybrid AI system combining on-device processing and cloud computing.

Lightweight AI models run directly on the iPhone for tasks like notification summarizing.

For more complex tasks, Apple's private compute is used to ensure data security.

The new iPhone's AI features include on-device text summarization and style rewriting.

Apple's AI models are limited by the iPhone's RAM and processing power.

Chat playground is an AI browser co-pilot that uses AI models to simplify tasks.

Apple's AI system is designed to follow clear, detailed instructions to guide its behavior.

Apple is cautious about AI 'hallucinations' and instructs its AI not to make up factual information.

Apple's on-device intelligence is powered by a model called Open Elm.

Apple uses adapters for task-specific tweaks to the base AI model, improving efficiency.

Quantization is used to shrink the AI model for speed and efficiency.

Private Cloud Compute (PCC) is Apple's cloud-based AI for tasks that exceed on-device capabilities.

PCC ensures privacy and security by only sending necessary data to the cloud and deleting it after use.

Apple's transparency allows independent audits of PCC to verify its privacy and security claims.

The new Siri leverages on-device data and personal contacts for more personalized assistance.

Visual intelligence on the iPhone 16 offers instant insights into objects, text, and landmarks.

Apple's image playground allows for image generation using text prompts or activity suggestions.

Most of Apple's new AI features can be replicated on older iPhones with standalone apps.

The real value of Apple's AI features lies in their seamless integration across the system.

Transcripts

play00:00

so Apple intelligence is dropping soon

play00:02

and the features they are teasing look

play00:04

super cool you might even be thinking do

play00:07

I need to upgrade just for this but

play00:09

Apple's been kind of vague on how all

play00:10

this actually works so going to break it

play00:12

down for you and by the end of this

play00:14

video we'll figure out these AI features

play00:16

alone are worth dropping cash on a new

play00:19

iPhone 16 16 Pro 15 Pro whatever oh and

play00:23

also how you can use these AI features

play00:25

and older iPhone

play00:28

models so Apple using a hybrid AI system

play00:32

usually when you are running AI stuff on

play00:34

your phone you've got two options either

play00:36

it all happens in the cloud where your

play00:38

data gets sent to be processed or it's

play00:41

done directly on your device Apple's

play00:43

found a metal ground for simple tasks

play00:45

it's handled on your phone but for heavy

play00:47

lifting that's where Apple's private

play00:49

compute kicks in then the more complex

play00:51

work without shipping all your info to

play00:53

some random server it's designed to keep

play00:56

your data secure without losing any of

play00:58

the cold features

play01:01

most of the cool AI features they showed

play01:04

off at the keynote yeah those are

play01:06

running on small effici models right on

play01:09

your device take the notification

play01:10

summarizer for example us this a

play01:12

lightweight model to analyze all your

play01:14

notifications and sort them by urgency

play01:17

or priority then gives you a summary of

play01:19

what's important no need for the cloud

play01:21

here your phone handles it all by itself

play01:23

and those gen emojis you don't need the

play01:25

cloud to generate funny little emojis

play01:27

Apple's AI models can run directly on

play01:30

the phone without breaking a sweat same

play01:32

deal with summarizing audio recordings

play01:34

but the real standout is the on device

play01:37

writing TOS from what I've dug up there

play01:39

is a dedicated model for that and proof

play01:41

read your text and even rewrite it in

play01:44

different styles but since this is all

play01:46

running locally we're limited by the

play01:48

iPhone RAM and processing power that's

play01:51

probably why there aren't too many

play01:52

options for writing styles the iPhone

play01:54

just doesn't have the muscle to run more

play01:56

sophisticated models without slowing

play01:58

down you know even though we've got AI

play02:01

baked right into our devices these days

play02:03

there's still some Standalone apps out

play02:05

there that actually bring something new

play02:07

to the table one of those is chat play

play02:09

ground an AI browser co-pilot that's all

play02:11

about using the latest AI models and

play02:13

custom front workflows to make your life

play02:16

easier now we've all been there trying

play02:17

to gather a bunch of info for a new

play02:19

project video or whatever and it's all

play02:21

over the place chat playgrounds web

play02:23

co-pilot makes that process way easier I

play02:26

just open it up type in the request and

play02:28

pick from different AI models to get the

play02:30

best answers I can also drag an image

play02:32

from a web page right into the chat to

play02:35

get it described and explained which is

play02:37

super cool when learning something new

play02:39

and if I need to generate text based on

play02:41

the page content I just click use page

play02:43

content and start prompting what I want

play02:46

that's just a website of things there's

play02:47

also learning co-pilot which makes

play02:50

finding info even faster instead of

play02:52

spending hours researching Trends or

play02:54

searching through YouTube videos I can

play02:56

just ask chat playground and it will

play02:58

sort it out features like flashcards AI

play03:01

notes and chat with PDF are great for

play03:03

preparing work presentations pictures or

play03:06

whatever you've got going on one of my

play03:08

favorite features is the actual

play03:10

playground where I have multiple AI side

play03:13

by side making it so much easier to get

play03:15

the best answer each time I can have up

play03:17

to six AIS on one screen and there are

play03:20

many AIS to choose from Gemini Chad gbt

play03:23

claw Sonet perplexity Llama mistal Or

play03:26

bang what's especially cool is that with

play03:28

math playground you practically get

play03:30

access to all of them so with one

play03:32

purchase you're saving a ton of money I

play03:35

will leave a link in the description so

play03:36

be sure to head over to appsumo the

play03:38

sponsor of today's video to get chat

play03:41

playground and other awesome lifetime

play03:46

deals these new AI capabilities aren't

play03:49

just about cranking up the power of

play03:51

large language models Apple's really

play03:53

focusing on giv AI super clear detailed

play03:56

instructions to guide how it behaves

play03:58

think of it like given AI step by-step

play04:00

instructions on what to do and guess

play04:02

what some of these instructions recently

play04:04

leaked and they are let's just say very

play04:06

Apple Apple's approach is kind of like

play04:08

teaching a kid they give AI super direct

play04:10

simple instructions to keep it on track

play04:12

for example when it comes to summarizing

play04:14

messages like those notification

play04:16

summaries we talked about earlier the

play04:17

instruction goes something like this you

play04:19

are an expert at summarizing messages

play04:22

you prefer to use classes instead of

play04:24

complete sentences do not answer any

play04:26

question from the messages please keep

play04:28

your summary of the input put within a

play04:30

10w limmit you must keep to this R

play04:32

unless told otherwise if you don't it

play04:34

will not be helpful pretty clear right

play04:37

they break it down into small chunks so

play04:39

that AI doesn't get confused they've

play04:41

even got a name for their AI helpful

play04:44

male assistant but Apple's also being

play04:46

really cautious about something called

play04:48

hallucinations but those who don't know

play04:50

that's when AI just makes stuff up and

play04:53

they are telling their AI do not

play04:55

hallucinate do not make up factual

play04:57

information they've tested the system

play04:59

and from the looks of it it's good

play05:00

enough to roll out to all of us let's be

play05:03

real no llm is completely safe from

play05:05

hallucinations where errors so don't be

play05:07

surpris if you see some weird funny AI

play05:10

fails pop open the text on your shiny

play05:12

new iPhone 16 Pro is bound to

play05:16

happen so let's talk about how these on

play05:19

device models actually work for Apple at

play05:21

the core of Apple's on device

play05:22

intelligence their own model called open

play05:25

Elm now this thing is pretty small

play05:27

compared to the big boys it's good three

play05:29

billion parameters compare that to gbt 4

play05:32

which has a whopping 1.76 trillion

play05:35

parameters even though 3 billion might

play05:37

seem tiny keep in mind it's still a lot

play05:39

to run on a phone with just 8 gigs of

play05:42

RAM and only part of that is actually

play05:43

free to use for some context Google's

play05:46

Gemini Nano which runs in pixel phones

play05:48

has 1.8 billion parameters and those

play05:50

phones have 12 gigs of RAM so yeah apple

play05:53

really squeeze some serious efficiency

play05:55

out of this model without burning

play05:57

through your battery or tur your phone

play05:59

into Slowpoke to make things even

play06:01

smoother Apple introduced something

play06:03

called adapters think of these as small

play06:05

task specific tweaks that can be added

play06:08

to the base model instead of having

play06:10

multiple big models for each job they

play06:12

use something called Laura or low rank

play06:15

adaptation to load these adapters only

play06:17

when needed it's kind of like putting on

play06:19

a different hat for each task no need to

play06:22

have separate models for proof feeding

play06:24

email summarizing the voice memo or

play06:26

whatever so the model just switches

play06:28

adapter on the floor for example when

play06:30

you're having your email proofread Apple

play06:32

intelligence loads to proofread and

play06:34

adapter then when you switch to

play06:36

summarizing the voice memo it ditches

play06:38

the previous adapter and loads the one

play06:40

for summerization this keeps things

play06:42

super efficient couldn't down on how

play06:44

much memory and power the AI needs to

play06:47

run directly on your iPhone Apple's also

play06:49

using this adapter trick for hyper

play06:51

personalization or what they call

play06:54

personal context since all your personal

play06:56

data messages contacts calendar events

play06:58

is stored low Al the AI can use it to

play07:01

customize its responses specifically for

play07:03

you and because it's all on device your

play07:06

privacy stays intact but Apple didn't

play07:08

stop there they came up with something

play07:10

called quantization basically it's a way

play07:12

to shrink the model even more by

play07:15

lowering the Precision of its parameters

play07:17

in plain English it means that the model

play07:19

can trade off a bit of accuracy for

play07:22

Speed and efficiency reducing the memory

play07:24

it needs by up to four times and while

play07:26

that might sound like it would hurt

play07:28

performance Apple figur figured out how

play07:30

to adjust the quantization dynamically

play07:32

dialing up or down the precision based

play07:34

on the task so for simpler jobs the

play07:37

model doesn't need to be as precise and

play07:39

for more complex stuff it bumps the

play07:41

accuracy back

play07:42

[Music]

play07:45

up another key piece of Apple

play07:48

intelligence is private Cloud comput PCC

play07:51

this is Apple's cloud-based AI that

play07:53

kicks in when the on device model just

play07:55

can't handle something just think of it

play07:57

like a backup when the task is too big

play08:00

like asking Siri for some super detailed

play08:02

info or trying to generate an image and

play08:04

image playground your iPhone Taps into

play08:06

PCC Apple uses this orchestration layer

play08:09

to decide whether to keep the task on

play08:11

your device or push it to the cloud

play08:13

what's cool about PCC is how it handles

play08:16

privacy and Security First off every

play08:18

time your iPhone reaches out to PCC it

play08:21

sets up an encrypted connection so your

play08:23

data is safe during transmission but it

play08:25

gets better your phone decides what info

play08:27

to share with the cloud and only send

play08:30

the bare minimum nothing personal

play08:32

nothing sensitive just enough for PCC to

play08:34

do its job plus once the task is done

play08:37

whatever data was used is instantly

play08:40

deleted from Apple's servers nothing

play08:42

sticks around hopefully and those PCC

play08:45

servers are running on Apple silicon

play08:47

chips giving them added layers of

play08:49

security like secure enclave and secure

play08:52

boot just like your iPhone that makes

play08:53

the entire process even more loged down

play08:56

what really sets PCC apart is Apple

play08:59

app's transparency they've made it so

play09:01

that independent parties can audit the

play09:03

system to verify its work in exactly how

play09:06

Apple says it is so if apple is claiming

play09:08

that no user data is being stored or

play09:10

mishandled there's a way to actually

play09:12

check that pretty rare for a tech

play09:14

company to invite this kind of scrutiny

play09:16

but Apple's betting on the fact that

play09:18

their system can back up their

play09:23

promises all these advancements and

play09:25

clever techniques are what make apple

play09:27

intelligence possible but hence down the

play09:30

most exciting part is the new Siri Les

play09:32

be real Siri used to be a joke set in

play09:34

timers and maybe turn on the flashlight

play09:37

and that's about it but now Siri leveled

play09:40

up can access your on device data apps

play09:43

and personal contacts and ways it never

play09:45

could before this is thanks to something

play09:48

called semantic index basically it

play09:50

organizes your data messages photos

play09:53

calendar events so Siri can actually use

play09:55

it to handle requests like now Siri can

play09:58

combine info from your messages and

play10:00

calendar with web searches giving you

play10:03

Real Time answers like when is Dad

play10:05

landing at the airport and don't worry

play10:07

all that personal data stays on your

play10:10

device it's not getting sent to the

play10:12

cloud hopefully serious really the

play10:14

showcase for Apple's Haul on device

play10:16

versus Cloud balance simple stuff done

play10:19

locally bigger more complex tasks that's

play10:21

where PCC comes in and for super complex

play10:24

requests Siri can actually tap into Chad

play10:27

GPT of course this only happens with

play10:29

your permission and mainly when Siri

play10:31

needs to pull in World Knowledge that's

play10:34

outside your personal context this makes

play10:36

Siri pretty comparable to Google's

play10:38

Gemini which dropped with pixel 9 sure

play10:41

Gemini can set alarms and hit up the web

play10:43

too but it doesn't have the same deep

play10:45

integration into the system and apps

play10:48

like Siri Apple really turn Siri into

play10:51

digital assistant that's way more than

play10:54

just a voice it's becoming a personal AI

play10:56

that simplifies life in a way we haven't

play10:58

seen before now I will be honest I'd

play11:01

never really used Siri for more than

play11:03

alarms or checking the weather but I

play11:05

know plenty of people who use it all the

play11:07

time and for them all these new

play11:09

capabilities could make the iPhone

play11:10

upgrade totally worth it but let's say

play11:12

you don't want to Shell out for a new

play11:14

iPhone just for the new Siri can you

play11:16

replicate it well kind of the best

play11:18

workaround I found is combining Siri

play11:21

with Chad gbt you can use Siri for basic

play11:23

tasks like alarms and weather then

play11:26

create a shortcut in the shortcuts app

play11:28

to start a voice conversation with Chad

play11:30

gbt you could even assign that shortcut

play11:32

to double tap on the back glass that

play11:34

gives you a similar experience minus the

play11:37

Deep system integration and personal

play11:39

context not perfect but it works now

play11:41

while the new series is tough to

play11:43

replicate without a supported iPhone

play11:45

what about other Apple intelligence

play11:46

features you can get those in older

play11:48

iPhones like the 10r or 10s so it's not

play11:51

all locked behind the latest

play11:54

Hardware replicating the image

play11:56

playground feature is actually pretty

play11:58

simple even though it won't be a

play12:00

complete onetoone experience so what's

play12:02

image playground it's this built-in

play12:05

image generation tool and not only is it

play12:07

a standalone app but you can also fire

play12:09

it up straight from iMessage which is a

play12:11

nice touch with image playground you can

play12:13

generate images using text prompts or by

play12:15

choosing from suggestions based on your

play12:17

recent activities by conversations and

play12:19

messages or your Safari searches yeah it

play12:21

Taps into that personal context again so

play12:24

if you've been texting your friends

play12:25

about a hiken trip it might suggest

play12:27

prompts related to Nature mountains

play12:29

forests that kind of stuff what's cool

play12:31

is that it tries to handle the image

play12:34

generation locally on your device but to

play12:36

give than running smoothly it only

play12:38

supports three Styles sketch

play12:40

illustration and animation honestly all

play12:43

three look pretty basic nothing

play12:44

groundbreaking I'm guessing we'll get

play12:46

more styles with iOS 19 the image

play12:48

generation itself isn't anything we

play12:50

haven't seen before it's pretty much

play12:52

like M Journey or Dolly it's not that

play12:54

hard to replicate outside of Apple's

play12:56

ecosystem if you want a similar

play12:59

experience just download the Chad GPT

play13:01

app on your phone or Mac and use Dolly

play13:04

there to generate your images it's quick

play13:06

simple and gets the job done and if Chad

play13:08

gbt isn't your thing there are tons of

play13:10

other image Generation apps available

play13:13

just beware most of those are pretty

play13:15

aggressive with pay walls and limiting

play13:16

features unless you pay up in my opinion

play13:19

sticking with Chad gbt's dolly is your

play13:21

best bet it's reliable and relatively

play13:23

hassle-free

play13:24

[Music]

play13:27

another big Fe feature and honestly a

play13:30

selling point for the iPhone 16 is

play13:32

visual intelligence if you missed the

play13:34

keynote this is basically the ultimate

play13:36

introverts toll why ask someone about

play13:39

the breed of their dog when you can just

play13:41

snap a pick and let your phone tell you

play13:42

yeah apple has basically reinvented

play13:44

Google Lens but with a Twist so what

play13:46

does visual intelligence do you point

play13:48

your iPhone at an object text or even a

play13:51

landmark and boom you get instant

play13:54

insights if it's a restaurant you will

play13:56

see its card and Maps along with the

play13:58

menu and open hours if it's something

play14:00

like a bike you can search for similar

play14:02

ones but yeah we've all seen this before

play14:04

courtesy of Google Lens the only real

play14:06

differences with Apple's take well first

play14:09

it's integrated into the system so it

play14:11

syncs up with Siri and your personal

play14:14

contacts and second it's using a hybrid

play14:16

approach of on device processing and

play14:18

cloud computing to handle tasks now if

play14:21

you want basically the same thing

play14:23

without upgrading just install the

play14:25

Google app on your iPhone create a

play14:27

shortcut for visual search and short

play14:29

because and your set will work almost

play14:31

identically analyzing objects finding

play14:33

restaurants All That Jazz the only real

play14:35

downside is that Google Lens won't have

play14:37

the same system wide integration or

play14:39

access to your personal contacts but for

play14:41

most people it will feel 99% the same

play14:44

alternatively you can use the GPT the

play14:46

chat GPT app it's good image recognition

play14:49

built in and if you stick with one chat

play14:51

it even remembers what you've looked at

play14:53

for 99% of users that's more than enough

play14:57

to get the job done so yeah even your

play14:59

iPhone 10r can suddenly feel like a mini

play15:02

AI Powerhouse with the right

play15:07

tools but probably the easiest to

play15:10

replicate feature is the cleanup in

play15:12

photos it's super simple you tap addit

play15:14

in the picture grab the rasor from the

play15:16

menu and boom just like that the object

play15:18

you don't want is gone clean and easy

play15:20

but here's the thing this isn't dropping

play15:23

until October however you don't need to

play15:25

wait or even buy a new iPhone to do this

play15:28

if you install the go photos app right

play15:30

now it has a similar feature called

play15:32

Magic race that does the exact same

play15:34

thing so no need to rush and drop cash

play15:37

on the new iPhone just for

play15:40

that if you're someone who's heavy on

play15:43

Siri and think having a smarter more AI

play15:46

driven assistant is going to make your

play15:48

life a lot better yeah maybe consider

play15:50

the upgrade but if you are the kind of

play15:52

person who only asks Siri to set a timer

play15:54

once a week probably not worth the

play15:56

Splurge same goes for the hall visual

play15:59

intelligence thing if a free app like

play16:01

Google photos can erase stuff from

play16:03

pictures just as easily then honestly is

play16:06

this really a feature to upgrade for and

play16:09

as for image playground it's fun but

play16:11

honestly Chad GPT and other apps can

play16:13

generate way more interesting stuff in a

play16:16

wider range of styles so yeah you can

play16:18

easily replace most of these new AI

play16:20

features with Standalone apps but apple

play16:22

is about the seamless experience they

play16:24

want all these features to just work

play16:26

together across the whole system and

play16:28

that that's where the real value is so

play16:30

dropping a few hundred in a new iPhone

play16:32

might not be the worst idea if you're

play16:34

into that kind of integration but hey if

play16:36

you're still on the fence check out my

play16:39

other video where I break down some

play16:41

killer iPhones you can buy instead of

play16:43

the iPhone trust me it's worth watching

play16:45

thanks for tuning in and I will see you

play16:47

there

Rate This

5.0 / 5 (0 votes)

Related Tags
Apple AIiPhone UpgradeOn-Device ProcessingCloud ComputingAI AssistantImage GenerationPrivacy FocusedTech ReviewSiri EnhancementsVisual Intelligence