Siri is broken - Here's how Apple plan to fix it... (Next-Gen Siri)

Proper Honest Tech
29 Jan 202416:55

Summary

TLDRThe video examines the past, present, and future of Apple's Siri voice assistant. It explores how Siri has fallen behind competitors, become unreliable and frustrating for users, and lost top talent. The video speculates Apple is developing a next generation version of Siri powered by AI and aimed at making Siri a truly useful digital assistant. There is skepticism that Apple may restrict this improved Siri to only their latest iPhone models, perhaps to boost sales by locking a desirable feature. Overall the video is hopeful Apple can turn Siri into an incredible product, but uncertain if or when that may happen.

Takeaways

  • 😟 Siri is outdated technology that frustrates users and fails to meet expectations
  • 😠 Internal issues at Apple have stifled Siri's progress over the years
  • 🤔 Siri's core technology of rules and databases prevents real assistant capabilities
  • 😎 New large language models power the latest AI assistants with more advanced abilities
  • 👎 Losing top talent to companies like Google has hurt Apple's voice assistant efforts
  • 💡 Siri should understand users, summarize information, booked appointments, and more
  • 📱 Samsung and other's voice assistants showcase more useful features
  • 😕 The presence of old Siri on Apple's new Vision headset is puzzling
  • 🤞 edgeAI computing could enable an advanced Siri while protecting user privacy
  • 😮 A next gen iPhone may launch with a dramatically updated Siri later this year

Q & A

  • How old is the core technology behind Siri?

    -The core technology behind Siri dates back to 2011, making it over 10 years old now.

  • What are some of the main complaints about Siri from users?

    -Common complaints include Siri not understanding users properly, especially those with non-neutral accents, and Siri often just providing web search results rather than answering questions directly.

  • Why has Siri fallen so far behind other voice assistants?

    -Reasons include the older technology at its core compared to more advanced AI like LLMs, Apple's reluctance to lose control over Siri's responses, and top talent leaving Apple to work on AI at other companies.

  • What does the author think Apple should do to improve Siri?

    -The author suggests improvements like better speech recognition, being a true digital assistant by summarizing emails and other information, and having memory to recall facts the user has mentioned previously.

  • How might Apple restrict access to a new version of Siri?

    -The author speculates Apple may limit the new Siri to only their latest iPhone models, rather than making it widely available, either to restrict bugs or encourage people to upgrade.

  • What is edge AI and why does it matter for Siri?

    -Edge AI allows AI processing to happen on the device rather than sending data to remote servers. This improves privacy and allows Siri to work offline.

  • When does the author expect Apple to unveil its new Siri capabilities?

    -At Apple's Worldwide Developer Conference (WWDC) in June 2023, along with the reveal of iOS 18.

  • How could a more advanced Siri benefit Apple Watch and HomePod users?

    -With better natural language capabilities, Siri on those devices could become much more useful, responsive and intelligent.

  • Why might restricting new Siri to new iPhones be problematic?

    -It risks annoying users who own recent but now incompatible iPads, Macs etc, forcing them to upgrade again after short periods, just for Siri access.

  • What does the author think is the main thing Apple needs to get right with Siri?

    -Understanding users and interpreting speech correctly, since if Siri gets that part wrong, everything after will also fail.

Outlines

00:00

😀 Siri's Capabilities and Shortcomings

This paragraph provides background on the author's experience creating content about Siri. It notes that Siri is still considered a flagship Apple product but using it feels outdated, like 2011 technology on a 2024 phone. The author considers Siri an enigma - touted by Apple but frustrating to actually use.

05:00

😊 Promoting BetterHelp Mental Health Services

This paragraph promotes BetterHelp, an online therapy service. It notes how the new year can bring stress and pressure, and how talking to a therapist at BetterHelp can provide helpful, unbiased advice. It provides details on how BetterHelp works and notes that viewers can get 10% off their first month.

10:02

😟 Explaining Why Siri Has Gotten So Bad

This paragraph explores reasons why Siri has declined in quality, including old technology compared to modern AI, Apple's desire for control vs embarrassment, and top talent leaving for better opportunities elsewhere. It suggests the most likely reason is that Siri's software is outdated and can't be tweaked to compete with modern assistants.

15:04

😃 Envisioning What Siri Should Be

This paragraph discusses what the author wants in a voice assistant. Key points include understanding users, summarizing important information intelligently, making recommendations based on preferences, and generally behaving more like an actual assistant.

🤔 Speculating on Apple's Plans for Siri in 2023

This final paragraph speculates how Apple will handle Siri in 2023. It suggests Siri 2.0 will be announced at WWDC in June but may only work on new iPhone models. It also discusses the role of on-device AI and hopes Apple doesn't make major upgrades exclusive to push new hardware sales.

Mindmap

The video is abnormal, and we are working hard to fix it.
Please replace the link and try again.

Keywords

💡Siri

Siri is Apple's voice assistant technology. The video focuses heavily on Siri, its history, capabilities, limitations, and future. Siri is described as outdated compared to competitors, frustrating for users, and in need of a 'ground up redesign'. The narrator speculates Apple is working on a next generation Siri using AI and edge computing.

💡AI

AI, or artificial intelligence, refers to the technology powering modern voice assistants like Siri. The video contrasts Siri's 'database' approach with new AI techniques like large language models (LLMs) used by Google and Amazon. The narrator says Apple needs to rebuild Siri using more advanced AI to compete.

💡LLMs

LLMs or large language models are a type of AI that can understand natural language, generate human-like responses, and power conversational assistants. According to the video, LLMs make assistants like Google Assistant more capable than Siri. Apple will likely need to adopt LLMs to rebuild Siri.

💡Edge AI

Edge AI refers to on-device artificial intelligence, where processing happens locally on the user's device. The video suggests Apple may use edge AI to power a new version of Siri for privacy reasons, without needing the cloud.

💡Privacy

Privacy is a major concern for Apple. The video suggests a rebuilt Siri using edge AI could alleviate privacy concerns associated with cloud-based assistants that process user data externally.

💡Vision Pro

Apple's Vision Pro headset will apparently launch with the current version of Siri. The video sees this as evidence of Apple's next gen Siri not being ready yet, predicting it will come later in 2024.

💡WWDC

WWDC is Apple's annual developer conference where they announce new software like iOS and new hardware. The narrator predicts the next major version of Siri will be announced at WWDC in June.

💡iPhone 16

The narrator speculates the new Siri could be exclusive to the iPhone 16 range expected in September, to limit availability while Apple perfects it.

💡Digital assistant

The video emphasizes Siri should function as a true digital assistant - understanding the user, completing tasks, being helpful. This is contrasted with the current limited version of Siri.

💡User experience

The poor user experience of the current Siri is a recurring theme. Users are frustrated by limitations, lack of comprehension of accents, and preference for web search over answering questions.

Highlights

Siri is a bit of an enigma, still touted as a flagship Apple product but using it feels like 2011 technology on a 2024 phone.

Common complaints are that Siri often fails to replicate demonstrated functionality and heavily relies on web search instead of answering questions directly.

Within Apple, things are allegedly so bad that employees struggle to see how Siri can be fixed without a ground-up redesign.

Siri is essentially built around a limited database, so when asked something outside its knowledge set, problems arise.

Apple's strict control and past practice of manually scripting Siri responses could discourage a more flexible, ChatGPT-style approach.

The worse Siri has gotten, the more top talent has left for better AI opportunities at other companies.

Siri likely can't be meaningfully improved through tweaks - a wholesale replacement is needed.

The next voice assistant needs to deeply understand users and tasks to become an actual digital assistant.

Competing products like Samsung's Galaxy AI already offer advanced functionality like real-time translation.

It's confusing that Vision Pro runs the outdated Siri despite being Apple's biggest launch in a decade.

A new Siri likely exists but may not have been ready in time for Vision Pro's launch.

I think a completely new Siri will launch at WWDC in June and initially be exclusive to the iPhone 16 Pro.

On-device processing via Apple's Neural Engine chips may power new privacy-focused AI capabilities.

Restricting new Siri risks frustration, but could be a strategy to smoothly roll it out or spur iPhone upgrades.

If executed perfectly, new Siri could make Apple Watch and HomePod Mini the best Apple products ever.

Transcripts

play00:00

I’ve found some web results,  I’ll send them to Tom’s iPhone

play00:03

OK, I’ve found this on the web for “it’s a bit  

play00:06

of an enigma in one way it’s  still touted as a flagship…”

play00:09

This is a video that I’ve wanted  to make for a long time now. As  

play00:12

someone who has created a lot of  content about Siri over the years,  

play00:15

I feel like I’ve got a pretty good idea of it’s  capabilities, but also of it’s shortcomings,  

play00:19

and the things about it that frustrate  people, of which there are many.

play00:23

Siri is a bit of an enigma to me. In one way,  it’s still touted as a flagship Apple product,  

play00:28

talked about in each iOS release each year,  given new features and capabilities. But yet  

play00:33

using it feels like stepping into the past, it’s  like running 2011 technology on a 2024 phone.

play00:40

Which begs the question, how has  it been able to get this bad? Does  

play00:43

Apple actually acknowledge that there’s  a problem, and are they doing anything to  

play00:46

fix it? Can they even fix it? These are the  questions I’ll try to answer in this video.

play00:52

OK, let’s get into it.

play00:54

To find out how bad things have gotten with  Apple’s voice assistant, as I’ll call it as  

play00:58

much as possible from this point on to avoid  setting your devices off, I’d personally start  

play01:03

by taking a look at some of the comments on the  numerous videos I’ve made about it over the years.

play01:08

A quick glance through the comments pages  makes for pretty dire reading. A common  

play01:12

complaint is that people watch something on  my video, and try to replicate it at home,  

play01:16

only for it not to work for them, for  whatever mysterious reason, possibly  

play01:20

an issue understanding the user or a feature  that’s been localised to only certain regions.

play01:25

But another common complaint is Siri’s reliance  on pointing people to articles on the web,  

play01:30

rather than trying to answer the question.  I almost see this as a bit like a child  

play01:34

asking their parent a question. What the  child wants in that moment is an answer,  

play01:39

not pointing to the local library where they can  research the answer for themselves. When this  

play01:43

happens to people using Siri, they understandably  get frustrated with it, and go and find the answer  

play01:48

some other way. If this happens too many times,  the average person will simply stop using it,  

play01:52

hence why so many people you talk to will probably  tell you that they just don’t bother using it.

play01:57

Things aren’t just restricted to being  bad for consumers though. Within Apple,  

play02:01

things are allegedly so bad that employees have  pretty much given up on it, suggesting that they  

play02:05

struggle to see how Apple can fix their voice  assistant, without going for a ground-up redesign.

play02:10

In an article from The Information last year,  they reported that the team working on Vision  

play02:14

Pro were left totally underwhelmed by the  demonstrations that the Siri Team gave them  

play02:19

on how the voice assistant could control the  headset, even going so far at one point as to  

play02:23

consider building an alternative method  of controlling the device using voice.

play02:28

So how has it gotten so bad?

play02:31

There isn’t really a clear answer to this, but  when you do a bit of research you can kind of  

play02:34

put the pieces of the puzzle together to  come up with some possible explanations.

play02:39

The first is that of the technology itself.  Quite simply, the software running Apple’s voice  

play02:44

assistant is old technology, when compared  with that which you’d find in the LLMs,  

play02:48

the Large Language Models of  today’s AI powered assistants.

play02:52

An LLM is an AI algorithm that  uses deep learning techniques,  

play02:55

along with massive datasets to not only  comprehend what people are asking it,  

play03:00

but also to generate human-like responses.  They’re trained on huge amounts of data,  

play03:04

and can recognise, understand, translate and  even predict text, making conversing with them  

play03:09

really simple, and increasing the likelihood  of them being able to answer your question.

play03:13

Siri on the other hand, whilst still being  a form of AI, is essentially built around a  

play03:18

database. An article in Cult of Mac last year  spoke with former Siri engineer John Burkey,  

play03:24

who explained that Siri is built  around a database of words. Ultimately,  

play03:28

it knows what it knows, and if you’re  asking it something that falls outside  

play03:31

of it’s limited knowledge set, this is  where you’re going to run into problems.

play03:35

It’s January as I’m making this video, and that  means new year, new goals, new challenges. But  

play03:40

with that often comes more pressure to succeed,  along with all the stress we already face  

play03:44

everyday. I know that’s how I feel at the start of  each year, trying to improve upon the last, trying  

play03:48

to grow my business whilst also helping to raise  my young family can be emotionally exhausting and  

play03:54

totally overwhelming at times. I’m pretty good at  talking to my friends when I’m feeling burnt out,  

play03:58

but they’ve all get their stresses too, so I  sometimes feel like I don't have anyone I can  

play04:02

talk to about the challenges I'm facing. Even if  I did, sometimes it's hard to share what's going  

play04:07

on in your life - even with close friends,  who might not always know how to advise us.

play04:12

BetterHelp, who is a paid partner of this  video, connect you with a credentialed  

play04:16

therapist who is trained to listen and provide  helpful, unbiased advice. With BetterHelp,  

play04:20

you can organise therapy sessions through  phone calls, video chats, or even messaging,  

play04:24

depending on your preference and comfort level.  To start, you’ll fill out a questionnaire to  

play04:29

help assess your specific needs. Once you’ve  done that, you’ll be matched with a therapist,  

play04:33

in most cases within 48 hours. You can schedule  your sessions at a convenient time for you,  

play04:38

and if you feel that the therapist you’ve  been matched with isn’t the right fit,  

play04:41

which is common when starting therapy, you can  easily switch to a new one at no additional cost. 

play04:46

We all like to talk about the importance of  getting in the gym or getting out for exercise  

play04:50

every day, so why not give your mind that same  kind of care? Over 4 million people have used  

play04:54

BetterHelp to start living a healthier and happier  life. If you think you might benefit from therapy,  

play05:00

give BetterHelp a try. Click the  link in the description or visit  

play05:03

[betterhelp.com/properhonesttech](http://betterhelp.com/properhonesttech)  

play05:05

to get started. Doing so not only supports  this channel, it also gets you 10% off your  

play05:10

first month, so you can connect with  a therapist and see if it helps you.

play05:14

Apple are also a company infamously devoted  to control, and not the kind of company that  

play05:19

appreciate being made to look foolish. There was  an article here in the UK this week about delivery  

play05:24

company DPD, who use an AI assisted Chatbot  to help people looking for their parcels.  

play05:29

A customer decided to do some testing with  the Chatbot to see what he could make it do,  

play05:33

beyond just asking about his delivery.  He managed to get it to tell him a joke,  

play05:37

but then managed to get it to swear at him, and  even write a poem about how bad DPD are as a  

play05:42

delivery firm. The company blamed it on an update  that went wrong and claim that it’s now fixed,  

play05:47

but when you consider that Apple once admitted  that they’d hired a team of writers to write Siri  

play05:51

responses, rather than leaving it to chance  and letting Siri think up answers itself,  

play05:56

you can understand why a ChatGPT style  approach just wouldn’t work for Apple.

play06:01

And to add to Apple’s problems here,  the worse things have gotten with Siri,  

play06:05

the more this has pushed their top talent to  jump ship, in search of better opportunities  

play06:09

elsewhere. And you can understand why. If you’re  someone who operates at the bleeding edge of AI,  

play06:14

are you going to work for someone like Open AI,  who will essentially write you a blank cheque  

play06:18

and allow you to create the kind of AI you’ve  always wanted to build, or are you going to  

play06:22

work for Apple, who will very much restrict  what you can do? The answer is pretty clear,  

play06:26

in that Apple lost some of their best AI talent  last year, to companies like Google and Microsoft.

play06:32

The most likely reason for things being bad, in  my opinion, is that things genuinely can’t get any  

play06:37

better for Siri. What i mean by that is, we’re  no longer at a point where the software can be  

play06:42

tweaked and tuned to make it into something  that people would actually want to use when  

play06:46

compared with the likes of Bard and ChatGPT.  This isn’t a case of opening the bonnet and  

play06:51

tweaking the engine - this is essentially  a write-off, and something else is needed.

play06:56

Thankfully, I do believe that Apple is working on  that, but we’ll talk about that more in a moment.

play07:01

So what SHOULD Siri be? This is a really important  question in my opinion. It’s all well and good  

play07:07

criticising Siri for what it’s not, but what  do we actually want from our voice assistant?

play07:11

We’ll start with the obvious - we want it to  understand us, because if a voice assistant  

play07:15

gets this bit wrong, everything thereafter is  also going to be wrong. And while this might  

play07:20

sound obvious, check online, ask your friends -  people with anything other than a neutral accent  

play07:25

often struggle to get Siri to understand  them. Here in the UK, we’ve got something  

play07:29

like 40 different dialects, with multiple accents  branching off from those. And we’re a tiny nation,  

play07:35

imagine what this would look like in the US.  Whatever comes next, it’s got to be a night  

play07:39

and day improvement when it comes to that most  basic task of hearing us, and understanding us.

play07:44

With that out of the way, we want our voice  assistants to be actual assistants. If you  

play07:49

think about it, that was the whole point of Siri  all along. Help me to get more done in my day,  

play07:53

by being my digital assistant. And in some ways,  

play07:56

it does. But the limitations are  what’s stopping it from being amazing.

play08:00

Let’s take a simple example. You wake up in  the morning, and as a busy businessperson,  

play08:04

you’ve woken up to 150 unread messages in your  inbox. You ask your iPhone voice assistant about  

play08:10

your email, and it begins to read you your  most recent emails, in chronological order.  

play08:15

My guess is that, pretty quickly, you tell  it to stop, and you go and grab your phone.

play08:19

Let’s say instead that you’re using ‘Siri 2.0’ as  we’ll call it, the AI powered version. You ask it  

play08:24

about your email, and it tells you that you’ve  received 150 emails overnight. But out of those  

play08:29

emails, the most important is a complaint from one  of your customers, who’s angry because the service  

play08:34

you’ve signed them up to isn’t working. Siri has  recognised this, and has automatically drafted a  

play08:39

reply back to the customer, letting them know  how sorry you are and that you’re looking into  

play08:42

the issue, all you have to do is approve it, edit  if you like and then ask it to send it. It’s also  

play08:47

booked a call with one of your engineers for  that morning so you can talk through the issue  

play08:51

with them. It then proceeds to summarise the  10 other ‘mid level’ importance emails for you.

play08:56

One of those examples is actually useful, and is  the way in which a real assistant would function.

play09:01

And you can take that functionality, and apply  it to pretty much any other area of the iPhone’s  

play09:06

ecosystem. It should be able to not only  capture notes in audio format from you,  

play09:11

but transcribe those notes, format them correctly,  and then provide a summary of said notes.

play09:16

It should be able to summarise my favourite  podcasts for me, let me know about new music  

play09:20

that I might want to listen to based  on a deep understanding of my tastes,  

play09:24

it should be able to think on it’s feet  when we’re out in the car using Maps,  

play09:27

and quickly divert me away where needed  based on realtime data, not just traffic.

play09:33

One thing I really think it should  be able to do is remember things,  

play09:36

and recall them back to me. What I mean by that,  is let’s say that I’m getting fitted for a suit,  

play09:41

and I get measured. That information is useful  to remember so that I can order things online  

play09:45

in the correct size, but it’s a pain having to  write a note each time. Plus that note now just  

play09:51

lives forever in my Notes app, when chances are  I’m only going to look at it one more time, when  

play09:55

the information is needed. It would be much easier  to say, “remember my collar measurement is xyz”,  

play10:02

and then when I need the information, be able to  say “remind me what my collar measurement is”,  

play10:06

and it simply gives you the answer. This to  me is a much more ‘assistant’ way of working.

play10:11

And while this might sound farfetched, much  of this is already possible on competing  

play10:15

systems. Samsung have for example,  just announced their Galaxy AI,  

play10:19

which can do much of this, including amazing  things like real-time translation of phonecalls,  

play10:24

automatically summarising the contents  of group chats, and circle to search,  

play10:28

where you can literally circle anything on your  phone’s screen, and find out more about it.

play10:33

There’s one other product that I wanted to mention  in this video, because it adds to my confusion  

play10:38

with Siri, and that’s Vision Pro. Apple’s  VR/AR/Spatial computing headset is their biggest  

play10:43

launch in a decade, possibly of Tim Cook’s tenure  at Apple. It’s the bleeding edge of AR technology,  

play10:49

and yet, unless we find out otherwise when  people get their’s on the 2nd of February,  

play10:53

it runs Siri, software that has only  somewhat improved since it’s launch in 2011.

play10:59

This is weird to me, but there are a few  reasons why Apple may have chosen to do this.

play11:03

One, is that the next generation Siri that  we’re all hoping for either doesn’t exist,  

play11:07

or it doesn’t exist in a state suitable  to be put into Vision Pro just yet. The  

play11:12

idea of it not existing at all  is very unlikely to me. I think  

play11:16

it’s more likely that it’s not in a  state that’s ready for general usage.

play11:19

Plus, if Apple does release a Siri 2.0, I think  it’s the kind of product that they’re going to  

play11:24

want some fanfare around, which means a major  launch. I’d expect it to be announced at WWDC  

play11:29

this June, and launched in September alongside  new iPhones, which of course doesn’t coincide  

play11:34

with Vision Pro. And whilst that might sound  like an odd choice, keep in mind that Apple  

play11:38

have very conservative estimates about how many  Vision Pros they’re going to ship this year,  

play11:43

so the number of people even likely to  experience Siri on Vision Pro is tiny,  

play11:47

by Apple numbers. They can simply add  it to Vision Pro 2 or 3 or whatever.

play11:52

The other explanation, and one which I’m  not so keen on, but absolutely do believe  

play11:56

as being possible, is that this is going into a  dedicated ‘AI Phone’, for release in September.

play12:02

And that leads us to the next  talking point of the video.

play12:05

Up until recently, I had no idea what Edge AI was,  

play12:08

but it’s potentially the most important  part of this whole discussion, in terms  

play12:12

of Siri seeing actual improvements.  Edge AI is essentially ‘on device’ AI.

play12:18

So the problem with services like ChatGPT  for example, from Apple’s perspective,  

play12:23

is that it involves you submitting requests to  remote servers for processing, with responses then  

play12:28

being generated back. You don’t know this while  you’re doing it of course, but it’s the reason  

play12:32

why you a) always need an internet connection  to use ChatGPT and b) can run ChatGPT on pretty  

play12:38

much anything with an internet connection. It’s  also a privacy concern, from Apple’s perspective.

play12:43

On-device AI allows for AI related tasks to  be dealt with entirely locally, without ever  

play12:49

leaving your device. This both negates the need  for an internet connection, but it also alleviates  

play12:53

those privacy concerns, because the data stays  on your device, in a secure part of the phone’s  

play12:58

chip. This will almost certainly be handled by  a beefed up version of Apple’s Neural Engine,  

play13:03

a component of their silicon that exists on their  A and M series chips, designed specifically for  

play13:08

these sorts of tasks. Right now, the Neural  Engine is used to power features like Face ID,  

play13:14

Memojis, offline dictation and OCR, and pretty  much everything computational with photos and  

play13:19

videos. Quite simply, Apple already do a LOT when  it comes to AI, they just don’t call it AI, they  

play13:25

call it Machine Learning, and it’s not as flashy  as what some of their competition are doing.

play13:29

So, here’s how I think the rest  of the year is going to play out.

play13:32

Vision Pro will launch, and  it will be what it will be,  

play13:35

I’m probably going to make a separate video  about that. But importantly for this topic,  

play13:38

it will launch with regular Siri, and Apple  will get away with it, because while Siri  

play13:43

will be annoying to use on Vision Pro, the hype of  Vision Pro will divert from much of the criticism.

play13:48

At WWDC in June, Apple will unveil  their take on generative AI. Whether  

play13:53

this is seen as two separate products,  one for AI and one for Siri, I’m unsure,  

play13:58

I think they’re more likely to package it  altogether as ‘All New Siri’ or something similar.

play14:02

But here’s the thing. I think that for most of  their WWDC Keynote, it won’t get a mention. I  

play14:08

think that iOS18 will be shown, with it’s  new features and functionality, like any  

play14:12

other WWDC. And that’s because I think that ‘All  New Siri’ will only run on brand new hardware. I  

play14:18

think it will be exclusive to the iPhone 16 Pro,  perhaps even the long speculated iPhone 16 Ultra.

play14:24

And I think that the only way you’ll  be able to enjoy All New Siri,  

play14:28

is by upgrading to the latest and greatest iPhone.

play14:30

There are a few reasons why I think this.

play14:33

There is the first possibility, which is  that it will only be able to run on the  

play14:37

processor that they put in their latest  phone. I don’t buy this, personally,  

play14:41

Apple silicon for phones and tablets is  already way overpowered for most use cases,  

play14:46

but it could be something that Apple  claim, it could even be factual.

play14:49

The optimist in me thinks that it’s because Apple  want to test this out on a much smaller subset of  

play14:54

users, before pushing out to the wider audience.  There are around 2 billion iPhone owners in the  

play14:59

world, and if you assume that half of those have  phones that could be capable of running the new  

play15:03

Siri, that’s potentially a lot of people to report  on possible negative experiences, while they iron  

play15:09

out the inevitable kinks. Remember I told you that  Apple don’t like being embarrassed, and they hate  

play15:13

being out of control. Restrict it to only the  people who bought the new model, the people who  

play15:18

are likely your biggest fans anyway, and you can  make changes in a much less public environment.

play15:23

The cynic in me thinks this is about money.  iPhone sales have plateaued in recent years,  

play15:27

Mac sales are down, despite the incredible  advancements Apple have made in their own silicon.  

play15:33

In many ways, the quality of Apple’s devices has  become their number one problem - people don’t see  

play15:37

a reason to upgrade so often, with the average  user holding onto their phones now for 4, up to  

play15:42

5 years, computers even longer. Apple need to  fix this, and one way to fix it is to lock out  

play15:48

an undeniably killer feature, restricting  it to only their top of the range phones.

play15:53

I hope I’m wrong. Because my concern is if it’s  restricted to only the top of the range iPhones,  

play15:58

what about the rest of the lineup? I have  an M2 iPad Pro - do I have to replace that?  

play16:03

I’ve got an M2 Max MacBook Pro and an M2  Ultra Mac Studio, blazingly powerful Mac  

play16:08

computers that are both less than a year old.  Will they run new Siri, or am I going to be  

play16:13

having to replace those next year? I hope  this isn’t the case, because anything else  

play16:17

is going to feel like one hell of a cash grab,  admittedly by the richest company in the world.

play16:21

That said, if Apple get this  right, then the Apple Watch,  

play16:24

and the HomePod Mini could be about to become the  most incredible Apple products ever. Just imagine,  

play16:29

next generation Siri capabilities  on your Watch. Now that’s exciting.

play16:34

I’d love to hear your thoughts on this, so  drop me a comment and let me know. Also,  

play16:38

regular channel viewers, what do you think of the  

play16:40

new ‘video essay’ format? Tell me what  you think, I’ve got loads more planned.

play16:44

And as ever, if you found this video useful,  do please consider leaving me a like,  

play16:48

and subscribing to my channel for  more content like this in the future.

play16:52

See you on the next video.