Siri is broken - Here's how Apple plan to fix it... (Next-Gen Siri)
Summary
TLDRThe video examines the past, present, and future of Apple's Siri voice assistant. It explores how Siri has fallen behind competitors, become unreliable and frustrating for users, and lost top talent. The video speculates Apple is developing a next generation version of Siri powered by AI and aimed at making Siri a truly useful digital assistant. There is skepticism that Apple may restrict this improved Siri to only their latest iPhone models, perhaps to boost sales by locking a desirable feature. Overall the video is hopeful Apple can turn Siri into an incredible product, but uncertain if or when that may happen.
Takeaways
- 😟 Siri is outdated technology that frustrates users and fails to meet expectations
- 😠 Internal issues at Apple have stifled Siri's progress over the years
- 🤔 Siri's core technology of rules and databases prevents real assistant capabilities
- 😎 New large language models power the latest AI assistants with more advanced abilities
- 👎 Losing top talent to companies like Google has hurt Apple's voice assistant efforts
- 💡 Siri should understand users, summarize information, booked appointments, and more
- 📱 Samsung and other's voice assistants showcase more useful features
- 😕 The presence of old Siri on Apple's new Vision headset is puzzling
- 🤞 edgeAI computing could enable an advanced Siri while protecting user privacy
- 😮 A next gen iPhone may launch with a dramatically updated Siri later this year
Q & A
How old is the core technology behind Siri?
-The core technology behind Siri dates back to 2011, making it over 10 years old now.
What are some of the main complaints about Siri from users?
-Common complaints include Siri not understanding users properly, especially those with non-neutral accents, and Siri often just providing web search results rather than answering questions directly.
Why has Siri fallen so far behind other voice assistants?
-Reasons include the older technology at its core compared to more advanced AI like LLMs, Apple's reluctance to lose control over Siri's responses, and top talent leaving Apple to work on AI at other companies.
What does the author think Apple should do to improve Siri?
-The author suggests improvements like better speech recognition, being a true digital assistant by summarizing emails and other information, and having memory to recall facts the user has mentioned previously.
How might Apple restrict access to a new version of Siri?
-The author speculates Apple may limit the new Siri to only their latest iPhone models, rather than making it widely available, either to restrict bugs or encourage people to upgrade.
What is edge AI and why does it matter for Siri?
-Edge AI allows AI processing to happen on the device rather than sending data to remote servers. This improves privacy and allows Siri to work offline.
When does the author expect Apple to unveil its new Siri capabilities?
-At Apple's Worldwide Developer Conference (WWDC) in June 2023, along with the reveal of iOS 18.
How could a more advanced Siri benefit Apple Watch and HomePod users?
-With better natural language capabilities, Siri on those devices could become much more useful, responsive and intelligent.
Why might restricting new Siri to new iPhones be problematic?
-It risks annoying users who own recent but now incompatible iPads, Macs etc, forcing them to upgrade again after short periods, just for Siri access.
What does the author think is the main thing Apple needs to get right with Siri?
-Understanding users and interpreting speech correctly, since if Siri gets that part wrong, everything after will also fail.
Outlines
😀 Siri's Capabilities and Shortcomings
This paragraph provides background on the author's experience creating content about Siri. It notes that Siri is still considered a flagship Apple product but using it feels outdated, like 2011 technology on a 2024 phone. The author considers Siri an enigma - touted by Apple but frustrating to actually use.
😊 Promoting BetterHelp Mental Health Services
This paragraph promotes BetterHelp, an online therapy service. It notes how the new year can bring stress and pressure, and how talking to a therapist at BetterHelp can provide helpful, unbiased advice. It provides details on how BetterHelp works and notes that viewers can get 10% off their first month.
😟 Explaining Why Siri Has Gotten So Bad
This paragraph explores reasons why Siri has declined in quality, including old technology compared to modern AI, Apple's desire for control vs embarrassment, and top talent leaving for better opportunities elsewhere. It suggests the most likely reason is that Siri's software is outdated and can't be tweaked to compete with modern assistants.
😃 Envisioning What Siri Should Be
This paragraph discusses what the author wants in a voice assistant. Key points include understanding users, summarizing important information intelligently, making recommendations based on preferences, and generally behaving more like an actual assistant.
🤔 Speculating on Apple's Plans for Siri in 2023
This final paragraph speculates how Apple will handle Siri in 2023. It suggests Siri 2.0 will be announced at WWDC in June but may only work on new iPhone models. It also discusses the role of on-device AI and hopes Apple doesn't make major upgrades exclusive to push new hardware sales.
Mindmap
Please replace the link and try again.
Keywords
💡Siri
💡AI
💡LLMs
💡Edge AI
💡Privacy
💡Vision Pro
💡WWDC
💡iPhone 16
💡Digital assistant
💡User experience
Highlights
Siri is a bit of an enigma, still touted as a flagship Apple product but using it feels like 2011 technology on a 2024 phone.
Common complaints are that Siri often fails to replicate demonstrated functionality and heavily relies on web search instead of answering questions directly.
Within Apple, things are allegedly so bad that employees struggle to see how Siri can be fixed without a ground-up redesign.
Siri is essentially built around a limited database, so when asked something outside its knowledge set, problems arise.
Apple's strict control and past practice of manually scripting Siri responses could discourage a more flexible, ChatGPT-style approach.
The worse Siri has gotten, the more top talent has left for better AI opportunities at other companies.
Siri likely can't be meaningfully improved through tweaks - a wholesale replacement is needed.
The next voice assistant needs to deeply understand users and tasks to become an actual digital assistant.
Competing products like Samsung's Galaxy AI already offer advanced functionality like real-time translation.
It's confusing that Vision Pro runs the outdated Siri despite being Apple's biggest launch in a decade.
A new Siri likely exists but may not have been ready in time for Vision Pro's launch.
I think a completely new Siri will launch at WWDC in June and initially be exclusive to the iPhone 16 Pro.
On-device processing via Apple's Neural Engine chips may power new privacy-focused AI capabilities.
Restricting new Siri risks frustration, but could be a strategy to smoothly roll it out or spur iPhone upgrades.
If executed perfectly, new Siri could make Apple Watch and HomePod Mini the best Apple products ever.
Transcripts
I’ve found some web results, I’ll send them to Tom’s iPhone
OK, I’ve found this on the web for “it’s a bit
of an enigma in one way it’s still touted as a flagship…”
This is a video that I’ve wanted to make for a long time now. As
someone who has created a lot of content about Siri over the years,
I feel like I’ve got a pretty good idea of it’s capabilities, but also of it’s shortcomings,
and the things about it that frustrate people, of which there are many.
Siri is a bit of an enigma to me. In one way, it’s still touted as a flagship Apple product,
talked about in each iOS release each year, given new features and capabilities. But yet
using it feels like stepping into the past, it’s like running 2011 technology on a 2024 phone.
Which begs the question, how has it been able to get this bad? Does
Apple actually acknowledge that there’s a problem, and are they doing anything to
fix it? Can they even fix it? These are the questions I’ll try to answer in this video.
OK, let’s get into it.
To find out how bad things have gotten with Apple’s voice assistant, as I’ll call it as
much as possible from this point on to avoid setting your devices off, I’d personally start
by taking a look at some of the comments on the numerous videos I’ve made about it over the years.
A quick glance through the comments pages makes for pretty dire reading. A common
complaint is that people watch something on my video, and try to replicate it at home,
only for it not to work for them, for whatever mysterious reason, possibly
an issue understanding the user or a feature that’s been localised to only certain regions.
But another common complaint is Siri’s reliance on pointing people to articles on the web,
rather than trying to answer the question. I almost see this as a bit like a child
asking their parent a question. What the child wants in that moment is an answer,
not pointing to the local library where they can research the answer for themselves. When this
happens to people using Siri, they understandably get frustrated with it, and go and find the answer
some other way. If this happens too many times, the average person will simply stop using it,
hence why so many people you talk to will probably tell you that they just don’t bother using it.
Things aren’t just restricted to being bad for consumers though. Within Apple,
things are allegedly so bad that employees have pretty much given up on it, suggesting that they
struggle to see how Apple can fix their voice assistant, without going for a ground-up redesign.
In an article from The Information last year, they reported that the team working on Vision
Pro were left totally underwhelmed by the demonstrations that the Siri Team gave them
on how the voice assistant could control the headset, even going so far at one point as to
consider building an alternative method of controlling the device using voice.
So how has it gotten so bad?
There isn’t really a clear answer to this, but when you do a bit of research you can kind of
put the pieces of the puzzle together to come up with some possible explanations.
The first is that of the technology itself. Quite simply, the software running Apple’s voice
assistant is old technology, when compared with that which you’d find in the LLMs,
the Large Language Models of today’s AI powered assistants.
An LLM is an AI algorithm that uses deep learning techniques,
along with massive datasets to not only comprehend what people are asking it,
but also to generate human-like responses. They’re trained on huge amounts of data,
and can recognise, understand, translate and even predict text, making conversing with them
really simple, and increasing the likelihood of them being able to answer your question.
Siri on the other hand, whilst still being a form of AI, is essentially built around a
database. An article in Cult of Mac last year spoke with former Siri engineer John Burkey,
who explained that Siri is built around a database of words. Ultimately,
it knows what it knows, and if you’re asking it something that falls outside
of it’s limited knowledge set, this is where you’re going to run into problems.
It’s January as I’m making this video, and that means new year, new goals, new challenges. But
with that often comes more pressure to succeed, along with all the stress we already face
everyday. I know that’s how I feel at the start of each year, trying to improve upon the last, trying
to grow my business whilst also helping to raise my young family can be emotionally exhausting and
totally overwhelming at times. I’m pretty good at talking to my friends when I’m feeling burnt out,
but they’ve all get their stresses too, so I sometimes feel like I don't have anyone I can
talk to about the challenges I'm facing. Even if I did, sometimes it's hard to share what's going
on in your life - even with close friends, who might not always know how to advise us.
BetterHelp, who is a paid partner of this video, connect you with a credentialed
therapist who is trained to listen and provide helpful, unbiased advice. With BetterHelp,
you can organise therapy sessions through phone calls, video chats, or even messaging,
depending on your preference and comfort level. To start, you’ll fill out a questionnaire to
help assess your specific needs. Once you’ve done that, you’ll be matched with a therapist,
in most cases within 48 hours. You can schedule your sessions at a convenient time for you,
and if you feel that the therapist you’ve been matched with isn’t the right fit,
which is common when starting therapy, you can easily switch to a new one at no additional cost.
We all like to talk about the importance of getting in the gym or getting out for exercise
every day, so why not give your mind that same kind of care? Over 4 million people have used
BetterHelp to start living a healthier and happier life. If you think you might benefit from therapy,
give BetterHelp a try. Click the link in the description or visit
[betterhelp.com/properhonesttech](http://betterhelp.com/properhonesttech)
to get started. Doing so not only supports this channel, it also gets you 10% off your
first month, so you can connect with a therapist and see if it helps you.
Apple are also a company infamously devoted to control, and not the kind of company that
appreciate being made to look foolish. There was an article here in the UK this week about delivery
company DPD, who use an AI assisted Chatbot to help people looking for their parcels.
A customer decided to do some testing with the Chatbot to see what he could make it do,
beyond just asking about his delivery. He managed to get it to tell him a joke,
but then managed to get it to swear at him, and even write a poem about how bad DPD are as a
delivery firm. The company blamed it on an update that went wrong and claim that it’s now fixed,
but when you consider that Apple once admitted that they’d hired a team of writers to write Siri
responses, rather than leaving it to chance and letting Siri think up answers itself,
you can understand why a ChatGPT style approach just wouldn’t work for Apple.
And to add to Apple’s problems here, the worse things have gotten with Siri,
the more this has pushed their top talent to jump ship, in search of better opportunities
elsewhere. And you can understand why. If you’re someone who operates at the bleeding edge of AI,
are you going to work for someone like Open AI, who will essentially write you a blank cheque
and allow you to create the kind of AI you’ve always wanted to build, or are you going to
work for Apple, who will very much restrict what you can do? The answer is pretty clear,
in that Apple lost some of their best AI talent last year, to companies like Google and Microsoft.
The most likely reason for things being bad, in my opinion, is that things genuinely can’t get any
better for Siri. What i mean by that is, we’re no longer at a point where the software can be
tweaked and tuned to make it into something that people would actually want to use when
compared with the likes of Bard and ChatGPT. This isn’t a case of opening the bonnet and
tweaking the engine - this is essentially a write-off, and something else is needed.
Thankfully, I do believe that Apple is working on that, but we’ll talk about that more in a moment.
So what SHOULD Siri be? This is a really important question in my opinion. It’s all well and good
criticising Siri for what it’s not, but what do we actually want from our voice assistant?
We’ll start with the obvious - we want it to understand us, because if a voice assistant
gets this bit wrong, everything thereafter is also going to be wrong. And while this might
sound obvious, check online, ask your friends - people with anything other than a neutral accent
often struggle to get Siri to understand them. Here in the UK, we’ve got something
like 40 different dialects, with multiple accents branching off from those. And we’re a tiny nation,
imagine what this would look like in the US. Whatever comes next, it’s got to be a night
and day improvement when it comes to that most basic task of hearing us, and understanding us.
With that out of the way, we want our voice assistants to be actual assistants. If you
think about it, that was the whole point of Siri all along. Help me to get more done in my day,
by being my digital assistant. And in some ways,
it does. But the limitations are what’s stopping it from being amazing.
Let’s take a simple example. You wake up in the morning, and as a busy businessperson,
you’ve woken up to 150 unread messages in your inbox. You ask your iPhone voice assistant about
your email, and it begins to read you your most recent emails, in chronological order.
My guess is that, pretty quickly, you tell it to stop, and you go and grab your phone.
Let’s say instead that you’re using ‘Siri 2.0’ as we’ll call it, the AI powered version. You ask it
about your email, and it tells you that you’ve received 150 emails overnight. But out of those
emails, the most important is a complaint from one of your customers, who’s angry because the service
you’ve signed them up to isn’t working. Siri has recognised this, and has automatically drafted a
reply back to the customer, letting them know how sorry you are and that you’re looking into
the issue, all you have to do is approve it, edit if you like and then ask it to send it. It’s also
booked a call with one of your engineers for that morning so you can talk through the issue
with them. It then proceeds to summarise the 10 other ‘mid level’ importance emails for you.
One of those examples is actually useful, and is the way in which a real assistant would function.
And you can take that functionality, and apply it to pretty much any other area of the iPhone’s
ecosystem. It should be able to not only capture notes in audio format from you,
but transcribe those notes, format them correctly, and then provide a summary of said notes.
It should be able to summarise my favourite podcasts for me, let me know about new music
that I might want to listen to based on a deep understanding of my tastes,
it should be able to think on it’s feet when we’re out in the car using Maps,
and quickly divert me away where needed based on realtime data, not just traffic.
One thing I really think it should be able to do is remember things,
and recall them back to me. What I mean by that, is let’s say that I’m getting fitted for a suit,
and I get measured. That information is useful to remember so that I can order things online
in the correct size, but it’s a pain having to write a note each time. Plus that note now just
lives forever in my Notes app, when chances are I’m only going to look at it one more time, when
the information is needed. It would be much easier to say, “remember my collar measurement is xyz”,
and then when I need the information, be able to say “remind me what my collar measurement is”,
and it simply gives you the answer. This to me is a much more ‘assistant’ way of working.
And while this might sound farfetched, much of this is already possible on competing
systems. Samsung have for example, just announced their Galaxy AI,
which can do much of this, including amazing things like real-time translation of phonecalls,
automatically summarising the contents of group chats, and circle to search,
where you can literally circle anything on your phone’s screen, and find out more about it.
There’s one other product that I wanted to mention in this video, because it adds to my confusion
with Siri, and that’s Vision Pro. Apple’s VR/AR/Spatial computing headset is their biggest
launch in a decade, possibly of Tim Cook’s tenure at Apple. It’s the bleeding edge of AR technology,
and yet, unless we find out otherwise when people get their’s on the 2nd of February,
it runs Siri, software that has only somewhat improved since it’s launch in 2011.
This is weird to me, but there are a few reasons why Apple may have chosen to do this.
One, is that the next generation Siri that we’re all hoping for either doesn’t exist,
or it doesn’t exist in a state suitable to be put into Vision Pro just yet. The
idea of it not existing at all is very unlikely to me. I think
it’s more likely that it’s not in a state that’s ready for general usage.
Plus, if Apple does release a Siri 2.0, I think it’s the kind of product that they’re going to
want some fanfare around, which means a major launch. I’d expect it to be announced at WWDC
this June, and launched in September alongside new iPhones, which of course doesn’t coincide
with Vision Pro. And whilst that might sound like an odd choice, keep in mind that Apple
have very conservative estimates about how many Vision Pros they’re going to ship this year,
so the number of people even likely to experience Siri on Vision Pro is tiny,
by Apple numbers. They can simply add it to Vision Pro 2 or 3 or whatever.
The other explanation, and one which I’m not so keen on, but absolutely do believe
as being possible, is that this is going into a dedicated ‘AI Phone’, for release in September.
And that leads us to the next talking point of the video.
Up until recently, I had no idea what Edge AI was,
but it’s potentially the most important part of this whole discussion, in terms
of Siri seeing actual improvements. Edge AI is essentially ‘on device’ AI.
So the problem with services like ChatGPT for example, from Apple’s perspective,
is that it involves you submitting requests to remote servers for processing, with responses then
being generated back. You don’t know this while you’re doing it of course, but it’s the reason
why you a) always need an internet connection to use ChatGPT and b) can run ChatGPT on pretty
much anything with an internet connection. It’s also a privacy concern, from Apple’s perspective.
On-device AI allows for AI related tasks to be dealt with entirely locally, without ever
leaving your device. This both negates the need for an internet connection, but it also alleviates
those privacy concerns, because the data stays on your device, in a secure part of the phone’s
chip. This will almost certainly be handled by a beefed up version of Apple’s Neural Engine,
a component of their silicon that exists on their A and M series chips, designed specifically for
these sorts of tasks. Right now, the Neural Engine is used to power features like Face ID,
Memojis, offline dictation and OCR, and pretty much everything computational with photos and
videos. Quite simply, Apple already do a LOT when it comes to AI, they just don’t call it AI, they
call it Machine Learning, and it’s not as flashy as what some of their competition are doing.
So, here’s how I think the rest of the year is going to play out.
Vision Pro will launch, and it will be what it will be,
I’m probably going to make a separate video about that. But importantly for this topic,
it will launch with regular Siri, and Apple will get away with it, because while Siri
will be annoying to use on Vision Pro, the hype of Vision Pro will divert from much of the criticism.
At WWDC in June, Apple will unveil their take on generative AI. Whether
this is seen as two separate products, one for AI and one for Siri, I’m unsure,
I think they’re more likely to package it altogether as ‘All New Siri’ or something similar.
But here’s the thing. I think that for most of their WWDC Keynote, it won’t get a mention. I
think that iOS18 will be shown, with it’s new features and functionality, like any
other WWDC. And that’s because I think that ‘All New Siri’ will only run on brand new hardware. I
think it will be exclusive to the iPhone 16 Pro, perhaps even the long speculated iPhone 16 Ultra.
And I think that the only way you’ll be able to enjoy All New Siri,
is by upgrading to the latest and greatest iPhone.
There are a few reasons why I think this.
There is the first possibility, which is that it will only be able to run on the
processor that they put in their latest phone. I don’t buy this, personally,
Apple silicon for phones and tablets is already way overpowered for most use cases,
but it could be something that Apple claim, it could even be factual.
The optimist in me thinks that it’s because Apple want to test this out on a much smaller subset of
users, before pushing out to the wider audience. There are around 2 billion iPhone owners in the
world, and if you assume that half of those have phones that could be capable of running the new
Siri, that’s potentially a lot of people to report on possible negative experiences, while they iron
out the inevitable kinks. Remember I told you that Apple don’t like being embarrassed, and they hate
being out of control. Restrict it to only the people who bought the new model, the people who
are likely your biggest fans anyway, and you can make changes in a much less public environment.
The cynic in me thinks this is about money. iPhone sales have plateaued in recent years,
Mac sales are down, despite the incredible advancements Apple have made in their own silicon.
In many ways, the quality of Apple’s devices has become their number one problem - people don’t see
a reason to upgrade so often, with the average user holding onto their phones now for 4, up to
5 years, computers even longer. Apple need to fix this, and one way to fix it is to lock out
an undeniably killer feature, restricting it to only their top of the range phones.
I hope I’m wrong. Because my concern is if it’s restricted to only the top of the range iPhones,
what about the rest of the lineup? I have an M2 iPad Pro - do I have to replace that?
I’ve got an M2 Max MacBook Pro and an M2 Ultra Mac Studio, blazingly powerful Mac
computers that are both less than a year old. Will they run new Siri, or am I going to be
having to replace those next year? I hope this isn’t the case, because anything else
is going to feel like one hell of a cash grab, admittedly by the richest company in the world.
That said, if Apple get this right, then the Apple Watch,
and the HomePod Mini could be about to become the most incredible Apple products ever. Just imagine,
next generation Siri capabilities on your Watch. Now that’s exciting.
I’d love to hear your thoughts on this, so drop me a comment and let me know. Also,
regular channel viewers, what do you think of the
new ‘video essay’ format? Tell me what you think, I’ve got loads more planned.
And as ever, if you found this video useful, do please consider leaving me a like,
and subscribing to my channel for more content like this in the future.
See you on the next video.
Browse More Related Video
![](https://i.ytimg.com/vi/nw0XUie7PqM/hq720.jpg)
iOS 18 será la mayor ACTUALIZACIÓN de la historia del iPhone ⚠️
![](https://i.ytimg.com/vi/-qOJbyJ3s98/hq720.jpg)
What We Expect from WWDC and Google I/O
![](https://i.ytimg.com/vi/hkkZIw_CjFs/hq720.jpg)
Apple AI is here and it's EPIC - ChatGPT + 25 New AI Updates
![](https://i.ytimg.com/vi/58lVqKConL4/hq720.jpg)
iOS 17.3 is Out! - What's New?
![](https://i.ytimg.com/vi/6KeVsjQmftc/hq720.jpg)
Apple WWDC 2024: AI, iOS 18 and More in Less than Six Minutes | WSJ
![](https://i.ytimg.com/vi/EPpK8jIbOTk/hq720.jpg)
15 ACTUAL iPhone Tricks You Didn't Know Existed!
5.0 / 5 (0 votes)