Apple has unveiled its long-rumoured suite of AI features, which it's calling Apple Intelligence. Sporting the strapline 'AI for the rest of us', Apple Intelligence is the company's attempt to capitalise on the AI craze that has captivated Wall Street – if not the consumer.
Although criticised by some for its slow response to AI mania, this approach to integrating AI is exactly what I would have expected from the company. Apple is widely known for being more deliberate in their approach to features, often observing what competitors are doing and then refining the experience. Apple Intelligence appears to have been developed in tandem with guiding tenets; anything they create must be "powerful, intuitive, integrated, personal, private."
At first glance, Apple Intelligence delivers on those principles. It is deeply integrated into the operating systems and harnesses the power of their exceptionally potent processors. Siri has finally been given a makeover and can now complete tasks in-app or provide you with instructions on how to do something yourself, for example, scheduling a text message. Siri also has a better contextual understanding of what you're looking at on screen, making answers more relevant. A lot of the processing will be completed on-device, but for more compute-intensive tasks, it will be transferred to Apple silicon-powered servers – which it calls 'Private Cloud Compute'.
Also included is a new feature called 'Genmoji', which allows you to create new emoji on the fly, and a DALL-E type feature called Image Playground, which can generate images based on prompts, although it is unable to create photorealistic imagery. As you might expect, Apple Intelligence can be used to generate or summarise text, and adjust the tone of your writing based on prompts.
Additionally, Apple announced a new partnership with OpenAI, allowing Siri to defer to ChatGPT-4o if users choose. The company kickstarted the AI craze with the release of ChatGPT back in November 2022, and is still treated as the gold standard for what an AI model could and should behave like, but Apple's inclusion is nonetheless an interesting concession.
It's an admission that Apple isn't able to outperform OpenAI's language models. Apple is happy to Sherlock a feature or product if it thinks it can do something better (or not, if the Apple Maps launch was anything to go by), but the partnership with OpenAI says that not only does Apple believe it can't compete, but it suggests it has no interest in doing so. Senior VP of software engineering Craig Federighi even stated they are looking to explore integrating other models in the future – leaving the door open for Google's Gemini to have a near-native experience on iPhone.
All of this sounds impressive, and when presented in Apple's sleek infotainment style, you would be forgiven for getting swept up in the hype. CEO Tim Cook even said, "All of this goes beyond artificial intelligence, it's personal intelligence, and it's the next big step for Apple." But let's summarise what Apple actually announced. Text and image generation, the ability for your devices to take action based on your personal information and contexts, and a chatbot interface for GPT-4o.
Those are all table stakes in the world of AI nowadays, and while they may have put the work in to add an enhanced layer of privacy, the reality is they haven't moved the needle at all in terms of what AI can and is doing for consumers. Google has had similar generative text features in its Workspace suite of apps for nearly a year, and Samsung’s Bixby has been able to use on-device functions since 2017. Apple Intelligence is flashy, and very Apple-like in the way it's so neatly integrated into their various operating systems, but it is simply a repackaging of old ideas.
I can't help but come back to the strapline: AI for the rest of us.
Watching it at the time, I thought it was a strange position to take, especially given Apple's ubiquity. Who are the rest of us? And what AI features does Apple think the "rest of us" need? And in announcing Apple Intelligence, I think Apple might have – knowingly or unknowingly – given away the answer.
There is no refuting that AI has some groundbreaking, truly life-advancing applications. But I think those will be in manufacturing, science, and computer engineering. For "the rest of us", i.e. the everyday person, AI is going to be a 'nice to have', rather than essential. Something we might occasionally call upon to help summarise a long email or article, or to play around with when messaging friends. It's not going to be transformative to how we live our lives, nor how we interact with friends, colleagues, or loved ones. Apple even has a warning when using ChatGPT, telling users to "check important info for mistakes," acknowledging the ongoing issues around hallucinations of these models. So, if the features are the same as the competition, and the chatbot element remains as fallible as ever, what's left?
Simply put, it's the ‘Narrative’. Apple – minus a few hiccups – is a marketing expert, and what we saw during the WWDC keynote and reiterated during the iPhone 16 announcement was perhaps the most coherent and compelling sales pitch for these AI features. Google’s glut of AI-feature announcements at I/O, while more technically impressive, were dense and inaccessible. By keeping it user-oriented, and focusing on the story of their product, Apple has managed, as it so often does, to simplify the complexity of LLMs and LAMs to the point where an everyday user can understand how it works and how it could fit into their lives.
Apple has come the closest to selling AI, but it's yet to be seen if the market is really buying it.