Online exclusive
Peter Griffin’s consumer tech columns appear fortnightly on listener.co.nz
Apple may have been slow to embrace let alone mention artificial intelligence in recent years, happy for its competitors to jump on the generative AI bandwagon.
But that changed this month when the smartphone giant revealed its brand of AI, which it dubbed Apple Intelligence, will be powered by ChatGPT creator OpenAI, and built into Apple’s high-end and late-model devices when its latest operating system iOS18 debuts later this year.
Apple has been in the AI game for some time, allowing certain apps to process AI workloads on its devices. But with iOS18, Apple infuses AI into its operating system at a fundamental level, which will make Apple’s own apps more intuitive, give its Siri voice assistant a significant upgrade, and let third-party app makers take advantage of AI on the device to supercharge their own apps.
Apple’s key pitch to iPhone users is that processing AI tasks on the phone avoids sending sensitive user data to cloud computing platforms, making it less likely to be hacked, stolen or inappropriately accessed. It is hoping to build on its reputation for prioritising security and privacy.
There’s a lot to unpack about Apple Intelligence but here are five key things you need to know about it, particularly if you will be in the market for a new iPhone, Mac computer or tablet in 2024.
1. It won’t be available on most iPhones
Let’s get the bad news out of the way upfront. The new AI features that will debut on the iPhone won’t be compatible with at least 90% of existing iPhone models. Apple Intelligence will only be compatible with the latest high-end models, including the iPhone 15 Pro (from $2099) and the iPhone 15 Pro Max (from $2499) that contain Apple’s A17 Pro chip. That’s just two of 24 models that will be compatible with iOS18.
The situation is better when it comes to Apple Mac computers, MacBooks and iPads. Apple intelligence will be available on computers with Apple’s M1 processor or higher. That equates to five of the 15 iPad models that are compatible with the new iPadOS18 operating system, and 13 of the 18 Mac computers compatible with macOS 15 Sequoia.
Apple hopes its AI features will revive its falling iPhone sales by triggering an upgrade cycle in which iPhone users will buy premium models to take advantage of Apple Intelligence. To get the sales boost it desires, Apple will need to make Apple Intelligence available across a broader range of devices. I’d expect to see that happen either in September, with the launch of the iPhone 16 series, or next year at the latest.
2. Initial AI features are interesting but not revolutionary
Apple’s demo of its AI features felt a bit familiar because Apple’s competitors have already released them in one form or other. But with its name for simplicity and user-friendliness, Apple has the potential to take AI accessibility to the next level and build it into everyday use.
ChatGPT built-in
Apple will integrate ChatGPT functionality into its operating system. Users can select ChatGPT as their default intelligent chatbot when formulating messages, working on documents and using the Siri voice assistant. This won’t come at an extra cost but is a feature included for Apple users armed with a compatible device.
AI-powered writing
Apple’s spelling and grammar-checking features are fairly barebones, which is why many people working on documents use third-party apps like Grammarly or ProWritingAid. Apple Intelligence will include system-wide writing tools. These will allow you to summarise documents and edit and proofread them automatically.
By selecting portions of text, you can automatically turn them into key point lists, and include them in tables. You can rewrite a document in a different tone of voice. Users of Microsoft Copilot are familiar with these features, but this is the first time they will come to Apple, native to the device.
Message summaries
Priority Messages is a panel at the top of the Apple Mail inbox that displays urgent messages, which isn’t necessarily a new thing. But now you’ll see summaries of the messages, so you don’t have to open them to find out their content.
Long email threads can also be summarised quickly so you can get up to speed with their contents. Smart Reply will give you suggestions for an email response but not with simple automated responses. It will read the emails, figure out the context and answer questions posed by the sender.
Auto transcription
Journalists will applaud this new AI feature. In the Notes and Phone app on the iPhone, you’ll be able to record, transcribe and summarise audio. To get around privacy issues, all parties to the call are notified when the call is recorded. At the end of the call, the user receives a summary of the call recapping key points covered. As a user of Otter and other transcription tools, this will be hugely valuable in helping people remember the contents of calls and follow-up action points.
Better options for generating images
It has never really appealed to me as someone who likes to express myself primarily with words, but iPhone owners are big users of emojis and customised photos to amuse themselves and their network of contacts.
This is now much easier and interactive with Image Playground, a way to quickly generate images in messages based on text prompts and suggestions based on the contents of the conversation. Genmoji will do the same thing for emojis, letting you create an emoji featuring yourself or a contact, drawing on images to come up with relevant imagery.
3. Siri is getting a big overhaul
Apple’s voice assistant Siri has ticked over for 13 years now and won legions of fans. But it hasn’t become any more intelligent or useful in recent years. It reads out calendar appointments, finds tidbits of information from the web, helps you turn on lights and adjust your air conditioning, and gives you directions when you are driving.
But with Apple Intelligence, Siri gains the ability to see data stored in your apps, including third-party apps, and can describe text and images on your screen. This means you will be able to make more complicated requests of it.
This is useful for Apple’s own apps, such as Music, Calendar and Mail. But it will really come into its own as third-party developers enhance their apps to work with the new Siri. That’s when interacting with many of the apps that sit on your iPhone will become much more intuitive.
The example Apple gave during its demo last week was: “For example, a user can say, ‘Play that podcast that Jamie recommended,’ and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email.”
That would be very useful indeed. Siri will also draw on ChatGPT if you request it to, which will allow for much more sophisticated responses, given ChatGPT is one of the most sophisticated AI chatbots around.
4. Not everything AI will be handled on the device
Apple will use the processing power of the iPhone, iPad or Mac to perform an AI task on the device if it can. For simple tasks like summarising emails or searching Notes, that should be fairly quick and useful. However, if Apple Intelligence determines that it needs more information, either general knowledge from the web, or additional data owned by an Apple user, it will need to look beyond the device for answers.
In the first instance, it will connect to new servers that Apple has dubbed Private Cloud Compute, which run large models on the computer chips Apple has created itself. This is a bit like what happens when an Apple user connects to their iCloud account, but with a layer of AI included, too.
If Apple Intelligence needs a lot of external knowledge to answer the question, it will draw on ChatGPT, in which case the request is sent to OpenAI, Apple’s artificial intelligence partner. That was enough to have SpaceX and Tesla founder Elon Musk threatening to ban iPhones from his workplaces, dubbing them a security risk.
But Apple says that ChatGPT requests originating in Siri will have their IP (internet protocol) addresses obscured and will not be stored by OpenAI. Users will also be notified that ChatGPT is being drawn on for the answer, so can decide not to use that method.
5. No big bang, but a gradual roll-out of AI
Since Apple Intelligence was launched last week, it has emerged that Apple will be taking a cautious approach to its rollout in the coming months. Some features may not arrive until next year.
As Bloomberg reported: “When the software ultimately launches in the fall [NZ’s spring], it will arrive as a preview, signalling to users that it’s not quite ready for prime time. It will only work on a subset of Apple’s devices and only in American English. In some cases, users may even have to join a waitlist to use features.”
Users will be able to access many of the upgrades to Apple’s operating systems, such as the ability to summarise emails, and create Genmojis. But key Siri improvements will come later, as well as aspects of the ChatGPT integration.
In essence, it may be well into 2025 before Apple users with compatible devices get the full Apple Intelligence experience. The staggered rollout means that Apple can test the waters, stress-test features and scale up its private cloud capacity as required.
Will it be simple enough?
Apple Intelligence represents a huge upgrade for Apple and the user experience across its key platforms of iOS (iPhone) iPadOS (iPads) and macOS (Macs and MacBooks).
The staggered roll-out suggests Apple is wary of biting off more than it can chew. There’s a lot of complexity involved with a new partnership with OpenAI, Apple’s new private cloud infrastructure, a supercharged Siri and a large number of AI-powered features being turned on.
There’s a risk that a misstep, either in the form of security or privacy issues or AI going rogue, could taint the Apple Intelligence brand. An equally big risk is that Apple Intelligence launches without nailing Apple’s trademark simplicity and user-friendliness. If it is too fiddly to use, confusing or turns out to be more hassle than what it is replacing, Apple is in trouble.
Many Apple users will miss out on the experience anyway, until they upgrade to hardware compatible with Apple Intelligence. As such, Apple’s AI revolution won’t really take off until late 2025, by which point it will have had time to smooth out the wrinkles and deliver an AI experience worth having.