Apple Intelligence is arriving without many of the most hyped features that Apple announced in June. Although the company struck a deal with OpenAI to include ChatGPT in its software, the chatbot will not be part of this initial release. Siri also isn’t smart enough (yet) to do things like stitch together data from multiple apps to tell you whether a last-minute meeting will make you late for your child’s play. Apple said those features and others would be gradually rolled out through next year.
To get a sneak preview, I tested an early version of Apple Intelligence over the last week. The new features were a little tricky to find – they have been integrated into different parts of Apple’s software system, including into its buttons for editing text and photos.
I found a few features, including tools for proofreading text and transcribing audio, to be very handy. Others, like a tool for generating summaries of web articles and a button for removing unwanted distractions from photos, were so hit or miss that they should be ignored.
This is all to say that Apple Intelligence is worth watching over the next few years to see whether it evolves into a must-have product, but that it’s not a compelling reason to splurge on new hardware.
Apple Intelligence will work on the latest iPhone 16s and last year’s iPhone 15 Pro, as well as on some iPads and Macs released in the last four years. Here are the tools that will be most useful and the ones you can skip when the software lands on devices this month.
Apple Intelligence tools that are useful
Transcribe audio recordings
Apple Intelligence delivers a feature that feels long overdue: when you use the voice memos app to record audio, the app will now automatically produce a transcript alongside the file.
As a journalist who regularly records interviews, I was gung-ho about trying this tool and pleased that it worked well. When I met with a tech company last week, I pressed the record button in the app, and after I hit stop, the transcript was ready for me. Apple Intelligence detected whenever a different person was speaking and created a new paragraph accordingly in the transcript. It transcribed some words incorrectly whenever a person mumbled. But overall, the transcript made it easy for me to look up a keyword to pull a portion of the conversation.
Ask Siri for help with an Apple product
While it may be easy to use any smartphone or tablet, Apple’s software has grown increasingly complex over the years, so it can be difficult to know how to take advantage of features that are hard to find. Apple Intelligence has imbued Siri with the ability to offer help with navigating Apple products.
I can never remember, for the life of me, how to run two apps side by side on the iPad, for instance. So I asked Siri, “How do I use split screen on the iPad?” Siri quickly showed me a list of instructions, which involved tapping a button on the top of an app.
Ironically, Siri could not offer help on how to use Apple Intelligence to rewrite an email. Instead, it loaded a list of Google search results showing other websites with the steps.
Speed through writing
Speaking of email, Apple Intelligence includes writing tools to edit your words, and it can even generate canned email responses.
I used the automatic response tool to quickly shoo away a salesperson at a car dealership: “Thanks for reaching out. I’m no longer interested in purchasing a vehicle at this time.”
As for editing text, I highlighted an email I quickly wrote to a colleague and hit the “Proofread” button. Apple Intelligence quickly edited the text to insert punctuation that I had skipped.
Apple AI tools you can ignore
Removing distractions from photos
One of Apple Intelligence’s most anticipated features is the ability to automatically edit a photo to remove a distraction, such as a photo bomber in an otherwise perfect family portrait. Plenty of people will want to try this tool, called Clean Up, but prepare to be disappointed.
To try it, I opened a photo I shot of family members at an outdoor wedding a few years ago. I hit the “Clean Up” button with hopes of removing people sitting on lawn chairs in the background. The software deleted the people and lawn chairs, but they were replaced with an unintelligible jumble of black-and-white pixels.
I tried the tool again on a photo of my corgi, Max, sleeping on my couch next to a blanket. Apple Intelligence removed the blanket and tried to reproduce the couch cushion. Instead, it generated a deep, unflattering butt groove.
Summarising text
Apple seems to think that the internet is filled with too many words. One of Apple Intelligence’s most prominent features is its ability to generate summaries of text in many applications, including an email, a web article and documents.
By pressing the “Summarise” button in the Safari browser, I got a three-sentence summary of a 1200-word New York Times article about the pros and cons of eating tuna. Apple Intelligence summed up the premise of the article – that tuna was a nutritious food that could be high in mercury, and consumers should consider species of tuna with lower mercury levels.
Unfortunately, in its summary, Apple Intelligence recommended that people consume albacore, one of the species listed in the article as having the highest levels of mercury. This is what’s known in the tech industry as a hallucination, a common problem in which AI fabricates information after failing to guess the correct answer.
The tool also fell short summarising my notes. Recently, to prepare for an office meeting, I took notes on three colleagues I was going to meet with. Instead of producing a tight dossier on each person, the tool generated a summary of only one person’s role.
Apple declined to comment.
In summary, you can skip this tool.
This article originally appeared in The New York Times.
Written by: Brian X. Chen
Photographs by: Derek Abella
©2024 THE NEW YORK TIMES