Apps are here, but there’s still a lot to learn from (Is it still going to be too late to make a leap in artificial intelligence?)
Some of the things that are coming look really promising. Apple is catching up to other companies in the field of artificial intelligence. But no phone maker has yet created a cohesive set of time-saving AI tools. Apple might be arriving late, but the game is just getting started.
Other features like smart replies are supposed to do what they are supposed to but don’t have a human touch. I didn’t send any of the suggestions because they conveyed the right feelings. If I’m going to take the time to respond to a text, I might as well just write “That’s tough” myself rather than have AI do it, you know? Isn’t that part of the point of texting someone? I also prompted Photos to create a memory of moments of my kid, which it did but titled it the eerily impersonal “Joyous Moments with Child.”
Also, there is an upgrade of Siri. It looks different, but you don’t have to use it for long to realize it’s just the same old thing with the same features. It handles natural language better and includes more product knowledge to help you find settings on your iPhone, but that’s about it right now. Apple has promised big updates for Siri down the road, and features like a ChatGPT extension are scheduled to arrive by the end of the year. The ability to take action in apps will be one of the biggest developments in the foreseeable future.
Source: Apple Intelligence is here, but it still has a lot to learn
Magic Cleaner: Using Artificial Intelligence on the iPhone to Remove Objects from Photos, Texts, and Doorbell Notifications
For smaller objects in the background the tool does a good job. But it’s only about as good as Google’s older Magic Eraser tool in Google Photos — occasionally, it’s better, but it’s not as good as Google’s Magic Editor, which uses generative AI for incredibly convincing object removal. The tool runs in a cloud, so it is a little bit apples to oranges. I can use Google Photos’ on-device Magic Eraser tool on my four-year-old iPhone 12 Mini, and the results are pretty close to what I get with Clean Up running on the iPhone 16 — not a great argument for the AI phone upgrade cycle.
The new Clean Up tool is available in your editing options. It’s designed to remove objects from a scene in less than a second and can be used to highlight or outline the objects you want removed. It runs on-device, so you only have to wait a few moments, and you’ll see the selected object (mostly) disappear.
It is quite ironic that Artificial Intelligence tries to summarize a string of gossipy text messages or notifications from your doorbell. It also showed me some important information in a string of text messages from my friend and I might have been able to read them later. That was helpful to me.
In the Mail app, AI summaries appear where the first line of an email would normally show up when you’re viewing an entire inbox; there’s also an option to summarize individual emails. Maybe it’s a reflection of how useless email has become, but I didn’t find either of these features terribly helpful. We already have something that summarizes an email. The subject line. Most of the time, I get emails that are short and to the point. Maybe Tim Cook saves himself a lot of time reading long emails, but personally, I could live without a little summary of every email the DNC sends me asking for three dollars by midnight.
Apple uses AI to summarize groups of notifications so you can catch up on what you missed faster. A new focus mode can help you summarize emails and keep them focused. After a week of using these things, I don’t feel like they saved me much time or energy.
It summarized my work emails with the phrase “medical emergency” as part of it. I checked my inbox to see what was going on. The person who said they were late due to a medical emergency was correct that they were fine. It wasn’t an important work email—glad to hear they were fine—but the summary made me check my inbox when I didn’t need to. I found myself clicking into my notifications more than once, because Apple Intelligence highlighted something that was not crucial but still important.
Summaries are an important part of Apple Intelligence. You can use it to get an overview of web pages and even your notifications. If you have multiple messages from a group chat, the summary will highlight important things that were said and you’ll be able to click in to see the full details. I have yet to get use of this, as my summaries are often garbled.
Messages and Mail has the option to send Smart Replies, based on the context of the conversation, like “Thank you,” or “Sounds good.” This can be useful, but it’s hard to get excited over a feature that has been in the email service for a while.
This is new, though, and you can type it now. Apple has been baking this into the experience for years, finally catching up with other speakers that have had this capability before. Siri also has a new design, with a glowing effect around the screen, and the ability to understand queries a little easier, even if you trip up while asking the question. It feels almost the same in day-to-day use, despite a new coat of paint, and might feel a bit let down.
Can you do anything right now? Writing Tools can help you to write, proofread or summarize your text while in the operating system. Rewrite changes the sentence’s tone from casual to professional, for example, while Proofread fixes typos and improves grammar. Too bad it’s nearly impossible to remember this feature exists, because it only shows up when you highlight words. Perhaps Writing Tools would be better as a little button built into the virtual keyboard.