close
close

Apple iPhone 16 Pro and iPhone 16 Pro Max in the test: The smarter iPhones

Apple iPhone 16 Pro and iPhone 16 Pro Max in the test: The smarter iPhones

Creating summaries seems to be something everyone wants to do with AI, and Apple Intelligence is ready to do the same. You can have it summarize your emails, messages, and even your notifications from third-party apps. Some of this can be handy, like when the Mail app shows an urgent-sounding email in its summary that I would have missed if I had just glanced at the massive email collection. But most of the time I just swipe the summary away and dive into all the notifications.

Speaking of which, there is a summary feature in Safari, but you have to put the web page into reading mode. It is these things that make it difficult to find those smart features and remember they’re there. At least I was able to summarize an 11,000 word story and get the gist when I didn’t have time to sit down and read it. (Sorry.) I’ll forgive you if you summarize this review.

Arguably the most helpful Apple Intelligence features for me as a journalist who attends multiple briefings a month are the new transcription tools in the Notes, Voice Memos, and even the Phone app. Press Record in Voice Memos and Notes, and the apps will transcribe conversations in real time! If you’re on a call, tap the record button. After both parties are notified, the call will begin recording and you’ll get a transcription saved in your Notes app.

With all of these recordings, a lot depends on the microphone quality of the person on the other end. Either way, it’s definitely better than no transcription at all. Too bad there are no speaker labels, like in Google’s Recorder app. You also can’t search through these recordings to find a specific quote. (Technically, you can if you add the transcription to your note in the Notes app, but that’s an extra step.)

The Photos app also gets an Apple intelligence infusion, and the highlight here is the Clean Up feature. Just like Google’s Pixel phones, which introduced Magic Eraser more than three years ago, you can now erase unwanted objects in the background of your iPhone photos. This works pretty well in my experience, although I’m a little surprised that Apple gives you so much freedom when it comes to deleting anything. I completely erased my eye from a selfie. I erased all the fingers from my hand. (Google’s feature doesn’t let you erase part of a person’s face.)

Video: Julian Chokkattu

Next, I deleted my cup that was in front of my face when I took a sip, and Clean Up tried to restore the rest of my face that was previously hidden, with some appalling results. (I tried this on the Pixel 9, by the way, and the results were just as bad, although Google did give me more options.) As my colleague said in Slack, “They both seem to have been trained with pictures of Bugs Bunny.”

Apple Intelligence will offer even more. Image Playground will let you generate images. Genmoji will let you create new kinds of emoji that currently only exist in your head. Siri will be better at providing more contextual information. But I’ll have to revisit Apple Intelligence when those features arrive later this year. Just a reminder: Apple Intelligence is part of the next iOS 18 update, but it’s only available on select devices: the iPhone 15 Pro, 15 Pro Max, and the entire iPhone 16 lineup.

Leave a Reply

Your email address will not be published. Required fields are marked *