Tech

A first look at Apple Intelligence and its (slightly) smarter Siri

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


In the latest developer preview of iOS 18, Siri takes center stage. Like, the whole phone glows around the edges when you summon Siri.

A home screen reintroduces you to the virtual assistant once you activate Apple Intelligence, an early version of which is now available on the iPhone 15 Pro and Pro Max in a developer beta. You’ll know Siri is listening when the edges of the screen glow, making it pretty obvious that something different is going on.

The big Siri AI update is still months away. This version comes with significant improvements in language understanding, but future updates will add features like recognizing what’s on the screen and the ability to take action on your behalf. Meanwhile, the rest of the Apple Intelligence feature set introduced in this update feels like a party waiting for the guest of honor.

That said, the Siri improvements in this update are useful. Double-tapping the bottom of the screen will open a new way to interact with the assistant: through text. It’s also much better at parsing natural language, more patiently waiting through hesitations and “um”s as I stumble through questions. He also understands when I’m asking a follow-up question.

Double-tapping the bottom of the screen opens a text box that you can use to talk to Siri.

The new Siri understands context in follow-up questions, like this one after I asked about the weather in Olympia.

Outside of Siri, it’s something of an Easter egg hunt to find bits of Apple’s intelligence scattered throughout the operating system. They’re in the email app, now with a summary button at the top of each email. And anywhere you can type and highlight text, you’ll find a new option called “writing tools” with AI proofreading, writing suggestions, and summaries.

“Help me write something” is pretty common for generative AI these days, and Apple Intelligence does it as well as anyone. You can make your text more friendly, professional, or concise. You can also create text summaries or summarize them in bulleted lists of key points or in a table.

I find these tools most useful in the Notes app, where you can now add voice recordings. In iOS 18, voice recordings finally come with automatic transcriptions, which is no an Apple Intelligence feature as it also works on my iPhone 13 Mini. But Apple Intelligence will let you turn a recording transcript into a summary or checklist. This is useful if you just want to free associate while recording a memo and listing a bunch of things you need to pack for an upcoming trip; Apple Intelligence turns this into a list that actually makes sense.

Honestly, this transcription is really good.

Apple Intelligence turned my disjointed list into a sleek little table.

These writing tools are out of the way, and if you didn’t look for them, you might miss them completely. The most obvious new AI features are in the email app. Apple Intelligence displays what it considers important emails in a card that sits above the rest of your inbox marked as priority. Below that, emails show a brief summary in place of the first line or two of text you would normally see.

There’s something charming about the AI’s sincere attempt to summarize promotional emails, trying to extract details like “Backpacks and lunch boxes ship for FREE” and “Organic white nectarines are sweet and juicy, in season now.” But the descriptions in my inbox were accurate—helpful in some cases and harmless at worst. And the emails it prioritized were genuinely important, which is promising.

The Photos app’s search tool now uses AI to understand more complicated requests. You can ask for photos of a certain person wearing glasses or all the food you ate in Iceland, all in natural language.

Despite the light show, Siri is almost the same as always

That’s it very good. Results come back quickly and are generally reliable. He found the photo I had in mind of my son wearing a pair of stupid glasses, although photos also surfaced of him and someone other was wearing glasses. Still, I think it will be a feature that people will get used to immediately and won’t think twice about using – intuitive and obviously useful.

But despite the light show, Siri is almost the same as always. For the most part, it remains a “let me Google that for you” machine. The most significant updates are yet to come in future updates, when Siri becomes aware of what’s on your screen and can take action in apps for you. In theory, you could have Siri take information from messages and turn them into calendar events or retrieve information from email without having to dig through your inbox.

That’s what I’m most excited about, and all the pieces of Apple Intelligence available so far could be the building blocks for a better Siri. Apple’s AI is capable of understanding the content of an email or photo. Likewise, Siri better understands how humans speak. For Apple Intelligence to prove its worth, Siri needs to connect the dots.



Source link

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 9,595

Don't Miss