Introducing Apple Intelligence: A Sneak Peek at the Enhanced Siri

Commenti · 359 Visualizzazioni

In the latest developer preview of iOS 18, Siri receives a notable upgrade. The new feature includes an eye-catching visual cue: the edges of your iPhone light up when you activate Siri.

 

In the latest developer preview of iOS 18, Siri receives a notable upgrade. The new feature includes an eye-catching visual cue: the edges of your iPhone light up when you activate Siri.

With the introduction of Apple Intelligence, now available in a developer beta on the iPhone 15 Pro and Pro Max, Siri’s listening mode is more visually distinct. The glowing edges make it clear when Siri is active, signaling that something has changed.

Although the major Siri AI overhaul is still several months away, the current update brings substantial improvements in language comprehension. Future iterations promise even more advanced features, such as contextual awareness of your screen and the ability to perform tasks on your behalf. For now, the Apple Intelligence features included in this update feel like a preview of what’s to come.

Among the enhancements, Siri’s ability to interpret natural language has notably improved. A new text-based interaction mode, accessed by tapping the bottom of the screen twice, adds flexibility. Siri is also better at handling pauses and hesitations, and it now comprehends follow-up questions more effectively.

Beyond Siri, Apple Intelligence is subtly integrated throughout iOS. For instance, in the Mail app, a new “summarize” button provides quick overviews of emails. The “writing tools” feature, available in text input areas, offers AI-powered proofreading, writing suggestions, and summaries.

The “Help me write something” function, a staple of generative AI, is present and effective. You can refine your text to be more friendly, professional, or concise, and generate summaries or key points. In the Notes app, where voice recordings are now automatically transcribed (a feature available on older iPhones as well), Apple Intelligence can convert these transcriptions into summaries or checklists. This is particularly useful for organizing thoughts or creating task lists from recorded memos.

While these writing tools are somewhat tucked away, the more prominent AI features are found in the Mail app. Here, important emails are highlighted in a priority card above your inbox, and email summaries replace the initial lines of text you’d normally see. The AI’s attempts to summarize promotional emails are both charming and functional, extracting relevant details effectively.

The Photos app now includes an AI-driven search tool that understands more complex queries, like finding photos of a person wearing glasses or all the food you ate in Iceland. This feature is intuitive and generally accurate, quickly locating the images you’re looking for.

Despite these improvements, Siri remains largely the same in its core functionality, often resorting to “let me Google that for you” responses. The most exciting updates, such as Siri’s ability to understand the content of your screen and take actions within apps, are still on the horizon. These features have the potential to make Siri a more integrated and capable assistant, leveraging Apple Intelligence’s advancements in understanding and processing information.

 

Commenti