free counter

Everything to know about Apple's AI features coming to iPhones, Macs, and iPads

Everything to know about Apple's AI features coming to iPhones, Macs, and iPads
7

During WWDC 2024, Apple poured a big vat of artificial intelligence onto expectant viewers, leaving us drenched in new AI features under the banner of Apple Intelligence. But how do all these features work?

Siri is getting an upgrade, inside and out. The virtual assistant has a new look and a new list of features for users. The Siri logo has been redesigned, and instead of seeing the Siri bubble when users talk to it, a new glowing, colorful light that wraps around the device’s edges will illuminate the display. 

Looks aside, Siri will now understand and respond more naturally, making interactions feel more human-like. The AI assistant will also be able to maintain context from one request to the next to answer follow-up questions accurately. 

Also: Every iPhone model that will get Apple’s iOS 18 (and which ones won’t)

With context on what is on your iPhone, iPad, or Mac and screen awareness, Siri can also make inferences from things you’ve gotten in your email, photos in your library, or messages. Apple shared an example of Siri being capable of adding an address to a contact card after a friend shared it in a text message. 

The voice assistant will also be able to perform hundreds of new actions across Apple and third-party apps, like opening articles from a Reading List or looking up a specific photo in your library. 

Apple is also upgrading Siri to understand text, allowing users to type or speak to Siri as needed. 

For months, Apple was rumored to be working on different ways to keep its AI running strictly on device for security and privacy. However, Apple Intelligence is expected to rely on the cloud for at least some tasks, though the company is prioritizing on-device processing for enhanced privacy. Whether a specific task will be processed on-device or in the cloud will depend on task complexity, resource availability, data privacy considerations, and latency requirements.

Essentially, if a task is simple enough to be processed locally, leveraging the device’s processing power and battery life, and requires immediate results, it is more likely to be handled on-device. Tasks involving sensitive data could also prioritize on-device processing, as Apple prioritizes data privacy.

Also: Why are Apple’s AI features not coming to lower-end iPhones? Here’s my guess as an IT expert

In turn, cloud-based AI processing requires sending data from the device to remote servers that can handle complex or computationally heavy tasks. In Apple’s case, tasks requiring processing large amounts of data or updated models could include intricate analysis and advanced generative AI requests. 

Apple is leveraging what it calls Private Cloud Compute for complex tasks that require cloud servers. These processes draw on larger server-based models while protecting user privacy. The servers are built on Apple Silicon, and the data is never saved in the cloud.

Depending on its complexity and system requirements, an algorithm will determine whether a task requiring AI should be processed on-device or offloaded to the cloud. Simpler tasks like a Siri request and other basic NLP tasks can be processed on-device. More complex tasks, like generating a detailed summary of a large document, will be sent to the cloud, where more robust processing can occur.

Leave A Reply

Your email address will not be published.