Discover the New Privacy Features in Apple Intelligence

Apple announced the release of "Apple Intelligence" at the Worldwide Developers Conference (WWDC) this week, officially joining the Artificial Intelligence race. "Apple Intelligence" will help you edit photos, improve emails, create emojis, and enhance Siri. However, the most impressive additions may be the enhanced privacy features.

Apple’s new privacy features will change the way you use your iPhone, iPad, and Mac by making them smarter and more secure. Here’s what you need to know!

What is Apple Intelligence?

Apple Intelligence is a set of new features that use artificial intelligence (AI) to make your Apple devices more helpful and personalized. This means your device can better understand your needs and help you with tasks like editing photos, sending emails, and even finding your favorite songs. But what makes Apple Intelligence special is its focus on privacy and security.

Privacy at the Core

Apple Intelligence is designed with privacy in mind. Craig Federighi, Apple's senior vice president of Software Engineering, said that Apple Intelligence has "groundbreaking privacy protection." He and his team made sure data privacy is foundational to Apple Intelligence, claiming that "your personal information remains entirely yours and under your control. And no one, not even Apple, would have any visibility into that information, even if our data center was processing your request." (Fast Company).

Great soundbites, but what does this mean for you?

On-Device Processing

When you enter a query in Google, DuckDuckGo, ChatGPT, Gemini, or basically any other search platform or AI chatbot, your query is sent to a data processing center (the “cloud”) where it is processed and returned to your device. The app on your device provides limited security and formats the returned data. None of the "heavy lifting" is done on your device.

 
 

Apple Intelligence changes this by processing some queries directly on your device. This is called “on-device processing.” On-device processing ensures that your sensitive and private information stays on your device and under your control.

For example, if you ask Siri to find a photo or edit a picture, your device will handle the task without sending your data to the cloud. Because the picture never leaves your phone, it is much less likely to be hijacked by hackers, stolen, or displayed without your permission. (See Hunter Biden).

The Problem with On-Device Processing

On-device processing is limited by the computing power of your device. The iPhone 15 Max processes data on a GigaFLOP (Floating-point Operations Per Second) scale. In contrast, the server-grade infrastructure that chatGPT is built on processes data on a petaFLOP scale (1,000,000 times greater than a gigaFLOP). Basically, processing an average Siri query on-device will take 1,000,000 times longer than processing the query in the cloud. Who has the patience for that?

Private Cloud Compute

Apple is dealing with this computational power problem using "Private Cloud Compute." When you ask your device to do something it can’t handle on-device (in a reasonable amount of time), the query is sent to Apple’s secure servers to get the job done. But don’t worry, Apple has put safety measures in place to protect your data:

  1. Limited Data Sharing: Only the necessary information is sent to the cloud, reducing the amount of data that leaves your device.

  2. No\Limited Data Retention: Apple promises that it will not keep your data after the task is completed.

  3. Data Encryption: All data sent to and from the servers is encrypted, making it very hard for anyone to access your information without permission. (ZDNet).

Ability to Opt Out

These features definitely sound like they will improve your data security. But, let’s be real. Apple is partnering with OpenAI, a company that has been the subject of privacy and data collection concerns since ChatGPT was released.

Like OpenAI, Apple gives you the option to opt out of Apple Intelligence features. According to Apple, you choose whether or not you want to use these new AI features, and if you change your mind, you can opt out anytime.   

Apple states that ChatGPT will be used for isolated tasks like email composition and other “writing tools,” and the data sent to ChatGPT servers will be limited to what is necessary to complete the task, to reduce the exposure of your personal data.

Third-Party Verification

In addition to the other security features, Apple will publish its Private Cloud Compute software for third-party verification. This means that data privacy experts and researchers can access the software and subsets of actual code. This gives these experts what they need to perform their own security tests and uncover vulnerabilities. In addition, Apple is offering incentives for experts to find problems through the Apple Security Bounty program. This “verifiable transparency” ensures that Apple’s systems meet high privacy standards. (Apple).

Why This Matters

Apple has historically made privacy and security a priority in its products. These new privacy features set a new standard for how AI can be used safely and responsibly. Apple’s focus on privacy means you can enjoy the benefits of AI without worrying about your personal data being misused. As Tim Cook, Apple’s CEO, said, Apple Intelligence aims to “apply this technology in a responsible way.” (The Guardian).

As director of information security and engagement at the National Cybersecurity Alliance, Cliff Steinhauer notes, “Apple is saying a lot of the right things, but it remains to be seen how it’s implemented.” Assuming Apple continues to focus on data privacy and security, the privacy and security features included with Apple Intelligence are a step in the right direction.

Previous
Previous

What Do the “Gay Olympics,” the “Redneck Olympics,” and Logan Paul Have in Common?

Next
Next

Let’s Talk About Hunter Biden’s Wiener And Why the Biden’s Need an IP Lawyer