AI
Apple's Private Cloud Compute: What you need to know
Aditi
Jun 26, 2024
TL;DR
"You shouldn't have to hand over all the details of your life to be warehoused and analysed in someone’s AI cloud," says Craig Federighi, Apple's Senior Vice President of Software Engineering. while introducing Private cloud compute. Private Cloud Compute (PCC) is a privacy-centric AI system to keep your digital life secure and private, away from AI to train and store your personal information.
What's the Big Deal about PCC?
AI is like a super-smart assistant, but it needs your data to learn and get better. That's where things get tricky. Powerful AI hardware in the data center can fulfill a user’s request with large, complex machine learning models —but they need to see the user's data without encryption to do this. Apple's PCC, on the other hand, is like a private vault for your data. It sits on Apple's own hardware, so no one else can come around. End-to-end encryption and other security measures make it nearly impossible for anyone to access your information.
How it works:
Think of your Apple device as having a built-in brain – it's pretty smart, but sometimes it needs help with tougher tasks. That's where Private Cloud Compute (PCC) comes in.
Here's how it happens:
You ask Siri for restaurant recommendations or use an AI-powered feature on your iPhone.
Your device tries to figure out the answer on its own. If it can, great! Your data stays right where it is – safe and sound on your phone.
If the task is a bit too complex, your device calls in PCC, the AI expert.
Your request doesn't go straight to PCC. It takes a secret route, going through many different places, so no one knows who sent it or where it's coming .
Only the bare minimum information needed to answer your question gets sent to PCC.
Once Private cloud compute (PCC) has done its job, PCC deletes the user’s data , and no user data is retained in any form after the response is returned.
Apple Intelligence architecture in nutshell:
There are two on-device models, each with 3 billion parameters: one for language/actions and another for image generation. This is how it works:
Larger Apple models running on a private cloud server.
An orchestrator that directs Siri queries to the appropriate models (device-based, server-based, or third-party like ChatGPT).
A semantic index that catalogs all local data for retrieval-augmented generation (RAG).
An app intents toolbox that indexes all available actions.
Apple's Intelligence, takes a unique approach where most of the ai tasks will be handled by on device models but sometimes when it needs help with tougher tasks. That's where Private Cloud Compute (PCC) comes in.Since Large foundation model needs to see data to process requests, thus full end-to-end encryption isn't possible. Private Cloud Compute enforces strict privacy measures about the way it handles user data:
User data is sent to PCC only to fulfill the request and is deleted immediately after.
PCC does not retain user data once the request is completed.
User data is never accessible to Apple, even to staff with administrative access.
Is it really that safe ?
To make Private cloud compute more foolproof, Apple has adopted the approach of "Verifiable transparency". While no system is completely foolproof, Apple has gone to great lengths to make PCC as secure as possible. With Verifiable Transparency Apple will publish Every production Private Cloud Compute software image for independent binary inspection, offering rewards for anyone who finds a flaw. So, while there's always a small risk, for now PCC is about as close as you can get in the digital world.
Why Should You Care?
Apple's emphasizing privacy, contrast sharply with Microsoft's Recall AI privacy blunders. PCC is a correct step forward. It means you can enjoy the benefits of AI without sacrificing your personal information. Plus, it sets a new standard for the industry, showing that it's possible to create powerful AI systems that respect your privacy. Big players should start paying attention to this approach, let's hope they will step up!