While Apple’s technology itself could be seen as innovative and privacy-preserving in a very narrow view, their implementation of a closed ecosystem has dangerous lock-in effects. A properly open, standards-based approach could deliver these benefits while preserving the fundamental user agency and platform choice required for actual safety. Having led the design and delivery of encrypted virtual machines in 2010 and initiated field level client-side encryption of NoSQL in 2019, I can say with confidence there are better fundamental paths ahead for AI safety than what is being offered by Apple.
Security and Privacy Checklist
Here’s a set of tests to expose systems designed with too much lock-in:
- Does it use open standards for homomorphic encryption in AI applications?
- Does it use interoperable protocols that allow encrypted data sharing across platforms?
- Does it have community-audited implementations?
- Does it use decentralized approaches where encrypted processing happens on edge devices?
- Does it use peer-to-peer networks for sharing encrypted embeddings?
- Are encryption keys and processing user-controlled?
- Are there clear user controls over data usage and sharing?
- Is there an ability to revoke access to historical data?
- Are there options for local-only processing?
Critical Analysis of Apple’s AI Encryption Strategy
I couldn’t help but notice how Apple frames their belief in privacy, without the really important part said out loud about barriers to exit.
At Apple, we believe privacy is a fundamental human right [that you lose if you jump ship].
In the late 1700s philosopher David Hume clearly warned that any vendor who gives the option to jump off a ship only in the middle of the ocean is not giving any real option.
Problem 1: Centralized Infrastructure Control
- While the data remains encrypted, Apple maintains complete control over the homomorphic encryption infrastructure
- Users are dependent on Apple’s proprietary implementation and cannot easily migrate to alternative systems
- The “networked” benefits are confined within Apple’s ecosystem
Problem 2: Encryption Implementation
- The security relies entirely on Apple’s proprietary implementation of homomorphic encryption
- There’s no way to independently audit or verify the encryption process
- Users must trust Apple’s claim that the data remains truly encrypted and private
Problem 3: Platform Lock-in Effects
- By creating a powerful network effect around encrypted data sharing, Apple strengthens its ecosystem lock-in
- The more users contribute encrypted data, the more valuable the system becomes, creating high switching costs
- Competitors would struggle to build comparable systems without similar scale
Problem 4: Data Sovereignty Issues
- Even though data is encrypted, users still lose direct control over how their data moves and is processed
- The evaluation function and global POI database are controlled entirely by Apple
- Users cannot opt out of specific data uses while maintaining platform benefits
Problem 5: Future Risks
- If Apple’s homomorphic encryption is ever compromised, it could expose historical user data
- Apple could potentially modify the system to reduce privacy protections in the future
- Users have no guarantee of long-term data portability