In essence, Apple seeks to balance its privacy pledge with its AI goals by leveraging a strategy centered on on-device processing and a unique Private Cloud Compute architecture.
Minimizes Data Collection: The core principle is to keep as much data as possible on the user's device, processing AI tasks locally without sending personal information to Apple's servers.
Examples: Features like Face ID, Touch ID, and Photos' facial recognition are performed entirely on the device, ensuring the data never leaves it.
Private Cloud Compute (PCC)
Handles Complex Requests: For more complex AI tasks that require greater computational power, Apple Intelligence utilizes Private Cloud Compute.
Extends Privacy to the Cloud: PCC is designed to extend the privacy and security of Apple devices into the cloud, ensuring user data is protected even when processed remotely.
Key Privacy Principles of PCC:
Stateless Processing: User data sent to PCC is used only to fulfill the specific request and is never stored on the servers.
No Apple Access: Apple employees do not have privileged access to user data on PCC servers.
Verifiable Transparency: Apple publishes the software running on PCC servers for independent security researchers to inspect and verify the privacy claims.
Additional privacy-enhancing techniques
Differential Privacy: Apple employs differential privacy to gain insights into overall user trends without being able to identify individual users or their specific data. It adds a controlled amount of statistical noise to user data before it is aggregated, making it impossible to reconstruct individual data points.
Homomorphic Encryption: This advanced encryption technique allows computations to be performed on encrypted data without decrypting it first. This allows Apple to use AI features, such as Enhanced Visual Search in Photos, without ever seeing the raw image content.
User Control and Transparency: Apple provides granular settings to control which AI features access data, transparency reports showing what data leaves the device (for example, the Apple Intelligence Report), and requires user opt-in for certain privacy-affecting features like sharing Siri audio recordings.
By prioritizing on-device processing, employing Private Cloud Compute for more demanding tasks, and utilizing advanced privacy techniques like differential privacy and homomorphic encryption, Apple aims to deliver personalized AI experiences while maintaining its strong privacy stance and not "gobbling up" user data indiscriminately.
However, Apple’s privacy-preserving approach may come with trade-offs, such as potentially slower feature development compared to competitors with less restrictive data practices. Essentially, other than Apple AI service providers are stealing valuable user information to get rich from it quickly. The "black box" problem in AI makes it challenging to understand how some models process data and make decisions, raising transparency issues.
These privacy-neglecting AI companies have faced accusations and lawsuits regarding unauthorized data collection or misuse, highlighting the importance of robust data protection practices. If AI is so powerful and replaces us, let it happen at least without criminal activity as a basis for it.
The open source path, chosen by Chinese AI companies, e.g. DeepSeek, allows download models to the personal computer. This is important for companies intending to keep trade secrets. Mr. Altman is not able to provide such service, because he craves user data to train his models. Mr. Altman has to encounter bird Dodo fate. Poor Dodo vanished.
“AI is a risk to Apple's way of doing things in more ways than one.
The technology doesn't just open the door to possibly replace the iPhone. It seriously threatens Tim Cook's ethos around user privacy.
The Current Thing in Silicon Valley, after all, is the idea that personalized artificial intelligence will live and breathe on consumer devices carried everyday and everywhere. Listening, watching, recording.
Whether it be smartglasses as Mark Zuckerberg envisions for Meta Platforms or something inspired by sci-fi as Sam Altman has hinted at for OpenAI, these new personal computers are expected to work in large part because they will gobble up user data to provide highly tailored AI services -- part friend, part assistant, part AI god, part a person stealing our most valuable information.
That effectively is the antithesis of Apple. Since the days of the late Steve Jobs, the company has prided itself on user privacy.
Under Cook, those efforts have only intensified. This has put Apple at odds with its tech peers -- it disrupted Facebook's ad business with rules that limited tracking Apple users' internet usage.
It also locked the company in a battle years ago with the Justice Department over unlocking a gunman's iPhone.
"None of us should accept that the government or a company or anybody should have access to all of our private information," Cook once said. "This is a basic human right."
Yet Cook is now balancing investors' and users' demands for AI and products that supposedly will run better with more and more of our most sensitive and mundane information.
To appease investors worried about Apple's investment and pace of AI innovation, Cook last month emphasized the company's increased spending in the area and work to spread "AI features across platforms that are deeply personal, private and seamlessly integrated."
The emphasis being on private.
The question hanging over Apple is whether it will be able to offer truly personalized AI that competes with rivals taking different approaches to privacy.
Apple's early efforts in AI illustrate how it's trying to balance things. Executives have emphasized how certain iPhone AI features that use personal data are processed using computing power on the device. That is in contrast to rivals who ship the data off through the internet to remote servers, or cloud computers.
For more complex needs, Apple says, its AI can use what it calls Private Cloud Compute. This sends user data to servers that process the information but never store it. The effort comes after Apple has worked to create end-to-end encrypted iCloud backups for users.
But Apple's AI features are still limited. Some of them have been delayed because they haven't worked as developers had first promoted.
In other cases, Apple is using OpenAI's ChatGPT to augment chatbot capabilities that it doesn't have. Apple requires users to opt into those offerings.
"If you don't sign in to ChatGPT, your requests are anonymous and won't be used to train OpenAl models," a dialogue box tells users.
Already, more broadly, ChatGPT users are unloading personal thoughts and feelings to the chatbot in detailed terms. So much so that Altman, OpenAI's chief executive, has been warning that they shouldn't expect the same sorts of privacy protections that come with intimate conversations with psychologists or lawyers.
"People talk about the most personal s--- in their lives to ChatGPT," Altman said during a podcast appearance in July. "If you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, like we could be required to produce that and I think that's very screwed up. I think we should have like the same concept of privacy for your conversations with AI that we do with a therapist or whatever."
It's an issue that has garnered attention, in part, because of a lawsuit against OpenAI over alleged copyright infringement and a judge ordering chats to be preserved as evidence.
Chatbots are only the tip of Altman's plans. He has teamed with former Apple chief designer Jony Ive to develop an AI device that's reportedly meant to be fully aware of a user's surroundings and life.
Similarly, Zuckerberg has been talking about how near advancements in AI will soon enable smartglasses to have superintelligence. He's called glasses the ideal device for AI, allowing the technology to hear and see a user's life.
"I wear contact lenses, I feel like if I didn't have my vision corrected, I'd be sort of at a cognitive disadvantage going through the world," Zuckerberg said recently. "And I think in the future, if you don't have glasses that have AI or some way to interact with AI, I think you're kind of similarly . . . at a pretty significant cognitive disadvantage compared to other people and who you're working with, or competing against."
Microsoft, too, has talked about making its AI offerings more personal, which will require more user data. Mustafa Suleyman, the company's AI chief, told the "Bold Names" podcast he expects much of the data generated for making such systems workable to become ephemeral and encrypted.
"I don't necessarily think there are going to be these persistent large historical personal logs," he said. "Your AI is going to learn the sort of essence of your history of your data, and that in itself will still be personally identifying, so it will be valuable and personalized, but it won't necessarily need the full raw form."
While Apple's approach with AI has frustrated some on Wall Street who see the company going too slowly, its adherence to the ethos of privacy has won it some praise.
Cook is clearly betting that amid a new wave of technology, his old beliefs will still be a powerful selling tool, even if Apple is late to the party.” [1]
1. EXCHANGE --- Technology: Apple's Privacy Pledge At Odds With AI Goals --- Can the tech giant offer personalized artificial intelligence without gobbling up user information? Higgins, Tim. Wall Street Journal, Eastern edition; New York, N.Y.. 09 Aug 2025: B3.
Komentarų nėra:
Rašyti komentarą