“DURING Apple’s annual developer conference, which began on June 9th, the tech giant’s bosses were in their happy place. On home turf in Cupertino, California, they unveiled a glossy visual overhaul of Apple’s operating systems and showed off new features that pull its devices into ever-closer harmony. However, although the new “liquid glass” styling may give its software a new sheen, beneath the window-dressing things are not going well.
Nearly all of Apple’s products are made in Asia, so President Donald Trump’s tariffs threaten to crush its margins in America. Its tight grip on its software ecosystem has got it into trouble with regulators; after a bust-up with a judge during its long-running fight with Epic Games, the maker of “Fortnite”, over how it runs its App Store, Apple was slapped with a court order that jeopardises the $30bn it takes each year in app-related fees. And in the vital field of artificial intelligence (AI), Apple is floundering. No wonder, then, that its share price is down by almost a fifth this year, the most of any of America’s five biggest tech firms.
The company’s struggles to adapt to the AI boom are deep-rooted. Even before OpenAI launched ChatGPT in late 2022, Apple was falling behind, as Siri, its voice assistant, proved to be notably less capable than the alternatives from other firms. At last year’s conference Apple previewed a new version of Siri that could combine data from different apps to handle complex requests. It looked impressive, but it never shipped.
As in the fight with Epic Games, Apple’s difficulties come down to control. The firm has long differentiated its products by enabling users to keep their personal data private [1]. It can afford to do this because it makes most of its money selling hardware—unlike rivals such as Google and Meta, whose business models depend on collecting and analysing data in order to sell personalised ads.
The rise of AI has turned Apple’s control-freakery from a strength to a weakness. The plan, announced last year, was to deploy the company’s own AI model directly on users’ devices, where it could gain access to personal data (such as emails, messages and calendars) to answer queries and perform tasks, without compromising privacy. The problem is that this doesn’t seem to work: a small model running on a smartphone cannot compete with a much more powerful one running in the cloud. Surely Apple could develop its own big cloud-based model, to compete with ChatGPT, Claude or Gemini? Maybe. Catching up might be possible if Apple dipped into the rich trove of its users’ data. But it has promised not to.
As a result, it is now seeking outside help. Already, Siri can offer to hand off more complex queries to ChatGPT, though it must clunkily ask permission each time. This week Apple announced a deeper partnership with OpenAI. With users’ permission, ChatGPT will be given more access to their devices, for example to answer queries about what is on their screens. ChatGPT will also be baked into Apple’s programming tools.
This is a step in the right direction, but Apple needs to go further. Rather than trying to control what AI can and cannot do on its devices, Apple should let users decide. This would go against Apple’s instincts for control, which have only intensified under the leadership of Tim Cook. Yet openness may not be as scary as Apple fears.
Think back to when Apple launched the iPhone in 2007, and refused to let anyone else build native apps for it. Apple changed its mind the following year, allowing others to build apps on its terms and unleashing a surge of new tools, games and services. It should now apply the same approach to AI. Opening up the App Store helped make the iPhone the world’s most successful consumer product. Opening up to others’ AI models is Apple’s best chance of keeping it that way.” [2]
1.
AI is so hungry for user's data, that Apple, letting AI of other companies freely take information from our phones would kill Apple's competitive advantage in protecting our privacy.
It is right that AI often requires large amounts of data to function effectively, and this can raise valid concerns about privacy
.
Apple has indeed emphasized privacy as a key differentiator, particularly with the introduction of Apple Intelligence. Their approach centers on minimizing the need for user data to leave the device and implementing strong privacy protections when cloud processing is necessary.
Here's how Apple addresses privacy in the context of Apple Intelligence:
On-device processing: Whenever possible, Apple processes AI tasks directly on your device, like facial recognition in Photos or Siri's on-device requests, so your data stays local and secure.
Private Cloud Compute (PCC): For tasks that require more computational power, Apple utilizes PCC. This system extends the privacy and security of your device to the cloud, ensuring data is used only for the specific request and not stored or made accessible to Apple.
Transparency and verification: Apple allows independent security researchers to inspect the code running on their servers to verify their privacy claims. They also provide users with an Apple Intelligence Report, which shows requests sent to PCC, offering transparency into how their data is handled.
User control and consent: Apple empowers users with granular controls over which AI features can access their data. This includes features like opting out of data sharing with Apple Intelligence and explicitly consenting to interactions with third-party AI like ChatGPT.
No user data for training: Apple states they do not use users' private personal data or interactions when training their foundation models.
While Apple's approach to AI privacy is innovative and positions them as a leader in this area, it's also important to consider potential trade-offs:
Performance limitations: On-device processing may limit the capabilities of AI features compared to cloud-based alternatives that leverage vast amounts of data.
Feature delays: Engineering strong privacy protections can potentially slow down the development and deployment of new AI features compared to competitors with less stringent data practices.
Transparency challenges: The complexity of AI systems can make it difficult for users to fully understand how their data is being used, even with transparency controls.
In conclusion, Apple's strategy is to offer powerful AI features while minimizing data collection and implementing strong privacy safeguards, aiming to maintain its competitive advantage in privacy in the age of AI. However, this approach comes with challenges related to performance, development speed, and communicating complex technical details to users.
2. Glasnost in Cupertino. The Economist; London Vol. 455, Iss. 9452, (Jun 14, 2025): 10, 9.
Komentarų nėra:
Rašyti komentarą