Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

How does Apple Intelligence protect user privacy?

Apple’s new iPhone 16 lineup features new colors, a new camera button and – perhaps most noteworthy – a new artificial intelligence system.
The tech giant is set to roll out features from “Apple Intelligence,” a personal intelligence system that can take action across different apps, in beta next month. The company says the tech can help users craft friendly sounding emails, create emojis, and have Siri pull together information to answer questions like when a family member’s flight is expected to land.  
But what do the new features mean for user privacy? 
Apple recently outlined its plans to keep data safe when the AI system launches. Tech privacy experts and advocates told USA TODAY that these ideas look innovative, but they’re waiting to see exactly how they play out.
“While there’s a lot of innovation and thinking about privacy with the new system, there are still open questions on how effective these interventions will ultimately be, especially if we see more examples from other companies trying to do similar things,” said Miranda Bogen, director of the Center for Democracy and Technology’s AI Governance Lab.
Apple Intelligence will only be available on devices with compatible chips, including the iPhone 16 lineup, the iPhone 15 Pro and Pro Max, and iPads and Macs with M1 and later chips. 
The first part of the rollout begins next month when Apple launches iOS 18.1, iPadOS 18.1 and macOS Sequoia 15.1.
New features will include:  
◾ A “more natural and flexible” Siri with “richer language-understanding capabilities” to help the AI assistant follow along when users stumble over words and maintain context between multiple requests.
◾ New writing tools that help users rewrite, proofread and summarize text in Mail, Notes, Pages and third-party apps. 
◾ Updates to Photos, including a “Memories” feature that lets users create movies “simply by typing a description,” improved search functions and a “Clean Up” tool that finds and removes objects in the background of photos. 
◾ Updates to the Notes and Phone apps, which let users record, transcribe and summarize audio. The Phone app notifies participants when a recording is initiated, and AI summarizes the key points in the conversation after the call.  
When does the iPhone 16 come out?Find out when you can get yours
Apple said more features will follow in the months to come, including the option to access OpenAI’s ChatGPT from “several experiences” within the operating system.
Wedbush Securities analyst Dan Ives said Apple could sell more than 240 million iPhones in fiscal 2025 thanks to the new AI features. 
“We believe (iPhone 16) will be the most successful iPhone unit launch in its history as Apple Intelligence will be the launching pad for the consumer AI Revolution globally,” Ives said in a Monday note. 
Apple says its AI system is “designed to protect users’ privacy at every step.” 
There are two ways Apple Intelligence will handle prompts: the first is on-device processing, which is generally the preferred method for user privacy. 
“On-device processing can be used to ensure that sensitive, personal information remains within an individual’s control,” said John Verdi, senior vice president of policy at the Future of Privacy Forum, a think tank focused on data privacy.
For more complex requests that can’t be handled on-device, Apple is launching Private Cloud Compute. The cloud intelligence system is designed for private AI processing, and Apple says users’ data is never stored or shared with the company when it is sent to the cloud. 
Private Cloud Compute makes sure personal user data “isn’t accessible to anyone other than the user – not even to Apple,” the company said in a June blog post. “We believe PCC is the most advanced security architecture ever deployed for cloud AI compute at scale.” 
Apple also said it plans to make available software images of every production build of Private Cloud Compute so security researchers can verify its functionality and identify any issues.  
The transparency “is a good thing,” but users shouldn’t expect immediate results from this sort of digging, according to Thorin Klosowski, a security and privacy activist at the Electronic Frontier Foundation, a nonprofit digital rights group.
“It will just take some time before we have a really good idea of what they’re doing, how they’re doing it and if it’s working,” he said, cautioning users to avoid offering “too deep” of personal and private information.
He added: “I think conceptually, it looks good.” 
“There are a lot of technical questions and details around how (the Private Cloud Compute) works and whether that’s effective, but it is no doubt an acknowledgment that confidentiality is a major area of concern,” said Alan Butler, executive director and president of the Electronic Privacy Information Center, a nonprofit research center focused on privacy protection.
As for Apple users who plan to access ChatGPT through Siri or Writing Tools, Apple says their IP addresses will be obscured, all shared data will be clearly visible and OpenAI won’t store requests or use that data for training. Users who choose to connect their ChatGPT account will see ChatGPT’s data-use policies apply, but Apple users can access ChatGPT without creating an account. 
Apple has not yet said when this feature will launch. 
“Any time Apple partners with a car company, or an AI company or anyone else, it does take you out of that insular Apple universe,” said Ryan Calo, a professor and co-director of the University of Washington Tech Policy Lab. “But at the end of the day, that’s not concerning to me unless we feel like Apple is doing the wrong thing with the data.”    

en_USEnglish