Apple shuts AI out: iOS apps just got more private
If you hate the idea of your information being used to train AI, you're going to love the minor but vital tweak Apple just made to the iOS App Store.
"You must clearly disclose where personal data will be shared with third parties, including with third-party AI," the company told app developers — adding that all apps must "obtain explicit permission before doing so."
The updated language — Apple's first guidance on third-party AI — is part of a document called App Review Guidelines. And lest the name fool you, the introduction makes clear that adhering to these guidelines is pretty much mandatory.
You May Also Like
Apple AirPods Pro 3 Noise Cancelling Heart Rate Wireless Earbuds — $219.99 (List Price $249.00)
Apple iPad 11" 128GB Wi-Fi Retina Tablet (Blue, 2025 Release) — $274.00 (List Price $349.00)
Amazon Fire HD 10 32GB Tablet (2023 Release, Black) — $69.99 (List Price $139.99)
Sony WH-1000XM5 Wireless Noise Canceling Headphones — $248.00 (List Price $399.99)
Blink Outdoor 4 1080p Security Camera (5-Pack) — $159.99 (List Price $399.99)
Fire TV Stick 4K Streaming Device With Remote (2023 Model) — $24.99 (List Price $49.99)
Shark AV2511AE AI Robot Vacuum With XL Self-Empty Base — $249.99 (List Price $599.00)
Apple Watch Series 11 (GPS, 42mm, S/M Black Sport Band) — $339.00 (List Price $399.00)
WD 6TB My Passport USB 3.0 Portable External Hard Drive — $138.65 (List Price $179.99)
Dell 14 Premium Intel Ultra 7 512GB SSD 16GB RAM 2K Laptop — $999.99 (List Price $1549.99)
"We will reject apps for any content or behavior that we believe is over the line," Apple tells developers later in the guidelines. "What line, you ask? Well, as a Supreme Court Justice once said, 'I'll know it when I see it.' And we think that you will also know it when you cross it."
The update, which dropped last week, marks the first time that AI has even been mentioned in the guidelines. Apple under CEO Tim Cook has been highly skeptical about AI, slow to include AI features in Siri, and sometimes hesitant to even use the letters "AI"; Cook has preferred to use the similar term "machine learning" in past keynotes.
Sourcing data to train AI models has become one of the most legally contentious activities in Silicon Valley. (Disclosure: Ziff Davis, Mashable’s parent company, filed a lawsuit in April against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
And even Apple, the AI laggard that is reportedly going to use Google Gemini to power Siri soon, isn't immune.
Last month saw two lawsuits alleging Apple has improperly used other people's work for its own AI training. In separate filings, two neuroscientists and two authors said Cook's company had used data from "shadow libraries," or pirated content available online.
While Apple's response remains to be seen, the legal landscape doesn't look all that promising for the company. AI giant Anthropic settled a class-action lawsuit over shadow library usage in September for $1.5 billion.
But at least Apple can now legitimately claim to be protecting its users from AI data-scraping within its apps.
Chris is a veteran tech, entertainment and culture journalist, author of 'How Star Wars Conquered the Universe,' and co-host of the Doctor Who podcast 'Pull to Open.' Hailing from the U.K., Chris got his start as a sub editor on national newspapers. He moved to the U.S. in 1996, and became senior news writer for Time.com a year later. In 2000, he was named San Francisco bureau chief for Time magazine. He has served as senior editor for Business 2.0, and West Coast editor for Fortune Small Business and Fast Company. Chris is a graduate of Merton College, Oxford and the Columbia University Graduate School of Journalism. He is also a long-time volunteer at 826 Valencia, the nationwide after-school program co-founded by author Dave Eggers. His book on the history of Star Wars is an international bestseller and has been translated into 11 languages.