Technology

Apple’s privacy reputation is at risk with the changes it announced this week

A monorail train displaying Google signage moves past a billboard advertising Apple iPhone security during the 2019 Consumer Electronics Show (CES) in Las Vegas, Nevada, U.S., on Monday, Jan. 7, 2019.
Bloomberg | Bloomberg | Getty Images

Apple announced a system this week that will enable it to flag images of child exploitation uploaded to iCloud storage in the U.S. and report it to authorities.

The move was hailed by child protection advocates. John Clark, the CEO of the National Center for Missing and Exploited Children, a nonprofit created through a congressional mandate, called it a “game changer” in a statement.

But the new system, which is in testing in the U.S. now, was also vociferously opposed by privacy advocates who warned it represents a slippery slope and could be tweaked and further exploited to censor other kinds of content on people’s devices.

Apple isn’t unique in its efforts to rid its cloud storage of illegal images of child pornography. Some other cloud services already do this. Google has used hashing technology since 2008 to identify illegal images on its services. Facebook said in 2019 it removed 11.6 million pieces of content related to child nudity and child sexual exploitation in just three months.

Apple says its system is an improvement over current industry-standard approaches to remove child pornography because it uses its control of hardware and sophisticated mathematics to learn as little as possible about the images on a person’s phone or cloud account while still flagging illegal child pornography on cloud servers. It doesn’t scan actual images, only comparing hashes, the unique numbers that correspond to image files.

But privacy advocates see the move as the beginning of a policy change in which Apple could be pressured by foreign governments to, for example, repurpose the system to quash political speech by asking Apple to flag photos of protests or political memes. Skeptics aren’t worried about how the system works today and aren’t defending people who collect known images of child exploitation. They’re worried about how it might develop in the coming years.

Skeptics worry about how the system could evolve

“Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow,” NSA whistleblower Edward Snowden tweeted.

The Electronic Frontier Foundation ( EFF), which has supported Apple’s policies on encryption and privacy in the past, slammed the move in a blog post, calling it a “backdoor,” or a system built to give governments a way to access encrypted data.

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the influential nonprofit said in a blog post.

Apple’s new system has also been criticized by the company’s competitors, including Facebook subsidiary WhatsApp, which also uses end-to-end encryption for some of its messages and has faced pressure to provide more access to people’s content to prevent child exploitation.

“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone,” WhatsApp CEO Will Cathcart tweeted on Friday. He said WhatsApp won’t adopt a similar system. “That’s not privacy.”

Privacy has become a core part of iPhone marketing. Apple has been public about the security architecture of its systems and is one of the most vociferous defenders of end-to-end encryption, which means it doesn’t even know the content of messages or other data stored on its servers.

Most notably, in 2016, it faced off against the FBI in court to protect the integrity of its encryption systems in the investigation of a mass shooter.

Apple has taken heat for this stance. Law enforcement officials around the world have pressured the company to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism.

Apple sees it as a win-win

Apple sees the new system as part of its privacy-protecting tradition: a win-win situation in which it’s protecting user privacy while eliminating illegal content. Apple also claims the system can’t be repurposed for other kinds of content.

But that’s also the reason privacy advocates see the new system as a betrayal. They feel they’ve lost an ally that built computers designed to prevent — as much as possible — data leaks to governments, Apple and other businesses. Now they see, as Snowden put it, a system that compares user photos against a “secret blacklist.”

That’s because of Apple’s own marketing. In 2019, it bought a giant billboard in Las Vegas during an electronics trade show with the slogan “What happens on your iPhone, stays on your iPhone.”

Apple CEO Tim Cook has addressed the “chilling effect” of knowing that what’s on your device may be intercepted and reviewed by third parties. Cook said a lack of digital privacy could prompt people to censor themselves even if the person using the iPhone has done nothing wrong.

“In a world without digital privacy, even if you have done nothing wrong other than think differently, you begin to censor yourself,” Cook said in a 2019 commencement speech at Stanford University. “Not entirely at first. Just a little, bit by bit. To risk less, to hope less, to imagine less, to dare less, to create less, to try less, to talk less, to think less. The chilling effect of digital surveillance is profound, and it touches everything.”

Apple’s pivot to privacy has been successful for the company. This year, it introduced paid privacy services, such as Private Relay, a service that hides user IP addresses and therefore location.

Privacy has also been part of the sales pitch as Apple breaks into lucrative new industries like personal finance with its Goldman Sachs-powered credit card, and healthcare with software that allows users to download medical records to their iPhones.

But reputations can be dashed quickly, especially when they appear to contradict previous public stances. Privacy and security are complicated and aren’t accurately conveyed by marketing slogans. The critics of Apple’s new plan to eliminate child exploitation don’t see a better-engineered system that improves on what Google and Microsoft have been doing for years. Instead, they see a significant shift in policy from the company that said “what happens on your iPhone stays on your iPhone.”

Articles You May Like

Trump’s pick to lead NASA made a big bet on crypto while going to space on the side
Star crocodile from hit film Crocodile Dundee dies peacefully, zoo says
Rachel Reeves says most Waspi women knew pension age was changing so compensation not needed
French court jails Dominique Pelicot for 20 years for organising the mass rape of his ex-wife
Micron shares plunge on weak second-quarter guidance