Home » All news » Apple’s staff concerned over iPhones scan for child abuse images

Apple’s staff concerned over iPhones scan for child abuse images

Advertisement

Apple’s employees are speaking out against its move to scan iPhones and computers in the US for child sex abuse images, fearing that the feature could be exploited by repressive governments to find materials for censorship or arrests.

After the plan was announced a week ago, its staff flooded into Slack channel to expressed objection with over 800 messages, an anonymous worker told Reuters.

The move even also aroused protests from leading technology policy groups, though the company was previously known for its secretive culture.

Together with the past security changes at Apple that also prompted internal concern, the new plan drew surprisingly heated debate, its staff said.  

Counterattack in Apple

Rolled out a few years ago, Slack has become popular among Apple teams during the pandemic. They mostly used it to socialise while working from home, and even sparked several serious discussions.

About the photo-scanning feature, some employees, mainly outside security and privacy units, wrote on a Slack thread that they disagreed with criticism, saying that it is not a proper place for such discussions.

Its core security staff thought that Apple’s solution is a sound response to fight illegal materials.

Other employees hoped that the scanning is a step closer to fully encrypting iCloud for customers who want it.

Protest letter on the way

Apple’s past outside supporters are heavily objecting to the plan, saying that the company is going against its history of well-marketed privacy protection and will face potentially dire consequences.

Sources close to the matter said a coalition of policy groups are drafting a letter of protest to send to Apple in coming days to demand the plan be suspended.

This week, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) also voiced complaints over Apple’s plan.

“What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in,” CDT project director Emma Llanso said in an interview.

“It seems so out of step from everything that they had previously been saying and doing”.

While declining to comment on this story, Apple said it will refuse requests from governments to use the system to check phones for anything except illegal child sexual abuse imagery.

Apple could be forced to change policy

Outsiders and employees recalled Apple’s successful fight against a court order in 2016 to develop a new tool to crack into a terrorism suspect’s iPhone.

At that time, the company argued that such a tool would inevitably be used to break into other devices for other reasons. To its surprise, its stance was not welcomed by users globally who since then have called for better monitoring of private communication.

Apple said the scanning is only within the US and other countries to be named one by one, applicable to images uploaded to iCloud and identified by the National Center for Exploited and Missing Children and a small number of groups.

According to critics, other countries’ legislatures and courts, such as those in China, could demand Apple to expand such capability, not to mention police and other agencies reasoning “technical assistance” for crime investigation, including those in the UK and Australia.

“The infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible,” wrote EFF General Counsel Kurt Opsahl.

Advertisement
Tags