Release of proposed Apple surveillance software to fight child sexual abuse delayed after nationwide privacy protests
First posted November 18, 2021 10:24am EST
Last updated November 29, 2021 11:14am EST
All Associated Themes:
- Protest Politics
- Social Media
External References
People protest against CSAM on iOS outside multiple Apple Stores in the US, 9to5Mac
Apple’s next event will be met with protests against its privacy hypocrisy, Input
Apple’s controversial new child protection features, explained, The Verge
Apple will scan photos stored on iPhones and iCloud for child abuse imagery, The Verge
Researchers show that Apple’s CSAM scanning can be fooled easily, BleepingComputer
Hoping to limit the spread of child sexual abuse material (CSAM) on its devices, Apple proposed a new technology that would analyze users’ photos, texts, and voice recognition for potential CSAM. Fearing encroachment of privacy, protesters rallied outside Apple stores across the country the day before the release of the iPhone 13.
Key Players
Apple is a multinational technology company headquartered in Cupertino that specializes in consumer devices, software, and online services. Apple is the world’s most valuable and largest technology company.
Founded in 1990, the Electronic Frontier Foundation (EFF) is a San Francisco-based nonprofit digital rights organization that promotes civil liberties on the Internet. It works to hold both the government and technology companies accountable while supporting user rights. According to its website, EFF aims to defend Free Speech onl1ine, stop illegal surveillance, advocate for users and innovators, and support freedom-enhancing technology by using the “unique expertise of leading technologists, activists, and attorneys.”
Further Details
On Aug. 5, 2021, Apple announced it would roll out new software to iMessage, iCloud, and Photos that would have the ability to detect child abuse imagery. According to The Verge, if the software detected such content, moderators would then review the material and could send it to the National Center for Missing and Exploited Children (NCMEC).
The software uses neuralMatch, a system designed to scan and detect CSAM. Trained with over 200,000 images from the NCMEC, it has the ability to scan photos stored on or sent to Apple devices.
While Apple maintained it would uphold its encryption policy and not harbor user data, opponents said the software gives Apple too much power by having the ability to decrypt data. Like other major cloud storage providers, Apple already scans cloud files for child abuse images. But this software would give access to locally stored files as well. Many were concerned this could lead to greater censorship and surveillance, potentially empowering governments around the world to dig into the personal, locally stored files of citizens, according to The Verge.
Apple stated its primary goal was to “to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of [CSAM].”
As The Verge reported, in the past, Apple had touted its privacy protections. Following the San Bernardino attacks in 2015, which included a mass shooting and an attempted bombing, Apple refused to comply with the FBI’s demand that the company create a “backdoor” in its iPhone operating system so it could access the encrypted phone of one of the shooters.
Outcome
Apple Delays Rollout of CSAM Identification Software
On Sept. 3, 2021, Apple announced it was pausing its rollout of the software, saying it was listening to the advice of critics, experts, customers, advocacy groups, and researchers. While Apple maintained its desire to push for new safety features, the company said it was gathering input to make improvements before updating any devices.
Protesters rally outside Apple stores
On Sept. 13, 2021, EFF and Fight for the Future, another digital rights advocacy group, organized protests in San Francisco, New York, Boston, Chicago, and other cities across the country to stand against the software, and even flew a banner above Apple’s headquarters reading “APPLE: DON’T SCAN OUR PHONES.” Joe Mullin, an EFF policy analyst and activist, said at a San Francisco protest that “Apple told the whole world that iPhone is all about privacy. But faced with government pressure, they caved.”
Experts release studies on dangers and shortcomings of CSAM technology
On Oct. 15, 2021, 14 cybersecurity researchers from prestigious U.S. universities released a 46-page study detailing the risks of new surveillance technology. According to the report, technology surveilling personal, locally stored data could be easily abused, even for the purpose of protecting children and uncovering illegal activity. Additionally, researchers argued monitoring personal data would neither guarantee crime prevention nor obstruct a widening of surveillance beyond finding illegal content.
On Nov. 10, 2021, BleepingComputer reported that researchers at the Imperial College in London devised a method for images to evade detection against Apple’s CSAM. According to researchers, the detection algorithm can be fooled 99.9% of the time by simply adding a hashtag filter to any image, making them appear different, even if the processed image appears identical.
As of Nov. 12, 2021, Apple had not responded to the Imperial College study and still planned a rollout for the CSAM technology in 2022.