Apple must drop plans to scan messages for abuse images, policy groups urge
Image credit: Dreamstime
Over 90 policy and rights groups around the world have published an open letter urging Apple to abandon plans for scanning children’s messages for nudity and the phones of adults for images of child sex abuse.
Earlier this month, Apple announced it will install surveillance software that conducts on-device scanning in messages and photos. Now, a coalition of rights groups has said in an open letter that although the new features could protect children and reduce the spread of child sexual abuse material (CSAM), they might also create new risks for children and could censor speech and threaten the privacy and security of people around the world.
The international coalition comprises over 90 civil society groups and is the largest campaign to date over an encryption issue at a single company. The open letter campaign was organised by the US-based non-profit the Center for Democracy & Technology (CDT).
Some overseas signatories are worried about the impact of the changes in nations with different legal systems, including some already hosting heated fights over encryption and privacy.
The letter also said how the scan and alert feature in messages could cause alerts that threaten the safety and wellbeing of some young people. LGBTQ+ youths with unsympathetic parents are particularly at risk, it highlighted.
“It’s so disappointing and upsetting that Apple is doing this because they have been a staunch ally in defending encryption in the past,” said Sharon Bradford Franklin, co-director of CDT’s Security & Surveillance Project.
An Apple spokesperson said the firm had address privacy and security concerns in its ‘Security Threat Model Review of Apple’s Child Safety Features’ document published last week, which outlined why the complex architecture of the scanning software should resist attempts to subvert it.
Those signing included multiple groups in Brazil, where courts have repeatedly blocked Facebook’s WhatsApp for failing to decrypt messages in criminal probes, and the senate has passed a bill that would require traceability of messages, which would require somehow marking their content. A similar law was passed in India this year.
“Our primary concern is the consequence of this mechanism, how this could be extended to other situations and other companies,” said Flavio Wagner, president of the independent Brazil chapter of the Internet Society, which signed. “This represents a serious weakening of encryption.” Other signatories on the letter were in India, Mexico, Germany, Argentina, Ghana and Tanzania.
Apple said it would refuse demands to expand the image-detection system beyond pictures of children flagged by clearing houses in multiple jurisdictions, though it has not gone as far as saying that it would pull out of a market rather than obeying a court order.
While most of the objections so far have been over device scanning, the coalition’s letter also faults a change to iMessage in family accounts, which would try to identify and blur nudity in children’s messages, letting them view it only if parents are notified.
The signers said the step could endanger children in intolerant homes or those seeking educational material. More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.
“Once this back-door feature is built in, governments could compel Apple to extend notification to other accounts and to detect images that are objectionable for reasons other than being sexually explicit,” the letter stated.
Other groups that signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.