Search This Blog

Saturday, August 14, 2021

RSN: FOCUS: Dear Apple, an Open Letter Against Apple's Privacy-Invasive Content Scanning Technology

 


 

Reader Supported News
14 August 21

Live on the homepage now!
Reader Supported News

LARGER DONORS: WE NEED MATCHING FUNDS, PLEASE. Our smaller donors stepped up again with 5, 10, and 20 dollar contributions. That is always the case. Those supporters are the backbone of Reader Supported News. But another thing always happens: somewhere, someone with a little more money to work with joins the fight and pushes us over the top. We need it now. Thanks sincerely to all.
Marc Ash • Founder, Reader Supported News

Sure, I'll make a donation!

 

A woman uses her phone. (photo: Kiyoshi Ota/Getty Images)

ALSO SEE: Apple Frequently Forced
to Give Customer iCloud Data to Police

FOCUS: Dear Apple, an Open Letter Against Apple's Privacy-Invasive Content Scanning Technology
Privacy Experts and Apple Consumers, Reader Supported News
Excerpt: "Child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products."

n August 5th, 2021, Apple Inc. announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.

Apple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac. One system detects if a certain number of objectionable photos is detected in iCloud storage and alerts the authorities. Another notifies a child's parents if iMessage is used to send or receive photos that a machine learning algorithm considers to contain nudity.

Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy.

Immediately after Apple's announcement, experts around the world sounded the alarm on how Apple's proposed measures could turn every iPhone into a device that is continuously scanning all photos and messages that pass through it in order to report any objectionable content to law enforcement, setting a precedent where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and unreasonable expansion of the scope of surveillance.

The Electronic Frontier Foundation has said that “Apple is opening the door to broader abuses”:

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses […] That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

The Center for Democracy and Technology has said that it is “deeply concerned that Apple’s changes in fact create new risks to children and all users, and mark a significant departure from long-held privacy and security protocols”:

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” says Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Dr. Carmela Troncoso, a leading research expert in Security & Privacy and professor at EPFL in Lausanne, Switzerland, has said that while “Apple's new detector for child sexual abuse material (CSAM) is promoted under the umbrella of child protection and privacy, it is a firm step towards prevalent surveillance and control”.

Dr. Matthew D. Green, another leading research expert in Security & Privacy and professor at the Johns Hopkins University in Baltimore, Maryland, has said that “yesterday we were gradually headed towards a future where less and less of our information had to be under the control and review of anyone but ourselves. For the first time since the 1990s we were taking our privacy back. Today we’re on a different path”adding:

“The pressure is going to come from the UK, from the US, from India, from China. I'm terrified about what that's going to look like. Why Apple would want to tell the world, ‘Hey, we've got this tool’?”

Sarah Jamie Lewis, Executive Director of the Open Privacy Research Societyhas warned that:

“If Apple are successful in introducing this, how long do you think it will be before the same is expected of other providers? Before walled-garden prohibit apps that don't do it? Before it is enshrined in law? How long do you think it will be before the database is expanded to include "terrorist" content"? "harmful-but-legal" content"? state-specific censorship?”

Dr. Nadim Kobeissi, a researcher in Security & Privacy issues, warned:

“Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure. What happens when local regulations in Saudi Arabia mandate that messages be scanned not for child sexual abuse, but for homosexuality or for offenses against the monarchy?”

The Electronic Frontier Foundation's statement on the issue supports the above concern with additional examples on how Apple's proposed technology could lead to global abuse:

“Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.”

Furthermore, the Electronic Frontier Foundation insists that it's already seen this mission creep in action: “one of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society.”

Fundamental design flaws in Apple's proposed approach have also been pointed out by experts, who have claimed that “Apple can trivially use different media fingerprinting datasets for each user. For one user it could be child abuse, for another it could be a much broader category”, thereby enabling selective content tracking for targeted users.

The type of technology that Apple is proposing for its child protection measures depends on an expandable infrastructure that can't be monitored or technically limited. Experts have repeatedly warned that the problem isn't just privacy, but also the lack of accountability, technical barriers to expansion, and lack of analysis or even acknowledgement of the potential for errors and false positives.

Kendra Albert, a lawyer at the Harvard Law School's Cyberlaw Clinic, has warned that “these "child protection" features are going to get queer kids kicked out of their homes, beaten, or worse”, adding:

“I just know (calling it now) that these machine learning algorithms are going to flag transition photos. Good luck texting your friends a picture of you if you have "female presenting nipples."”

Our Request

We, the undersigned, ask that:

  1. Apple Inc.'s deployment of its proposed content monitoring technology is halted immediately.

  2. Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.

Apple's current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases. We ask that Apple reconsider its technology rollout, lest it undo that important work.

READ MORE

 

Contribute to RSN

Follow us on facebook and twitter!

Update My Monthly Donation

PO Box 2043 / Citrus Heights, CA 95611




No comments:

Post a Comment

"Look Me In The Eye" | Lucas Kunce for Missouri

  Help Lucas Kunce defeat Josh Hawley in November: https://LucasKunce.com/chip-in/ Josh Hawley has been a proud leader in the fight to ...