If You Build It, They Will Come: Apple Has Opened the Backdoor to Increased Surveillance and Censorship around the World TECHNOLOGY, 16 Aug 2021

ZAC AND THE PROBLEM IS SOLVED

 

 Thailand: Thailand's regulatory guillotine project | International  Financial Law Review

 

If You Build It, They Will Come: Apple Has Opened the Backdoor to Increased Surveillance and Censorship around the World

TECHNOLOGY, 16 Aug 2021

Kurt Opsahl | Electronic Frontier Foundation - TRANSCEND Media Service

11 Aug 2021 – Apple’s new program for scanning images sent on iMessage steps back from the company’s prior support for the privacy and security of encrypted messages. The program, initially limited to the United States, narrows the understanding of end-to-end encryption to allow for client-side scanning. While Apple aims at the scourge of child exploitation and abuse, the company has created an infrastructure that is all too easy to redirect to greater surveillance and censorship. The program will undermine Apple’s defense that it can’t comply with the broader demands.

For years, countries around the world have asked for access to and control over encrypted messages, asking technology companies to “nerd harder” when faced with the pushback that access to messages in the clear was incompatible with strong encryption. The Apple child safety message scanning program is currently being rolled out only in the United States.

The United States has not been shy about seeking access to encrypted communications, pressuring the companies to make it easier to obtain data with warrants and to voluntarily turn over data. However, the U.S. faces serious constitutional issues if it wanted to pass a law that required warrantless screening and reporting of content. Even if conducted by a private party, a search ordered by the government is subject to the Fourth Amendment’s protections. Any “warrant” issued for suspicionless mass surveillance would be an unconstitutional general warrant. As the Ninth Circuit Court of Appeals has explained, “Search warrants . . . are fundamentally offensive to the underlying principles of the Fourth Amendment when they are so bountiful and expansive in their language that they constitute a virtual, all-encompassing dragnet[.]” With this new program, Apple has failed to hold a strong policy line against U.S. laws undermining encryption, but there remains a constitutional backstop to some of the worst excesses. But U.S constitutional protection may not necessarily be replicated in every country.

Apple is a global company, with phones and computers in use all over the world, and many governments pressure that comes along with that. Apple has promised it will refuse government “demands to build and deploy government-mandated changes that degrade the privacy of users.” It is good that Apple says it will not, but this is not nearly as strong a protection as saying it cannot, which could not honestly be said about any system of this type. Moreover, if it implements this change, Apple will need to not just fight for privacy, but win in legislatures and courts around the world. To keep its promise, Apple will have to resist the pressure to expand the iMessage scanning program to new countries, to scan for new types of content and to report outside parent-child relationships.

It is no surprise that authoritarian countries demand companies provide access and control to encrypted messages, often the last best hope for dissidents to organize and communicate. For example, Citizen Lab’s research shows that—right now—China’s unencrypted WeChat service already surveils images and files shared by users, and uses them to train censorship algorithms. “When a message is sent from one WeChat user to another, it passes through a server managed by Tencent (WeChat’s parent company) that detects if the message includes blacklisted keywords before a message is sent to the recipient.” As the Stanford Internet Observatory’s Riana Pfefferkorn explains, this type of technology is a roadmap showing “how a client-side scanning system originally built only for CSAM [Child Sexual Abuse Material] could and would be suborned for censorship and political persecution.” As Apple has found, China, with the world’s biggest market, can be hard to refuse. Other countries are not shy about applying extreme pressure on companies, including arresting local employees of the tech companies.

But many times potent pressure to access encrypted data also comes from democratic countries that strive to uphold the rule of law, at least at first. If companies fail to hold the line in such countries, the changes made to undermine encryption can easily be replicated by countries with weaker democratic institutions and poor human rights records—often using similar legal language, but with different ideas about public order and state security, as well as what constitutes impermissible content, from obscenity to indecency to political speech. This is very dangerous. These countries, with poor human rights records, will nevertheless contend that they are no different. They are sovereign nations, and will see their public-order needs as equally urgent. They will contend that if Apple is providing access to any nation-state under that state’s local laws, Apple must also provide access to other countries, at least, under the same terms.

‘Five Eyes’ Countries Will Seek to Scan Messages 

For example, the Five Eyes—an alliance of the intelligence services of Canada, New Zealand, Australia, the United Kingdom, and the United States—warned in 2018 that they will “pursue technological, enforcement, legislative or other measures to achieve lawful access solutions” if the companies didn’t voluntarily provide access to encrypted messages. More recently, the Five Eyes have pivoted from terrorism to the prevention of CSAM as the justification, but the demand for unencrypted access remains the same, and the Five Eyes are unlikely to be satisfied without changes to assist terrorism and criminal investigations too.

The United Kingdom’s Investigatory Powers Act, following through on the Five Eyes’ threat, allows their Secretary of State to issue “technical capacity notices,” which oblige telecommunications operators to make the technical ability of “providing assistance in giving effect to an interception warrant, equipment interference warrant, or a warrant or authorisation for obtaining communications data.” As the UK Parliament considered the IPA, we warned that a “company could be compelled to distribute an update in order to facilitate the execution of an equipment interference warrant, and ordered to refrain from notifying their customers.”

Under the IPA, the Secretary of State must consider “the technical feasibility of complying with the notice.” But the infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible. With Apple’s new program, we worry that the UK might try to compel an update that would expand the current functionality of the iMessage scanning program, with different algorithmic targets and wider reporting. As the iMessage “communication safety” feature is entirely Apple’s own invention, Apple can all too easily change its own criteria for what will be flagged for reporting. Apple may receive an order to adopt its hash matching program for iPhoto into the message pre-screening. Likewise, the criteria for which accounts will apply this scanning, and where positive hits get reported, are wholly within Apple’s control.

Australia followed suit with its Assistance and Access Act, which likewise allows for requirements to provide technical assistance and capabilities, with the disturbing potential to undermine encryption. While the Act contains some safeguards, a coalition of civil society organizations, tech companies, and trade associations, including EFF and—wait for it—Apple, explained that they were insufficient.

Indeed, in Apple’s own submission to the Australian government, Apple warned “the government may seek to compel providers to install or test software or equipment, facilitate access to customer equipment, turn over source code, remove forms of electronic protection, modify characteristics of a service, or substitute a service, among other things.” If only Apple would remember that these very techniques could also be used in an attempt to mandate or change the scope of Apple’s scanning program.

While Canada has yet to adopt an explicit requirement for plain text access, the Canadian government is actively pursuing filtering obligations for various online platforms, which raise the spectre of a more aggressive set of obligations targeting private messaging applications.

Censorship Regimes Are In Place And Ready to Go

For the Five Eyes, the ask is mostly for surveillance capabilities, but India and Indonesia are already down the slippery slope to content censorship. The Indian government’s new Intermediary Guidelines and Digital Media Ethics Code (“2021 Rules”), in effect earlier this year, directly imposes dangerous requirements for platforms to pre-screen content. Rule 4(4) compels content filtering, requiring that providers “endeavor to deploy technology-based measures,” including automated tools or other mechanisms, to “proactively identify information” that has been forbidden under the Rules.

India’s defense of the 2021 rules, written in response to the criticism from three UN Special Rapporteurs, was to highlight the very real dangers to children, and skips over the much broader mandate of the scanning and censorship rules. The 2021 Rules impose proactive and automatic enforcement of its content takedown provisions, requiring the proactive blocking of material previously held to be forbidden under Indian law. These laws broadly include those protecting “the sovereignty and integrity of India; security of the State; friendly relations with foreign States; public order; decency or morality.” This is no hypothetical slippery slope—it’s not hard to see how this language could be dangerous to freedom of expression and political dissent. Indeed, India’s track record on its Unlawful Activities Prevention Act, which has reportedly been used to arrest academics, writers and poets for leading rallies and posting political messages on social media, highlight this danger.

It would be no surprise if India claimed that Apple’s scanning program was a great start towards compliance, with a few more tweaks needed to address the 2021 Rules’ wider mandate. Apple has promised to protest any expansion, and could argue in court, as WhatsApp and others have, that the 2021 Rules should be struck down, or that Apple does not fit the definition of a social media intermediary regulated under these 2021 Rules. But the Indian rules illustrate both the governmental desire and the legal backing for pre-screening encrypted content, and Apple’s changes makes it all the easier to slip into this dystopia.

This is, unfortunately, an ever-growing trend. Indonesia, too, has adopted Ministerial Regulation MR5 to require service providers (including “instant messaging” providers) to “ensure” that their system “does not contain any prohibited [information]; and […] does not facilitate the dissemination of prohibited [information]”. MR5 defines prohibited information as anything that violates any provision of Indonesia’s laws and regulations, or creates “community anxiety” or “disturbance in public order.” MR5 also imposes disproportionate sanctions, including a general blocking of systems for those who fail to ensure there is no prohibited content and information in their systems. Indonesia may also see the iMessage scanning functionality as a tool for compliance with Regulation MR5, and pressure Apple to adopt a broader and more invasive version in their country.

Pressure Will Grow

The pressure to expand Apple’s program to more countries and more types of content will only continue. In fall of 2020, in the European Union, a series of leaked documents from the European Commission foreshadowed an anti-encryption law to the European Parliament, perhaps this year. Fortunately, there is a backstop in the EU. Under the e-commerce directive, EU Member States are not allowed to impose a general obligation to monitor the information that users transmit or store, as stated in the Article 15 of the e-Commerce Directive (2000/31/EC). Indeed, the Court of Justice of the European Union (CJEU) has stated explicitly that intermediaries may not be obliged to monitor their services in a general manner in order to detect and prevent illegal activity of their users. Such an obligation will be incompatible with fairness and proportionality. Despite this, in a leaked internal document published by Politico, the European Commission committed itself to an action plan for mandatory detection of CSAM by relevant online service providers (expected in December 2021) that pointed to client-side scanning as the solution, which can potentially apply to secure private messaging apps, and seizing upon the notion that it preserves the protection of end-to-end encryption.

For governmental policymakers who have been urging companies to nerd harder, wordsmithing harder is just as good. The end result of access to unencrypted communication is the goal, and if that can be achieved in a way that arguably leaves a more narrowly defined end-to-end encryption in place, all the better for them.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, the adoption of the iPhoto hash matching to iMessage, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. Apple has a fully built system just waiting for external pressure to make the necessary changes. China and doubtless other countries already have hashes and content classifiers to identify messages impermissible under their laws, even if they are protected by international human rights law. The abuse cases are easy to imagine: governments that outlaw homosexuality might require a classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand a classifier able to spot popular satirical images or protest flyers.

Now that Apple has built it, they will come. With good intentions, Apple has ​​paved the road to mandated security weakness around the world, enabling and reinforcing the arguments that, should the intentions be good enough, scanning through your personal life and private communications is acceptable. We urge Apple to reconsider and return to the mantra Apple so memorably emblazoned on a billboard at 2019’s CES conference in Las Vegas: What happens on your iPhone, stays on your iPhone.

_________________________________________

Español

Kurt Opsahl is the Deputy Executive Director and General Counsel of the Electronic Frontier Foundation. In addition to representing clients on civil liberties, free speech and privacy law, he counsels on EFF projects and initiatives.  For his work responding to government subpoenas, Opsahl is proud to have been called a “rabid dog” by the Department of Justice. He co-authored Electronic Media and Privacy Law Handbook in 2007 and was named as one of the “Attorneys of the Year” by California Lawyer magazine for his work on the O’Grady v. Superior Court appeal. Email: kurt@eff.org

Go to Original – eff.org


Tags: , , , , , , ,

Nessun commento:

Posta un commento

Lettera aperta al signor Luigi di Maio, deputato del Popolo Italiano

ZZZ, 04.07.2020 C.A. deputato Luigi di Maio sia nella sua funzione di deputato sia nella sua funzione di ministro degli esteri ...