Is Mass Surveillance Coming to Europe?

'Going Dark', chat control and the looming threat of client-side message scanning.

Is Mass Surveillance Coming to Europe?
Photo by David East on Unsplash

Going Dark

The notion of 'going dark' was popularized by FBI Director James Comey in a 2014 speech at the Brookings Institution. It describes the growing problem encryption poses for law enforcement when trying to (lawfully) intercept electronic communication.

In words heaped with platitudes (the phrase "bad guys" is used five times; "good people" comes up twice), Director Comey warns us that encryption will eventually "lead ... to a very dark place." From beneath a cloak of apparent reasonableness ("tough issues", "find the balance we need") he deploys arguments designed to appeal to base instincts - such as repeatedly conjuring images of children in danger. To quote one example: "When a city posts police officers at a dangerous playground, security has promoted liberty—the freedom to let a child play without fear." A reasonable reader might also conclude that, in a world where a playground can be considered dangerous enough to warrant posting a police officer, encrypted communication should probably be the least of our concerns.

A few years later, it emerged that the FBI had deliberately exaggerated the "going dark" problem. Notably, the FBI's own Inspector General found that "FBI staff ignored available methods to access encrypted content on an iPhone, presumably to press its legal case to compel Apple to decrypt the data.” (In 2015-16, the FBI was embroiled in a legal dispute with Apple concerning access to the iPhone of a suspect in the San Bernardino, California terror attack. The FBI eventually found a third party to help them access the phone, but ultimately found nothing of relevance to the investigation on it.) In other words, the FBI had revealed itself to be the very embodiment of state overreach.

Governments and their law enforcement agencies continue to have an uneasy relationship with encryption. While an outright ban on encryption or government-mandated backdoors seem unlikely for the moment, lawmakers are constantly exploring new methods of shining a light into the Internet's darker corners.

The EU’s Proposed CSA Regulation

Photo by Christian Lue on Unsplash

Under the noble guise of protecting children, the EU is seeking to implement a sweeping surveillance dragnet for Europe's digital communication. In May 2022, the European Commission (the EU's executive body) published regulatory proposal COM(2022) 209 Final (the CSA Regulation) which "[lays] down rules to prevent and combat child sexual abuse". The proposal is currently being debated in the European Parliament (the EU legislature).

The CSA Regulation raises concerns because of the requirements it imposes on hosting and communication service providers. Article 4 states that "providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures" to prevent the distribution of child abuse material. Such measures can include "reinforcing the provider’s internal processes or the internal supervision of the functioning of the service."

To put themselves in a position to mitigate risks, providers will have to closely and invasively monitor how their platforms are being used. Because the providers alone are responsible for deciding which risk mitigation measures to deploy, they can also be held legally liable. This all but guarantees that providers will use the most heavy-handed methods at their disposal to meet regulatory standards.

The CSA Regulation will also impose age verification requirements "for any hosting or interpersonal communication service where there is a risk of solicitation". (Given the text's broad formulation, almost any platform could meet these criteria.) This will lead to the collection of additional sensitive data (e.g. date of birth, copies of ID cards or passports, biometric data), putting all users at increased risk in case of a data breach.

The proposal provoked fiery condemnation from digital rights advocates. In June 2022, a coalition of 118 organisations (including the Chaos Computer Club and the Electronic Frontier Foundation) penned an open letter to the European Commissioners outlining the glaring backwardness of their proposal:

In recent years, the EU has fought to be a beacon of the human rights to privacy and data protection, setting a global standard. But with the proposed CSA Regulation, the European Commission has signalled a U-turn towards authoritarianism, control, and the destruction of online freedom.

This will set a dangerous precedent for mass surveillance around the world.

Open Letter to European Commissioners (8 June 2022)

The CSA Regulation is of particular concern for end-to-end encrypted (E2EE) communication services (WhatsApp, Facebook Messenger, Signal etc.) which, by design, are not privy to any of the activity taking place on their platforms. In fact, merely by virtue of being encrypted, they could be considered at particularly high risk for the distribution of abuse material. The fear is that providers will have to either downgrade their security (i.e. disable or weaken encryption) or install backdoors that would allow them to somehow monitor user activity.

One possible consequence of the CSA Regulation dreaded by many privacy advocates is the widespread introduction of client-side message scanning. (Sometimes referred to as 'chat control' in Europe.) This is a form of generalized surveillance which uses sophisticated techniques to detect abuse material. While not a traditional backdoor per se (in fact supporters tout the fact that client-side message scanning doesn't break encryption as its chief benefit), this method is unreliable and has the potential to be weaponized.

Apple's Failed Attempt at Client-Side Scanning

Photo by kuu akura on Unsplash

In December 2022, Apple quietly shuttered its own client-side message scanning initiative, which it had announced just the year before. The software, called NeuralHash, would locally scan photos and check them against a database of known abuse material provided by the National Center for Missing and Exploited Children and similar organizations. It does this using a technique called hashing, which "is a way of converting one set of data, like a picture, into a different unique representation, such as a string of numbers." If the hash of a photo on your device matches the hash of known abuse content from the database, a flag is raised. Raise enough flags (around 30) and your account gets reviewed by an Apple employee - and then possibly blocked and reported to the authorities depending on the outcome of their investigation.

Questions were immediately raised regarding the reliability of NeuralHash: How secure is the hash database? Can those who maintain and update the database be trusted? What about false positives? Has the system been thoroughly tested? These important questions were never adequately addressed by Apple.

University researchers were also able to demonstrate the potential susceptibility of NeuralHash to so-called hash collision attacks. This type of attack involves creating a seemingly innocuous image that matches the hash value of known harmful content. The image could be sent to unsuspecting users who would have no idea what they are looking at. (The image would likely be something unrecognizable - perhaps just a jumble of shapes and colors.) Imagine for a moment that you are part of a Telegram or Discord group with hundreds or thousands of participants. Any images sent to that group could be downloaded on your device and scanned. NeuralHash would then detect the image as a match and flag your device.

Apple pulled the plug on NeuralHash at around the same time that it introduced a new "Advanced Data Protection" feature in iOS. This feature is essentially end-to-end encryption for all iCloud data including device backups, photos, messages, notes and iMessage histories. The move is noteworthy because Apple had previously used the availability of unencrypted iCloud backups as an argument against implementing backdoors on devices. They reasoned that law enforcement could always request a suspect's iCloud data should they need access to information on their device. With "Advanced Data Protection", this is no longer possible. Apple is stepping up its commitment to privacy - due likely in no small measure to changing consumer demands.

Signal Threatens to Leave the UK if the Online Safety Bill Passes

Signal is a popular encrypted messaging app. It's available on multiple platforms and has been downloaded hundreds of million of times. It is owned by the non-profit Signal Foundation which relies on donations to operate.

The Online Safety Bill is the UK’s equivalent of the EU’s CSA Regulation. The bill (introduced by Boris Johnson) is intended to “protect children and adults online” and “make social media companies more responsible for their users’ safety on their platforms” according to the UK government. It's expected to become law by autumn 2023.

As with the CSA Regulation, certain provisions in the Online Safety Bill could force communication service providers to weaken or remove encryption and could eventually lead to the introduction of client-side message scanning.

Signal Foundation president Meredith Whittaker recently told the BBC that if the Online Safety Bill passes in its current form, the company would "100% walk rather than ever undermine the trust that people place in us to provide a truly private means of communication." WhatsApp also said it would refuse to lower the security of its app "for any government".

CSA Regulation: The Current State of Play

Recent discussions between EU member states about the CSA regulation proposal have revealed wildly differing views with regard to encryption and client side scanning.

France and Germany both strongly favor protecting encryption. Germany went as far as to suggest forbidding the use of any technology that could weaken, break or otherwise circumvent encryption. (German language link.) The Netherlands, Austria and Latvia are also in favor of granting encryption special protection.

On the other hand, the European Commission along with Croatia, Greece, Lithuania, Spain and Cyprus argue against exempting encryption from the CSA Regulation or otherwise affording it special protection. They claim that doing so would create considerable gaps in the legislation's coverage. Following their reasoning, providers would be able to avoid their legal obligations by simply encrypting their platforms.

On 1 March, a hearing of the German Parliament's Digital Committee took place which focused on the CSA Regulation Proposal. Nine experts (German language link.) from law enforcement, privacy groups and child protection groups were invited to testify. They were largely united in their criticism of the new law: "All experts, including child protector organizations[!], agree that the EU proposal goes too far and that it would undermine fundamental human rights protected by the EU Constitution."

Law enforcement was represented at the March 1 hearing by Martin Hartmann, a senior public prosecutor for cybercrime: "There is no prosecution at any cost." Lawmakers, he continues, must "balance the pursuit of criminal justice with fundamental rights. The 'going dark' scenario has been slightly overblown."