Europe’s online child abuse law will make us all less safe

EuroActiv Politico News

Iverna McGowan is director of the Centre for Democracy & Technology’s Europe Office. 

Would you trust Europol with pictures of your children? With pictures of your pets, friends and everything else on your camera roll? 

In a misguided effort to combat child sexual abuse material (CSAM), the European Commission has come up with a new proposal, which could lead to exactly that. Ostensibly designed to detect “any potential abuse,” if passed, this proposal would essentially create a mass surveillance regime across the European Union. 

If this is news to you, it’s likely because lawmakers are framing the legislation solely as a crackdown on CSAM — which is a very important goal. Unfortunately, the fact is the proposal is breathtakingly broad in scope, and it would actually make everyone in the EU, including children, less safe. 

As currently written, the draft law obliges all hosting services and providers of interpersonal communications to scan all content, and then make a judgment on what to hand over to law enforcement. According to the proposal, providers of applications will face legal consequences unless they report “any information indicating potential online child sexual abuse.” 

So, not only would providers be required to scan all communications, but to avoid liability, they would most likely err on the side of caution and massively over-report people’s communications. 

Obviously, such an approach would have serious impact on free expression and association across Europe, where you would never again be able to communicate electronically with the confidence that your information isn’t being intercepted by a social media company, messaging provider or law enforcement. 

Despite the strong political momentum on both sides of the Atlantic to rein in the power of social media giants, this proposal would give them super state powers to control and surveil all users’ communications and information. 

The premise of this proposal —that filtering technologies and general monitoring obligations are core to eradicating CSAM online — is fundamentally flawed. 

As a result of this legislation, huge amounts of data, including text and images from private communications would end up in a Europol database — an agency that has already come under fire from the EU’s data protection watchdog over its previous mismanagement of large datasets. Privacy and security apps that use end-to-end encrypted communications would also be effectively banned, or their encryption backdoored— and not just for Europeans but for everyone. 

The innocent picture you take of your baby in the bath and send to their grandparents could end up in a law enforcement database or, worse, in the hands of child abusers who could manipulate that image. 

This proposal also seeks to use technology to identify never-before-seen examples of child sexual abuse material. But this technology is nowhere near advanced enough to be reliable, or accurate. Instead, it’s prone to error

Consider this: When Tumblr rolled out its system designed to detect “adult content,” it mistakenly flagged art, advocacy, memes, photos of people’s dogs and posts about design patents. Facebook also uses classifiers to enforce its ban on Adult Nudity and Sexual Activity, and has mistakenly been blocking news, journalism and health information.  

This proposed EU surveillance regime, which would invariably make similar serious errors, would be devastating for those involved. The use of such technology poses significant risks of flagging content that’s both overly broad and underinclusive, and it’s been shown, time and time again, to disproportionately impact vulnerable and marginalized populations.  

The law would also break end-to-end encryption, ending the possibility of anyone securely communicating online. It would empower government agencies and administrative bodies to issue detection orders, so they can gain access to communications without a warrant. 

Imagine how this could play out in Viktor Orbán’s Hungary, where investigative journalists and human rights defenders are already at risk. And even in other EU countries, it would make it far too easy for governments to gain access to citizens’ communications on the pretense of combatting child abuse. 

This move would also open the door for authoritarian governments around the world, as they could compel communication service providers into using the same types of technology deployed in service of this regulation to hunt down and prosecute their political opponents, trade union organizers or anyone who dares to dissent, clearly undermining the EU’s foreign policy efforts to combat unlawful surveillance and support human rights defenders worldwide. 

Sexual abuse thrives where there is abuse of power, lack of accountability and fear of dissent and truth. To prevent abuse and to be able to get truth and justice for victims and survivors, we need transparency, accountability and strong independent institutions that rebalance power. We simply can’t afford the online element of such efforts to be lacking in these very safeguards.  

Additionally, the EU’s approach to CSAM shouldn’t be limited to the online world: Thousands of victims have been denied justice due to statutes of limitations, or through institutions like the Catholic Church escaping liability. Underinvestment in child protection and well-being is an ongoing challenge across the EU, and not all member countries have even adopted national child protection laws.  

There are better, more holistic ways to protect children’s rights in the EU. Putting in place a mass surveillance system that endangers everyone’s human rights — including children’s rights — isn’t the right place to start.