British intelligence services reheat argument, think of children to demand weakening of encryption, rekindling concerns over global adoption of this stance


Two UK security officials have released a document which again suggests that weakening encryption would be a good thing for society. One of the reasons put forward is the need to protect children from online sexual abuse. The painting revives concerns about the multiplication of this posture on a global scale. Towards an Internet piloted on the Chinese model everywhere on the planet?

Almost four years ago, the same British intelligence pair published a paper arguing for weak encryption tricks to protect children from online sexual abuse. They try their luck again by publishing a new article that presents a very similar argument while acknowledging its flaws.

This document is not a rigorous analysis in terms of safety, but aims to show that today there are ways to address much of the harm associated with the sexual exploitation of children online, but also to show the extent and the extent of the work that needs to be done in this area, they write.

We have not identified any technique capable of allowing as precise detection of child pornography as content scanning, and while the privacy considerations raised by this type of technology should not be overlooked, we have presented arguments to suggest that it should be possible to implement it in configurations that address many of the most serious privacy concerns, they add.

The duo argues that to protect against child sexual abuse and the material they produce, it is in everyone’s best interest for law enforcement to have some access to private communications. The same argument has been used many times before, usually against terrorists, drug traffickers and organized crime.

Their proposal is to restart attempts to automatically filter content posted on online platforms. It’s about asking providers to be part of the process to verify that child pornography is not being posted online. This verification could be performed by an artificial intelligence trained to detect this type of material. Law enforcement could then be notified and work with these companies to combat the plague. In the first quarter of this year, the Electronic Frontier Foundation took a position against a similar approach that was being developed on the US side.

The EFF’s full position

A recent campaign in the UK demonizes encryption as something only a criminal might want to use, and the reasoning behind the EARN IT bill currently being debated in the US Senate is largely the same. Everywhere we turn, we find senators and think tanks saying governments around the world need to hold big tech companies accountable, and they say a key step in that process is banning end-to-end encryption. . Criminals, they say, should not be able to shield their communications from scrutiny. It’s not surprising to hear governments invoking crime to justify encroachment on individual liberty—or, for that matter, using such loaded words as “hide.” Are you “hiding” when you lock the door to your house every day simply because the government has no right to enter without a warrant? Is it “hidden” to seal the envelope of the card you send your valentine? Even if you accept that it is a cover-up, end-to-end encryption is not solely, or even primarily, to hide from massive government surveillance.

As lawmakers begin to consider the so-called misuse of end-to-end encryption, they would do well to think about the positive ways it is used on a daily basis. Much time is spent (rightfully so) on how it protected whistleblowers like Edward Snowden, but end-to-end encryption has vital applications much closer to home. Free and encrypted messaging, for example, helps protect young gay men from intolerant violence (at home and abroad, e.g. in Ghana). At the same time, in a world where abusers can track their victims simply by stashing an AirTag in their bag, end-to-end encryption plays a direct role in helping victims break out of these relationships by giving them possibility to contact friends for help. There are as many use cases for end-to-end encryption as there are people using it. To say the opposite testifies not only to a lack of imagination, but also to a discourse that can only be held from a position of power and privilege.

Our campaigns and the history of the Free Software Foundation (FSF) show that we don’t just want to hold big tech companies accountable. In addition to this somewhat misleading expression, we believe that the companies that make up “Big Tech”, such as Apple, Microsoft and Amazon, must be radically reshaped, based on respect for users’ freedom. And while we appreciate that companies like Apple and Meta advertise end-to-end encryption a lot, we have to remember that these companies are by no means friends of user freedom. As activists, we need to help these companies understand that their support for technologies such as end-to-end encryption should be extended to other important aspects of software and user freedom.

With initiatives like the EARN-IT Act, Congress is only going after “Big Tech” in a very specific sense, namely by seriously jeopardizing the privacy of all computer users. At the same time, any legal measures regulating encryption will have an impact on the implementation of encryption in free software and on its use by free software users. Once a backdoor is created, we are not allowed to specify how it is used or by whom. This is by definition a security risk.

In a world under EARN IT, the simple act of using encryption can be grounds for taking you to court. As privacy activists, one of our responses should be to normalize encryption as much as possible. Whistleblowers are important to many freedoms, including user freedom, and we need to ensure that people who report violations of the General Public License (GPL), Digital Restrictions Management (DRM) rootkits, or any other technological injustice do not stand out from the crowd for their use of encryption. The fact that people take the idea that only criminals use encryption seriously should give us pause: it means that we have failed in our goal of bringing privacy to the majority of users, and that we have to to do more.

The first step to doing more? Stop EARN-IT in its tracks. FSF urges all of its supporters to contact their local Congressmen and advise them to vote against the EARN-IT Act. Wherever you are in the world, stand firm in your campaign to defend the right of whistleblowers and ordinary people to protect their communications from mass surveillance. After all, today’s average citizen can become tomorrow’s whistleblower.

READ in a similar motion

The EU is considering requiring online service providers to automatically scan for suspicious content in all private chats, messages and emails, generally and indiscriminately. The goal: to combat the sexual abuse of children online. The bill is controversial if we take into account the results of the opinion polls published by the Commission, which show that the population sees it as the introduction of mass surveillance.

The European Union took a decisive step forward in its fight against pedophilia, child pornography and any other form of abuse that children can be exposed to online by approving the ePrivacy Derogation midway through the previous year. A majority of EU MPs passed the law allowing service providers to scan all private correspondence. The Commission’s proposal for a presentation agreement will add salt: Chatcontrol 1.0 provided that the search of chats, messages and private emails is carried out by online service providers on a voluntary basis. Chatcontrol 2.0 (text pending presentation) obliges them to do so and applies to encryption communications.

The consequences of the possible adoption of this project are:

  • all online conversations and emails will be automatically scanned for suspicious content. Nothing remains confidential or secret. It will not be necessary to obtain a court order or have an initial suspicion to search the messages;
  • if an algorithm classifies the content of a message as suspicious, the private or intimate images may be viewed by staff and contractors of international companies and law enforcement agencies. The same content may be consulted by unknown persons or end up in the hands of ill-intent;
  • intimate conversations can be read by staff and contractors of international companies and law enforcement agencies, as text recognition filters targeting “child solicitation” often falsely flag intimate conversations;
  • third parties may be falsely reported and investigated for allegedly disseminating child sexual exploitation material. Message and chat monitoring algorithms have been known to flag perfectly legal vacation photos of kids on a beach, for example. According to the Swiss Federal Police, 86% of all machine-generated reports turn out to be unfounded;
  • During a trip abroad, you may experience major problems. The reports generated by the machines about the communication could be transferred to other countries, such as the United States, where the confidentiality of the data remains very poorly regulated, this with unpredictable results;

Source: GCHQ

And you?

Is an Internet based on the Chinese model, that is, controlled by governments, inevitable?
What do you think about backdoors as a solution to weaken encryption from a technical point of view? Is this a possible solution?

Also see:

DOJ plans to crack down on encryption while Techlash’s iron is hot, hoping Australia’s anti-encryption law will make it easier to pass a similar law in the US
EFF Report Reveals How Big Tech Personal Data Trackers Hide in Social Media and Websites, Invading Users’ Privacy with Every Click
The US Congress may pass the EARN IT bill to stop online encryption, despite opposition from tech companies
US senators introduce child protection bill calling for an end to online encryption, seen by critics as a Trojan horse

Leave a Comment