11.05.2023

eco: “CSAM Regulation severely curtails civil liberties and is set to fail at the latest at the ECJ”

  • CSAM Regulation curtails right to privacy of minors as well as adults
  • Network of hotlines must be embedded in regulatory plans

Since its publication in May 2022, the EU Commission’s proposal for a “Regulation on laying down rules to prevent and combat child sexual abuse” (the CSAM Regulation) has sparked much discussion, not to mention severe criticism. Media reports have stated that the legal service of the EU Council of Ministers has, in its own internal report, expressed serious reservations about the EU Commission’s controversial draft. Ultimately, the indiscriminate chat control to be undertaken is not a targeted measure, but a general detection obligation, and violates the fundamental right to respect for private life; as such, the CSAM Regulation is doomed to collapse at the latest at the European Court of Justice (ECJ). In a position paper, the German federal government also reserves the right to make “further calls” due to numerous points of dispute, and plans to continue “active” participation in the negotiations.

Commenting on the current status of negotiations on the CSAM Regulation, the Head of the eco Complaints Office Alexandra Koch-Skiba has the following to say: “The protection of children and young people from sexual violence is a critical goal for which eco has been strongly committed for years with its Complaints Office’s activities. Nevertheless – or, rather, precisely for this reason – we are critical of the EU Commission’s draft regulation. Namely, the obligations it currently contains would ultimately lead to mass surveillance and counteract important end-to-end security technologies. They would also favour blocking of Internet content instead of consistent work on deleting depictions of abuse on the strength of expanded and intensified cooperation. While we welcome the fact that the German federal government’s statement now opposes detection obligations for encrypted communication and client-side scanning, this by no means addresses all of our concerns on the topics: eliminate detection obligations for companies and chat control. Non-encrypted, private communication as well as the use of (cloud) storage services must also be viewed to be worthy of protection.”

CSAM Regulation curtails right to the privacy sphere

In view of the ongoing European legislative process and the further votes on the CSAM Regulation by the German federal government, the Association of the Internet Industry appeals for a fundamental reconsideration of the detection obligations or for these to be explicitly designed as “ultima ratio”, to be linked to strict and clear preconditions and more strictly limited in terms of content, and to be restricted exclusively to known depictions of abuse:

“According to the CSAM Regulation, online providers – in being ordered to do so – will have to actively search for known or unknown depictions of child sexual abuse as well as grooming activities. However, incorporating unknown material and grooming consequently poses the grave risk that content that is not legally objectionable will also be reported on – for example, erotic content in the context of permissible sexting. This not only leads to an unnecessary overload of law enforcement authorities, but also to ‘false suspicions’ as well as the ‘viewing’ of permissible private communication, including the permissible sending of erotic content in the context of admissible sexting. The CSAM Regulation thus seriously curtails privacy, especially in the intimate sphere, one of the most protected areas of the right to privacy. This would affect minors as well as adults.”

This is one of the reasons why eco is critical of the draft report on the CSAM Regulation, published on 26 April 2023 by the Committee on Civil Liberties, Justice and Home Affairs. Although the committee proposes to make the detection obligations more of a “last resort”, it does not currently plan to limit the content to already known depictions of abuse.

Network of hotlines must be implemented in regulatory plans

The Association of the Internet Industry is also critical of the European plan to establish new public authorities or institutions for combatting child abuse; instead, established, functioning structures should be more closely embedded into the current plans, and existing cooperation and synergies should be promoted and expanded.

“Here we would have liked to see an even stronger commitment by the German federal government to involve the hotlines that have already been established. For many years now in Germany, the eco Complaints Office, FSM and jugendschutz.net hotlines have been important partners of the German Federal Criminal Police Office (BKA) in combatting depictions of abuse; for over 15 years, their collaboration has also been based on a written cooperation agreement. These hotlines are also an enormously important contact point for Internet users wanting to report depictions of abuse; especially because they allow anonymous reports. This important bridging function should not be jeopardised under any circumstances!” says Head of the eco Complaints Office Koch-Skiba.

In its finalisation of the report on the CSAM Regulation, the Association of the Internet Industry calls on the LIBE Committee to more strongly underscore the important role of the hotlines and the INHOPE network, and to explicitly earmark them as cooperation partners and important actors within the framework of the CSAM Regulation.

Here you can read the detailed eco STATEMENT on the EU Commission’s proposal for a regulation on laying down rules to prevent and combat child sexual abuse: https://international.eco.de/download/204062/

 

 

Alexandra Koch Skiba