Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

Chatkontrolle: Internetwirtschaft fordert Schutz von Verschlüsselung

IT-Wirtschaftsverbände aus ganz Europa üben in zwei Briefen massive Kritik an der Chatkontrolle. Das Vorhaben sei überhastet und gefährde die Privatsphäre aller. Zu Verschlüsselung und Client-Side-Scanning zeigen sie deswegen klare Kante.

Kamera auf einer Tastatur
IT-Verbände warnen vor der Chatkontrolle. (Symbolbild) – Alle Rechte vorbehalten IMAGO / YAY Images

Kurz vor dem Verhandlungsfinale zur Chatkontrolle in den kommenden Wochen haben IT-Verbände aus ganz Europa massive Bedenken angemeldet. In zwei offenen Briefen an die EU-Ratspräsidentschaft, die EU-Innenkommissarin und Mitglieder des Parlaments warnen die Organisationen davor, Verschlüsselung und Privatsphäre zu untergraben. Sie fordern, dass das Gesetzespaket Verschlüsselung schützen muss. Insgesamt 26 Organisationen haben die Briefe unterzeichnet, unter ihnen der deutsche Verband Eco und der österreichische Providerverband ISPA.

Einstimmig warnen beide Briefe vor einer überhasteten Umsetzung der Verordnung zur Prävention und Bekämpfung des sexuellen Missbrauchs von Kindern. Der Ministerrat will seine Position im September beschließen, am 20. September tagt die zuständige Arbeitsgruppe Strafverfolgung, im Oktober wird das Europaparlament beraten.

Client-Side-Scanning untergräbt Privatsphäre und Sicherheit

Besonders sorgen sich die Verbände um Verschlüsselung und Privatsphäre. So wirbt der Brief der mehrheitlich mittel- und osteuropäischen Digitalverbände darum, die „entscheidende Rolle“ anzuerkennen, die Verschlüsselungstechnologien, inklusive der Ende-zu-Ende-Verschlüsselung, bei der Gewährleistung privater und sicherer Kommunikation für Nutzer*innen  einschließlich der Kinder spielen.

Auch der zweite Brief, den vorrangig westeuropäische Verbände unterschrieben haben, kritisiert die Umgehung von Verschlüsselung. Client-Side-Scanning „würde den robusten Schutz, den die Ende-zu-Ende-Verschlüsselung für die Privatsphäre und die Sicherheit der Menschen bietet, ernsthaft untergraben und die Wahrscheinlichkeit von ungerechtfertigten Privatsphäreverletzungen erhöhen.“ IT-Sicherheitsexperten hätten wiederholt davor gewarnt, dass die Schwächung irgendeines Teils eines verschlüsselten Systems die Sicherheit von jedermann und überall beeinträchtigen würde.

Kritik an automatischer Erkennung

Maßnahmen zur Aufdeckung („detection“) können aus Sicht der IT-Verbände nur das letzte Mittel sein. Diese sollten nur zum Einsatz kommen, wenn klar ist, dass ein Provider nicht alle Maßnahmen ausgeschöpft hat. Darüber hinaus sei Vorsicht geboten, wenn es um die Erkennung von bisher unbekannten Darstellungen von Kindesmissbrauch und sogenanntem „Grooming“ gehe. Letzteres meint die Anbahnung von Kommunikation von Erwachsenen an Kinder mit Missbrauchsabsicht. Die Detektion dieser beiden Dinge sei „technisch und operativ schwierig“ und „erfordere den Einsatz von Menschen und die Analyse des Kontextes der Kommunikation“, so die Verbände.

Beide Gruppen von Verbänden mahnen zudem an, dass sich Anordnungen auf solche Dienste beschränken müssten, die diese technisch umsetzen könnten. Zudem hätten etwa Anbieter von Cloud-Infrastruktur und App-Stores nur ein geringes Risko für Missbrauch. Aktuell sei zudem der Begriff „Hosting-Service“ im Gesetzestext zu breit gefasst.

Lob gibt es von den mittel- und osteuropäischen Verbänden für das Europäische Parlament: Die dort vorgeschlagenen Änderungen würden deutlich besser umsetzbare Lösungen bedeuten und Innovation nicht ausbremsen. Final ist die Position des Europäischen Parlaments aber noch nicht. Bevor es zu einer Abstimmung im Oktober kommt, muss erst noch der Innenausschuss (LIBE) seine Empfehlung abgeben. Wenn Rat und Parlament ihre Positionen beschlossen haben, starten die Trilog-Verhandlungen zusammen mit der EU-Kommission.

 


Hier der Brief der mehrheitlich Mittel- und Osteuropäischen Digitalverbände („Joint letter“) aus dem PDF befreit:


  • Date: September, 2023
  • To: Spanish Presidency of the Council
  • To: European Commissioner for Home Affairs Ylva Johansson and DG HOME
  • To: MEPS
  • Subject: Joint letter from European industry players

The Child Sexual Abuse Regulation must be balanced, technologically neutral and future-proof

Dear Madam/Sir,

The undersigned organizations and their members have been long committed to combating online child sexual abuse (CSA) and fully share the objectives of the European Commission to prevent and fight these crimes. Cooperation between regulators, tech companies, law enforcement agencies, governments and civil society is crucial in the process – and so is a sound legal framework that is effective, balanced, technologically neutral and future-proof.

We welcome the progress made in the European Parliament so far. We believe these improvements maintain the essence of the proposal while providing more feasible options for the industry to continue to innovate in this space while also further scaling their efforts.

While we applaud the Spanish Presidency’s commitment to reaching a compromise, considering the complexities of the proposal, we warn against adopting a rushed general approach in the EU Council, without addressing the most critical aspects of the legislation. In this regard, we share concerns expressed by the European Data Protection Board and the European Data Protection Supervisor about the impact of the proposal on the fundamental rights, particularly to privacy and the protection of personal data, in addition to the lack of legal clarity related to detection and delisting orders.

To address our concerns, we propose the following five points that can contribute to creating a sound and proportionate legal framework, with the ultimate goal of benefitting society and keeping children safe:

1. Given the extremely high stakes, it is critical to find the right balance between protecting children and privacy. We believe it is possible to strike such a balance by carefully evaluating the feasibility of technological solutions, allowing innovation to continue in this space and by avoiding enshrining broad legal actions that would violate fundamental rights. Any technology should be developed and discussed continuously in close cooperation with industry. The required technical solutions should be implementable on a technical level without interfering with digital infrastructure and networks. The proposal should recognize the crucial role that encryption technologies, including end-to-end encryption play in providing private and secure communications for users, including children. Strong encryption, including end-to-end encryption, protects users‘ sensitive data – including individuals, corporations, and governments. Requiring providers to engineer vulnerabilities into products and services would undermine the security and privacy of customers’ data. In cases where encryption is used in the cloud, any request to break encryption could also undermine the information technology infrastructure and leave customers with sensitive data exposed.

2. Without voluntary actions and an appropriate derogation from the relevant provisions of the e-Privacy Directive, some services will not be able to proactively search for illegal content. Providers may need to stop detection of online child sexual abuse in certain services while they wait for a detection order, which could lead to a gap in online child safety. We propose providing a clear legal basis and long-term derogation from the ePrivacy Directive to allow providers of electronic communications services to continue to innovate and carry out voluntary detection efforts. We believe that the current e-Privacy Directive derogation is transparent, proportionate and reliable.

3. Detection and delisting orders must be a measure of last resort. They need to be flexible, with appropriate safeguards. The text should clarify that services should exhaust all risk mitigation measures before a detection or delisting orders can be issued. This is why voluntary efforts must be maintained. In addition, it is unclear how providers could meet this obligation without breaching the prohibition of general monitoring obligations since human review remains essential, and detecting these types of CSA cannot be fully automated through to enforcement decision making. Legal mandates should be only used as last resort measures where voluntary efforts and other risk mitigates are deemed insufficient and, then, only when such orders do not violate the prohibition on general monitoring.

4. Narrow the scope of detection orders to appropriate service providers: the legislation should focus on services that present high risk of abuse due to their nature. Software application stores, search engines, number-based interpersonal communication services and cloud infrastructure providers are examples with low risk due to the nature of the service. Similarly, interpersonal communication services and cloud infrastructure providers are not well placed to take action due to technical and contractual limitations We propose keeping only high-risk service providers that are able to act in scope, while the order should be directed first and foremost to the hosting service to which it was uploaded.

5. The regulation should be aligned with other pieces of EU legislation to prevent the fragmentation of the digital single market. The legislation should build upon the recently adopted Digital Services Act (DSA) and its risk assessments. Governments should avoid any action requiring companies to create security vulnerabilities in their products and services, while preserving its ban on general monitoring, and carefully map out the interplay between other pieces of legislation.

We are aware of the high stakes this legislation involves – fulfilling the goals of the legislation is crucial for all stakeholders, and above all, for children. We believe that our proposed solutions help find such a framework.

We remain committed to fighting child abuse online and helping co-legislators reach a strong long-term legislation to tackle child abuse online in the most effective, balanced, technologically neutral and futureproof way.

The Confederation of Industry of the Czech Republic, on behalf of the undersigned organizations.

  • Adigital
  • ASIC
  • CCIA
  • Digital Poland
  • Danish Entrepreneurs
  • Confederation of Danish Industry
  • Confederation of Industry of the Czech Republic
  • IAB Polska
  • Infobalt
  • ITI
  • Lewiatan
  • LITKA
  • SAPIE

 

 


Hier der Brief der mehrheitlich westeuropäischen Digitalverbände („Joint industry call“) aus dem PDF befreit:


Joint industry call for protecting encryption and limiting detection orders in the EU Regulation laying down rules to prevent and combat child sexual abuse

September 6, 2023

The undersigned industry associations representing technology companies remain deeply committed to making the digital space safer for everyone and in particular to protecting children online. We firmly stand behind the European Commission’s overarching objective to prevent and combat child sexual abuse.

As such, we believe certain improvements need to be introduced in the proposal for an EU Regulation laying down rules to prevent and combat child sexual abuse (CSA Regulation) in order to achieve a legislative framework that recognises our industry’s efforts to safeguard children and one that better ensures the prosecution of perpetrators.

To this end, we call on EU policymakers to 1) defend the rights to privacy and confidentiality of communications through the specific protection of encryption; 2) make sure that detection orders are a last resort measure; and 3) limit detection orders to those with the ability to act.

1. Safeguard encrypted communications
Encryption (including end-to-end encryption of data) plays a key role in the provision of private and secure communications, including those of minors (1).

The importance of encryption technologies, and end-to-end encryption in particular, in safeguarding the security and confidentiality of users’ communications is already acknowledged in currently applicable EU legislation (2) and should not be undermined by the CSA Regulation. Governments and public authorities rightly want to protect children from harm, but weakening encryption is the wrong approach and would put millions of EU citizens at risk of hacking, fraud and identity theft.
These risks have been highlighted by many actors (3), who have drawn attention to the privacy and security implications of scanning the content of encrypted communications to detect child sexual abuse.

The proposed Regulation fails to clearly exclude end-to-end encrypted services from an obligation to scan message contents and could instead mandate providers to deploy certain potentially invasive technological solutions – such as client-side scanning (CSS) – to execute detection orders. This would seriously undermine the robust protection that end-to-end encryption provides to people’s privacy and security, and increase the likelihood of unjustified privacy breaches. Cybersecurity experts have repeatedly warned that weakening any part of an encrypted system would decrease the safety and security of everyone, everywhere (4).

That is why the EU co-legislators should make sure that the obligations in the CSA Regulation are proportionate to the known risks and explicitly exclude any prohibition or weakening of encryption, including access by any third party to communications and digital data which are not meant to be accessed, read or edited.
In order to respect users’ privacy and ensure childrens’ safety, encrypted services should be permitted to tackle child sexual abuse without accessing message contents. This should include approaches like product design, the analysis of unencrypted surfaces, and metadata processing.

2. Ensure that mandatory detection is targeted and issued as a last resort measure
The CSA Regulation is an opportunity to build on existing efforts to fight against child sexual abuse. These well-established efforts include detection using high-quality databases of known child sexual abuse material (CSAM), voluntary prevention and detection measures, as well as the development of novel technologies. These tools already result in many actionable reports to law enforcement as well as in the successful prosecution of offenders worldwide. It is of the utmost importance that measures proven to be effective are preserved.

In this context, the CSA Regulation should ensure that the issuance of detection orders remains a last resort measure, enforced only after finding that the provider has failed to take all reasonable and proportionate mitigation measures to address the risk of their services being potentially misused for the purpose of online child sexual abuse. This approach would ensure continuity with existing targeted activity and support law enforcement authorities in investigating and prosecuting offences.
The proposal should therefore, first, enable providers to continue deploying proactive voluntary actions for the prevention, detection and removal of child sexual abuse as a mitigation measure under Article 4 and, second, ensure that detection orders are only activated once it is clear that a certain provider has failed to appropriately mitigate the risks.
Further, caution should be exercised for detection orders of previously unknown CSAM and the solicitation of children (so-called ‘grooming’), given the technical and operational difficulties with detecting this type of content, which requires human confirmation and review of contextual communication (5).

To support this approach, the CSA Regulation also needs to provide clarity on how providers should implement the detection orders, while staying in line with the principle of no general monitoring or active fact-finding obligations, as recently reconfirmed in the Digital Services Act
(DSA).

3. Limit detection orders to providers with the ability to act
The CSA Regulation refers to the term ‘hosting service’, which is very broad and encapsulates a variety of service providers with different technical and operational capabilities. For example, certain providers may use cloud computing services to store content uploaded by their users. In this case, both the service provider and the cloud hosting would qualify as ‘hosting services’ under this Regulation, even though cloud providers lack full visibility over users’ content and are unable to apply detection orders in a way that is proportionate and safeguards privacy.

Requiring providers like cloud computing services to detect online child sexual abuse would show a disregard to their capabilities and disrupt the confidentiality of their customers’ data, which could include businesses and governmental organisations. Co-legislators should introduce language in the CSA Regulation clarifying that detection orders should only be issued to those downstream providers with the technical and operational ability to act, so as to prevent and minimise any possible negative effects on the availability and accessibility of information, in line with Recital 27 of the DSA.

Conclusions
We, the below-mentioned signatories, fully support the proposal’s objective to fight child sexual abuse, and to strengthen existing efforts and ongoing cooperation between national authorities. The new rules need to be proportionate and preserve the privacy of communications, while still allowing for innovation in the fight against child sexual abuse in the EU and beyond. To achieve this, lawmakers need to ensure that the proposal specifically protects encrypted communications and that detection orders are the last step of the process, targeting those providers with the technical and operational ability to act.

While time is of the essence, the CSA Regulation should not be rushed without carefully considering balanced and future-proof solutions to achieve its intended goals. Only this way a robust legislative framework that stands the test of time will be put in place. The undersigned associations are eager to continue engaging with policymakers and other relevant stakeholders in order to secure ongoing and future efforts to combat child sexual abuse online, while at the same time safeguarding the fundamental right to privacy of EU citizens.

Signatories (in alphabetical order):

  • AFNUM (Alliance Française des Industries du Numérique) – Registered in France under
    438608630 (HATVP)
  • CISPE.cloud (Cloud Infrastructure Services Providers in Europe) – 041495920038-44
  • Computer & Communications Industry Association (CCIA Europe) – 15987896534-82
  • CZ.NIC (Czech Internet Association) – Registered in the Czech Republic under
    67985726
  • Developers Alliance – 135037514504-30
  • DOT Europe – 53905947933-43
  • Eco (Verband der Internetwirtschaft e.V.) – 483354220663-40
  • EuroISPA (European Internet Services Providers Association) – 54437813115-56
  • FiCom (Finnish Federation for Communications and Teleinformatics) – 29762326480-22
  • Freedom Internet – Registered in the Netherlands under 74768573
  • i2Coalition (Internet Infrastructure Coalition) – 722865639438-43
  • ISPA Austria (Internet Service Providers Austria) – 56028372438-43
  • ITI (Information Technology Industry Council) – 061601915428-87

 

(1) As stated in UNICEF’s White Paper on Encryption, Privacy and Children’s Right to Protection from Harm, “Encryption is also critical to ensure children’s safety. Their digital devices and communications contain personal information that could compromise both their privacy and safety if it fell into the wrong hands.”

(2) Including in Regulation (EU) 2021/1232 (Interim Regulation on a temporary derogation of the ePrivacy Directive to combat online child sexual abuse), in Directive (EU) 2022/2555 (NIS 2 Directive) and in Regulation (EU) 2023/1543 (e-Evidence).

(3) Joint Opinion 04/2022 by the European Data Protection Board and the European Data Protection
Supervisor and Open Letter by Academics and Researchers on CSA Regulation.

(4) Paper ‘Bugs in our Pockets: The Risks of Client-Side Scanning’; Paper ‘Keys Under Doormats: Mandating Insecurity By Requiring Government Access to All Data And Communications’; Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression; Electronic Frontier Foundation explained ‘Why Adding Client-Side Scanning Breaks End-To-End Encryption’; and Internet Society factsheet ‘Client-Side Scanning’.

(5) As highlighted in the complementary impact assessment by the European Parliament Research Service.


Die Arbeit von netzpolitik.org finanziert sich zu fast 100% aus den Spenden unserer Leser:innen.
Werde Teil dieser einzigartigen Community und unterstütze auch Du unseren gemeinwohlorientierten, werbe- und trackingfreien Journalismus jetzt mit einer Spende.

Enregistrer un commentaire

0 Commentaires