Tech Companies Urge EU Council to Preserve Voluntary Detection of Online Child Sexual Abuse Content
Tech companies and platforms are urging EU Council member states to support a new proposal aimed at combating online child sexual abuse while maintaining the voluntary detection of illegal content. This would enable the continued use of technologies like CSAI Match and PhotoDNA, which automatically identify abusive images shared on online platforms.
Currently, online companies can flag illegal images under a temporary exemption from the EU’s e-privacy law, set to expire in April 2026. The tech industry is advocating for the inclusion of this exemption in the final legislation and seeking its extension in the interim, as negotiations are expected to take longer to resolve.
“While the extension is merely a temporary fix, we strongly support establishing a legal basis for voluntary detection, as proposed by the Polish Presidency in its latest compromise proposal,” they stated.
Ella Jakubowska, Head of Policy at the digital rights NGO EDRi, believes the extension is likely. “With member states still at an impasse, EU lawmakers may have no choice but to extend the current derogation,” she told Euronews.
The proposal has been under discussion since 2022, with various rotating presidencies struggling to achieve consensus. After unsuccessful attempts by the Czech, Spanish, Belgian, and Hungarian presidencies, Poland has adopted a new approach to reach a compromise.
ICYMT: Ghana Observes World Kidney Day 2025 with Emphasis on Prevention
The European Commission’s original proposal included controversial “detection orders” that would allow authorities to access private communications. However, Poland’s latest proposal removes this provision, focusing instead on voluntary detection. Detection orders would only be applied as a last resort to platforms classified as “high risk” that do not take action against online child abuse.
Emily Slifer, director of policy at Thorn, a child protection NGO that develops abuse-detection technologies, acknowledged that while voluntary detection is not ideal, it is currently the most politically feasible option. “When voluntary detection ceased in 2021 due to legal uncertainties, CSAM reports plummeted by 58% overnight—removing it from the legislation would severely hinder child protection efforts,” Slifer said.
ECPAT, an international organization combating child sexual exploitation, has also expressed support for the current proposal but stressed the need for both voluntary and mandatory detection systems to avoid gaps in child protection.
A particularly contentious issue has been the potential scanning of end-to-end encrypted communications. Messaging services like WhatsApp, Signal, Telegram, and Messenger use encryption to ensure that only the sender and recipient can access the messages. Both the Commission’s proposal and earlier drafts of the Council’s text included provisions for scanning encrypted communications, raising significant privacy concerns.
In a positive development for privacy advocates, the latest proposal excludes scanning encrypted messages. Former MEP Patrick Breyer called this a “breakthrough in preserving our right to confidential digital correspondence,” referring to the new proposal as the “Half-good Polish Chat Control proposal.”
ECPAT has urged the Polish presidency and EU member states to swiftly adopt a robust legal framework to address the escalating online child sexual abuse crisis. As cases of online abuse continue to rise, the organization emphasizes the urgent need for action to protect vulnerable children.
With member states still unable to reach consensus, a revised version of the proposal, incorporating feedback from diplomats, will be presented in hopes of achieving a final compromise. Initially proposed by the European Commission in May 2022, the proposal was quickly adopted by the European Parliament. Once the Council reaches an agreement, the long-awaited interinstitutional negotiations, known as trilogues, can commence.
SOURCE: GBCGHANAONLINE