‘Irresponsible failure’: Google, Meta, Snap and Microsoft slam EU over child sexual abuse law lapse

A picture


The European parliament has blocked the extension of a law that permits big tech firms to scan for child sexual exploitation on their platforms, creating a legal gap that child safety experts say will lead to crimes going undetected,The law, which was a carve-out of the European Union’s ePrivacy Directive, was put in place in 2021 as a temporary measure allowing companies to use automated detection technologies to scan messages for harms, including child sexual abuse material (CSAM), grooming and sextortion,However, it expired on 3 April, and the EU parliament decided not to vote to extend it, amid privacy concerns from some lawmakers,The regulatory gap has created uncertainty for big tech companies, because while scanning for harms on their platforms is now illegal, they still remain liable to remove any illegal content hosted on their platforms under a different law, the Digital Services Act,Google, Meta, Snap and Microsoft said they would continue to voluntarily scan their platforms for CSAM, in a joint statement posted on a Google blog.

“We are disappointed by this irresponsible failure to reach an agreement to maintain established efforts to protect children online,” the statement said,The European parliament said in a statement that it was prioritizing its work on legislation to prevent and combat child sexual abuse online, and that negotiations on a permanent legal framework were ongoing, though the body had offered no timeline for agreements or implementation,Child protection advocates had warned that allowing the legislation to lapse would probably trigger a steep fall in reports of child sexual abuse,They point to a similar legal gap that occurred in 2021, when reports of such material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) fell by 58% over a period of 18 weeks,“When detection tools are disrupted, we lose visibility that directly impacts our ability to find and protect child sexual abuse victims,” said John Shehan, vice-president at the NCMEC, a US-based organisation that acts as a clearinghouse for child abuse reports, which it forwards to relevant law enforcement agencies around the world.

“When detection goes dark, the abuse doesn’t stop.”In 2025, the NCMEC received 21.3m reports that included more than 61.8m images, videos and other files suspected of being related to child abuse, from around the world.About 90% of these reports are related to countries outside the US.

A spokesperson for the EU parliament declined to comment on whether the legislative body had conducted any assessments to determine the consequences of the lapse of the law.The EU’s decision to prohibit scanning will have ripple effects in other regions around the world, child safety experts said.Many internet crimes are cross-border, with perpetrators sending illegal images to people or targeting children in other countries.“Sextortionists”, who pose as romantic interests to trick people into sending intimate photographs before making blackmail attempts, may also capitalise on the law change, Shehan said.“The offender can be anywhere in the world, but they could have unfettered access to minors in Europe now that there’s legal uncertainty around those safeguards and protections to identify when a child is being groomed,” Shehan said.

For the past four years, the proposed child sexual abuse regulation has been under negotiation, with contention arising because it would obligate companies to take measures to minimise risks on their platforms, said Hannah Swirsky, head of policy and public affairs at the Internet Watch Foundation, a UK-based child safety non-profit.Privacy advocates argue that big tech scanning messages for child abuse threatens fundamental privacy rights and data security for EU citizens, equating these measures to “chat control” that could lead to mass surveillance and false positives.“There are claims of surveillance or infringement of privacy,” Swirsky said.“Blocking CSAM is not an evasion of privacy.Free speech does not include sexual abuse of children.

”The scanning technology uses machine learning that performs pattern detection to identify known images or videos of abuse, as well as language associated with child exploitation, and does not store any data, said Emily Slifer, director of policy at Thorn, a non-profit that builds technology to detect online child abuse, which is commonly used by companies and law enforcement.The system works by having trained analysts review known CSAM obtained from external sources, such as reports from police, the public or investigations into websites known for hosting child abuse material.When analysts confirm that content is illegal child sexual abuse, they generate a unique digital fingerprint – known as a hash value – that identifies that exact image.Lists of hash values are then shared with platforms, which use automated systems to scan uploads and block matching content instantly, without the need for a human to view it.“The technology doesn’t find babies in bathtubs and things like that.

If you just think of what an image of abuse would look like versus what consensual content would look like: those are two very different pieces of material, and technology can determine those patterns between them,” Slifer said.While the EU has blocked scanning for child abuse, it has allowed tech companies to voluntarily scan messages for the detection of terrorist content under legislation adopted in 2021, she said.“The EU is effectively risking open doors for predators,” Swirsky said.“If the EU is serious about protecting children online, then it needs to agree on a permanent legislative framework for safeguarding children and for enabling detection.”This article was amended on 10 April 2026 to correct the name of the EU’s ePrivacy Directive.

technologySee all
A picture

‘Irresponsible failure’: Google, Meta, Snap and Microsoft slam EU over child sexual abuse law lapse

The European parliament has blocked the extension of a law that permits big tech firms to scan for child sexual exploitation on their platforms, creating a legal gap that child safety experts say will lead to crimes going undetected.The law, which was a carve-out of the European Union’s ePrivacy Directive, was put in place in 2021 as a temporary measure allowing companies to use automated detection technologies to scan messages for harms, including child sexual abuse material (CSAM), grooming and sextortion. However, it expired on 3 April, and the EU parliament decided not to vote to extend it, amid privacy concerns from some lawmakers.The regulatory gap has created uncertainty for big tech companies, because while scanning for harms on their platforms is now illegal, they still remain liable to remove any illegal content hosted on their platforms under a different law, the Digital Services Act. Google, Meta, Snap and Microsoft said they would continue to voluntarily scan their platforms for CSAM, in a joint statement posted on a Google blog

A picture

Elon Musk’s xAI sues Colorado over new rules for artificial intelligence

Elon Musk’s artificial intelligence company, xAI, has filed a lawsuit against the state of Colorado over a new AI law set to take effect in June.The suit seeks to block the state from enforcing the law, which would impose new requirements on AI systems to protect state residents from “algorithmic discrimination” in sectors such as education, employment, healthcare, housing and financial services.Colorado was the first state to pass a comprehensive bill to regulate AI.The company claims the law infringes on its first amendment free-speech protections and would force xAI to “promote the state’s ideological views on various matters, racial justice in particular”, according to the Financial Times, which first reported the lawsuit. “Its provisions prohibit developers of AI systems from producing speech that the state of Colorado dislikes

A picture

Amazon upsets ebook lovers by ending support for old Kindle devices

Amazon is to stop supporting older Kindle models leaving longtime ebook fans unable to access new content from the Kindle store.Devices released during or before 2012 will no longer receive updates from 20 May, affecting owners of older Kindles, including the earliest models such as the Touch and some Fire tablets. It is thought that 2m e-readers could be affected.Users will still be able to read ebooks they have downloaded, and their accounts and their Kindle library will remain accessible on mobile and desktop apps. Active users have been offered discounts to help “transition to newer devices”

A picture

OpenAI shelves Stargate UK in blow to Britain’s AI ambitions

OpenAI has put on hold plans for a landmark UK investment citing high energy costs and regulation, in a blow to the government which has put AI at the centre of its growth strategy.Stargate UK was a part of the UK-US AI deal announced last September, in which US companies appeared to commit £31bn to the UK’s tech sector, part of a larger series of investments intended to “mainline AI” into the British economy.It came as the Labour government seeks to make AI and datacentres the engine of its growth plans, alongside closer ties with Europe and regional growth.“This is a wake-up call for the government to manage energy costs in the UK and foundation infrastructure,” said Victoria Collins MP, the Liberal Democrat spokesperson for science, innovation and technology. “We cannot be dependent on US tech companies to build our own sovereign capabilities – whether that’s energy cost, supply or even data and phone signal

A picture

British computer scientist denies he is bitcoin developer Satoshi Nakamoto

A British computer scientist has insisted he is not the elusive developer of bitcoin, after a report claimed to unmask him as its creator.A story in the New York Times details a years-long effort to unmask Satoshi Nakamoto, the mysterious author of the bitcoin white paper which laid the theoretical foundations for modern digital currencies.It names Adam Back, a London-born computer scientist and entrepreneur. In a thread on X, Back promptly denied being the mysterious – and presumably ultra-wealthy – technologist.“I also don’t know who satoshi is, and i think it is good for bitcoin that this is the case, as it helps bitcoin be viewed [as] a new asset class, the mathematically scarce digital commodity,” he wrote

A picture

Britons warned about Russian hackers targeting internet routers for espionage

Russian hackers are exploiting commonly sold internet routers to harvest information for espionage purposes, the UK’s cybersecurity agency has said.The hack could allow attackers to obtain users’ credentials, redirect them to fake sites, and potentially access other devices on their home network such as phones and PCs, said Alan Woodward, a professor at the University of Surrey.The National Cyber Security Centre said on Tuesday the operations were “believed to be opportunistic in nature, with the actor targeting a wide pool of victims and then likely filtering down for users of potential intelligence value at each stage of the exploitation chain”.It follows a common pattern of cyber-actors targeting edge devices – hardware such as internet routers or internet-connected security cameras – that act as a bridge between users and the cloud.Woodward said: “It’s not the first time that warnings have come out about routers