The new law will give regulators the power to force tech companies to stop sexually abusing children on their platforms.
The amendments to the Online Safety Bill, announced today by the Home Office, would allow Ofcom to require big tech companies like Facebook and Google to use their “best efforts” to prevent, identify and eliminate child sexual abuse.
The move was welcomed by the National Society for the Prevention of Cruelty to Children (NSPCC), which said it would help stem what it called an “online child abuse tsunami”.
The amendment is a small but significant enhancement of the powers of Ofcom, which would become a technology and social media regulator if the proposed Online Safety Bill becomes law.
It will allow Ofcom to assert evidence that child sexual abuse is being addressed, even as the technology behind the platform changes.
Meta, which owns Facebook, WhatsApp and Instagram, has announced plans to effectively lock down direct messages on Facebook Messenger and Instagram using end-to-end encryption, a technology that helps keep chats secure. stories but can also make them inaccessible to anyone trying to keep them safe. .
Pros and cons of encryption
Interior Minister Priti Patel condemned Meta’s encryption schemes in the strongest possible terms, calling them “morally wrong and dangerous”, and law enforcement agencies like Interpol and Britain’s National Crime Agency (NCA) have criticized the technology.
But Whitehall officials insist that they are not against encryption itself, just the problems it poses for law enforcement and police forces, who need direct evidence of hacking. related to child sexual abuse to initiate investigation and arrest.
Last year, the Internet Watch Foundation was a success blocked 8.8 million attempts by UK Internet users to access videos and images of abused children.
Faced with exploitation of this scale, officials say they must at least maintain current levels of access, which relies on tech companies reporting abuse cases to authorities.
The case of David Wilson, for example, who impersonated girls online to elicit pornographic images from young men, was started following a report from Meta. Wilson is jailed for 25 years in 2021 after admitting 96 counts.
The new law will give Ofcom the power to require tech companies both inside and outside the UK to identify and remove child sexual abuse content, potentially giving the UK regulator the power to crack the code on the internet. Global.
However, officials say this doesn’t mean other apps and services can’t be encrypted, and say technologies exist that could allow police forces to access their documents. required without compromising privacy.
The new law would require tech companies to take action against child sexual abuse “where it is proportionate and necessary to do so”, giving Ofcom the ability to strike a balance between security for users. and security for children.
However, while the move may seem like a peaceful resolution of the vexing encryption problem, it may not mark the end of the conflict.
‘online child abuse tsunami’
Apple’s attempt to scan iPhone images for child sexual abuse images was delayed last year following outcry from privacy campaigners.
The system, called NeuralHash, is designed to identify images in a privacy-preserving way by performing analysis locally on phones rather than in Apple’s data centers, but rights campaigners privacy that this software could be abused by governments or authoritarian countries.
Whitehall officials say the fear is overblown, pointing to the results of the Safe Technology Challenge Fund, a government-funded partnership with industry to produce technology that can “hold up”. safe for children in end-to-end encrypted environments” – such as an algorithm that automatically shuts down the camera when nudity is detected.
The announcement of the law change comes as police data obtained by the NSPCC reveals what the charity describes as an “online child abuse tsunami”.
Freedom of Information requests filed by the charity revealed that sexual contact offenses with children increased by 80% in four years, rising to 6,156 in the last year – an average of almost 120 violation for a week.
Sir Peter Wanless, chief executive of the NSPCC, welcomed the change to the Online Harm Bill, saying it would strengthen protections around private messaging.
He told Sky News: “This positive step shows that there is no need to trade off privacy with the detection and disruption of child abuse material and grooming.