Britain admits encryption hurdles in online security law after WhatsApp and Signal threatened to withdraw

The United Kingdom acknowledged possible technical hurdles to its planned crackdown on illegal online content after encrypted messaging companies including WhatsApp threatened to withdraw their service from the country.

Regulator Ofcom can only force tech companies to scan platforms for illegal content such as child sexual abuse images if it is “technically feasible”, Culture Secretary Stephen Parkinson told the House of Lords on Wednesday as the chamber considered the online safety bill the government debated. He said the regulator would work closely with companies to develop and procure new solutions.

“If there is no suitable technology that meets these requirements, Ofcom cannot require its use,” Parkinson said. Ofcom “cannot require companies to use proactive technology for private communications to meet the security obligations of the Bill”.

The comments are intended to allay concerns among technology companies that scanning their platforms for illegal content could jeopardize privacy and the encryption of user data and give hackers and spies a backdoor into private communications. In March, Meta Platforms’ WhatsApp even threatened to withdraw from Great Britain.

“Today it really seems that the Department for Science, Innovation and Technology is offering some language to the messaging companies so that they can save face and avoid the embarrassment of their threats to leave the UK, their second largest company “To have to reject the market in the G7,” said Andy Burrows, a technology responsibility activist who previously worked for the National Society for the Prevention of Cruelty to Children.

Protect children
The sweeping law, which aims to make the internet safer, is in its final stages in Parliament after six years in the making. Parkinson said Ofcom could still require companies to “develop or procure a new solution” to enable them to comply with the bill.

“It is right that Ofcom can require technology companies to use their significant resources and expertise to develop the best possible protection for children in encrypted environments,” he said.

Meredith Whittaker, president of encrypted messaging app Signal, previously welcomed a Financial Times report that suggested the government was withdrawing from the standoff with tech companies and quoted anonymous officials as saying there is no service today that Scan messages without compromising privacy.

However, Security Minister Tom Stimmehat and a government spokesman said it was wrong to claim the policy had changed.

feasibility
“As has always been the case, as a last resort, on a case-by-case basis and only where strict data protection safeguards are met, Ofcom will be allowed to direct companies to either use or use their best efforts to develop or procure. “Technology to identify and remove illegal child sexual abuse content – ​​which we know can be developed,” the spokesperson said.

On Tuesday ministers met in Westminster with major technology companies such as TikTok and Meta.

In the past, the government has repeatedly pushed for technical feasibility. In July, Parkinson told Parliament: “Ofcom can only require the use of technology for an end-to-end encrypted service where it is technically feasible.”

The NSPCC, a key supporter of the UK’s approach, said the government’s statement “consolidates the status quo in the bill and the legal requirements for technology companies remain the same.”

Accredited technician
Ultimately, the text of the law leaves it up to the government to decide what is technically feasible.

Once the bill comes into force, Ofcom may send a company a notice requiring it to “use accredited technology” to detect and prevent child sexual abuse or terrorist content, or face fines, it said in July published draft law. There is currently no accredited technology as the process of identifying and approving services will not begin until the bill comes into force.

Previous attempts to solve the dilemma have revolved around so-called client-side or device-side scanning. But in 2021, Apple Inc. delayed rolling out such a system, which would have searched photos on devices for signs of child sexual abuse, after intense criticism from privacy advocates who feared it could pave the way for other forms of tracking.

Andy Yen, founder and CEO of privacy-focused VPN and messaging company Proton, said: “As it stands, the bill still allows for the imposition of a legally binding obligation to ban end-to-end encryption in the UK, which is the “Undermines citizens’ fundamental rights to privacy and leaves it to the government to define what is ‘technically feasible’.”

“Despite all the good intentions of today’s declaration and no additional protections in the Online Security Act, all it takes is for a future government to change its mind and we are back where we started,” he said.

© 2023 Bloomberg LP


Affiliate links may be generated automatically – see our Ethics Statement for details.

Chrissy Callahan

Chrissy Callahan is a Worldtimetodays U.S. News Reporter based in Canada. His focus is on U.S. politics and the environment. He has covered climate change extensively, as well as healthcare and crime. Chrissy Callahan joined Worldtimetodays in 2023 from the Daily Express and previously worked for Chemist and Druggist and the Jewish Chronicle. He is a graduate of Cambridge University. Languages: English. You can get in touch with me by emailing: ChrissyCallahan@worldtimetodays.com.

Related Articles

Back to top button