Client-side scanning to detect harmful child abuse material • The Register

Exclusive Legislation has been proposed in the UK and Europe that would give authorities the power to undermine strong end-to-end encryption for what they believe to be justice.

If adopted, these rules would – according to a leading UK computer security expert – allow the reading and analysis of people’s previously private communications with a view to potentially preventing the dissemination of child sexual abuse material and terrorist communications.

Ross Anderson, a professor of security engineering at the University of Cambridge’s Department of Computer Science and Technology, argues that these proposed regulations – which frankly rely on technical solutions such as on-device message scanning and machine learning to fight crime – use algorithms instead of police, social workers and teachers – lead to magical thinking and unsound politics.

In a paper entitled Chat control or child protection?to be circulated via ArXiv, Anderson refutes arguments put forward in July by UK government cyber and intelligence experts Ian Levy, technical director of the UK National Cyber ​​Security Centre, and Crispin Robinson, technical director for cryptanalysis at Government Communications Headquarters (GCHQ), the UK equivalent of the NSA.

This pro-snoop paper written by Levy and Robinson and titled Thoughts on child safety on goods platformswas mentioned by EU Home Affairs Commissioner Ylva Johansson before the European Parliament’s Civil Liberties Committee (LIBE) on Monday for support of the EU Child Sexual Abuse Regulation (2022/0155), according to Anderson.

The debate is sparked by the approaching expiry on 3 August 2024 of an EU law that authorizes online service providers to voluntarily detect and report the presence of child sexual abuse material in users’ communications and files. Without replacement rules, proponents of the proposed child protection rule argue that harmful content will be ignored.

But online rights groups claim the legislation under consideration would cause its own harm.

“The proposed EU Child Sexual Abuse Regulation is a draft law designed to help combat the dissemination of child sexual abuse material,” said the European Digital Rights Initiative (EDRi) in response to Johansson’s proposal.

“Instead, the providers of all our digital chat, messaging and email will be forced to know what we are typing and sharing at all times. It will remove the possibility of anonymity from many legitimate places online. And it may also require dangerous software to be downloaded onto any digital device.”

Meanwhile, the UK is considering its own online security law that would include bypassing encryption through on-device scanning. Proposals similar to the EARN-IT Act keep popping up in the US.

Levy and Robinson’s paper — itself a response to a paper opposing device-side scanning that Anderson co-authored with 13 other security experts in 2021 — outlines the different types of harm children can face online: consensual peer-to-peer -peer sharing indecent picture; viral image sharing; indecent sharing of images/videos from perpetrator to perpetrator; offender-to-victim care; perpetrator-to-perpetrator communication; perpetrator-to-perpetrator group communication; and on-demand streaming of contact abuse.

Anderson argues that this taxonomy of harm reflects the interests of law enforcement officers rather than the welfare of children. “From a child protection and rights perspective, we need to look at the actual harms and then the practical priorities for policing and social work interventions that can minimize it,” he says.

Anderson questions the data used to stoke media outrage and political concerns about harm to children. Citing the 102,842 reports by the National Center for Missing and Exploited Children (NCMEC), the US-based non-profit organization that coordinates child abuse reports from tech firms, to the UK’s National Crime Agency (NCA), he estimates that there are 750 prosecutions for indecent imagery, “well under 3 percent of the 27,233 prosecutions for indecent image crimes in 2019, of which 26,124 involved images of children.” And the number of such prosecutions peaked in 2016 and has since declined, he says.

“In short, the data does not support claims of large-scale escalating harms initiated online that are preventable through image scanning,” Anderson says.

The danger of relying on questionable evidence

But real damage is done by false positives, he notes, pointing to Operation Ore, a crackdown on child abuse online that began two decades ago and led to false allegations.

Levy and Robinson propose to “run language models entirely locally on the client to recognize the language associated with grooming”. They compare this approach to the CSAM scanning on the device in the US that Apple proposed (and later at least shelved for the time being). While acknowledging the issues raised at the time — false positives, mission creep, vulnerability to tampering — they claim, “Through our research, we have found no reason why client-side scanning techniques cannot be implemented securely in many of the situations encountered.”

According to Anderson, law enforcement long ago gave up scanning emails for keywords like “bomb” because it doesn’t work and because traffic analysis, which doesn’t require access to content, is more effective. And he doesn’t expect natural language processing (NLP) models to do any better.

“Using modern NLP models to detect illegal speech—whether it’s sexual grooming, terrorist recruitment, or hate speech—is very error-prone,” he says. “Our research group has many years of experience in searching for violent political extremism online, as well as fraud and spam. Searching for textual content alone can be difficult, depending on the type of material being searched, with error rates well below 5-10 percent.”

With 5 percent false alarms, Anderson suggests that each of Europe’s 1.6 million police officers would have to deal with 625 damage alerts a day – not exactly a practical scenario. That’s not to say there aren’t options, just that the technical fixes that break the encryption aren’t fit for purpose.

In an email to The registryAnderson pointed out that the private sector has shown interest in helping governments get started with content scanning.

“There is a company called Thorn that is pushing the scanning contract and would like to get a government mandate to install their software in your chat clients,” he said.

“Chat operators would hate that, which could be one reason why Apple produced its own client-side scanning software, which caused a storm last year before some of it was withdrawn. There are also some UK startups that have funded GCHQ and the Home Office to produce prototypes. Perhaps this would only serve to get Big Tech to do the work themselves.

“The big weak point, however, is how Big Tech handles user reports ranging from poor (Facebook) to almost non-existent (Twitter). There is real reason for governments to mandate better performance here, as the paper points out and as I also discuss in my article on the UK Online Safety Bill that came out last week.”

Anderson, in his paper on chat control, suggests that child safety and privacy activists could join forces to push rules that force online service providers to remove illegal content when it’s reported.

“Currently, tech companies are paying attention to takedown requests from the police and copyright attorneys because ignoring them can be expensive — but ignore ordinary users, including women and children,” he says. “This needs to be remedied, whether through criminal penalties or substantial fines.”

Anderson’s recommendations for dealing with child abuse focus on traditional, complicated approaches: high-quality, community-based policing rather than push-button solutions; social commitment; empowerment of young people; and respect for human rights.

“The idea that complex social problems are amenable to cheap technological solutions is the siren song of software vendors and has lured many a gullible government agency to the rocks,” says Anderson. “When ministers buy the idea of ​​a magical software ‘solution’, as the industry likes to call their products, the results are often disappointing and sometimes catastrophic.”

In addition, Anderson says that pervasive surveillance violates human rights for no reason. “The rule of law must take precedence over ‘national security,'” he concludes. “We must retain a moral advantage over competing authoritarian states, not just a military and technological advantage. End-to-end encryption must therefore remain available for moral reasons.”

And he says that encryption must remain in place for valid cybersecurity reasons, as Levy and Robinson previously acknowledged and he and his engineering colleagues argued in their previous article. ® Client-side scanning to detect harmful child abuse material • The Register

Rick Schindler

World Time Todays is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – The content will be deleted within 24 hours.

Related Articles

Back to top button