Australia’s e-Safety Commissioner, a government agency responsible for keeping citizens safe online, has released a report on how seven technology platforms work to protect children from online sexual abuse – and found that most are not fast Respond or don’t have the appropriate processes in place Good.
The Commissioner oversees Australia’s core online safety expectations, which set out how Australia should behave from online platforms. The core of the expectations is that platforms will do their best to weed out unlawful or harmful material, allow users to report it, and respond to the Commissioner’s requests for information.
In August 2022, the commissioner sent transparency requests asking seven service providers – Apple, Meta, WhatsApp, Microsoft, Snap, Skype and anonymous chat service Omegle – to explain the tools, policies and processes they use to counter Addressing Child Sexual Exploitation and Abuse (CSEA) Materials and Actions. The Commissioner asked how they address the dissemination of such abhorrent material, the online nurturing of children and the use of video calling and conferencing services to provide live feeds of child abuse.
Among the findings evaluating the organizations’ responses, the commissioner noted that Microsoft does not use the PhotoDNA image recognition technology, which it has helped develop and promote, as a tool “to prevent the dissemination of online photos of child sexual abuse.” to stop”.
The commissioner also criticized Apple and Microsoft, saying that “despite the wide availability of PhotoDNA detection technology, they do not seek to proactively detect child abuse material stored on their widely used iCloud and OneDrive services.”
“Apple and Microsoft also reported that they don’t use technology to detect live streaming of child sexual abuse in video chats on Skype, Microsoft Teams or FaceTime, despite a long and widespread crime on Skype in particular.”
Microsoft offered the following explanation for not monitoring Teams videos on CSEA:
Microsoft reported that the average response time for reports about CSEA in Teams was two days – the same as for OneDrive and one day longer than for its Xbox Live services. The report notes that Microsoft was first to point out that some Teams review queues had been considering matters for 19 days.
Other platforms fared better: Meta reported that Instagram can detect and remove CSEA within two hours and forty seconds of detection on a device and processes reports from Instagram users in about 40 minutes.
But the report is also full of evasions, distractions and excuses for why more comprehensive measures to detect and eliminate CSEA are not in place.
For example, WhatsApp does not share information about blocked users with Instagram or Facebook. If a user is banned for CSEA on Facebook, they must not be banned on Instagram.
Snap and Microsoft aren’t even trying to uncover previously unobserved CSEA material.
Only Omegle tries to detect CSEA in live streams, video calls or video conferences. Snap and Apple do not attempt to identify underage grooming. Apple does not provide reporting tools on its online services.
We could go on, but you get the idea. Throughout the 63 pages of the report, readers will find many examples of inaction that, if corrected, would provide children with greater protection.
eSafety Commissioner Julie Inman Grant pointed out that the report is not comprehensive – it only provides answers to questions her agency asked seven specific service providers. However, some of the responses provided describe the global capabilities of respondents, meaning this report is at least a window into how some of the most powerful tech companies are tackling — or not tackling — the horrors of CSEA. ®
https://www.theregister.com/2022/12/16/esafety_comissioner_csea_report/ Microsoft Teams is a vector for child sexual abuse material • The Register