Social media giant Meta, formerly known as Facebook, has announced plans to launch a new safety tool aimed at blocking children from receiving and discouraging them from sending nude images. The tool, which will be optional and available to adults as well, will be introduced on Instagram and Facebook later this year. This move comes after Meta faced criticism from government and police for encrypting Messenger chats by default, which they argue will make it harder to detect child abuse. The new feature is designed to protect users, particularly women and teenagers, from being sent or pressured into sending explicit images. In addition, Meta has announced that minors will be unable to receive messages from strangers on Instagram and Messenger by default. This comes after police chiefs in the UK stated that the sending of nude images by youngsters has contributed to an increase in sexual offences committed by children. Legal filings made public as part of a US lawsuit against Meta allege that an estimated 100,000 teenage users of Facebook and Instagram are sexually harassed online every day. Meta has been accused of mischaracterising its work in response to the lawsuit. The tech giant’s decision to encrypt Facebook Messenger chats with end-to-end encryption (e2ee) has faced criticism from various parties who argue that it hinders the detection of child abuse material. Other messaging apps such as Apple’s iMessage, Signal, and Meta-owned WhatsApp already use e2ee and have defended the technology. However, critics argue that platforms should employ client-side scanning to detect child abuse in encrypted apps. Client-side scanning involves scanning messages on a user’s device for known child abuse images before they are encrypted and sent, reporting any suspected illegal content to the company. The NSPCC, a children’s charity, has suggested that Meta’s new system demonstrates that compromises between safety and privacy in end-to-end encrypted environments are possible. According to Meta, its new feature does not involve client-side scanning, as it believes this undermines the privacy protection provided by encryption. Instead, the system will use machine learning to identify nudity and will operate entirely on the user’s device. Meta argues that using machine learning to identify child abuse is much more challenging and carries a serious risk of errors when applied to billions of users, potentially resulting in innocent people being reported. Meta has implemented over 30 tools and resources to enhance child safety and has introduced several new child safety features, including default settings that prevent children from receiving messages from people they do not follow or are not connected to. The company’s parental supervision tools will also give parents the ability to deny teenagers’ requests to change their default safety settings.