Meta continues to face scrutiny over its enforcement efforts in combating child sexual abuse material (CSAM) on its platforms. Recent investigations have revealed that instances of CSAM are still being distributed across Meta’s networks, despite the company’s attempts to address the issue.
According to The Wall Street Journal, independent research groups have identified various instances of groups distributing child sexual abuse content on Facebook and Instagram. The problem extends beyond Instagram to encompass the broader universe of Facebook Groups, including large groups explicitly centered on sexualizing children. Meta has acknowledged that its progress in addressing the issue has not been as rapid as it would have liked.
One investigation discovered that CSAM networks on Instagram, with millions of followers, have continued to live-stream videos of child sex abuse in the app, even after being repeatedly reported to Meta’s moderators.
Meta has stated that it is working with other platforms to enhance collective enforcement efforts and has improved its technology to identify offensive content. The company is also expanding its network detection efforts to prevent pedophiles from connecting with each other in its apps.
However, CSAM actors continue to revise their approaches to evade detection, posing a constant challenge for Meta and other platforms. The responsibility is even greater for Meta, given its size and reach.
Meta’s own statistics on the detection and removal of child abuse material highlight the company’s significant role in addressing the issue. In 2021, Meta detected and reported 22 million pieces of child abuse imagery to the National Centre for Missing and Exploited Children (NCMEC), which reinforces concerns about the prevalence of the issue on Meta’s platforms.
The company’s gradual shift toward enabling full messaging encryption by default across all of its messaging apps has also raised concerns about the potential risk of expanded CSAM distribution. While encryption can provide more privacy for users, it could also limit the ability to monitor and prevent the distribution of CSAM.
Regulators are grappling with these complex considerations as Meta continues to progress with its encryption project.

