Mastodon, the decentralized network seen as a viable alternative to Twitter, is full of child sexual abuse material (CSAM), according to a new study from Stanford’s Internet Observatory (via The Washington Post). In just two days, the researchers found 112 instances of known CSAM in 325,000 posts on the platform – with the first instance appearing after a five-minute search.
To conduct its research, the Internet Observatory scanned the 25 most popular Mastodon cases for CSAM. The researchers also used Google’s SafeSearch API to identify clear images, along with PhotoDNA, a tool that helps find flagged CSAM. During its search, the team found 554 pieces of content that matched hashtags or keywords commonly used by child sexual abuse groups online, all of which were clearly identified as “highest confidence” by Google SafeSearch.
CSAM’s open posting is “disturbingly widespread”
There were also 713 uses of the top 20 CSAM-related hashtags across Fediverse in posts with media, as well as 1,217 text-only posts focusing on “off-site CSAM trading or grooming of minors.” The study said CSAM’s open postings were “disturbingly widespread.”
One example mentions the extended mastodon.xyz server outage we noticed earlier this month, which was an incident caused by the CSAM posted by Mastodon. In a post about the incident, the sole maintainer of the server said that they were alerted internally that there was CSAM but noted that the moderation was done in his spare time and could take up to a few days to happen – it’s not a giant operation like Meta with a worldwide team of contractors, it’s just one person.
While they said they took action against the content in question, the host of the mastodon.xyz domain still suspended it, making the server inaccessible to users until they were able to reach someone to restore its listing. After the issue was resolved, the administrator of mastodon.xyz said the registrar added the domain to a “false positive” list to prevent future deletions. However, as the researchers point out, “what triggers the action is not a false positive.”
“We got more photoDNA hits in a two-day period than we’ve had in the entire history of our organization doing any kind of social media analysis, and it’s not even close,” David Thiel, one of the report’s researchers, said in a statement to The Washington Post. “Much of this is simply a result of the seeming lack of tools used by centralized social media platforms to address child safety concerns.
While decentralized networks like Mastodon are growing in popularity, there are also concerns about safety. Decentralized networks don’t use the same moderation methods as mainstream sites like Facebook, Instagram, and Reddit. Instead, each decentralized instance is given control over moderation, which creates inconsistencies throughout the Fediverse. That’s why researchers suggest that networks like Mastodon use more powerful tools for moderators, along with PhotoDNA integration and CyberTipline reporting.