YouTube recommendations steer kids to videos about school shootings and other gun-related content, according to a new report. According to the Tech Transparency Project (TTP), a nonprofit watchdog group, YouTube’s recommendation algorithm is “driving boys interested in video games to school shooting scenes, instructions on how to the use and modification of weapons” and other gun-centric content.
The researchers behind the report posted four new YouTube accounts identifying two 9-year-old boys and two 14-year-old boys. All accounts view playlists of content about popular video games, such as Roblox, Lego Star Wars, Hello and Grand Theft Auto. The researchers then tracked the accounts’ recommendations over a 30-day period in November.
“The study found that YouTube pushes the content of shooting and weapons to all gamer accounts, but with a higher number of users who click on videos recommended by YouTube,” wrote TTP. “These videos include scenes depicting school shootings and other shooting events; graphic demonstrations of how much damage guns can do to the human body; and instructions on how to turn a handgun into a fully automatic weapon.”
As the report notes, many recommended videos appear to violate YouTube’s own policies. The recommendations included videos of a girl firing a gun and instructions for converting handguns to “fully automatic” weapons and other modifications. Some of these videos are also monetized with ads.
In a statement, a YouTube spokesperson pointed to the YouTube Kids app and its in-app tools, which “create a safer experience for tweens and teens” on its platform.
“We welcome research into our recommendations, and we are exploring many ways to bring in academic researchers to study our systems,” the spokesperson said. “But in reviewing the methodology of this report, it is difficult for us to draw firm conclusions. For example, the study does not provide context on how many total videos are recommended in test accounts, and neither will explain how to set up trial accounts, including when YouTube’s Supervised Experiences tools are applied.
The TTP report is far from the first time researchers have raised questions about YouTube’s recommendation algorithm. The company has also spent years working to reduce so-called content — videos that don’t violate its rules but may not be suitable for mass distribution — from showing up in recommendations. And last year, the company said it was considering sharing some of that content with everyone.