The YouTube algorithm recommended hundreds of videos about guns and gun violence to accounts of kids interested in video games, according to a new study. Some of the suggested content they gave instructions on how to convert pistols to automatic weapons or showed school shootings.
The investigation was made by Tech Transparency Project (TTP), a nonprofit project that monitors the work of big tech. For this study, analysts created four YouTube accounts: two identified as nine-year-olds and two identified as 14-year-olds. Then they pretended to be minors who liked video games, playing videos on the subject. With a key difference: half of the profiles only saw material recommended by the platform and the rest never interacted with it.
The study found that YouTube sent gun and shooting content to all accounts. However, it did so in a higher volume to users who clicked on the recommended videos. In some cases, up to 10 times more. That is, if a minor showed interest in the videos suggested by YouTube, the algorithm featured more and more content related to violence in the real world.
One video showed an elementary school-age girl holding a gun. In another, a man was seen using a .50 caliber pistol to shoot a dummy head, filled with blood and realistic brains. The team also recorded actual scenes showing school shootings and other such events.
Gun videos and YouTube policies
The study’s finding refutes the guarantees that YouTube has offered about the safety of its algorithms. “YouTube recommendations aren’t actually driving viewers toward extreme content,” public in 2021 on the platform’s blog the then vice president of engineering of the company, Cristos Goodrow. “The delivery of responsible recommendations is our top priority,” he said then.
youtube sent 382 weapons videos of real fire to the account that pretended to be a nine-year-old boy and that he saw only recommended content. All this in a single month, which would be an average of more than 12 per day. The volume was much higher in the 14 year old users: it showed 1,325 videos of real firearms, an average of more than 44 per day.

Accounts that ignored the suggestions did receive some gun-related videos, but much fewer: 34 for nine-year-olds and 172 for 14-year-olds.
“Video games are one of the most popular activities for children. You can play a game like call of duty unfinished in a gun store, but YouTube is taking them there,” Katie Paul, director of TTP, told PA. “It’s not the video games, it’s not the kids. It’s the algorithms.”
The danger beyond the screen

YouTube, on two different days of the investigation, went so far as to recommend over 100 gun videos in a single day to the account of 14 years. Some content appeared in the suggestions multiple times, including one titled “12- and 14-year-olds have HUGE fight with police,” which shows footage taken by authorities of a late-night shooting.
Investigators also found a video called “How to Put a Switch on a Glock (Educational Purposes Only),” among several others that explained how to use a gun. YouTube removed it after a short time, but the owner of the channel where this material was uploaded again with slight changes to the title. Until this week, the video was still available. In addition, it was monetized with advertising.
In May 2022, a shooting occurred in Buffalo, New York, in which 10 people were killed and three were injured. Most were black people. The 18-year-old shooter had learned how to improve his aim, reload firearms faster and modify his rifle by watching YouTube videos, according to a report from the Everytown for Gun Safety Support Fund group.
A survey of Pew Research 2022 ensures that YouTube, owned by Google, is the most popular social networking platform for children between the ages of 13 and 17 in the United States. 95% of respondents said they used YouTube. The one that comes closest to it is TikTok, with 65%. Despite its popularity among the youngest, maintains the TTP, YouTube has largely escaped the intense scrutiny that social networks like TikTok and Instagram do undergo.