YouTube is the single biggest video streaming platform on the internet. Everyone knows that. Anyone can just create an account, upload their videos, and voila! You have a stream. What's more, people can easily monetize their videos if the quality is good enough. The platform makes it easy for advertisers to come in contact with creators, and it is one the reasons why most people now practically make a living out of making YouTube videos.

YouTube bans gun-related content

Being the single best platform also makes it so that when rules and regulations change, content creators could get their source of income compromised en masse.

Take for example the fact that YouTube is demonetizing videos that have guns, or references to guns, in them.

This is a reason because some people believe that, one way or another, less exposure to guns and the violence associated with them might also curb the tendencies for violence. We understand that. But these restrictions on gun portrayal are also messing up with video game streamers' ability to make money out of their work.

See, a huge percentage of video game content creators on YouTube upload stuff from shooters. "Destiny," "Call of Duty," "Battlefield," and of course "Counter Strike" all have portrayals of guns in them. It becomes confusing when those who had been monetizing their videos before now experience filtering because of similar content that they have already been uploading for months, if not years, with no problem.

Algorithms for demonetization

A "demonetized" video means that it stays on the site and people can view it, but creators cannot make any money out of it. According to a report by Forbes, many gamers have already experienced demonetization under the watch of AI bots that seek the word "gun" in video titles. An experiment by user TotalBiscuit tested how the AI algorithms determined what videos to demonetize: he asked his followers to upload a copy of the same video using different titles.

Sure enough, those with "gun-based content" were almost instantly demonetized, while the AI took a good while to find those that have been masked with unrelated titles and unidentifiable thumbnails.

YouTube's statement on the issue is one that protects itself. They say that these "gun-related" videos are not advertiser-friendly, and are therefore set away from the advertising pool.

Reasonable, and one could say that they are protecting the advertisers with this way, as well. However, they better have an algorithm in place that tells which gamers make videos, or else they might lose a significant part of their user base.