When YouTube was created 11 years ago, it was praised as a forum for public expression and freedom of voice. It was a haven for content creators to safely share their ideas on any issues as well as express themselves to the world without fear of repercussions or censorship.
Sadly, this open forum has begun to change at the company that was purchased by Google in 2006. YouTube added new content restrictions as of Aug. 11, which exclude videos that include controversial topics from being monetized. This regulation can come in the form of “demonetization,” which is the suspension of ad revenue from a video.
With many large YouTube content creators having over over a million subscribers such as “nigahiga,” “PewDiePie” and “vlogbrothers” and making a living from video monetization and revenue produced through the videos they post, these restrictions will have a huge negative impact on the free flow of ideas on the site.
The aftermath of these new restrictions has been shown in the case of YouTube news reporter Philip De Franco, who makes videos talking about current events in pop culture and had at least 12 of his videos demonetized as a result of this decision, according to Vox.
The ridiculousness of these guidelines was revealed when De Franco’s video about social justice warriors got dinged for being controversial.
This event revealed YouTube’s illogical standards for judging what is controversial content. According to the new YouTube guidelines, videos containing “war, political conflicts, natural disasters and tragedies” are deemed “controversial,” meaning not suitable for all audiences. As a result, the videomakers won’t make money on them. Broader guidelines encompassing violence, nudity and hateful content have always been a part of YouTube.
While the new rules also decrease controversy and hate speech in the comment section of videos by allowing users to regulate comments, they have serious implications. The restrictions completely erase YouTube’s original appeal as a forum that allows for a diversity of ideas and content, regardless of how controversial some of those ideas were.
The company also introduced a new program known as “YouTube Heroes,” which essentially incentivizes the reporting — and thus restriction — of videos. Through the program, users can flag inappropriate videos, add captions and subtitles to videos and post their knowledge of the site in the YouTube Help Forum.
After any of these actions are done, the user gets a “contribution,” receiving a certain number of points. Once a certain threshold of points has been reached, the user obtains perks like being able to help moderate content in the “YouTube Heroes Community” and even earn an internship at YouTube.
These perks essentially give individuals the powers of moderators, meaning that instead of just being able to flag videos, they can now review content and decide whether or not they’re controversial. All that seems innocent enough.
However, the problem with giving these perks and moderator powers is that these individuals can abuse these powers and they are susceptible to personal bias. YouTube users can suffer as a result of individual bias or opinion if their videos are wrongly demonetized or taken down.
These efforts ultimately destroy much of the appeal that YouTube has had. The site’s content creators came to YouTube in search of an opportunity to express themselves freely in the public eye and earn a living — or at least some money — doing it.
Although YouTube CEO Susan Wojcicki has not personally addressed this situation, YouTube has still denied the idea that the company is targeting controversial content and shutting out creators.
To remain an open and fair forum, YouTube should revert to what it was originally made to be: a domain without needless restrictions. If not dealt with, over a certain period of time YouTube could risk losing creators and viewers, the basis of the company's existence.