YouTube is changing the way it moderates violent video game content

Joanna Estrada
December 3, 2019

YouTube is making a significant change in how it will moderate content that shows video game violence.

"We've heard loud and clear that our policies need to differentiate between real-world vs. simulated violence, and we're updating our enforcement to reflect that", the YouTube Gaming team wrote in a tweet Monday.

YouTube now has a policy that prohibits violent or graphic content. However, in some cases, the platform elects to age-restrict content rather than remove it, which means it can stay on the platform but requires the user to sign in to their account to view it.

Google will still continue to protect viewers from videos of real-world violence, however.

According to the company, the change means fewer gaming videos on YouTube will be age-gated, allowing more people to see them.

In its update today, YouTube said that 'scripted or simulated violent content found in video games will be treated the same as other types of scripted content.' That means any future uploads featuring gaming content may not be age-restricted even if it features simulated or scripted violence. For example, if the video focuses entirely on the most graphically violent part of a video game.

The revised guidelines are basically in line with the existing advertiser-friendly recommendations, which say that "violence in the normal course of video gameplay is generally acceptable for advertising, but montages where gratuitous violence is the focal point is not" - though that standard has been somewhat erratically enforced.

Other reports by Click Lancashire

Discuss This Article