In a “Policy enforcement update” posted earlier today, YouTube announced a series of changes coming to the site’s content moderation, most of them centred around new distinctions between actual real-world violence and the “scripted or simulated violence” you see in video games.
Starting today, “scripted or simulated violent content found in video games will be treated the same as other types of scripted content,” rather than as depictions of actual violence. Uploads made in the future that depict video game violence may also be “approved instead of being age-restricted.”
This basically means that less videos will be age-gated going forwards for simply showing scenes of video game violence. With a catch: clips focusing on a gory part of a game, or where violence is the sole focus, may still be age-restricted.
YouTube’s advertising policy, meanwhile, remains exactly the same, so violent videos can still have their ability to sell commercials limited.