If you ever upload anything to YouTube (someone’s doing it, there’s a lot of video being added daily), you’ll have noticed in recent months that the service was pressuring you on whether your content was kid-friendly or not. The point, as YouTube explains it, is to give its creators the ability to restrict content themselves since… well, you know what you put in the video. Now, though, YouTube’s scaling things up a little by also letting an AI make the call on age-restricting videos.
YouTube said that before year’s end it will increasingly use machine-learning to identify which videos contain content that should be stuck behind a wall. A wall that requires a signed-in YouTube account where the user is over eighteen to climb over — so not a terribly large wall. But it’s still a hurdle that needs to be cleared.
But, if your video is identified as needing an age-restriction and you believe it’s an error, you’ll be able to appeal the decision. YouTube hasn’t detailed how that’ll work or how effective it’ll be but the appeals process is bound to get a workout. Machine-learning methods, no matter how good they are, tend to make mistakes. This job is currently done by humans on the company’s Trust and Safety team — they’ll likely move over to checking on the AI locking videos up at high-speed.
The automatic age-restrictions should have limited impact on people who make a living uploading content to the service. Those users tend to know what they’re doing and avoid uploads of the sort that could kill their revenue streams. “For creators in the YouTube Partner Program, we expect these automated age-restrictions to have little to no impact on revenue, as most of these videos also violate our advertiser-friendly guidelines and therefore have limited or no ads,” the service said. So everything should be fine on that front. Unless, that is, the automated system makes a few monumental stuff-ups.