Stuff South Africa

YouTube is the cesspool of the internet as recent scandals around bigoted videos – and their recommendations – have shown

I’d never heard of Carlos Maza until this month, when the Vox video producer made a supercuts video of all the homophobic and racist ranting by a popular YouTuber called Steven Crowder.

The right-wing pundit who has 3.8m subscribers to his YouTube channel has attacked Maza repeatedly, called him an “anchor baby, a lispy queer, [and] a Mexican”.

Amazingly, given the long-standing vitriol, racism and homophobia, this did not contravene YouTube’s “community” standards. These prohibit videos that “deliberately posted in order to humiliate someone, make hurtful and negative personal comments/videos about another person, or incite others to harass or threaten individuals on or off YouTube”.

But clearly, they do not because the videos have remained online. It’s the latest – and latest high-profile – case of YouTube’s bizarre response to abuse on its platform, where hate speech run rife.

As I have written before, YouTube is the cesspool of the internet. It’s filled with viciousness and callousness about historically and scientifically verified truths. The Holocaust happened, as did the Sandy Hook and Stoneman Parkland school shootings. Don’t even get me started on the antivaxxers, who care more for discredited conspiracy theories than their own children’s health.

And yet, YouTube allows content portraying these patent falsehoods to remain on its site.

Why? I suspect because people watch that drivel. And believe it. And the more controversial, the more they watch it and the more advertising YouTube can show its viewers. It’s a vicious circle created by YouTube’s own business model. Stopping the abuse is directly at odds with its own rationale for revenue generation.

People like Crowder – and Alex Jones before him and countless others – have been able to continue with their factually-inaccurate, morally repugnant ranting to large audiences, seemingly because of the volume of advertising these “alt-right” fascists can be shown. How else do you explain YouTube’s slow response?

As if this isn’t bad enough, this month also saw some scary revelations about how YouTube’s algorithms show innocent children’s videos to paedophiles. The algorithms recommended these kids clips, in some cases researchers found, after people had watched sexually themed videos. That’s right, YouTube dug them up from its own archive and played them.

Makes you think twice about putting your kids’ videos online, doesn’t it.

This follows another outrage last month when it was shown how the video site’s comments section was rife with paedophiles commenting on otherwise innocuous footage.

Worse still, YouTube’s bosses knew about it and did nothing, according to a Bloomberg investigation in April.

In a blog post last week about its “ongoing work to tackle hate” YouTube said it would update its “hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status”. This includes videos that glorify Nazi ideology and those that deny “well-documented violent events” like the Holocaust or the Sandy Hook shooting.

Time will tell if this new moral code trumps its advertising-displaying business model.

I’d like to remind everyone that Google’s original mantra was: “Don’t do evil”.

How the mighty have fallen.

This column first appeared in Financial Mail

Exit mobile version