How was the article?

Industry News
2020/03

YouTube Says Automated Bots Will Remove Content That Doesn’t Violate Community Guidelines

YouTube is reportedly cutting back staff working in the offices on their content review team, and they claim that by doing so more content than ever could be removed due to automated bots taking over and removing content that doesn’t even violate their guidelines.

On March 16th, 2020 the Team YouTube Twitter account posted the following messages.

Many people called out YouTube for this because many of their reviewers work remotely for the company, as admitted to by CEO Susan Wojcicki, they use “raters” for content curation. These “raters” work remotely, as admitted to by Wojcicki. Meaning, there’s no reason to cut back on staff if the people rating content for being “inappropriate” or “borderline” if they don’t even work from YouTube’s main offices.

For those of you who don’t remember when she made these comments, it was during a Q&A with Re/Code that took place back on June 11th, 2019. The relevant portion of the quote is below, where Wojcicki stated…

“We offer – as you know and you’ve researched – a broad range of opinions. And we have looked in all of these different areas across the board and we see that we are offering a diversity of opinions. So when people go and they look up one topic – whether it’s politics, or religion, or knitting – we’re going to offer a variety of other content associated with it.

 

“But we have taken these radicalization concerns very seriously, and that’s why at the beginning of January we introduced some new changes in terms of how we handle recommendations. I know you’re familiar with them; you’ve referenced them. But what we do – just for [a bit] of background for other people – is that we basically have an understanding of what’s content that’s borderline. When that content is borderline we determine that based on a series of different interv– [cuts off] raters that we use that are representative of different people across the U.S., and when we determine that we’re able to build a set of understanding of what’s considered borderline content and then reduce our recommendations.

 

“So we have reduced by 50% the recommendations that we’re making of borderline content. And we’re planning to roll that out to all countries – to 20 more countries – this rest of the year.”

What’s more is that these raters will affect users’ P-Scores, which is a hidden metric that YouTube uses in order to determine whether a channel’s content is demonetized, whether or not it’s eligible for automated community violations, or put on the chopping block for termination.

While many Centrists™ are in the Twitter thread sucking up to YouTube and trying to stay on good terms with their corporate slavers, at least Styxhexenhammer666 called a spade a spade, and publicly informed people that YouTube – as admitted to by Wojcicki – employs remote raters to review content.

YouTube is essentially taking everyone for a loop, and most people seem to begrudgingly accept their response.

Most people should be holding YouTube’s feet to the fire, since – as pointed out by a few people in the thread, like JackSepticEye – there are people who make a living from YouTube and having that revenue cut off at the whim by a bot on a censorship rampage is a scary thing to have to deal with.

Smart content creators would have wisely built up an audience on alternative platforms like Bitchute or Dlive, because relying on YouTube for your livelihood is like playing Russian roulette with a blindfold while your opponent is constantly adding an extra bullet to the chamber.

(Thanks for the news tip Clownfish TV)

Other Industry News