Icetruck.tv News Blog
Business

Google says 10,000 staff will moderate content

993f619d722d0147f68a074d11abbec23a6d2828f6205aa47aac080f49801f96_4086840
More than 10,000 staff will moderate YouTube videos in 2018

By Alexander J Martin, Technology Reporter

Google says it will have more than 10,000 members of staff monitoring content on platforms including YouTube next year.

That number is composed of all teams across Google, including not only the reviewers but the company's engineers but also its lawyers and operations teams too.

YouTube has been criticised for failing to adequately safeguard children and for allowing extremist content, including Islamic terror-related and white supremacist videos, to be shared.

Hundreds of accounts which had posted lewd comments beneath benign videos, such as content children had uploaded of themselves performing gymnastics, have also been suspended.

Although Google, which is YouTube's parent company, employs machine learning algorithms to automatically flag videos which may breach its rules, the ultimate decision to remove content is made by humans.

In a statement from YouTube's chief executive, Susan Wojcicki, the company claimed to have reviewed almost two million videos and removed 150,000 since June.

In August, YouTube was criticised for deleting video evidence relating to potential war crimes in Syria as part of its work to remove terrorist content and propaganda from the platform.

A range of private and public sector organisations suspended their advertisements from YouTube in March amid concerns they were appearing beside inappropriate content.

Ms Wojcicki said she has seen how YouTube's open platform "has been a force for creativity, learning and access to information" and been used by activists to "advocate for social change, mobilise protests, and document war crimes".

"I've also seen up-close that there can be another, more troubling, side of YouTube's openness. I've seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm," she warned.

According to the statement, 98% of the videos that YouTube removes for violent extremism are flagged by its machine-learning algorithms, and nearly 70% of these are removed within eight hours of upload.

Related stories

  • YouTube 'deleted Syrian war crime evidence'

  • Google boss sorry as firms suspend ads over hate videos

The classifiers that YouTube uses for its machine-learning systems to identify violent content are more sophisticated than those used to spot content involving children.

Google said it remains committed to using humans – who are good at judgement, understanding context and nuance – to tackle these issues,

More top stories

  • Previous article 'Breach' of Bulger killers' court order probed
  • Next article Austria court legalises same-sex marriage


Source – News.sky.com

Leave a Comment