Google Brings Again Human Moderators for YouTube Content material

Joanna Estrada
September 24, 2020

She now claims to suffer from anxiety, panic attacks, and fear of open spaces where mass shootings might occur. The woman also says she is unable to be in crowded places.

The AI system was programmed to be cautious, which led to content that was close to breaking YouTube's rules being removed from the site.

A whole lot of videos were removed from YouTube because of heavy reliance on AI models of moderates.

YouTube is being accused of not doing enough to safeguard the mental health of its content moderators.

The former employee is seeking medical treatment for the psychological trauma she has endured, compensatory damages for her injuries and an award of attorney's fees, according to the lawsuit.

According to CNBC the woman, known in the lawsuit as "Jane Doe", worked as a contractor for a firm that was contracted to YouTube in 2018 and 2019. Neither Google or Collabera replied for comment.

YouTube said that before year's end it will increasingly use machine-learning to identify which videos contain content that should be stuck behind a wall. They had the authority to take down the videos also. This included content such as hate speech and misinformation.

YouTube, which is owned by Google, allegedly also decides whether moderators view content that is blurred and for how long they have to watch it for as part of their review process, which means employees often do not know what they will have to look at. The Washington Post, earlier this year, wrote about a former Facebook mod who sued the company, with Facebook in May agreeing to pay $52 million to some ex-moderators claiming PTSD.

The suit specifically alleged that YouTube, in violation of California law, did not create a safe work environment and failed to properly consider the mental health needs of employees handling such graphic content. And this has caused more videos to be deleted than it should. That's because they have to pass a test in which they have to determine whether certain content violates YouTube's rules.

YouTube also didn't do enough to provide support for these employees after they started their job, according to the lawsuit. The company allows workers to speak with wellness coaches, but the coaches don't have medical expertise and aren't available to moderators who work at night. But the plaintiff says that one coach told her to cope with illegal drugs while another said to "trust in God", the suit says. "Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions".

If content has been labeled as age-restricted, users will need to sign in to YouTube.

The moderators are also expected to have an "error rate" of between two to five per cent, on the 100 to 300 pieces of content they review daily, the suit claims.

Other reports by Click Lancashire

Discuss This Article

FOLLOW OUR NEWSPAPER