Facebook sued for exposing content moderators to Facebook | Social Media

Updated A Facebook contractor hired to keep the network free of beheadings, rape, torture and the like has sued the tech giant and its contracting firm, Pro Unlimited – for allegedly failing to protect workers from the psychological trauma arising from exposure to disturbing .

The complaint, filed on behalf of San Francisco resident Selena Scola on Friday in a San Mateo County court in California, describes content moderation work at Facebook as seeing a series of beheadings, every minute, every day.

Facebook, the complaint explains, employs thousands of contractors to sanitize the millions of videos, images and livestreamed broadcasts sent to the site every day. Many of these contain particularly gruesome, upsetting content.

facebook

Jeez, we’ll do something about Facebook murder vids, moans Zuckerberg

READ MORE

“As a result of constant and unmitigated exposure to highly toxic and extremely distributing images at the workplace, Ms Scola developed and suffers from significant psychological trauma and post-traumatic stress disorder,” the complaint says.

Psychological trauma appears to be a common affliction for those exposed to images of graphic violence and abuse. Last year, former members of Microsoft’s Online Safety Team sued the software giant over similar stress.

Facebook in its most recent Transparency Report said the company removed 3.4 million pieces of content in Q1 2018, up from 1.2 million pieces of content in Q4 2017. The Silicon Valley titan attributes the increase to improved detection technology (~70 per cent), a bug fix that prevented graphic photos from receiving warning banners (~13 per cent), and a surge of content depicting graphic violence (~17 per cent).

The biz says its systems find about 86 per cent of graphic violence uploaded to the site and users report the other 14 per cent. Its hate speech detection algorithms are less adept, catching only 38 per cent of violating content, with users alerting the site 62 per cent of the time.

The complaint says Facebook moderators are asked to review 10 million posts for potential rules violations each week.

Small army of human filters

According to CEO Mark Zuckerberg, Facebook expects to have 20,000 people handling security and content review by the end of this year, up from 15,000 in April. The complaint says about 7,500 presently handle content review.

The lawsuit, alleging negligence and failure to maintain a safe workplace, does not go into specifics about what Scola saw “because Ms Scola fears that Facebook may retaliate against her using a purported non-disclosure agreement.”

Facebook, the lawsuit says, actually helped develop some workplace safety standards to protect content moderators. The social ad biz, along with Adobe, Apple, Dropbox, Facebook, GoDaddy, Google, Kik, Microsoft, Oath, PayPal, Snapchat, and Twitter, are part of a group called the Technology Coalition that in 2013 and 2015 released reports titled “Employee Resilience Guidebook for Handling Child Sex Abuse Images.” The publications describe how companies should help employees deal with the emotional distress of exposure to child abuse images.

However, according to the complaint, Facebook doesn’t take its own advice.

“Facebook ignores the workplace safety standards it helped create,” the complaint says. “Instead, the multibillion-dollar corporation affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma.”

The Register asked Facebook for comment, and we’ve not heard back. ®

Updated to add

In a statement provided to The Register after this story was filed, Bertie Thomson, director of corporate communications said:

“We are currently reviewing this claim. We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”

“Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling (available at the location where the plaintiff worked) and other wellness resources like relaxation areas at many of our larger facilities,” he concluded.

Sponsored:
Following Bottomline’s journey to the Hybrid Cloud

You might also like
Leave A Reply

Your email address will not be published.