Ex-Facebook content moderator claims disturbing images gave her PTSD; sues

Desiree Burns
September 25, 2018

A former Facebook employee is suing the company, saying her job as a content moderator left her with lasting psychological trauma and post-traumatic stress disorder, according to Business Insider.

"That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources", the statement added. The lawsuit alleges negligence and failure to maintain a safe workplace at Facebook and Pro Unlimited. "Instead, the multibillion-dollar corporation affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma", the suit alleges.

The lawsuit does not go into further detail about Ms. Scola's particular experience because she signed a non-disclosure agreement that limits what employees can say about their time on the job.

'From her cubicle in Facebook's Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence, ' the court documents read.

The job of content moderator is notoriously unpleasant, and at Facebook is usually handled by contract workers.

A former Facebook moderator is suing the social media giant for failing to protect her after she developed post-traumatic stress disorder from the graphic content she had to watch.

'I've Never Sexually Assaulted Anyone'
In the interview, Kavanaugh told MacCallum: "The truth is I've never sexually assaulted anyone, in high school or otherwise". She said through "gracious persons here and across the country, we have been able to afford hiring security" for protection.

The woman reportedly worked for Facebook through a third party contracting company, Pro Unlimited, Inc., which is based in Boca Raton, Fla.

Facebook is under intense scrutiny from politicians and lawmakers, who have taken top executives to task in two high-profile hearings on Capitol Hill this year and are considering new regulations that would hold the companies to a more stringent standard of responsibility for illegal content posted on their platforms.

Why would Facebook be liable?

Facebook's director of corporate communications, Bertie Thomson, said in a statement via email that the company is "currently reviewing this claim", and that it recognizes that "this work can often be hard". "By requiring its content moderators to work in risky conditions that cause debilitating physical and psychological harm, Facebook violates California law".

"Our client is asking Facebook to set up a medical monitoring fund to provide testing and care to content moderators with PTSD", said Steve Williams, one of Scola's lawyers from the firm Joseph Saver, in the release.

Other reports by

Discuss This Article