Selena Scola scrolled through Facebook like millions of other people, except it was her job -- and it made her sick.
She worked as a content moderator, looking for objectionable content to flag for removal. What she saw -- "videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder" -- caused her psychological trauma and post-traumatic stress disorder.
In a new lawsuit filed in California, Scola alleges the social media giant is to blame for negligently failing to maintain a safe workplace. She's not the only one; it's a class action.
"Keeping People Safe"
CEO Mark Zuckerberg has tried to keep Facebook family-friendly, employing up to 7,500 content reviewers to ensure it is "keeping people safe on Facebook." The company pledged to double the number or people working on safety and security to 20,000 this year.
But it's too late for Scola and other content moderators, the suit alleges. They want Facebook to fund a medical monitoring program to help diagnose content moderators for psychological injuries.
"Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job," said attorney Korey Nelson.
According to the plaintiff's firm, Facebook and other internet service providers established standards for training, counseling, and supporting content moderators more than a decade ago. The lawsuit claims Facebook doesn't follow those standards.
Short-Term, Big Impact
Scola worked at Facebook for nine months though a staffing company, Pro Unlimited. Both companies are named defendants in the lawsuit.
Nelson, with Burns Charest, said his client "witnessed thousands of acts of extreme and graphic violence" from her cubicle.
The attorneys filed their complaint, Scola v. Facebook, in the superior court for the County of San Mateo.