A Blog by Jonathan Low

 

Sep 27, 2018

Ex-Content Moderator Sues Facebook, Claiming Viewing Violent Images Caused PTSD

The line between life online and off continues to narrow. The social and legal question here gets to the larger question of what responsibility internet platforms have to manage the information on their sites - and to protect both their own workforce and society at large.

Social and legal norms are playing catchup, but there appears to be a growing consensus that 'something must be done.' The issue is whether that inchoate 'something' can be defined, agreed upon and legislated in ways that meet the approval of the various commercial and individual constituencies involved - and whether society is willing to accept trade-offs that may limit the speed, convenience and universality that the internet has provided in this first generation of its existence. JL


Sandra Garcia reports in the New York Times:

A former content moderator who worked on contract for Facebook has filed a lawsuit saying that being bombarded with thousands of violent images on her computer led her to develop post-traumatic stress disorder. The former moderator argues Facebook failed to protect her and other contractors as they viewed distressing videos and photos of rapes, suicides, beheadings. “You’d go into work every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.”
A former content moderator who worked on contract for Facebook has filed a lawsuit against the company saying that being bombarded with thousands of violent images on her computer in Silicon Valley led her to develop post-traumatic stress disorder.
The former moderator, Selena Scola, argues that Facebook failed to protect her and other contractors as they viewed distressing videos and photographs of rapes, suicides, beheadings and other killings, according to the complaint, filed on Friday in San Mateo County Superior Court.
Ms. Scola, who worked on behalf of the company for nine months, said in the complaint that her post-traumatic stress disorder was set off “when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises or is startled.”
Facebook’s 7,500 moderators around the world sift through 10 million potentially rule-breaking posts per week, the lawsuit says.


The company relies on its two billion users to report inappropriate content. The moderators then employ the hundreds of rules Facebook has developed to determine if the content violates its policies.
“We recognize that this work can often be difficult,” Bertie Thomson, the director of corporate communications at Facebook, said in a statement. “That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”
Ms. Scola is urging Facebook to establish a fund to create a testing and treatment program through which current and former content moderators — including moderators employed by a third party — can receive medical testing and monitoring including psychiatric treatment. She is also asking that her legal fees be paid by Facebook.
Lawyers for Ms. Scola said their client was not currently giving interviews.
“What can cause PTSD has been the subject of countless articles and speculation long before it was an official diagnosis,” said Dr. Elspeth Cameron Ritchie, a psychiatrist and retired Army colonel who was formerly a senior Pentagon adviser on mental health issues. “In the vast majority of people, just seeing violent images is not enough, but in some people it could be.”
“People who operate drones and watch it blow things up suffer from PTSD even though they are not in the same room,” she added. “I’m not saying it could not happen, but we don’t see it very much.”
Facebook employees receive psychological support in house, according to Ms. Thomson.
“We also require companies that we partner with for content review to provide resources and psychological support, including on-site counseling — available at the location where the plaintiff worked, and other wellness resources like relaxation areas at many of our larger facilities,” Ms. Thomson said in a statement.
In May 2017, Mark Zuckerberg, Facebook’s chief executive, acknowledged that violent content was a problem for the company. He pledged to hire 3,000 additional people to more closely moderate what was posted on Facebook and said the company was creating more tools to simplify how users report content. “If we’re going to build a safe community, we need to respond quickly,” he said.
Violent acts that have been disseminated on Facebook include:
• A father’s killing of his 11-month-old daughter in April in Thailand, which he live-streamed before hanging himself
• The suicide of a 14-year-old who lived in a foster home in Florida
• A Minnesota police officer fatally shooting Philando Castile
Many content moderators have publicly discussed how difficult their job can be.
“You’d go into work at 9 a.m. every morning, turn on your computer and watch someone have their head cut off,” a man who chose to remain anonymous but was quoted in the lawsuit told The Guardian last year. “Every day, every minute, that’s what you see. Heads being cut off.”

0 comments:

Post a Comment