Compensation is being sought by former Facebook content moderators who claim to have suffered psychological injuries as a direct result of the exposure to extreme online content at work.
Several employees have started legal action against Facebook, first in California and now in Ireland, where Facebook has its EMEA headquarters.
In September 2019, the Personal Injuries Assessment Board in Ireland gave the go-ahead for former employees to take their case against Facebook to the High Court. The legal action started on December 4, 2019 against Facebook and CPL Resources, one of the third-party companies Facebook uses to provide its content moderators. Former Facebook content moderator Chris Gray is named as lead plaintiff.
Facebook content moderators perform a vital job for the social media platform. The job involves viewing content that had been posted by Facebook users and determining whether the content should remain on the social network or be filtered out or deleted. Without their efforts, the social media platform would be awash with extreme content.
According to Facebook’s Community Standards Enforcement Report, in the first quarter of 2019 its content moderators removed 5.4 million posts that violated its standards on child sexual abuse and exploitation and 33.6 million posts were removed from the platform that depicted violent and graphic content. All of that content must be manually reviewed by an army of content moderators.
Facebook content moderators are often paid little more than minimum wage and working conditions are difficult. Many workers struggle with the job due to the pressure to meet targets and the relentless stream of extremely disturbing content they must moderate. Facebook maintains that its content moderators are provided with access to support services and wellness resources due to the nature of the job. The Facebook content moderators in Ireland are telling a different story and say they are not properly trained to deal with the content they see and they do not have the necessary support, such as access to counselors and mental health services, both on the job and after they leave.
Chris Gray claims his job involved repeated exposure to graphic and often violent content, which in many cases was extremely disturbing. Gray was employed by CPL Resources as a contractor for 10 months between 2017 and 2018 and claims he has suffered psychological injuries at work as a result of the relentless images and videos he was having to view on a daily basis. Gray was later diagnosed as suffering from PTSD.
Gray had to make decisions on extreme content and faced a barrage of highly distressing content every day. He was exposed to a wide range of extreme material, including stonings, stabbings, beatings, beheadings, child abuse videos, animal torture, and extreme sexual content, including bestiality and child porn.
For instance, he had to view people being shot at point blank range with machine guns, saw videos of the massacre of the Rohingya people in Myanmar, and the torture and abuse of migrants from Libya. The extreme content was relentless.
The viewing of such extreme content left Gray numb and desensitized and it started to have a major impact on his life. He found that his personal and political views were changing “in a slow creep,” he experienced extreme emotions from sensitivity to irritability to anger. He found he was becoming more and more aggressive and argumentative outside of work.
There was no release while asleep, as Gray found himself dreaming about some of the things he had seen at work. The situation became so bad that he could not discuss his concerns and struggles with his superiors at work in a reasonable, calm, and professional manner.
On top of the content he had to view, he faced extreme pressure at work to ensure content was correctly categorized. Facebook demanded a 98% accuracy rate. As Gray explained, that equated to just 4 misclassifications a month. The pressure from achieving that level of accuracy and the huge volume of content he was required to assess also affected his mental health and stress levels, often disrupting his sleep. He often found himself waking up frightened that he had made a mistake at work.
A spokesperson for Facebook said, “reviewing certain types of content can sometimes be difficult,” but maintained that all staff had been provided with extensive training and that all content moderators were given full-time support. Measures had also been implemented to limit exposure to graphic, extreme content as far as was possible. However, Gray claims he was not given adequate support or training.
He also claims that there is no screening process for employees to determine whether they are right for the job and if viewing such extreme content is likely to affect their mental well-being. He says there was also a lack of monitoring on the job to identify individuals who are struggling to cope with either the content or stress from the workload and working conditions.
Gray is not an isolated example. Sean Burke, another former Facebook content moderator, told Vice in an interview, “My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed.” At least a dozen former Facebook content moderators are taking legal action against Facebook in Ireland.
This is the first case of its type to go before the European Court, but it is unlikely to be the last given the number of individuals employed to perform the job. Gray’s legal team say he is one of around 15,000 individuals who are employed as Facebook content moderators around the world through third party companies.
Many former Facebook content moderators are now speaking out about the poor working conditions, extreme pressure to reach targets, and the psychological effects of viewing extreme content day in, day out. Some of those employees have had to take antidepressants to help them cope, others speak of alcohol abuse to help them sleep and block out the images that plague them at night, and several have been diagnosed with PTSD.
One of the reasons why so few former employees have spoken out is because they have signed non-disclosure agreements. Violating the terms of the NDA could result in legal action and would make it difficult for them to find other work in the tech industry. Those NDAs are also placing the mental health of employees at risk, as they feel they cannot even talk about their work and problems with to friends and family and end up bearing the burden on their own.
Gray and other plaintiffs are seeking compensation for psychological distress, but they also want Facebook to take action to prevent others from suffering psychological injuries. They want to ensure that working conditions change, exposure to extreme content is limited, better support is provided and, given the nature of the job, greater care is taken selecting the right individuals for the job.
Gray’s legal team is attempting to get Facebook to provide data on the content employees have been exposed to and the volume of extreme content they had to moderate on a daily basis. If that information is disclosed, which is likely in Ireland, Facebook could well be forced to pay out a considerable amount of compensation to its content moderators. Another question that will need to be answered is who at Facebook knew that the job was causing post traumatic stress disorder and what, if anything, was being done to address the injuries in the workplace.
As more people speak out and the case receives wider press coverage, the number of individuals seeking social media content moderator compensation is expected to grow, not just in Ireland but throughout Europe. Gray’s legal team is already liaising with groups of content moderators in Barcelona and Berlin. They have also heard from former Facebook content moderators in Sweden who are interested in seeking compensation for psychological injuries sustained due to their work.
The post Exposure to Extreme Content at Work Sees Former Facebook Employees Sue for Psychological Injuries appeared first on HIPAA Journal.