Half of all Facebook moderators may develop mental health issues
[ad_1]
In November 2018 I received a message that changed my life. A person working as a moderator for Facebook in Phoenix through a company called Cognizant asked to get on the phone and talk about some of what he was seeing there. His experiences shocked me, and after I wrote about what he and his colleagues were going through in The Verge, they would go on to shock a lot more people.
It was an office where moderators would have panic attacks while still in training, traumatized by daily exposure to gore and other disturbing posts. Where ever-shifting content policies, and demands for near-perfect accuracy, could make the job itself impossible. And where months of sifting through conspiracy theories led some moderators to embrace fringe viewpoints, walking through the building insisting that the earth is flat.
I wrote about the experiences of a dozen current and former moderators at the Phoenix site last February. A few months later, after hearing from employees that conditions at Cognizant’s Tampa site were even more grim, I traveled there and talked to a dozen more workers. There I learned of a stressed-out moderator who died of a heart attack at his desk at the age of 42. I learned of multiple sexual harassment suits that had been filed against various workers at the site. And I met three brave former moderators who violated their non-disclosure agreements to describe their working conditions on camera.
By then a lawsuit by a former moderator named Selena Scola, which accused Facebook of creating an unsafe workplace that had caused her mental health problems, was working its way through the courts. And on Friday, lawyers filed a preliminary settlement in the case. I wrote about it today at The Verge:
In a landmark acknowledgment of the toll that content moderation takes on its workforce, Facebook has agreed to pay $52 million to current and former moderators to compensate them for mental health issues developed on the job. In a preliminary settlement filed on Friday in San Mateo Superior Court, the social network agreed to pay damages to American moderators and provide more counseling to them while they work.
Each moderator will receive a minimum of $1,000 and will be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or related conditions. The settlement covers 11,250 moderators, and lawyers in the case believe that as many as half of them may be eligible for extra pay related to mental health issues associated with their time working for Facebook, including depression and addiction.
“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” said Steve Williams, a lawyer for the plaintiffs, in a statement. “The harm that can be suffered from this work is real and severe.”
After a year of reporting on the lives of these moderators — I also profiled people who do the work for Google and YouTube — it seemed clear to me that some percentage of people who work as moderators will suffer long-term mental health consequences. But what is that percentage?
Last year I published some leaked audio from a Facebook all-hands meeting in which CEO Mark Zuckerberg acknowledged this range of experiences. “Within a population of 30,000 people, there’s going to be a distribution of experiences that people have,” Zuckerberg said, referring to the number of people Facebook has working on trust and safety issues around the world. “We want to do everything we can to make sure that even the people who are having the worst experiences, that we’re making sure that we support them as well as possible.”
One of the most interesting aspects of today’s news is that it begins to answer the question of how many moderators are affected. Designing a settlement required that lawyers for Facebook and the plaintiffs estimate how many people would make claims. And the number is much higher than I had imagined.
This lawsuit covers only people who have worked for Facebook through third-party vendors in the United States from 2015 until today, a group whose size is estimated to be 11,250 people. (A similar lawsuit is still pending in Ireland covering European workers.) Both Facebook and plaintiffs’ lawyers consulted with experts in post-traumatic stress and vicarious trauma. Based on those discussions, a lawyer for the plaintiffs told me, as many as half of the members of the class are expected to qualify for additional payments.
In other words, if you become a moderator for Facebook, a legal precedent suggests you have a one in two chance of suffering negative mental health consequences for doing the work.
Perhaps those odds will come down as Facebook implements some of the other changes they agreed to in the settlement, such as providing more counseling and offering workers tools to adjust the content that they’re viewing — turning it black and white, turning off audio by default, and so on. But the risk to human lives is real, and it’s not going away.
Another aspect to consider: how much will the average moderator get paid as a result of the settlement? The $52 million figure is less impressive when you consider that fully 32.7 percent of it has been earmarked for the lawyers in the case, leaving $35 million left over for everyone else.
The settlement was designed to compensate moderators in tiers. The first tier grants $1,000 to everyone, in the hopes that moderators use the money to get a mental health checkup from a doctor. For those who are either newly diagnosed or already have diagnoses, the settlement provides an additional $1,500 to $6,000 based on the severity of their cases. And then moderators can also submit evidence of distress suffered as a result of their work to win up to $50,000 per person in damages.
The sums could all be much smaller depending on how many members of the class apply and are found eligible for benefits beyond the first $1,000. If half the class were found eligible for additional mental health benefits and received equal compensation — which will not be the case but may be useful for ballpark-estimation purposes — there would be $4,222.22 available per moderator.
In my Twitter replies, lots of folks objected to the size of the payment, arguing it should have been much higher. Here, for example, is person who called the settlement “a day’s worth of random market fluctuation in profits for Facebook.” I won’t argue here — many of these content moderation roles are essentially first responder jobs, not nearly as different as you might think from police officers and paramedics, and they deserve compensation and benefits more closely in line with the service they provide and the risks that they take.
I called up Shawn Speagle, a former Facebook moderator who worked at the Tampa site, to tell me what he thought. Speagle, who was not involved in the lawsuit, worked for Cognizant from March to October 2018. During that time, he was exposed to videos of extreme violence and animal abuse on a near-daily basis, and he began to overeat and experience night terrors. After being fired, he was diagnosed with PTSD.
He said that a year of psychiatric care had helped him significantly with his symptoms, but also that the things he had seen continue to haunt him. “It’s been a very long ride,” Speagle told me Tuesday. “It’s been very difficult to forget about a lot of that stuff. You never do — it just sticks with you forever. Even though it was just seen over a screen, those lives are never coming back. I just wish that Facebook would recognize that.”
Speagle said that he sometimes felt embarrassed describing his PTSD to others, worrying they wouldn’t quite believe a person could develop the condition by reviewing Facebook posts. “There were a lot of times when it was humiliating,” he said. But psychiatrists helped him to understand that the phenomenon known as vicarious trauma — watching others experience pain — is real, and can be dangerous. He has since become a public school teacher.
I asked him what he thought of the payout he might now be eligible for.
“I would be fine getting no money,” Speagle told me. “I just wanted to bring this forward. When I did the job at Facebook, I was told I was making the world a better place for animals and young people. The reason I came forward was to stick to true to that. Money and a lawsuit have nothing to do with what I did.”
The Ratio
Today in news that could affect public perception of the big tech platforms.
Trending up: Instagram is launching new features aimed at making the platform a more positive space. The company says it’s rolling out the ability to delete up to 25 comments at once and also block or restrict multiple accounts at the same time. (Ashley Carman / The Verge)
Trending up: Jack Dorsey gave $15 million to San Francisco’s coronavirus relief fund. The money will help undocumented and low-income households struggling with the pandemic. (Dominic Fracassa / San Francisco Chronicle)
Virus tracker
Total cases in the US: More than 1,354,300
Total deaths in the US: At least 80,600
Reported cases in California: 69,520
Total test results (positive and negative) in California: 991,897
Reported cases in New York: 342,267
Total test results (positive and negative) in New York: 1,204,651
Reported cases in New Jersey: 139,945
Total test results (positive and negative) in New Jersey: 425,933
Reported cases in Illinois: 79,123
Total test results (positive and negative) in Illinois: 442,425
Data from The New York Times. Test data from The COVID Tracking Project.
Governing
⭐ Facebook is working to launch a new political advocacy group that would combat regulators trying to rein in the tech industry. The move escalates Silicon Valley’s battle with Washington at time when government officials are still threatening to break up large companies. Tony Romm at The Washington Post has the story:
The organization is called American Edge, and it aims through a barrage of advertising and other political spending to convince policymakers that Silicon Valley is essential to the U.S. economy and the future of free speech, according to three people familiar with the matter as well as documents reviewed by The Washington Post. The people spoke on the condition of anonymity to describe the group because it hasn’t officially been announced.
In December, American Edge formed as a nonprofit organization, and last month, it registered an accompanying foundation, according to incorporation documents filed in Virginia. The setup essentially allows it to navigate a thicket of tax laws in such a way that it can raise money, and blitz the airwaves with ads, without the obligation of disclosing all of its donors. Many powerful political actors — including the National Rifle Association — similarly operate with the aid of “social welfare” groups.
Facebook released a new report detailing how it uses a combination of artificial intelligence and human fact-checkers to enforce its community standards. The company is relying more heavily on AI for its moderation efforts. But some pieces of content, like memes and videos, can be difficult for AI to parse. (Nick Statt / The Verge)
Anti-lockdown protests in Australia were fueled by a conspiracy theory Facebook group. The members have suggested, falsely, that Bill Gates is behind the coronavirus pandemic, and chanted “arrest Bill Gates” at a recent rally. (Cameron Wilson / BuzzFeed)
Misinformation researcher Renée DiResta says she wishes tech companies would expand their definition of who counts as an authoritative source. In this Q&A she discusses how platforms are managing coronavirus misinformation and where they could be doing more. (Berkman Klein Center)
Coronavirus infection rates are spiking in heartland communities, according to an unreleased White House report. The news is at odds with President Trump’s declaration earlier this week that “all throughout the country, the numbers are coming down rapidly.” (Jonathan Allen, Phil McCausland and Cyrus Farivar / NBC)
Three days after a top White House aide tested positive for COVID-19, government officials working in the West Wing are being instructed to wear masks. The rule does not apply to President Trump, however. (Ashley Parker, Josh Dawsey and Philip Rucker / The Washington Post)
Here’s what it’s like being a contact tracer in the United States. These people are tasked with tracking down those who might have been infected with COVID-19. (James Temple and Bobbie Johnson / MIT Technology Review)
Nearly 40 percent of Icelanders are using a contact tracing app, the highest number of any country. Despite the widespread use, experts say the impact of the app has been small compared with manual tracing techniques like phone calls. As we told you it would be a month ago.(Bobbie Johnson / MIT Technology Review)
The Federal Trade Commission indicated that it’s looking into privacy complaints related to Zoom. Lawmakers have been expressing concerns about how Zoom collects and stores user data. (Reuters)
How big would a hologram of Joe Biden have to be for every person in the continental US to see him? Roughly 1,400 miles tall, or 255 Mount Everests stacked on top of each other. If you’re not clear on why this is currently a subject of debate, you may want to remain uninvolved. (Makena Kelly / The Verge)
Industry
⭐ Jack Dorsey told Twitter employees they’ll be allowed to work from home forever, even after the coronavirus pandemic has passed. Some jobs that require physical presence, such as maintaining servers, will still require employees to come in. This is a good thing, but also a few years from now Twitter is definitely going to realize that it has several hundred employees on the payroll who have not worked for the company in ages. Here’s Alex Kantrowitz at BuzzFeed:
Twitter encouraged its employees to start working from home in early March as the coronavirus began to spread across the US. Several other tech companies did the same, including Microsoft, Google, and Amazon.
That month, Twitter human resources head Jennifer Christie told BuzzFeed News the company would “never probably be the same” in the structure of its work. “People who were reticent to work remotely will find that they really thrive that way,” Christie said. “Managers who didn’t think they could manage teams that were remote will have a different perspective. I do think we won’t go back.”
Twitter added former Google AI chief Fei-Fei Li to its board of directors. Li left Google after coming under fire for her role in the company’s military contracts, which sparked employee protests and eventually led the company to abandon the project. (Tyler Sonnemaker / Business Insider)
Facebook shut down Instagram Lite, its app aimed at emerging markets. The company — which has always characterized the app as a “test” — is planning to take what it has learned over these past years to develop a new version of the app. (Sarah Perez / TechCrunch)
Facebook News publishers are worried that the project might have slid out of the company’s top priorities due to the coronavirus pandemic. The COVID-19 Information Center in particular seems to have been rolled out to solve precisely the type of problem the News tab was originally intended to solve. (Max Willens / Digiday)
The coronavirus pandemic and subsequent shutdowns have brought back bartering as people trade goods in Facebook groups. (Rachel Lerman / The Washington Post)
In an interview that was immediately read and shared by every working journalist, Quibi co-founder Jeffrey Katzenberg blamed the streaming app’s rough start on the coronavirus pandemic. “I attribute everything that has gone wrong to coronavirus,” he said. “Everything.” Same, Jeff, same. (Nicole Sperling / The New York Times)
Hackers are impersonating Zoom, Microsoft Teams, and Google Meet for phishing scams. As significantly more people are using the tools during the COVID-19 pandemic, the domains could be used to trick people into downloading malware. (Jay Peters / The Verge)
Ordering food from local businesses is a nice way to support the restaurant industry in your area. But it’s a bit more complicated than you might think. One Grubhub user tried to order pizza from a supposedly local restaurant that turned out to be an undercover Chuck E. Cheese. For shame, Chuck E., for shame.(Jelisa Castrodale / Food and Wine)
Google Stadia is a lonely place. Google’s cloud streaming service feels like a virtual stadium that is still being built, and the crowds haven’t arrived. (Tom Warren / The Verge)
Charli and Dixie D’Amelio have left TikTok’s Hype House. The sisters say they don’t have any beef with the house members, though Charli and one of the founders did just break up. (Starr Bowenbank / Cosmopolitan)
LinkedIn launched a virtual events tool to let people create and broadcast online gatherings on the platform. (Ingrid Lunden / TechCrunch)
Things to do
Stuff to occupy you online during the quarantine.
Read the books these tech executives are into right now.
Try not to murder the person you live with over a petty gripe. Reading about other’s petty gripes might help.
Listen to this Vergecast episode with me, Verge editor in chief Nilay Patel, and former Facebook chief security office Alex Stamos. We talked about fixing Zoom, misinformation on social platforms, election security, and much more.
Attend a three-day virtual festival on Houseparty with your friends. The lineup is very good and includes Keegan-Michael Key, Tinashe, Zooey Deschanel, and Doja Cat.
Those good tweets
I think I speak for everyone when I say that with each passing day in quarantine, I feel mentally, physically, and emotionally stronger
— Bridger Winegar (@bridger_w) May 9, 2020
Companies be like “ Now more than ever , we will let you buy our product”
— cancela lansbury (@gossipbabies) May 5, 2020
And finally …
This is just another tweet, but I wanted it to have a place of honor today.
Talk to us
Send us tips, comments, questions, and class action settlements: casey@theverge.com and zoe@theverge.com.
[ad_2]
Source link