Blog content

TikTok sued by content moderator who claims to have developed PTSD while reviewing disturbing content

(CNN) – A content moderator for TikTok is suing the social media platform after saying she developed psychological trauma as a result of her job, which she said caused her to review videos featuring graphic violence , conspiracy theories and other disturbing imagery.

Candie Frazier, a Las Vegas-based entrepreneur for the parent company of TikTok ByteDance, alleges that she and other content moderators often spend 12 hours a day reviewing disturbing content. She claims TikTok and ByteDance fail to provide adequate protections and psychological support to content moderators, according to the complaint.

“Complainant Frazier is viewing videos of the Myanmar genocide, mass shootings, raped children and mutilated animals,” the complaint states. “Due to constant, unmitigated exposure to highly toxic and extremely disturbing images in the workplace, Ms. Frazier has developed and suffers from significant psychological trauma, including anxiety, depression and stress disorder. Posttraumatic. “

The proposed class action lawsuit, filed last week in federal court in California, is likely to intensify scrutiny of problematic content and moderation practices at TikTok. The short video platform had previously gone under the radar compared to bigger rivals such as Facebook and YouTube, but has gained attention in recent months from critics and lawmakers after exploding in popularity, especially among young people. , during the pandemic. The company said in September that it had reached 1 billion monthly users.

A spokesperson for TikTok said the company is not commenting on the pending litigation.

“We strive to promote a caring working environment for our employees and contractors,” the spokesperson said. “Our security team partners with third-party companies for the essential work of helping protect the TikTok platform and community, and we continue to develop a range of wellness services to make moderators feel supported. mentally and emotionally. ”

Frazier is not an employee of TikTok or ByteDance; instead, she works for a Canadian company called Telus International, which outsources content moderation workers to TikTok and other social media platforms. But Frazier alleges in the lawsuit that his work is dictated and supervised by TikTok and ByteDance. A spokesperson for Telus, who is not named as a party to the lawsuit, said Frazier never raised concerns about his work and that “his claims are grossly inconsistent with our policies and practices.” .

“We have a strong resilience and mental health program in place to support all of our team members, as well as a comprehensive benefits package for access to personal health and wellness services.” , said the Telus spokesperson. “Our team members can raise questions and concerns about any aspect of their work through multiple internal channels, which the company all takes very seriously. “

Facebook faced a similar lawsuit in 2018 by a content moderator who said he developed PTSD after being exposed to content featuring rape, suicide and violence at work. Among the criticisms that Facebook faced for its content moderation practices was the fact that moderation contractors did not reap the same benefits as corporate employees, despite being charged. of such a trying job. The social media giant ultimately agreed to a $ 52 million class action settlement, which involved payments and funding for content moderator mental health treatment, as well as workplace changes.

A TikTok executive first testified on Capitol Hill in October and acknowledged the need to increase protections for young users on the platform. “We seek to gain trust through a higher level of action, transparency and accountability, as well as humility, to learn and improve,” said the vice president and chief public policy officer of TikTok, Michael Beckerman, to Senate Subcommittee. But Frazier’s lawsuit may point to the challenges of improving those protections.

The complaint alleges that problematic content is only reviewed by moderators after it has been uploaded to the platform if a user reports it. Due to the sheer volume of content they are loaded with, moderators only have 25 seconds to review each video and watch “three to ten videos at the same time,” he says. (TikTok did not immediately respond to a request for comment regarding these allegations.)

“These videos include cruelty to animals, torture, suicides, child abuse, murder, beheadings and other graphic content,” according to the complaint. “The videos are each sent to two content moderators, who review the videos and determine whether the video should remain on the platform, be removed from the platform, or have its audio muted.”

Theo Bertram, then TikTok’s public policy director for Europe, Middle East and Africa, told British lawmakers in September 2020 that the company had 10,000 people working in its ‘trust and security’ team in the world. TikTok earlier this year also launched an automated moderation system to analyze and remove videos that violate its “on upload” policies, though the feature is only available for certain categories of content.

The system manages “the categories of content for which our technology has the highest degree of precision, starting with violations of our policies on child safety, adult nudity and sexual activity, violent and graphic content, and illegal activity and regulated goods, ”a July blog post from TikTok’s US Security Officer Eric Han, reads. “We hope this update will also support resilience within our security team by reducing the volume of distressing videos viewed by moderators and allowing them to spend more time in highly contextual and nuanced areas. “

TikTok says 93% of infringing videos deleted between April and June 2021 were deleted within 24 hours of posting – the majority of which had no view and were reported by its automated system rather than reported by a user , according to an Application Report for Community Directives published in October. (TikTok has not commented on Frazier’s claim that content moderators only review videos after they have been flagged by a user.)

Frazier also alleges that content moderators are required to sign nondisclosure agreements that “aggravate the harm” caused by the job, according to the complaint. The practice of requiring workers to sign nondisclosure agreements has recently come under criticism in the tech industry amid disputes between employees at Pinterest, Apple and other big tech companies. . TikTok did not immediately respond to a request for comment regarding its NDA practices.

With the lawsuit, Frazier seeks to have TikTok pay damages (in an amount to be determined later) to herself and other content moderators, and to develop a “medical surveillance fund” to pay for the cost. screening, diagnosis and treatment of psychological problems of such workers, according to the complaint.

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.