Random Image Display on Page Reload

A Leaked Memo Shows TikTok Knows It Has a Labor Problem

Jul 21, 2023 7:00 AM

A Leaked Memo Shows TikTok Knows It Has a Labor Problem

Internal TikTok documents reveal that the company is preparing for scrutiny of its supply chain. This comes after a Kenyan judge cleared the way for moderators to sue Meta.

TikTok logo on headquarters building in Culver City California

Photograph: AaronP/Getty Images

Last month, a court in Kenya issued a landmark ruling against Meta, owner of Facebook and Instagram. The US tech giant was, the court ruled, the “true employer” of the hundreds of people employed in Nairobi as moderators on its platforms, trawling through posts and images to filter out violence, hate speech and other shocking content. That means Meta can be sued in Kenya for labor rights violations, even though moderators are technically employed by a third party contractor.

Social media giant TikTok was watching the case closely. The company also uses outsourced moderators in Kenya, and in other countries in the global south, through a contract with Luxembourg-based Majorel. Leaked documents obtained by the NGO Foxglove Legal, seen by WIRED, show that TikTok is concerned it could be next in line for possible litigation.

“TikTok will likely face reputational and regulatory risks for its contractual arrangement with Majorel in Kenya,” the memo says. If the Kenyan courts rule in the moderators’ favor, the memo warns “TikTok and its competitors could face scrutiny for real or perceived labor rights violations.”

The ruling against Meta came after the tech company tried to get the court to dismiss a case brought against it and its outsourcing partner, Sama, by the South African moderator, Daniel Motaung, who was fired after trying to form a union in 2019.

Motaung said the work, which meant watching hours of violent, graphic, or otherwise traumatizing content daily, left him with post-traumatic stress disorder. He also alleged that he hadn’t been fully informed about the nature of the work before he’d relocated from South Africa to Kenya to start the job. Motaung accuses Meta and Sama of several abuses of Kenyan labor law, including human trafficking and union busting. Should Motaung’s case succeed, it could allow other large tech companies that outsource to Kenya to be held accountable for the way staff there are treated, and provide a framework for similar cases in other countries.

“[TikTok] reads it as a reputational threat,” says Cori Crider, director of Foxglove Legal. “The fact that they are exploiting people is the reputational threat.”

TikTok did not respond to a request for comment.

In January, as Motaung’s lawsuit progressed, Meta attempted to cut ties with Sama and move its outsourcing operations to Majorel—TikTok’s partner.

In the process, 260 Sama moderators were expected to lose their jobs. In March, a judge issued an injunction preventing Meta from terminating its contract with Sama and moving it to Majorel until the court was able to determine whether the layoffs violated Kenyan labor laws. In a separate lawsuit, Sama moderators, some of whom spoke to WIRED earlier this year, alleged that Majorel had blacklisted them from applying to the new Meta moderator jobs, in retaliation for trying to push for better working conditions at Sama. In May, 150 outsourced moderators working for TikTok, ChatGPT, and Meta via third-party companies voted to form and register the African Content Moderators Union.

Most Popular

Majorel declined to comment.

The TikTok documents show that the company is considering an independent audit of Majorel’s site in Kenya. Majorel has sites around the world, including in Morocco, where its moderators work for both Meta and TikTok. Such an exercise, which often involves hiring an outside law firm or consultancy to conduct interviews and deliver a formal assessment against criteria like local labor laws or international human rights standards, “may mitigate additional scrutiny from union representatives and news media,” the memo said.

Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, says that these audits can be a way for companies to look like they’re taking action to improve conditions in their supply chain, without having to make the drastic changes they need.

“There have been instances in a number of industries where audits have been largely performative, just a little bit of theater to give to a global company a gold star so that they can say they're complying with all relevant standards,” he says, noting that it’s difficult to tell in advance whether a potential audit of TikTok’s moderation operations would be similarly cosmetic.

Meta has conducted multiple audits, including in 2018, contracting consultants Business for Social Responsibility to assess its human rights impact in Myanmar in the wake of a genocide that UN investigators alleged was partially fueled by hate speech on Facebook. Last year, Meta released its first human rights report. The company has, however, repeatedly delayed the release of the full, unredacted copy of its human rights impact report on India, commissioned in 2019 following pressure from rights groups who accused it of contributing to the erosion of civil liberties in the country.

The TikTok memo says nothing about how the company might use such an assessment to help guide improvements in the material conditions of its outsourced workers. “You'll note, the recommendations are not suddenly saying, they should just give people access to psychiatrists, or allow them to opt out of the toxic content and prescreen them in a very careful way, or to pay them in a more equal way that acknowledges the inherent hazards of the job,” says Crider. “I think it's about the performance of doing something.”

Barrett says that there is an opportunity for TikTok to approach the issue in a more proactive way than its predecessors have. “I think it would be very unfortunate if TikTok said, ‘We’re going to try to minimize liability, minimize our responsibility, and not only outsource this work, but outsource our responsibility for making sure the work that's being done on behalf of our platform is done in an appropriate and humane way.’”

Get More From WIRED

Vittoria Elliott is a reporter for WIRED, covering platforms and power. She was previously a reporter at Rest of World, where she covered disinformation and labor in markets outside the US and Western Europe. She has worked with The New Humanitarian, Al Jazeera, and ProPublica. She is a graduate of… Read more
Platforms and power reporter

More from WIRED

Google DeepMind’s CEO Says Its Next Algorithm Will Eclipse ChatGPT

Demis Hassabis says the company is working on a system called Gemini that will draw on techniques that powered AlphaGo to a historic victory over a Go champion in 2016.

Will Knight

Microsoft’s Satya Nadella Is Betting Everything on AI

The CEO can’t imagine life without artificial intelligence—even if it’s the last thing invented by humankind.

Steven Levy

Amazon’s New Robots Are Rolling Out an Automation Revolution

A wave of advanced machines is coming to the company’s facilities thanks to better AI and robots smart enough to work with—and without—humans.

Will Knight

Meet the Humans Trying to Keep Us Safe From AI

As artificial intelligence explodes, the field is expanding beyond the usual suspects—and the usual motivations.

Will Knight

OpenAI’s CEO Says the Age of Giant AI Models Is Already Over

Sam Altman says the research strategy that birthed ChatGPT is played out and future strides in artificial intelligence will require new ideas.

Will Knight

They Plugged GPT-4 Into Minecraft—and Unearthed New Potential for AI

The bot plays the video game by tapping the text generator to pick up new skills, suggesting that the tech behind ChatGPT could automate many workplace tasks.

Will Knight

Apple Is Taking On Apples in a Truly Weird Trademark Battle

Apple, the company, wants rights to the image of apples, the fruit, in Switzerland—one of dozens of countries where it’s flexing its legal muscles.

Gabriela Galindo

A Fight Over the Right to Repair Cars Takes a Wild Turn

A landmark right to repair law in Massachusetts is great for car owners. The US government argues it’s also great for hackers.

Aarian Marshall

*****
Credit belongs to : www.wired.com

Check Also

Dubai deluge likely made worse by warming world, scientists find

A powerful rainstorm that wreaked havoc on the desert nation of the United Arab Emirates …