Something is loading.
Thank you for your registration!
Access your favorite topics in a personalized feed on the go. download app
Facebook confirmed on Tuesday that it is ending its contract with Sama, the outsourcing company responsible for moderating graphic content in East Africa.
The news, first reported by Time Magazine, follows a lawsuit filed by a former content moderator employed by Sama in Nairobi, Kenya, alleging severe mental trauma as a result of working alongside others. labor violations.
According to Foxglove Legal, a nonprofit legal organization that investigates Big Tech companies, Majorel, a $2.2 billion European outsourcing company, will take over the contract. But Foxglove said Majorel was no better than Sama in his treatment of moderators – which is also shown by Insider reporting from last summer showing the heinous treatment of Majorel’s content moderators working for TikTok’s parent company. , ByteDance in Morocco.
This is a sign of how struggling the content moderation industry is. Despite several such lawsuits against social media companies such as Facebook, Youtube, TikTok and Reddit around the world, workers are still often forced to work multiple hours per shift to moderate some of the most graphic content on the internet, from material child pornography to videos. horrific accidents and beheadings, often with very few protections in place for their mental well-being.
In August, Insider investigated the working conditions of TikTok content moderators working for Majorel in Morocco – the hub of ByteDance’s content moderation operation in the Middle East and North Africa. Workers told us they often work 12-hour shifts, where they flag videos of animal abuse, sexual violence and other horrific content. They had fewer breaks than their American counterparts and said the company’s “wellness counselors” were of little help.
Social media companies claim to use sophisticated algorithms that help clean up people’s feeds, but that masks the grim reality of how almost all social media companies operate. Behind the scenes, a global workforce of tens of thousands of people filters loathsome content so it doesn’t end up in front of your eyes.
In recent years, Facebook has settled lawsuits with moderators who reported PTSD as a result of their work for the company and promised to make changes to its working conditions. But as Vittoria Elliott and I reported in 2020, relatively meager concessions rarely reach workers in India, the Philippines or Kenya.
Experts like Foxglove Legal have tapped social media companies like Meta to join their global content moderation staff. It may be the only way to ensure that dealing with the worst elements of social media is handled by those who are truly accountable to the company and its users. Until then, contractors like those at Sama or Marjorel or dozens of other outsourcing companies will pay the price.