A recent controversy involving Sama, a Kenya-based contractor for Meta, has sparked global concern after claims emerged of negligence in addressing threats against content moderators by Ethiopian rebels. These moderators were reportedly tasked with policing harmful content related to the ongoing Tigray conflict, a highly sensitive and volatile situation.
According to internal sources, Sama allegedly dismissed direct warnings from moderators about credible threats to their safety. The threats reportedly stemmed from their role in removing incendiary posts by rebel groups, which heightened tensions amid the conflict. Critics accuse the contractor of failing to provide adequate security or psychological support for its employees, who were often exposed to traumatic content.
Former moderators allege they raised these concerns with management, but their appeals were downplayed, leaving them vulnerable. Legal experts and labor activists have since condemned the actions, arguing that Meta and its contractors have a responsibility to ensure the safety of their workers.
This issue is part of a broader debate about how tech giants like Meta manage their content moderation efforts in conflict zones. Many argue the platforms are outsourcing the responsibility to contractors ill-equipped to handle such risks, while profiting from operations in volatile regions.
Meta and Sama have yet to release comprehensive statements addressing these allegations. The situation underscores the ethical dilemma tech firms face when their services intersect with politically charged and violent conflicts.