Wednesday, April 1, 2026
  • RED-TAGGING
  • DISINFORMATION
  • BUSINESS AND HUMAN RIGHTS

    Social Media Platforms Are Putting Human Rights Defenders at Risk: UN Expert

    A new report by a UN Special Rapporteur finds that Meta/Facebook, X, TikTok, YouTube and other platforms  activists rely on to expose abuse are increasingly being used to silence them — and the tech companies responsible appear to be paying less and less attention.

    Social Media Platforms Are Putting Human Rights Defenders at Risk: UN Expert

    THE UNITED Nations’ top expert on human rights defenders sent formal letters of concern to Meta and X Corp. at the end of last year, warning that the companies’ policies were actively endangering the people who use their platforms to document injustice and hold governments accountable. Three months later, neither company has written back.

    That silence sits at the center of a scoping paper released today, March 31, by the office of the UN Special Rapporteur on Human Rights Defenders, titled “From Visibility to Vulnerability: When Social Media Fails Human Rights Defenders.” The report, drawn from consultations with civil society organizations across every region in the world, paints a detailed picture of how platforms like Facebook, Instagram, X, TikTok, and YouTube have become something of a double-edged sword for activists: essential for their work, but increasingly hostile to their safety and their voices.

    “Social media platforms contribute, intentionally or otherwise, to silencing, deplatforming and endangering human rights defenders,” the report states.

    The Special Rapporteur — Ireland’s Mary Lawlor, who has worked on human rights issues for roughly four decades — dispatched the official communications to Meta and X on Dec. 30, 2025. The letters detailed specific concerns about each company’s policies and their real-world effects on defenders. The report notes plainly that no response had been received from either company by the time of publication.

    Gutting the Teams That Were Supposed to Help

    The report’s findings take on added weight given what has been happening inside these companies over the past several years. According to the scoping paper, which can be downloaded here, Meta, X, YouTube, and TikTok have all carried out significant layoffs of the staff most responsible for handling exactly the kind of problems it documents — the teams focused on trust and safety, human rights, and countering misinformation.

    X, which was acquired by Elon Musk in 2022, disbanded its safety advisory council shortly after the takeover. Meta has pulled back from third-party fact-checking, replacing it with a “community-based” moderation model that critics say opens the door to wider spread of false information. YouTube has rolled back its policies on election misinformation. TikTok, meanwhile, saw its content moderators in Germany stage a strike in August 2025 over mass layoffs and plans to hand their jobs to automated systems.

    The report describes the cumulative effect of these decisions bluntly: they have “drastically weakened any avenues of dialogue human rights defenders and civil society organisations previously had.”

    The consequences show up in response times. One civil society organization told the UN’s office that in 2025, harmful content targeting women defenders in the Middle East and North Africa was sometimes left up on major platforms for 10 to 15 days after being reported — even in cases involving serious, credible threats. The same organization said platforms were “far less responsive than they used to be to public pressure.”

    Others said engagement with the platforms on human rights matters had substantially declined over the past three years, with companies showing diminishing interest in hearing from civil society at all.

    A System That Wasn’t Working Well to Begin With

    What makes this retreat particularly troubling, the report argues, is that the system was already failing. A survey by Global Witness — cited in the scoping paper — found that 92% of land and environmental human rights defenders had experienced online abuse, arriving from individuals, bots, and coordinated harassment campaigns. Of those who reported the abuse to social media platforms, just 12% said they were satisfied with the response they received.

    Philippines Is Ground Zero for Disinformation. Facebook’s Fix Will Make it Worse.
    RELATED STORY: Philippines Is Ground Zero for Disinformation. Facebook’s Fix Will Make it Worse.

    The Philippines offers a sharp illustration of how badly that system can break down. Facebook’s own leadership once described the country as the world’s “patient zero” for disinformation — and the numbers bear that out. According to the Reuters Institute’s 2025 Digital News Report, 67% of Filipinos said they were concerned about online misinformation, a record high and well above the global average of 58%. 

    Within 12 hours of former President Rodrigo Duterte’s arrest by the International Criminal Court last year, some 200 Facebook pages flooded the platform with false narratives — including an AI-generated video targeting a drug war victim’s family member who had spoken publicly about a loved one’s killing. Now, as Rights Report Philippines reported this week, Meta is moving to replace the professional fact-checkers who have served as one of the few remaining safeguards in that environment with a crowd-sourced tool that Meta’s own independent oversight body says is structurally unequipped to handle fast-moving, coordinated disinformation — the very kind the Philippines produces at scale. 

    Meta’s fact-checking partnerships with Vera Files, Rappler, and Agence France-Presse remain in place for now, though Meta has already scrapped the program in the U.S. and fact-checkers globally expect the phase-out to follow. “In what can only be seen as an indication of their eroding commitment to human rights, mainstream social media companies have started to gradually pull back from collaboration with human rights defenders and NGOs’ — precisely when and where that collaboration is needed most,” Lawlor’s report notes.

    In Bangladesh, one organization found that more than 90% of abuse reports made to Facebook on behalf of human rights defenders were either ignored outright or, worse, resulted in action being taken against the person who filed the complaint rather than the person who carried out the harassment, a finding also cited in the UN report.

    Globally, the report found, defenders received more abuse on Facebook than on any other platform,  followed by X and Instagram.

    Platforms have pointed to their trusted partner programs, channels meant to give established civil society organizations a faster, more direct line to raise cases. The report is skeptical. One organization described trying to secure the reinstatement of a Kazakh human rights defender’s account by going through Meta’s official channels and receiving no follow-up at all, despite repeated attempts. Another said that even when Meta’s Trusted Partner channel did respond, the replies were often automated and failed to acknowledge the broader context of what had happened.

    “Results were never guaranteed,” one organization told the Special Rapporteur’s office.

    Filling the Gap — at Their Own Expense

    In the absence of reliable support from the platforms themselves, the burden of protection has shifted, largely invisibly, onto civil society organizations that were never designed to carry it. Groups are spending significant time and resources trying to navigate complaint systems on behalf of defenders, documenting harms, and pushing platforms for answers that often don’t come.

    The report calls this “a flawed and unsustainable model.” It places the weight of a structural problem on organizations with limited funding and capacity, while the companies whose platforms created the problem face no meaningful obligation to fix it.

    The scoping paper calls on social media companies to reverse recent rollbacks in hate speech, misinformation, and privacy policies, restore human rights and trust and safety staffing, and commit to faster responses in urgent cases. It asks states to hold platforms legally accountable for enabling online harassment, and to stop pressuring companies to censor or surveil their users.

    Whether any of that happens may depend, at least in part, on whether Meta and X eventually choose to respond to letters that have now gone unanswered for three months. (Rights Report Philippines)

    Stay Informed. Stay Engaged.

    Get the latest human rights news from the Philippines delivered to your inbox.