Letter to Meta Calling for Trust, Safety, and Transparency Policies as Threads Hits 100 Million Users

24 civil rights and consumer advocacy organizations call on Meta to develop and share robust and equitable community trust, safety, and transparency policies for Threads.

July 13, 2023

Mark Zuckerberg
Chief Executive Officer
One Hacker Way
Menlo Park, CA 94025

Adam Mosseri, Head of Instagram
Roy Austin, Vice President of Civil Rights, Meta

Delivered via electronic mail

Dear Mark:

We, the undersigned 24 civil rights, digital justice and pro-democracy organizations, call on you to develop and share robust and equitable community trust, safety, and transparency policies specific to the use of Threads. As we’re sure you understand, stewardship of this forthcoming platform is a serious responsibility with implications for democracy and human and civil rights around the world. 

Warning signs are already flashing. Since Threads launched, new users have been testing the boundaries of the platform’s moderation and enforcement; we are observing neo-Nazi rhetoric, election lies, COVID and climate change denialism, and more toxicity. They posted bigoted slurs, election denial, COVID-19 conspiracies, targeted harassment of and denial of trans individuals’ existence, misogyny, and more. Much of the content remains on Threads indicating both gaps in Meta’s Terms of Service and in its enforcement, unsurprising given your long history of inadequate rules and inconsistent enforcement across other Meta properties. 

Rather than strengthen your policies, Threads has taken actions doing the opposite, by purposefully not extending Instagram’s fact-checking program to the platform and capitulating to bad actors, and by removing a policy to warn users when they are attempting to follow a serial misinformer. Without clear guardrails against future incitement of violence, it is unclear if Meta is prepared to protect users from high-profile purveyors of election disinformation who violate the platform’s written policies. To date, the platform remains without even the most basic tools for researchers to be able to analyze activity on Threads. Finally, Meta rolled out Threads at the same time that you have been laying off content moderators and civic engagement teams meant to curb the spread of disinformation on the platform.

Following Threads’ launch last week we urge you to prioritize user safety and transparency. That is why we, the undersigned, have developed three overarching recommendations for your team to implement common sense guardrails to help ensure that Threads is a safe space for users and brands. 

  1. Immediately Implement Robust Policies to Keep Violence & Hate Off Threads: Implement strong policies unique to Threads that meet the needs of a rapidly growing text-based platform, including strong policies against hate speech to protect marginalized communities. Ensure application of policies to mitigate hate and lies, coupled with diligent review of and updates to written policies in order to limit the spread and visibility of dangerous speech, lies, hate, harassment, and extremism. 
  2. Invest in Robust Protections Against Algorithmic Manipulation & Equitable Policy Enforcement: Prioritize safety and equity by taking a proactive, human-centered approach to preventing machine learning bias and other AI-malfeasance – including investments in human moderation methods and equitable enforcement of policies across languages. Ensure that context is considered in content moderation enforcement, as specified in your Terms of Service.
  3. Transparency & Engagement With Civil Society: Implement governance and leadership practices to engage regularly with civil society, including transparent and accessible data and methods for researchers to analyze Threads’ business models, content and moderation practices. Continue to engage in open discourse about your commitment to interoperability and open source protocol.

For the safety of brands and users, Threads must implement guardrails that stem extremism, hate, and anti-democratic lies. Doing so isn’t just good for people: it’s good for business. We look forward to discussing this with you further.


18 Million Rising
Access Now
Accountable Tech
Action for the Climate Emergency (ACE)
Center for Countering Digital Hate
Common Cause
DemCast USA
Fair Vote UK
Free Press
Friends of the Earth
Global Project Against Hate and Extremism
Greenpeace USA
Media Matters for America
ProgressNow New Mexico
Public Citizen
Tech Transparency Project (TTP)
The Real Facebook Oversight Board
The Tech Oversight Project
Win Without War