Staff Posts

Why Accountable Tech is Leaving X

Accountable Tech is leaving X, but we’re still committed to advocating for safer digital spaces elsewhere.

Kenya Juarez
Kenya Juarez,
Nov 15, 2024

Two years ago, we worked to hold Twitter accountable to its own community standards and to protect the safety of its users. At the time, we saw the platform as a place where people could connect, engage, and advocate for a better world.

Two years later, the platform is a shell of what it once was. So, we’re stepping away from X. Here’s why.

A Platform in Decline

Under new leadership, X has deteriorated at a rapid pace. Hate speech, harassment, and misinformation are now prevalent due to dismantled content moderation teams and drastic changes to platform policies. This lack of oversight has transformed X into a hostile space, particularly for marginalized communities who are disproportionately targeted by harassment. The decisions to cut content moderation teams have left users vulnerable to abuse and misinformation.

New Terms of Service: Compromising Privacy for AI Training

The upcoming change to X’s Terms of Service on November 15 allows the platform to use all user content for training AI models without explicit user consent. This policy shift marks a disturbing departure from the privacy standards users deserve and underscores the platform’s willingness to exploit user data. These new terms raise significant privacy concerns.

Weakening of Basic Safety Tools

X has undermined even basic tools meant to protect users. The block feature, which once allowed users to shield themselves from harassment, has been modified so that blocked users can still view public posts. This erosion of safety tools is emblematic of a platform that no longer prioritizes user security, making it harder for individuals to protect themselves from unwanted interactions.

Why We’re Leaving X

We stayed on the platform because we believed in its potential as a powerful tool for raising awareness and advocating for a safer internet. But the platform we once supported no longer aligns with our mission or values. 

Continuing to engage here risks feeding an algorithm that amplifies anger and hate, exposing our followers and allies to abuse. We can’t, in good conscience, continue to bring our community into such a toxic environment.

Staying Connected

Our work to advocate for safer, more responsible digital spaces continues, and we invite you to stay engaged with us on other social media platforms. We know it’s ironic to ask you to follow us on the platforms we’re fighting to hold accountable—but they are… well… dominant, and to organize for a safer digital world, we need to reach people where they’re at.

You can follow Accountable Tech on Bluesky, Instagram, TikTok, Threads, LinkedIn, Facebook, and YouTube as we continue to push for accountability in the tech industry. Together, we can work toward a future where technology serves the public good, fostering healthier, more inclusive online spaces.

More Staff Posts

Glints of Gratitude: AT’s Biggest Moments in 2024
Robbie Dornbush
Robbie Dornbush,
Dec 19, 2024

In 2024, Accountable Tech had a big year of campaigns to protect kids online, amplify the real world dangers of AI, defend democracy, and safeguard the privacy of abortion seekers.

Bluesky's the Limit
Nicole Gill
Nicole Gill,
Dec 17, 2024

Elon Musk’s race to the bottom for social media companies may finally be turning around thanks to fresh competition from Bluesky.

The Future of Music and AI: A Conversation with Kevin Erickson
Giliann Karon
Giliann Karon,
Dec 16, 2024

A discussion with Kevin Erickson from Future of Music Coalition about AI’s threats to musicians, the gatekeeping status of streaming platforms, and recommendations for lawmakers, regulators, and AI developers.

Join the fight to rein in Big Tech.

Big Tech companies are some of the most powerful and profitable companies in history, presenting new threats to the safety of communities and the health of democracy. We’re taking them on through legislation, regulation and direct advocacy.