Statements
2022

Obama Outlines Existential Threat Big Tech Poses to Democracy

For Immediate Release
April 21, 2022
Contact: press@accountabletech.org

 

Today, in his keynote address at Stanford University’s “Challenges to Democracy in the Digital Information Realm” Symposium, President Barack Obama outlined Big Tech’s role in supercharging disinformation and undermining democracy across the globe. His pointed but hopeful call to action marked the President’s most robust argument yet for tech regulation, and closely aligned with Accountable Tech’s founding mission and flagship campaigns for reform. Former Secretary of State Hillary Clinton also weighed in on the issue, encouraging the European Union to bolster democracy by pushing the landmark Digital Services Act (DSA) across the finish line as high-stakes trilogue negotiations take place tomorrow.

[READ: Accountable Tech’s Memo on How Big Tech Boosts Autocrats]

“It’s great to see President Obama speak so forcefully about the danger Big Tech’s grow-at-all -costs business model poses to democracy in America and across the globe,” said Jesse Lehrich, co-founder of Accountable Tech. “Social media giants have played a unique role in the erosion of our consensus reality. They have stoked division, delusion, and demagoguery – and they’ve done this by design to maximize engagement and profits. But as President Obama also made clear, none of this is inevitable. It’s incumbent upon all of us – from advocates and policymakers to tech workers and advertisers – to do our part to right the ship before it’s too late.”

Throughout his speech, President Obama pinpointed the systemic problems and structural reforms that Accountable Tech has centered in its work. Below are some key takeaways.

Big Tech’s surveillance advertising business model erodes our consensus reality

“For more and more of us, search and social media platforms aren’t just our window into the internet. They serve as our primary source of news and information. No one tells us that the window is blurred, subject to unseen distortions and subtle manipulations. All we see is a constant feed of content, where useful, factual information and happy diversions flow alongside lies, conspiracy theories, junk science, quackery, racist tracts and misogynist screeds. Over time, we lose our capacity to distinguish between fact, opinion, and wholesale fiction – or maybe we just stop caring.”

“But as more and more ad revenue flows to the platforms that disseminate news instead of the newsrooms that report it, publishers, reporters and editors all feel the pressure to maximize engagement in order to compete.”

Tech companies boost harmful disinformation and extremism by design

“It’s not just that these platforms have, with narrow exceptions, been largely agnostic regarding the kind of information available and connections made on their sites. It’s that in the competition between truth and falsehood, cooperation and conflict, the very design of these platforms seems to be tilting us in the wrong direction. And we’re seeing the results. Take COVID, for example… People are dying because of disinformation.”

“In Myanmar, it’s been well-documented that hate-speech shared on Facebook played a role in the murderous campaign targeting the Rohingya community. Social media platforms have been similarly implicated in fanning ethnic violence in Ethiopia and far-right extremism across Europe.”

Regulating Big Tech companies is consistent with freedom of speech

“The First Amendment is a check on the power of the state – it doesn’t apply to private companies like Facebook or Twitter, any more than it applies to editorial decisions made by the New York Times or Fox News. Social media companies already make choices about what is or is not allowed on their platforms and how that content appears – both explicitly through content moderation and implicitly through algorithms.”

Black box algorithms are manipulating public discourse 

“Beyond that, tech companies should be more transparent about how they operate. So much of the conversation around disinformation has focused on what people post, but the bigger issue is what content certain companies promote. Algorithms have evolved to a point where no one can accurately predict what they’ll do – even the people who built them.”

“For example, the way content looks on your phone, as well as the veil of anonymity that platforms provide their users, can make it impossible to tell the difference between a peer-reviewed article by Dr. Anthony Fauci and a miracle cure being pitched by a huckster. Meanwhile, sophisticated actors, from political consultants to commercial interests to the intelligence arms of foreign powers, can game platform algorithms or artificially boost the reach of deceptive or harmful messages.”

Democratic governments must mandate transparency

“In a democracy, we can rightly expect companies to subject the design of consumer products and services to some level of scrutiny. At minimum, they should have to share that information with researchers and regulators who keep the rest of us safe. If a meatpacking company has a proprietary technique to keep our meat clean, they don’t have to tell the world, but they do have to let the meat inspector in. Similarly, tech companies should be able to protect their intellectual property while also following certain safety standards that we as a country have agreed are necessary for the greater good.”

“This is part of the Platform Accountability and Transparency Act being proposed by a bipartisan group of Senators here in the U.S., and negotiated in Europe as part of the European Union’s Digital Services Act.”

“As the world’s leading democracy, we need to set a better example. Right now, Europe is forging ahead with some of the most sweeping legislation in years to regulate the abuses of big tech companies. Their approach might not be the exact right model for America. But it points to the need for us to coordinate our work. We should find our voice in the global conversation.”

###

From the Newsroom

Statements
Oct 03, 2024
Accountable Tech on Injunction Against California Anti-Deepfakes Law

Accountable Tech Co-Founder and Executive Director Nicole Gill issued the following statement on U.S. District Judge John A. Mendez’s ruling to pause enforcement of AB 2389, a California law recently signed by Governor Gavin Newsom to prevent the distribution of deepfakes made about political candidates within a certain proximity to an election:

Statements
Oct 01, 2024
Accountable Tech on Enactment of Maryland Kids Code

Accountable Tech Executive Director and Co-Founder Nicole Gill issued the following statement on the Maryland Kids Code coming into effect:

Join the fight to rein in Big Tech.

Big Tech companies are some of the most powerful and profitable companies in history, presenting new threats to the safety of communities and the health of democracy. We’re taking them on through legislation, regulation and direct advocacy.