AntitrustDemocracy

The EU is Rewriting the Rules of the Digital World: Key Pillars of the DSA/DMA and Pertinent US Tech Policy Proposals

Aditi Ramesh Jesse Lehrich

This is the second Staff Post in a series that explores what the DSA and DMA means for the global tech accountability movement, key provisions of the legislative package, and compares them to pertinent proposals before Congress in the US.


The Digital Services Act (DSA) sets out to protect people’s fundamental rights online, and foster a safer and more accountable internet. It accomplishes this not by restricting speech, but by establishing obligations for digital services proportionate to their size, role, and impact on the online ecosystem. While all intermediaries must meet basic requirements – publishing clear terms of service and responding to court orders to remove pieces of illegal content – the largest online platforms must also assess and mitigate systemic harms inherent to their products, share access to data with researchers, open up black box algorithms, and more.

Below is a summary of the DSA’s key pillars and relevant US corollaries:

Key Provisions of the Digital Services Act (DSA)

US Corollaries

Risk Assessments and Mitigation with Independent Auditing. The requirement for large platforms to assess and mitigate risks inherent to their services – subject to independent auditing – may be the DSA’s most important feature. It will hold tech giants accountable for the extent to which things like their core business model, product design, content policies, data practices, and ad targeting drive societal harms.
The bipartisan Kids Online Safety Act – introduced by Sens. Blumenthal (D-CT) and Blackburn (R-TN) – would require covered platforms to publish annual reports assessing systemic risks to minors and mitigation tools based on independent auditing, among other protections.
Access to Data for Researchers and Civil Society. From COVID conspiracism to Kremlin propaganda, the proliferation of harmful online disinformation is killing people, yet only the platforms themselves know the scope and shape of the problem – and they have a vested interest in keeping it that way. The DSA will finally give researchers and NGOs access to the data necessary to counter deadly lies.
Sens. Coons (D-DE), Portman (R-OH), and Klobuchar (D-MN)’s Platform Accountability and Transparency Act would require large social media platforms to provide vetted researches access to data for NSF-approved projects, and empower the FTC to mandate certain information to be made proactively available on an ongoing basis. (i.e. – a comprehensive ad library with data on targeting and user engagement.)
Algorithmic Transparency and User Choice. Big Tech platforms are incentivized to maximize engagement above all else, so they employ powerful algorithms to predict what content is most likely to keep each user clicking and serve it up accordingly – often without our awareness or consent.
The bipartisan Filter Bubble Transparency Act would require large online platforms that utilize user-specific data and automated content curation systems to clearly notify users about the use of those algorithms, and allow users to easily switch to a transparent ranking system not based on profiling, such as a chronological feed.
Banning Surveillance Advertising Aimed at Minors or Using Sensitive Data. Upending Big Tech’s business model is critical to tackling the myriad harms it motivates.
Sen. Markey (D-MA) and Sen. Cassidy (R-LA)’s ‘COPPA 2.0’ bill would ban targeted ads to kids, while the bicameral Banning Surveillance Advertising Act would prohibit the practice altogether.
Ensuring What’s Illegal Offline is Illegal Online. While the DSA maintains broad intermediary liability exemptions similar to Section 230, it establishes a regime to strike a balance between promoting the timely takedown of explicitly illegal content and ensuring that platforms aren’t incentivized to over-censor. It also gives users clarity throughout the process and mechanisms for appeal.
Sens. Schatz (D-HI) and Thune (R-SD)’s PACT Act would require large online platforms to remove court-determined illegal content and activity within four days; establish a complaint system that processes reports and notifies users of moderation decisions within 21 days; and produce a biannual transparency report on content moderation actions.

In March 2022, the European Union (EU) secured an agreement on the DSA’s companion bill, the Digital Markets Act (DMA), which puts forward a robust, but measured, series of “do’s” and “don’ts” that are designed to rein in some of the most egregious abuses of monopoly power deployed by gatekeeper platforms to entrench their dominance at the expense of consumers, competition, and innovation. The key elements of the DMA track closely with bipartisan, bicameral antitrust proposals moving through the Congress, especially the American Innovation and Choice Online Act and the Open App Markets Act, which advanced out of the Senate Judiciary committee early this year. 

Below is a summary of the DMA’s key pillars and relevant US corollaries:

Key Provisions of the Digital Services Act (DSA)

US Corollaries

Prohibiting Gatekeepers from Rigging Marketplaces in their Own Favor. At the core of the DMA is a common-sense principle: The world’s largest platforms should not be allowed to rig the markets they operate to further entrench their monopoly power.
The bipartisan American Innovation and Choice Online Act outlines a set of new rules to curtail dominant tech platforms from rigging the marketplaces they operate to entrench their monopoly power. It would prohibit these gatekeepers from unfairly boosting their own products or kneecapping rivals; exploiting nonpublic data or hindering businesses access to their own data; conditioning access or placement on the use of services not intrinsic to the covered platform; or preventing users from app uninstalls.
Reining In App Store Monopolies. In addition to its broader self-preferencing prohibitions, the DMA includes distinct provisions designed to free app developers and consumers alike from monopoly abuses of app store operators.
The bipartisan Open App Markets Act, which sailed through committee with a 20-2 vote, would require operators of dominant app stores to allow users to install third-party apps and app stores from outside their walled gardens and set them as defaults. It would prohibit those gatekeepers from conditioning developers’ access to app stores on the use of payment services; boosting their own apps in search results; or exploiting nonpublic data from other apps for competitive gain.