Transitions: Planning and staffing a presidency

LIVE

Transitions: Planning and staffing a presidency
Sections

Commentary

The Digital Services Act’s lesson for U.S. policymakers: Co-regulatory mechanisms

The logos of Facebook, Messenger, Twitter, Instagram, WhatsApp, Snapchat and Pinterest social media app icons on a smartphone.

The Digital Services Act (DSA), a landmark piece of legislation which updates the E-Commerce Directive of 2000, is in the final stages of being fully approved by the institutions of the European Union (EU). While a majority of the bill would not stand a First Amendment test, parts of the Act can offer surprising lessons for policymakers in the United States, where bills focus mostly on a misguided attempt to strike down intermediary liability protections.

Of course, the text of the DSA is yet to be finalized, and parts of its complex mechanisms for legislating have been decried as opaque and shady, as articles were being added or removed without external stakeholder involvement or even knowledge. However, the main thrust of the bill will likely remain the same. Some would rightfully argue that specific obligations, rights, and duties are the unique attributes of the DSA. The takeaway for U.S. policymakers, though, is that they can very easily adapt, without fear of a First Amendment pushback, something else: the DSA’s co-regulatory mechanisms. A co-regulatory mechanism is a governance structure where government involvement exists, but is limited, and most of the actions are taken by other stakeholder groups, usually under the oversight of one or more governmental bodies. Such a mechanism, fundamentally transparent, can be more inclusive than the normal policy-making process, and would have built-in buy-in from industry.

The European Union, famous for its antitrust case against Microsoft, has expressly pointed out that the DSA package is a tool to not just protect the rights of users, but to also “establish a level playing field,” decrying the outsized role of “a few large platforms.” In fact, the extra obligations for very large online platforms (VLOP) and search engines, an evidence of asymmetric regulation, apply much sooner than the ones for other platforms and intermediaries. In short, this bill and its accompanying partner in the Digital Services Package, the Digital Markets Act, are built with the express purpose of countering the power of the large companies that are colloquially known as “Big Tech.”

Even so, beyond other important obligations, like mandating researcher access to data, the DSA relies heavily on co-regulatory mechanisms with intense oversight to achieve its goal. It separates very large online platforms from the rest of the online speech ecosystem and defines them generally as having more than 45 million monthly active users, although it can certainly tweak this definition as well. Although VLOPs must adhere to the rules that all other online service providers are subject to, they also have additional obligations—a significant chunk of them being co-regulatory mechanisms. The three most important co-regulatory mechanisms it uses are assessments, codes of conduct, and audits. We’ll go through each of them and discuss how they function in the DSA.

Assessments

The annual obligation to perform risk assessments is a linchpin of the regulation. These assessments are loosely structured by the government and aim to uncover risks posed by illegal content, and its effects on fundamental rights, civic discourse, public security, electoral processes, gender-based violence, and public health. They focus on how the platforms’ choices on recommendation, moderation, terms and conditions, ads, and data practices influence the systemic risks. Beyond that, the DSA is not descriptive in how it defines the risk assessments, and VLOPs are to create these assessments on their own. The assessments are not done for their own sake. Rather, platforms are supposed to—based on them—establish mitigation measures. They are not left to their own devices, though. The DSA positions the Commission and the newly created national-level regulators as backstops by giving them the power to issue guidelines, and the responsibility to publish comprehensive yearly reports on risks and mitigations.

Codes of Conduct

While risk assessments are clearly a tool to make sure platforms are actively focusing on the issues that the DSA drafters have currently identified as important, the legislation builds in another co-regulatory mechanism. It allows for a multistakeholder structures to come together and assess industry-wide “significant systemic risk” related to several VLOPs or VLOSearch Engines (VLOSE) and build voluntary commitments to mitigation methods—along with a reporting framework and metrics. The DSA notes the VLOPs and VLOSEs in question, along with other parts of the industry, are necessary to kickstart the process. Other stakeholder groups, including civil society and relevant competent authorities, are also enumerated as potential members at the discretion of the EU Commission. While drafting is to be done by these participants, the government would take an active role to ensure the codes of conduct are written properly, and especially consider the needs and interests of EU citizens. The codes of conduct thus involve significant government oversight but are still a co-regulatory mechanism. Some of existing EU codes, like the Code of Practice on Disinformation, will likely be considered a DSA Code of Conduct. Separately, the DSA also calls for voluntary codes on advertising and accessibility, but those are not immediately restricted to VLOPs.

Audits

The DSA requires VLOPs to be subjected to a yearly independent audit of their compliance with the obligations set out in the DSA, and with the obligations set out in the codes of conduct themselves. While the legislation has specific guidelines on who the auditors should be, and a very general minimal framework for the audit reports themselves, the DSA does not design the audits in any way. In fact, it explicitly says that it supports the building of voluntary—presumably industry-based—standards for said audits. Certainly, the Commission still has a role in oversight for these processes, but it is not involved at all directly.

From intermediary liability to co-regulation in the United States

Platform governance is a tricky endeavor, not just because of the underlying complexities of these technologies and their role in society, but also since legislating it in the United States is wrought with either adversarial animosity or sweetheart deals. Increasingly, the “do-something” point of view dominates, and it invariably focuses on making changes to the limited legislation already in place, mainly Section 230, which deals with intermediary liability. This would at best be an indirect and highly risky way of getting to a solution. While some proposals tackle adjacent and crucial aspects, like researcher access to data, gutting Section 230 has unfortunately risen to the top of the pile of legislative propositions in the United States. Constructive, sustainable alternatives exist in other jurisdictions and they don’t have to be gifts to industry. The European Union has no issue being proactive in legislating and has a long history of actively going after large (U.S.-based) technology companies, but it eschews reforming its intermediary liability statutes as a way of protecting consumers.

On the contrary, the DSA uses innovative tools like co-regulatory mechanisms to achieve some of its goals. It tailors these mechanisms to the large platforms that elicited the desire for regulation and makes sure to not otherwise impact the entire ecosystem. The upsides of this are clear: transparent and standardized industry action, the deeper involvement of civil society and other third parties, and external oversight, respectively. These co-regulatory mechanisms can take many formats and can be tweaked to have a far easier time in surviving the U.S.’s First Amendment scrutiny than changes to Section 230.

In fact, somewhat similar governance structures already exist in the U.S. The Digital Trust and Safety Partnership (DTSP), founded in 2020, is an industry initiative that aims to build best practices (similar to codes of conduct) and subsequent internal and third party assessments based on those best practices. Beyond the big names one would expect—Meta, Google, Twitter, Microsoft, and Apple—there are smaller players, like Vimeo, Bitly, Pinterest, and others, involved as well. DTSP is primarily an industry-based structure, but it has taken input from civil-society actors. It has released its first iteration of best practices designed as five commitments to product development, governance, enforcement, improvement, and transparency. This shows that with or without legislation, there is likely industry appetite for action. Institutions like the DTSP can build these mechanisms by themselves or can be partners in co-regulatory mechanisms.

With their prominent inclusion in the DSA, co-regulatory mechanisms are now legitimate tools to tackle online harms and risks. They provide flexibility that traditional legislation usually lacks, either systematize or publicize industry action through assessments, invite in civil society to help shape mitigation strategies in codes of conduct, or ensure independent audits of platforms. There are important caveats to note. First, co-regulatory mechanisms are not, by themselves, sufficient to tackle the wide range of risks, harms, and other issues that stem from online platforms, and should not be used to replace other legislation. Complementary measures are important and the DSA is a mammoth piece of legislation that designs many other mechanisms. Thus, similar choices should be contemplated in the U.S. Secondly, the DSA is the first-time co-regulation is used in this context. In fact, the exact success or efficacy of any industry-focused structures—be they co-regulatory or similar to the DTSP—is entirely unknown, and potential limits, like audit capture, may materialize.

However, instead of arguing over untenable reform of intermediary liability as the first step in regulating the online ecosystem and reigning in major players, U.S. policymakers should take a page out of the EU’s book and try to understand the limits, scope, and role of co-regulatory mechanisms.


Microsoft, Meta, Google, and Apple are general, unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the author and not influenced by any donation.

The author would like to thank Alex Engler for his editorial assistance. 

Author