Sections

Commentary

How a compliance mindset undermines antitrust reform proposals

Facebook CEO Mark Zuckerberg speaks via video conference during an Antitrust, Commercial and Administrative Law Subcommittee hearing, on Capitol Hill, in Washington, Wednesday, July 29, 2020, on "Online platforms and market power. Examining the dominance of Amazon, Facebook, Google and Apple" No Use UK. No Use Germany.

The CEOs of America’s most powerful technology companies went before Congress recently to answer questions about their growing role in the U.S. economy. Lawmakers grilled the CEOs on their business practices and whether it is time to curb their companies’ market power. But for antitrust to work in the digital era, it must go beyond its traditional focus on market power to consider questions of public interest.

For years, technology ethicists have considered how to square the interests of major companies with the interests of society as a whole, and recent approaches to ethics in the technology industry provide a cautionary tale for antitrust policy. Our data ethics research shows that Big Tech companies tend to approach challenges from the perspective of compliance: that as long as a company ticks the boxes on a checklist, it is in the clear. While checklists can highlight concerns, they do not necessarily lead to ethical actions. Current approaches to antitrust threaten to extend a compliance mindset, which will likely result in efforts that ostensibly tackle market power but fail to protect consumers. 

Building a compliance mindset

In 1993 the Harvard Business Review chastised out-of-touch business ethicists for approaching their field too broadly, being too theoretical and impractical, and failing to offer concrete solutions to thorny real-world problems. In adapting to these problems, “values” like not being evil or focusing on doing good for the community, flooded corporate websites and statements. Following the Worldcom and Enron accounting scandals, Congress in 2002 mandated that publicly traded companies disclose their business ethics statements. A newfound need to operationalize ethics fit the mindset of the software engineers and corporate managers running technology companies.

Software engineers like solvable problems, and they tend to think technology can solve those problems—and ethics is no exception. Statistics might indicate a project is 76.3% ethical, for example. But ethical questions do not have such obvious answers. For instance, in 2016, after a Facebook study was found to have manipulated the emotions of the platform’s users, the company pledged that it would apply ethics reviews to research and engineering processes, promoting concepts like beneficence, justice, and respect—but those reviews did not apply to broader company business decisions.  

Managers, meanwhile, want to compare results and turn ethical values into measurable objectives. Ethics codes that would otherwise state the highest professional aspirations boil down to following the law with adherence to a list of regulatory obligations. Such an approach to regulation—“managerialism”—often reduces the most complex to the most measurable. Publicly traded companies simply extend existing regulatory compliance regimes for codes of business conduct. 

Nowadays, technology companies are busy trying to fit ethics of emerging technologies within their compliance perspective on data ethics. Google’s “Responsible AI Practices” document uses both broad value statements and practical, specific performance goals to describe its commitment to ethics—primarily through fairness and privacy. Similarly, Microsoft Research has released an AI ethics checklist for the industry, while its corporate parent has pledged to add an AI ethics review step to its checklists on privacy, security, and accessibility.

Compliance is satisfying for managers and technologists, but an over-reliance on it turns data ethics into a hollow performance, which has led some critics to argue that technology companies are actively attempting to “ethics-wash” their businesses. Business ethics, at its core, should clarify the reasoning behind management decisions and technical choices in order to broaden understanding of how the industry affects society. Checklists, by obscuring the logic behind decisions, do the exact opposite of clarifying reasoning and decisions.

The phenomenon of “compliance above all else” is not constrained just to ethics. Privacy law has devolved into a series of automated compliance tools. Organizations can dutifully report that they comply with consumer privacy protection when, in actuality, legal departments tick off boxes to lower risk exposure to regulatory action. For instance, the General Data Protection Regulation (GDPR), the landmark EU privacy law which entered into effect in May 2018, has strict boundaries about how users give their consent. But even those boundaries fit neatly within the compliance mindset: When a user accepts the cookies used by a website, it symbolizes the company is GDPR compliant, which in turn carries with it a patina of doing the right thing regardless of substantive actions.

Ethical statements and audits reduce complex matters of societal power to a series of checkboxes, thereby obscuring other questions and unchosen alternatives. Over-reliance on compliance mechanisms to operationalize anti-competitive behavior may diminish the trust that is necessary for the company to act in the public interest. Compliance, a necessary governance tool to meet legal and contractual obligations, cannot control business models that exploit the trust of users. 

The risk for antitrust 

Antitrust legislation could easily fall into the same trap as technology ethics and fail to protect the public interest through an over-reliance on compliance mechanisms.  

One popular antitrust enforcement mechanism is to split up large companies. But dividing a large company into many smaller ones will not address the issue at hand, as the business practices of the dominant player are likely to be replicated throughout the industry. In fact, it may solidify the same market logic across all competitors to the detriment of consumers.

A secondary antitrust remedy is to increase the regulatory burden of large companies to monitor for anti-competitive behavior. Companies must consistently signal that they are not using their market dominance to stifle competition. Substituting consumer protection for regulatory mandates is like constantly monitoring a patient’s temperature without trying to understand and stop the fever. Performative in nature and limited in scope, this regulatory remedy falls short of legitimate reform or corporate oversight.

We argue that antitrust policymakers should be aware of three interconnected concerns:

1. Complex regulatory regimes can only be satisfied by dominant players 

The CEOs of both Facebook and Google have historically requested that government increase regulation. And that’s probably because only the largest organizations will have the capacity to translate policy and manage regulatory risk, relying on growing in-house legal teams and a blossoming auditing industry.

Smaller firms will find it increasingly difficult to comply, which would be disastrous for competition. With low capital needs, many tech industry successes start with a programmer in her basement writing code. In a space where innovation is critical, a complex regulatory machine may be anti-competitive. Any regulatory regime must be applicable and manageable across a range of firm sizes.

Some antitrust proposals consider restricting a very large company from merging or acquiring smaller companies or barring two moderately large ones from merging. But proposals to control mergers and acquisitions based on size do little to address whether smaller firms are able to comply with complex regulatory regimes in the first place.  

2. Too big to audit 

While episodic antitrust monitoring was possible in other industries, lawmakers should be wary of applying the old regulatory logic to digital markets. Just as some banks are considered too big to fail, the technology industry may be too big to audit without the right approach. 

Given the scale of digital markets, reporting will need to be high-level enough for an outsider to track activity without being bogged down in details. It is non-trivial to devise ways to track 2 billion users, 63,000 searches per second, 400,000 app downloads or a million dollars spent per minute. Digital markets leverage economies of scale at an entirely new level of complexity.  

Though the Federal Trade Commission has a mandate to enforce antitrust laws, given the lack of tech expertise in the public policy world broadly, it is unclear who in government will provide effective oversight. Governments will continue to find it difficult to ask the right regulatory questions and to digest the responses in a meaningful way, which is more likely to lead to rote thinking.

3. Compliance might outweigh the public interest 

Ethics statements and audits have come to represent an attempt to demonstrate “doing good,” but their content and application often fail to address genuine concerns. Debates over free speech, electoral politics, or safety for under-represented groups remain unresolved despite the technology industry’s attempts to support the public interest. One small step is to decouple ethics leadership from legal compliance departments within technology corporations in order to better address the broader, societal concerns that are driving interest in antitrust. 

Making data available

In preparation for last week’s hearing before the antitrust subcommittee of the House Judiciary Committee, the companies involved were asked to answer basic questions about competitors, revenues, and products for each product line. That kind of basic information is usually only available to regulators as part of a merger or acquisition. Congress’s data gathering process thus illustrated what regulators currently lack: the data necessary to write informed regulations.

We recommend that a mechanism for reporting consistent market information affecting tech companies be established. This is where open data can support what the regulators cannot do. Making sure that basic market information is regularly published in comparable formats will benefit everyone: regulators, possible competitors, and the public. With expertise on digital markets within and outside the public sector, regularly reported high-level data can help other stakeholders, such as activists and journalists, provide nimble, targeted, and contextualized accountability to these companies, without the potential for hurting competitors or entrenching the dominant players. While not a panacea, clarifying the size and scope of the industry can be a powerful control mechanism for future mergers and acquisitions, in addition to the malaise of compliance.

Open data will not make large companies less powerful on its own, but it provides a better tool to examine and realign company behavior with the public interest, rather than an ethics checklist. In the recent antitrust hearings, the most probing questions were based not on data that is openly available to regulators, but on company practices uncovered by investigative journalism. If more open data were available, regulators would have a better understanding of how companies operate and would be able to better conduct oversight to encourage public-interest behavior. It is hard to describe a more concrete course of action than open data because the markets in which huge technology companies operate remain poorly understood.

Regulation for the future 

The next generation of antitrust law must consider the anti-competitive behavior making it possible to build audiences and monopolize the attention of billions without competitors. As Congress debates how to apply antitrust law toward the current giants of Silicon Valley, what’s at stake is more than the fate of a few large companies. The rapid growth of online platforms into almost all parts of American life has made their missteps both hugely consequential and highly visible. Poor policy, design, or execution at a given platform has the potential to destabilize large swaths of society, influencing economies, local and national politics, and communities. For that reason alone, antitrust legislation must avoid rote, narrow regulations that could turn the tech industry into one where only the largest can succeed.    

The need for nuanced regulation and legislation could not be clearer. Hopefully, our recommendations can be a starting point for moving beyond the compliance thinking that has permeated industry and Capitol Hill. In order to actually focus on the public interest, companies must go above the letter of the law—or the regulatory checkbox. 

David Morar is a visiting scholar with the Digital Trade and Data Governance Hub at the Elliott School of International Affairs and a fellow at the Digital Interests Lab.

Anne Washington is an assistant professor of data policy at New York University.

Facebook, Google, and Microsoft provide financial support to The Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research. 

Authors

The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).