Sections

Research

Framing a privacy right: Legislative findings for federal privacy legislation

US Congress

INTRODUCTION

Since debate on privacy legislation began in earnest following the 2018 midterm elections, members of Congress have released over 20 comprehensive information privacy bills or drafts. Most of these bills do not include proposed legislative findings or statements of policy that explain the overarching foundations of the legislation. Such declarations are primarily rhetorical rather than operational, but this rhetoric can serve several important functions. In this light, as the House and Senate and a new presidential administration prepare for the 117th Congress, we are proposing a set of legislative findings for federal privacy legislation as a template for the coming debate.

A handful of bills, like Sen. Kirsten Gillibrand’s (D-N.Y.) Data Protection Act and Sen. Sherrod Brown’s (D-Ohio) Data Accountability and Transparency Act, include statements of legislative findings or policy—but the majority do not. Most notably, legislative findings are absent from the bills that are the most likely starting points for legislation in 2021: Sen. Maria Cantwell’s (D-Wash.) Consumer Online Privacy Rights Act, Sen. Roger Wicker’s (R-Miss.) SAFE DATA Act, and the House Energy and Commerce “bipartisan staff draft” (although the latter does include a bracketed placeholder).

These omissions are understandable. The core provisions of privacy legislation present plenty of challenging and contested issues that we explored in our report “Bridging the gaps: A path forward to federal privacy legislation” last June. That report analyzed the Wicker and Cantwell bills in particular and suggested ways to reconcile differences, including on the polarized issues of preemption and private right of action. To make our recommendations concrete, we also drafted complete legislative text. Complete, that is, except for legislative findings. Like the House bipartisan staff draft, our proposed legislative text put in only a placeholder. The “Bridging the gaps” report commented briefly on the significance of findings and referred to “an ongoing project” on the content of such findings. Here, we introduce legislative findings and policy statements as the end product of that effort. They seek to provide an overarching foundation for our June report, and to fill in the placeholders in the draft privacy legislation. We hope to motivate further discussion about articulating the foundational principles and aims of privacy legislation.

“Legislative findings present an argument for a bill that can help build congressional and public support.”

Legislative findings present an argument for a bill that can help build congressional and public support. More significantly, they build a record of congressional intent that can guide interpretation by courts, administrative agencies, and affected parties, as well as enunciate grounds to uphold legislation against constitutional challenges. Such a record will be significant for the inevitable challenges to privacy legislation, especially on First Amendment commercial speech protections or Article III standing questions. As Congress continues to debate privacy legislation going into 2021, legislative findings will provide an opportunity to make a statement about the significance of privacy—one that can not only inform judges, regulators, and lawyers applying a privacy law, but can also declare American values to the world.

What do legislative findings accomplish?

Writing for the University of Chicago Law Review, Brigham Young University law professor Jarrod Shobe conducts a rare and thorough study of the use and legal significance of legislative findings and purposes in enacted statutes. In a review of 30 years of legislation appearing in the Statutes at Large, the chronological collection of laws passed by Congress, Shobe finds that the majority of “significant bills” included findings. Nevertheless, he points out, courts and legal academics have given them little attention. This judicial omission is wrong, he argues, because “[e]nacted findings and purposes are law just like any other law, and there is no reason why they should not be given the full weight of the law.”

Shobe points to institutional reasons for the underestimation of legislative findings: they are often published only in the Statutes at Large, stripped out when legislation is codified in the United States Code, and discouraged by the House and Senate Offices of Legislative Counsel. In short, this omission does not indicate a lack of legal import, but rather reflects a mere matter of practice by the Office of Law Revision Counsel, a nonpartisan congressional office charged by statute with “[c]lassifying newly enacted provisions of law to their proper positions in the Code.” If Congress wants its text to matter, perhaps it should change this practice of relegating enacted findings and purpose statements to either hard-to-find statutory notes or the Statutes at Large.

Professor Shobe’s groundbreaking argument makes sense. The late Supreme Court Justice Antonin Scalia’s most accepted contribution to American law has been the significance of textualism in statutory interpretation (not to be confused with a second and more debated Scalia contribution, originalism in constitutional interpretation). From his first term on the Supreme Court, Scalia waged a battle against use of legislative history to interpret statutes, dismissing statements of sponsors and even committee reports as irrelevant to the meaning of the text and a dubious reflection of the intent of Congress as a whole. As a result, as Jonathan Siegel puts it, “[w]e are all textualists now compared with the 1960s and 1970s.” As Shobe points out, legislative findings are a different order of congressional statement. “The text is the law,” Scalia argued—and legislative findings adopted by Congress are an integral part of that text, enacted by the full Congress the same as the other parts of a bill.

The need for legislative findings in privacy legislation

Legislative findings will matter in privacy legislation. Privacy legislation addresses the use of personal information, and any legislation that regulates information may implicate the First Amendment. The Supreme Court has extended First Amendment protection to encompass commercial speech, in particular advertising and marketing, albeit under a more lenient standard than for non-commercial speech. Under this standard, any government restrictions on commercial speech must “directly advance” a “substantial” government interest, whereas restrictions on other speech must be “narrowly tailored” to interests that are “compelling.” Commercial speech cases raise questions for how privacy legislation may limit businesses that share personal information with third parties, track and profile online users for advertising and marketing, serve ads based on personal information, and collect and disseminate information that may be considered public. Inevitably, some of these issues will become judicial questions.

In particular, the Supreme Court’s 2011 decision in Sorrell v. IMS Health may increase the likelihood that limitations on advertising and marketing resulting from privacy legislation will be challenged on First Amendment grounds. The Sorrell decision resulted from a Vermont law that prohibited pharmacies from disclosing information that links doctors with drug prescriptions to “data miners,” and barred pharmaceutical manufacturers from using such information to contact doctors for marketing purposes. The Court saw the statute as discriminating against these pharmaceutical manufacturers and marketers on one hand, while others were allowed to use the same prescription information for non-marketing purposes. It therefore struck down the statute under the stricter standard that applies to discrimination in expression (but said the same result would apply under the more lenient commercial speech standard).

This narrow rationale was complicated by the statute’s confusing history—leaving the impact of the Court’s decision on privacy limits open to interpretation. Notably, the Court contrasted Vermont’s narrowly-targeted law with the broader protection of the Health Insurance Portability and Accountability Act (HIPAA) and suggested that a HIPAA-like statute “would present quite a different case than the one presented here.” Even so, the potential impact of privacy legislation on commercial speech has been raised in the privacy debate, and challengers would likely cite Sorrell as support for their claims. Indeed, a Maine law establishing privacy requirements specifically for broadband providers is being challenged on the theory that it discriminates among First Amendment speakers, and facial-recognition systems provider Clearview AI is defending a suit under Illinois’s Biometric Information Protection Act on similar grounds. Legislative findings will help to meet the Supreme Court’s effective guidance in Sorrell—that Congress should explain the broad societal goals that baseline privacy legislation seeks to address.

Privacy legislation also needs to address the Supreme Court’s Spokeo v. Robins 2016 decision on standing. This class action case was brought under the first federal privacy statute—the Fair Credit Reporting Act of 1970—by a plaintiff who alleged that a “people search engine” used to assess individual credit contained inaccuracies and sought damages. The Court sent the case back to the lower courts to determine whether allegations of intangible harm were “particularized” and “concrete” enough to meet the requirement of a case or controversy to sue in federal court under Article III of the Constitution. In discussing these requirements, the Court noted that injury to rights like free speech and free exercise of religion can be concrete for these purposes even though they are abstract. Although the Court ruled that not every inaccuracy or procedural violation under FCRA amounts to concrete harm, it acknowledged that when considering “whether an intangible harm constitutes an injury in fact, both history and the judgment of Congress are instructive.” The Spokeo decision specifically recognized that “Congress is well positioned to identify intangible harms that meet minimum Article III requirements …” This directly invites Congress to identify and articulate privacy harms.

“Legislative findings in privacy legislation will amount to a congressional brief to the Supreme Court.”

In the context of issues like these, legislative findings in privacy legislation will amount to a congressional brief to the Supreme Court that can articulate “substantial” or “compelling” interests—and how the statute directly advances these interests as well as limits harms that privacy violations can cause to individuals. With the force of law, such a brief would be far more persuasive than any “appellate counsel’s post-hoc rationalizations for agency action.”

Our proposed legislative findings

With these considerations in mind, we have drafted a set of legislative findings and policy conclusions to anticipate and address these likely challenges. Our proposal is intended to inform interpretation by courts, the Federal Trade Commission, and the many other parties that would apply privacy legislation and need to understand congressional intent. These proposed findings are more expansive and detailed than what is typical in most legislation (though far more concise than the 173 recitals and 31 pages that explain the European Union’s General Data Protection Regulation). Unlike Members of Congress, however, we do not seek to win the votes of a majority of colleagues or to declare a set of conclusions. Instead, as think tank scholars, our goal is to provide a comprehensive outline for Congress and stakeholders to consider, and arguments to back up the recommendations in our June report. We leave it to Congress to distill such ideas into punchy declarations about enacted legislation as the privacy debate builds on current bills in the next Congress.

In drafting these suggested findings, we have drawn wherever possible on existing declarations in enacted legislation. Endnotes to the suggested language indicate these and other sources of the language.

The findings below contain five segments. We begin with a brief statement of the legal, moral, and historical foundations of privacy in America, demonstrating that privacy is a value deeply embedded in American law and society and describing some of the history that the Supreme Court alluded to in Spokeo. Second, we cover recent technology developments that underlie the need for legislation, especially the explosion of data and widespread collection and sharing of personal information, as background for legislative purposes.

“Privacy is a value deeply embedded in American law and society.”

The third and fourth segments lay out these purposes. The third identifies effects of these developments that privacy legislation aims to address and the fourth explains how it aims to address them. Finally, we conclude with a set of policy declarations that express key governmental objectives (which may be somewhat broader than the findings and specific provisions of the legislation), as Congress did in both the Fair Credit Reporting Act of 1970 (FCRA) and “Section 230” of the Communications Act of 1934, as amended (47 U.S.C. § 230). FCRA was a foundational national privacy law not only in the United States but around the world, and Section 230 is aimed at some of the same sectors most affected by privacy legislation. The proposed findings address reasons for the potential compromises framed in our “Bridging the gaps” report: tailored federal preemption, individual rights aimed at recognized privacy harms, and a graduated approach to risk and obligations centered on duties of loyalty and care that balance prescription and flexibility.

PROPOSED FINDINGS

The legal, moral, and historical foundations of privacy in America

  • The right to privacy is a personal and fundamental right protected by the Constitution of the United States.
  • Americans cherish privacy as an essential element of their personal and social lives, and our system of self-government. It serves essential human needs by sheltering zones for individual liberty, autonomy, seclusion, and self-definition, including the exercise of free expression; for family life, intimacy and other relationships; and for physical and moral space and security, among other values.
  • Privacy also advances societal interests in the protection of marginalized or vulnerable individuals or groups, the safeguarding of foundational values of democracy, and the integrity of democratic institutions and processes including elections.
  • The United States has protected aspects of privacy since the Nation’s founding. The Constitution protects various privacy interests through the First, Third, Fourth, Fifth, Ninth, and Fourteenth Amendments, and protection of individual privacy helps enable the exercise of these fundamental civil rights and fundamental freedoms of all Americans.
  • The United States has a history of leadership in privacy rights since that time. It enacted some of the first privacy laws anywhere beginning in the 18th century, it gave birth to the legal concept of a “right to privacy” in the 19th century and, in the 20th century, it adopted one of the first national privacy and data protection laws as well as “fair information practice principles” that influenced laws and privacy practices worldwide.The United States should continue to be a leader in protecting privacy rights in the 21st century.
  • The right to privacy is widely recognized in international legal instruments that the United States has endorsed, ratified, or promoted.

The development of a digital information society and economy

  • Throughout the Nation’s history, economic growth, opportunity, and leadership have been propelled by technological innovations. In the 20th century, digital and communications technologies and networks have become integral to economic competitiveness, social and political discourse, and the flow of information, ideas, and innovation in the United States and around the world.
  • The expansion of computers, Internet connectivity, mobile telephones, and other digital information and communications technology has magnified the risks to individuals’ privacy that can occur from collection, processing, storage, or dissemination of personal information.
  • Digital network connectivity has become essential for full engagement in modern life.
  • As of 2019, more than 90 percent of Americans possess mobile telephones and approximately 80 percent own smartphones equipped with powerful computers, immense storage capacity, arrays of sensors, and the capacity to transmit information around the globe instantaneously. Many individuals use these devices continuously and store on them a digital record of nearly every aspect of their lives.
  • An increasing number of individuals have smart consumer devices such as automobiles, televisions, home appliances, and wearable accessories that collect, process, and transmit information linked to these individuals and their activities.
  • In addition to these personal devices, a growing number of interconnected sensors in public spaces collect, process, and transmit personal information linked or linkable to individuals, often without their knowledge or control. The number of such devices is likely to expand faster with increased deployment of smart public and private infrastructure and systems and advances in network technology.
  • These ubiquitous and always-connected devices have exploded the volume and variety of personal information collected, stored, and analyzed by a wide variety of entities. Such information is often available not only to service providers with which the individuals affected have some relationship, but also to networks of applications providers, websites, advertisers, data brokers, and additional parties that are able to collect, process, and transmit the information for purposes that may be unexpected and unrelated to the reason for which this information originally was shared or collected.
  • The aggregation of personal information from many different sources across these networks, coupled with the increasing power of data science, enables a wide variety of entities to make connections, inferences, or predictions regarding individuals with levels of power and granularity far beyond what individuals linked to this information reasonably know or expect. These include the ability to link information to specific individuals even in the absence of explicit identifying information, and to derive conclusions about individuals that are sensitive to a reasonable person.

The impact of information technology on individuals

  • Surveys demonstrate that most individuals do not read or understand published privacy policies.
  • Even if they do, the increased velocity, complexity, and opacity of data collection, aggregation, and use have rendered individual control or consent a futile exercise.
  • Numerous surveys of consumer attitudes on privacy and security also indicate that a majority of Americans lack confidence in industry to handle personal information and keep it secure, and also believe they lack control and knowledge of information collected about them.
  • Some use of personal information in advertising and marketing provides benefits to businesses and consumers by disseminating information about products, services, and public issues; supporting the delivery of news and other content; and enabling free services. However, increases in precise targeting of individuals and automated advertising exchanges have enabled sharing of personal information for advertising that can be unwanted, intrusive, manipulative, discriminatory, or unfair.
  • With the development of artificial intelligence and machine learning, the potential to use personal information in ways that replicate existing societal biases has increased in scale. Algorithms use personal information to guide decisionmaking related to critical issues—such as credit determination, housing advertisements, and hiring processes—and can result in differing accuracy rates among demographic groups. Such outcomes may violate federal and state anti-discrimination laws or result in diminished opportunities for members of some groups. The covered entities that use these algorithms should have the responsibility to show that the algorithms do not cause discriminatory effects.
  • The majority of Americans have experienced losses of personal information linked to them due to data breaches that have occurred at numerous businesses and institutions. Personal information increasingly is a target of malicious actors, including nation-states and organized criminals anywhere in the world.
  • The aggregation of increasing volumes of data among many different entities expands the attack surface exposed to malicious actors in cyberspace and the availability of personal information to such actors.
  • The risks of harm from privacy violations are significant. Unwanted or unexpected disclosure of personal information and loss of privacy can have devastating effects for individuals, including financial fraud and loss, identity theft and the resulting loss of personal time and money, destruction of property, harassment, and even potential physical injury. Other effects such as reputational or emotional damage can be equally or even more substantial.
  • Individuals need to feel confident that data that relates to them will not be used or shared in ways that harm themselves, their families, or society.
  • As with all forms of commerce, trust is an essential element for broad consumer use and acceptance of goods and services offered in the digital economy, and a growing lack of trust in online services harms the interstate and foreign commerce of the United States. Trust is also important to social and political discourse, and a growing lack of trust in online communications impairs American democracy and society.

The need for federal privacy legislation

  • As enterprises use technology to collect, retain, and process more and more personal information, laws and regulations protecting individual privacy must keep pace to protect users and businesses and sustain the Nation’s digital economy and society.
  • Current laws and regulations governing the use of personal information do not sufficiently protect individual privacy because they do not cover many new and expanding types of information and uses of such information.
  • In addition, they rely substantially on “notice and choice” for individuals. This places the burden of protecting privacy on individuals instead of on the companies that use and collect data, and permits the companies to set the boundaries for what information they collect and how they use or share it, with little meaningful understanding on the part of the individuals whose data is collected.
  • Entities that collect, use, process, and share personal information should be subject to meaningful and effective boundaries on such activities. They should be obligated to take reasonable steps to protect the privacy and security of personal information, and to act with loyalty and care toward individuals linked or linkable to such information.
  • Privacy risk and harms must be mitigated and addressed up front, because in the digital era, data harms are often unforeseen and compounded almost instantaneously. Information leakage usually cannot be undone, and it is often difficult to make victims of privacy harms whole after the fact.
  • There is a need for a national solution to ensure that entities that collect, process, and transmit personal information do so in ways that respect the privacy interests of individuals linked or linkable to that information and do not cause harm to these individuals or their families and communities.
  • States have a patchwork of differing laws and jurisprudence relating to the privacy of their citizens. A robust and comprehensive federal privacy law will ensure that all Americans have the benefit of the same privacy protections regardless of where they live and can rely on the entities they deal with to handle personal information consistently regardless of where these entities are located.
  • The need for consistent federal privacy protection is heightened by the interstate and global nature of the information economy in which few online products and services are targeted toward specific states. Instead, many such services are offered to users anywhere in the United States (and often around the world) who can access the Internet. Consistent federal privacy protection will facilitate entry and competition in interstate commerce for millions of small businesses for which compliance with multiple state laws could present barriers to entry.
  • Consistent and robust data security practices will enhance the privacy of individuals as well as the collective security of U.S. information and communications networks.
  • Privacy laws must be backed by strong enforcement agencies and tools. To provide such enforcement, the Federal Trade Commission needs adequate resources and legal authority at least equal to that of other leading privacy regulators, reinforced by authorized state officials.
  • Individuals should have recourse through the federal courts for privacy harms that have been commonly compensable under existing laws, including anti-discrimination laws, as well as violations of federal privacy law that cause “actual” harm.
  • Technology will continue to evolve and change. Any new privacy laws therefore must be flexible and technology-neutral, so that the laws’ protections may apply not only to the technologies and products of today, but to those of tomorrow.
  • A comprehensive federal privacy law will enable the United States to take steps toward ensuring that Americans’ privacy is appropriately protected internationally, while increasing the flow of information and promoting greater trust in American commerce abroad.

“Any new privacy laws must be flexible and technology-neutral, so that the laws’ protections may apply not only to the technologies and products of today, but to those of tomorrow.”

PROPOSED POLICY STATEMENTS

  • In order to protect the privacy of individuals, it is necessary and proper for Congress to regulate the collection, use, processing, and sharing of personal information.
  • There is a compelling national interest in providing meaningful and effective boundaries on the collection, use, storage, and sharing of personal information so all individuals linked or linkable to such information have a basis to trust that such information will be handled in ways consistent with their privacy and other interests.
  • There is a compelling national interest in empowering individuals through meaningful and effective rights with respect to personal information linked to them so that those individuals who want to can ensure this information is used and shared in ways consistent with their privacy and other interests.
  • It is the policy of the United States to provide a consistent national approach to the collection, processing, storage, and sharing of personal information, but also to preserve the existing fabric of state and local statutory and common law protecting privacy to the extent it does not interfere with the comprehensive operation of federal law.
  • It is the policy of the United States to provide individuals with meaningful remedies for privacy harms, whether those harms are financial, physical, reputational, emotional, or other kinds; and to ensure that an exclusive federal remedy for violation of privacy rights vindicates interests that have long been protected by other privacy laws.
  • It is the policy of the United States to ensure that protections for users’ privacy can remain up-to-date, and continue to evolve as technology, innovation, and services—and risks to privacy—evolve.

The Brookings Institution is a nonprofit organization devoted to independent research and policy solutions. Its mission is to conduct high-quality, independent research and, based on that research, to provide innovative, practical recommendations for policymakers and the public. The conclusions and recommendations of any Brookings publication are solely those of its author(s), and do not reflect the views of the Institution, its management, or its other scholars.

Brookings recognizes that the value it provides is in its absolute commitment to quality, independence, and impact. Activities supported by its donors reflect this commitment.

Authors

  • Footnotes
    1. Privacy Act of 1974, 5 U.S.C. § 552; Data Accountability and Transparency Act of 2020, Discussion Draft, 116th Cong. (2020), Section 2(1) (“Congress finds that—privacy is a fundamental individual right protected by the Constitution of the United States”); Data Protection Act of 2020, S. ___, 116th Cong. (2020), Section 2(a)(1) (“Congress finds the following: Privacy is an important fundamental individual right protected by the Constitution of the United States.”)
    2. See Sec. 3(a) of the Obama Administration’s Consumer Privacy Bill of Rights Act of 2015 Discussion Draft: “Americans cherish privacy as an element of their individual freedom,” https://obamawhitehouse.archives.gov/sites/default/files/omb/legislative/letters/cpbr-act-of-2015-discussion-draft.pdf.
    3. Kyllo v. United States, 533 U.S. 27 (2001) (“Where, as here, the Government uses a device that is not in general public use, to explore details of the home that would previously have been unknowable without physical intrusion, the surveillance is a ‘search’ and is presumptively unreasonable without a warrant”); Collins v. Virginia, No. 16-1027, 584 U.S. ___, 138 S.Ct. 1663 (2018) (“The protection afforded the curtilage is essentially a protection of families and personal privacy in an area intimately linked to the home, both physically and psychologically, where privacy expectations are most heightened” (quoting California v. Ciraolo, 476 U.S. 207, 212–213 (1986)).
    4. Data Accountability and Transparency Act of 2020, Discussion Draft, 116th Cong. (2020), Section 2(4), (“[P]rivacy protections not only protect and benefit the individual, but also advance other societal interests, including—the protection of marginalized and vulnerable groups of individuals”); Data Protection Act of 2020, S. ___, 116th Cong. (2020), Section 2(a)(1) (“Privacy protections not only protect and benefit the individual, but they also advance other societal interests, including the protection of marginalized and vulnerable groups of individuals.”)
    5. Boyd v. United States, 118 U.S. 616, 630 (1886) (Fourth Amendment principles “apply to all invasions on the part of the government and its employees of the sanctity of a man’s home and the privacies of life”); Olmstead v. United States, 277 U. S. 438, 478 (1928) (Brandeis, J., dissenting) (“The makers of our Constitution undertook to secure conditions favorable to the pursuit of happiness. They recognized the significance of man’s spiritual nature, of his feelings and of his intellect. They knew that only a part of the pain, pleasure and satisfactions of life are to be found in material things. They sought to protect Americans in their beliefs, their thoughts, their emotions and their sensations. They conferred, as against the Government, the right to be let alone—the most comprehensive of rights and the right most valued by civilized man”); Feldman v. United States, 322 U. S. 487, 489-490 (1944) (“We are immediately concerned with the Fourth and Fifth Amendments, intertwined as they are and expressing as they do supplementing phases of the same constitutional purpose—to maintain inviolate large areas of personal privacy”); Wolf v. Colorado, 338 U.S. 25, 27-28 (1949) (“The security of one’s privacy against arbitrary intrusion by the police —which is at the core of the Fourth Amendment—is basic to a free society. It is therefore implicit in ‘the concept of ordered liberty,’ and, as such, enforceable against the States through the Due Process Clause”); Griswold v. Connecticut, 381 U.S. 480, 485 (1965) (“Various guarantees create zones of privacy. The right of association contained in the penumbra of the First Amendment is one, as we have seen. The Third Amendment, in its prohibition against the quartering of soldiers ‘in any house’ in time of peace without the consent of the owner, is another facet of that privacy. The Fourth Amendment explicitly affirms the ‘right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.’ The Fifth Amendment, in its Self-Incrimination Clause, enables the citizen to create a zone of privacy which government may not force him to surrender to his detriment. The Ninth Amendment provides: ‘The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people’”); Stanley v. Georgia, 394 U.S. 557, 564, 566, 567 (1969) (“For also fundamental is the right to be free, except in very limited circumstances, from unwanted governmental intrusions into one’s privacy. … Whatever the power of the state to control public dissemination of ideas inimical to the public morality, it cannot constitutionally premise legislation on the desirability of controlling a person’s private thoughts …”); McIntyre v. Ohio Election Commission, 514 U.S. 334, 357 (1995) (“Under our Constitution, anonymous pamphleteering is not a pernicious, fraudulent practice, but an honorable tradition of advocacy and of dissent. Anonymity is a shield from the tyranny of the majority”); Carpenter v. United States, 585 U.S. __, __ (2018) (“[W]hat [one] seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected” (quoting Katz v. United States, 389 U. S. 347, 351–352 (1967)).
    6. See Olmstead, supra (Brandeis, J. dissenting) (privacy is “the most comprehensive of rights and the right most valued by civilized man”).
    7. In establishing the United States Postal Service, the Postal Service Act of 1792 prohibited postal officials from reading the contents of letters. See 1 Stat. 354, https://www.cantwell.senate.gov/news/press-releases/cantwell-senate-democrats-unveil-strong-online-privacy-rights; The Fair Credit Reporting Act of 1970 was among the first modern national privacy and data protection laws. 15 U.S.C. § 1681.
    8. Samuel D. Warren and Louis D. Brandeis, Harvard Law Review, Vol. 4, No. 5. (Dec. 15, 1890), pp. 193-220, https://www.jstor.org/stable/1321160?seq=1#metadata_info_tab_contents.
    9. Health: Health Insurance Portability and Accountability Act of 1996, Pub. L. 104-191, 110 Stat. 1936; Health Information Technology for Economic and Clinical Health Act (“HITECH”), Pub. L. 11-5, codified at 42 U.S.C. §§ 17901 et seq.; Genetic information: Genetic Information Nondiscrimination Act of 2008, Pub. L. 110-233, codified at 42 U.S.C. Chapter 21F.; Financial records: Fair Credit Reporting Act of 1970, Pub. L. 91-508 as amended, codified at 42 U.S.C. §§ 601 et seq; Financial Services Modernization Act of 1999 (Gramm-Leach-Bliley Act), Pub. L. 106-102, codified at 15 U.S.C. §§ 6801 et seq.; 6821 et seq.; Government records: Driver’s License Privacy Protection Act, 18 U.S.C. §§ 2721 et seq.; 42 U.S.C. § 405 (c)(2)(C)(vi)(II(States “may not display a social security account number … on any driver’s license, motor vehicle registration, or personal identification card” or code the number by any “means of communication which conveys such number …”); E-mail communications: Do-Not-Call Implementation Act, 15 U.S.C. §§ 6101-02; CAN-SPAM Act of 2003, 15 U.S.C. §§ 7701 et seq.
    10. See “Records, Computers, and the Rights of Citizens,” U.S. Department of Health, Education, and Welfare’s Advisory Committee on Automated Personal Data Systems, Records, Computers, and the Rights of Citizens, 1973. Also see: Robert Gellman, “Fair information practices: A basic history – Version 2.19,” October 7, 2019, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2415020.
    11. Data Accountability and Transparency Act of 2020, Discussion Draft, 116th Cong. (2020), Section 2(2) (“[T]he right of privacy is widely recognized in international legal instruments that the United States has endorsed, ratified, or promoted”); Data Protection Act of 2020, S. __, 116th Cong. (2020), Section 2(a)(1) (“[T]he right of privacy is widely recognized in international legal instruments that the United States has endorsed, ratified, or promoted.”); Universal Declaration of Human Rights, Article 12, adopted by the United Nations on December 10, 1948, ratified by the United States on June 8, 1992, https://www.un.org/en/universal-declaration-human-rights/; International Covenant on Civil and Political Rights, Article 17, adopted by the United nations on December 19, 1966, ratified by the United States on April 2, 1992, https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668-English.pdf.
    12. Communications Act of 1934, as amended in 1996 (47 U.S.C. § 230): “The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens.”; “The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.”; “The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation.”; “Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services.”
    13. Privacy Act of 1974, 5 U.S. § 552 (“The increasing use of computers and sophisticated information technology, while essential to the efficient operations of the Government, has greatly magnified the harm to individual privacy that can occur from any collection, maintenance, use, or dissemination of personal information.”).
    14.  “Mobile fact sheet,” Pew Research Center, June 12, 2019, www.pewresearch.org/internet/fact-sheet/mobile/.
    15. Riley v. California, 573 U.S. 373 (2014) (“These cases require us to decide how the search incident to arrest doctrine applies to modern cell phones, which are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.… Modern cell phones are not just another technological convenience. With all they contain and all they may reveal, they hold for many Americans ‘the privacies of life.’”).
    16. Aaron Smith, “Half of online Americans don’t know what a privacy policy is,” Pew Research Center, December 4, 2014, https://www.pewresearch.org/fact-tank/2014/12/04/half-of-americans-dont-know-what-a-privacy-policy-is/.
    17. Brooke Auxier et al. “Americans and privacy: Concerned, confused, and feeling lack of control over their personal information,” Pew Research Center, November 15, 2019, https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/; Rafi Goldberg, “Lack of trust in internet privacy and security may deter economic and online activities,” NTIA, May 13, 2016, https://www.ntia.gov/blog/2016/lack-trust-internet-privacy-and-security-may-deter-economic-and-other-online-activities; see also “Consumer intelligence series: Protect.me,” PwC, 2017, https://www.pwc.com/us/en/advisory-services/publications/consumer-intelligence-series/protect-me/cis-protect-me-findings.pdf.
    18. Danielle Keats Citron and Frank A. Pasquale, “The scored society: Due process for automated predictions,” Washington Law Review, Vol. 89, 2014, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2376209.
    19. Aaron Smith, “Americans and cybersecurity,” Pew Research Center, January 26, 2017, https://www.pewresearch.org/internet/2017/01/26/americans-and-cybersecurity/.
    20. Maureen K. Ohlhausen, “Painting the Privacy Landscape: Informational Injury on FTC Privacy and Data Security Cases,” September 19, 2017, https://www.ftc.gov/public-statements/2017/09/painting-privacy-landscape-informational-injury-ftc-privacy-data-security.
    21. See “Intel drafts model legislation to spur data privacy discussion,” November 8, 2018, https://newsroom.intel.com/news/intel-drafts-model-legislation-spur-data-privacy-discussion/.
    22. Fair Credit Reporting Act of 1970, 15 U.S.C. § 1681 (“The banking system is dependent upon fair and accurate credit reporting. Inaccurate credit reports directly impair the efficiency of the banking system, and unfair credit reporting methods undermine the public confidence which is essential to the continued functioning of the banking system.”).
    23. Fair Credit Reporting Act of 1970, 15 U.S.C. § 1681 (“There is a need to ensure that consumer reporting agencies exercise their grave responsibilities with fairness, impartiality, and a respect for the consumer’s right to privacy.”).
    24. Children’s Online Privacy Protection Act of 1998, 15 U.S.C. §§ 6501, et seq. (“To date, while industry has developed innovative ways to help parents and educators restrict material that is harmful to minors through parental control and self-regulation, such efforts have not provided a national solution ….”).
    25. CAN-SPAM Act of 2003, 15 U.S.C. § 7701 (“Many States have enacted legislation intended to regulate or reduce unsolicited commercial electronic mail, but these statutes impose different standards and requirements. As a result, they do not appear to have been successful in addressing the problems associated with unsolicited commercial electronic mail, in part because, since an electronic mail address does not specify a geographic location, it can be extremely difficult for law-abiding businesses to know with which of these disparate statutes they are required to comply.”).
    26. See “FTC report on resources used and needed for protecting consumer privacy and security,” Federal Trade Commission, 2020, https://www.ftc.gov/system/files/documents/reports/reports-response-senate-appropriations-committee-report-116-111-ftcs-use-its-authorities-resources/p065404reportresourcesprivacydatasecurity.pdf. The Commission has about 40 Full Time Equivalent employees dedicated to data protection—in comparison, the United Kingdom’s Information Commissioner’s office employs around 700 individuals and the Irish Data Protection Commissioner employs around 18.
    27. Data Accountability and Transparency Act of 2020, Discussion Draft, 116th Cong. (2020), Section 2(1) (“[I]n order to protect the privacy of individuals, groups of individuals, and support society, it is necessary and proper for Congress to regulate the collection, maintenance, use, processing, storage, and dissemination of information”); Data Protection Act of 2020, S. ___, 116th Cong. (2020), Section 2(a)(1) (“In order to protect the privacy of individuals, it is necessary and proper for Congress to regulate the collection, maintenance, use, processing, storage, and dissemination of information.”).