FTC Launches Commercial Surveillance and Data Security Rulemaking, Holds a Public Forum, and Seeks Public Input
Client Alert | September 27, 2022
On August 11, 2022, the Federal Trade Commission (the “FTC” or the “Commission”) launched one of the most ambitious rulemaking processes in agency history with its 3-2 vote to initiate an Advance Notice of Proposed Rulemaking (“ANPR”) on “commercial surveillance” and data security.[1] On September 8, the Commission continued the rulemaking process by hosting a virtual “Commercial Surveillance and Data Security Public Forum (the “Public Forum”)” to gather public feedback on the proposed rulemaking.[2]
As explained in more detail in our prior article, the ANPR lays out a sweeping project to rethink the regulatory landscape governing nearly every facet of the U.S. internet economy, from advertising to anti-discrimination law, and even to labor relations. Any entity that uses the internet, even for internal purposes, is likely to be affected by this FTC action.
FTC Rulemaking Process
The FTC is undertaking this rulemaking under Section 18 of the FTC Act (also known as “Magnuson-Moss”), a hybrid rulemaking process that goes beyond the Administrative Procedure Act’s standard notice-and-comment procedures.[3] The FTC may promulgate a trade regulation rule to define acts or practices as unfair or deceptive “only where it has reason to believe that the unfair or deceptive acts or practices which are the subject of the proposed rulemaking are prevalent.” 15 U.S.C. § 57a(b)(3) (emphasis added). The FTC may make a determination that unfair or deceptive acts or practices are prevalent only if: “(A) it has issued cease and desist orders regarding such acts or practices, or (B) any other information available to the Commission indicates a widespread pattern of unfair or deceptive acts or practices.” 15 U.S.C. § 57a. That means that the agency must show (1) the prevalence of the practices, (2) how they are unfair or deceptive, and (3) the economic effect of the rule, including on small businesses and consumers.
Since the FTC published the ANPR, the Commission has posted 123 comments received thus far.[4] The Commission will continue to accept public comments until October 21. After the FTC’s review of comments, the next step in the Magnuson-Moss rulemaking process would be to publish a Notice of Proposed Rulemaking (“NPR”), which would set forth the proposed rule text, a description of its reasons supporting the proposed rules, any alternatives, and a preliminary regulatory analysis assessing the costs and benefits of the proposal and alternatives. This proposal would be submitted to Congress 30 days before public issuance. The FTC would then be required to convene a public comment opportunity after the issuance of the NPR and provide interested parties an opportunity for an informal hearing to present its views and resolve disputed factual issues. Finally, the FTC would publish its Final Rule, accompanied by a Statement of Basis of Purpose detailing the prevalence of the practices being regulated, how they are unfair or deceptive, and the economic effect of the rule, including an assessment of the rule’s costs and benefits and why it was chosen over alternatives. Any person could then seek review of the rule in the D.C. Court of Appeals within 60 days of promulgation. If an NPR is published, challenges will be likely.
Commercial Surveillance and Data Security Public Forum
The September 8 Public Forum included (i) statements from Chair Lina M. Khan, Commissioners Rebecca Slaughter and Alvaro Bedoya, and the Commission’s Assistant General Counsel Josephine Liu; (ii) a panel of industry representatives; (iii) a panel of consumer advocates; and (iv) over 65 public commenters.
Key topics discussed during the Public Forum included data minimization, data security, algorithmic discrimination and ethical Artificial Intelligence (“AI”), and the protection of teenagers over 13 years old, among others.
Below are highlights from the sessions:
Commissioner Statements.
- Chair Lina Khan noted that the hearing will inform whether the agency proceeds with the rulemaking process. She noted that the FTC has a long record of using its enforcement tools to combat commercial surveillance and “lax” data security practices in instances where they are illegal, but that the FTC is “seeking to determine whether unfair or deceptive data practices may now be so prevalent that we need to move beyond case by case adjudication and instead have market wide rules.” She explained that the public record will be “critical” for the Commission to determine if it has the evidentiary basis to proceed with rulemaking, and meet the legal requirements to craft those rules. Chair Khan also stated that these issues are “urgent” given the ability for companies to track and surveil individuals throughout their day to day lives, without transparency for the average consumer regarding the data collection and use, and without any real power for Americans to opt-out of that surveillance.
- The Commission’s Assistant General Counsel Josephine Liu provided an overview of the rulemaking process, and in particular highlighted three of the questions from the ANPR on which the Commission most wants public input:
- Which practices used to surveil customers are most prevalent? She explained that this question will help the FTC focus on particular areas of concern, for both enforcement purposes and determining whether rulemaking will occur. To move on in the rulemaking process, the FTC needs reason to believe such surveillance practices are prevalent.
- How should the Commission identify and evaluate commercial surveillance harms or potential harms? Public input on this will help the FTC identify and address specific ways Americans are being harmed.
- Lastly, which areas or kinds of harm has the FTC failed to address through enforcement? Public input on this will provide the FTC with evidence about the areas in which it has less enforcement experience, and areas that rulemaking may better address.
- Commissioner Rebecca Slaughter remarked that she supports strong federal privacy legislation, but until it is passed, the Commission has a duty to act to address and investigate unlawful behavior. She encouraged industry representatives to engage with the Commission to ensure that the rules are effective and not merely a burdensome compliance exercise.
- Commissioner Alvaro Bedoya emphasized that the Commission is not just looking for a collection of “expert” opinions, but instead wants to hear from the public how it is has been impacted by commercial surveillance and poor data security practices. He also noted that the ANPR goes beyond the conception of notice and choice, the usual “caricature” of American privacy law.
Commissioners Phillips and Wilson did not participate in the Public Forum.
Industry Representative Panel.
In addition to the Commissioners’ remarks, the FTC convened a panel of industry representatives moderated by Professor Olivier Sylvain, now Senior Advisor on Technology to Chair Khan. Professor Sylvain, whose academic work has focused on Section 230 of the Communications Decency Act, joined the FTC in 2021 from Fordham University where he served as Professor of Law.
Panelists included four senior executives and policy counsel from (1) a trade association for the digital content industry; (2) a web browser provider; (3) a retail trade association; and (4) a nonprofit coalition researching the use of artificial intelligence. Below are key themes from the industry panel:
- Context Matters. The panel’s key theme was that the Commission should calibrate future rulemaking to different levels of risk presented by particular types of data collection and uses. Specifically, several panelists emphasized the need for future regulations to treat first-party data collected and used by consumer-facing apps and websites differently from third-party data collected by third parties for behavioral advertising. The Commission was urged to be careful not to inadvertently craft such broad regulations that they interfere with consumer freedoms and choices on the Internet.
- Shift Away From Behavioral Advertising. Similar to the above, panelists emphasized the need to shift away from behavioral advertising completely. Instead, they recommended shifting towards other methods of advertising that utilize first-party data.
- Big Data. One panelist mentioned that the “terms” of data use are established by “just a few big companies,” and that special attention needs to be paid to the dominant companies in the industry, who can set the tone for how rules are interpreted and implemented.
- Best Practices. The Commission moderator asked panelists what “best practices” and business models have been developed to mitigate consumer harm and protect data. Responses included: (i) maintaining internal and public-facing documentation and benchmarking across the AI lifecycle; (ii) implementing risk assessment processes and basic security controls, such as encryption in transit, strong access controls (such as multi-factor authentication and strong password requirements), and security awareness training.
- Global Insight. Panelists encouraged the FTC to review global legislation, such as the EU’s General Data Protection Regulation (“GDPR”) and the UK’s Children’s Code for guidance on what has worked, and has not worked, globally.
- Protecting Teens Over 13. Protecting teens online over 13 years old, who have aged out of protections by the Children’s Online Privacy Protection Act (“COPPA”), was another key theme. Panelists urged the Commission to be sure the rules do not just create child safety “theater.”
- Global Privacy Control/Single Opt-Out. Lastly, a key theme was implementing a browser setting, called a Global Privacy Control, that lets consumers tell websites their privacy preferences through a single opt-out, without having to manually reach out or make choices on each website. The Global Privacy Control was touted by some panelists as an important measure to protect privacy and choice. Others, however, worried that the single opt-out approach creates the potential to frustrate consumer choice and efforts for businesses to serve customers if consumers want to specifically consent to data collection and use for particular businesses.
Consumer Advocate Panel.
The Consumer Advocate Panel was moderated by Rashida Richardson, an Attorney Advisor to Chair Khan. This panel included members of non-profits and thinktanks focused on consumer privacy and digital innovation. In general, the moderator’s questions assumed harmful impacts of data use to consumers.
- Algorithmic Discrimination. Panelists expressed that the FTC should protect disadvantaged communities. Panelists claimed that barriers in housing and employment are exacerbated by targeted advertisements.
- Sensitive Information and Dark Patterns. The Supreme Court decision in Dobbs was discussed extensively. Concerns were raised about data brokers being able to sell consumer data to foreign governments, with consumers allegedly being harmed through an inability to opt-out of data being collected and companies selling sensitive health information.
- API Misuse. The panelists stated that unwanted observation – through a single Software Development Kit (“SDK”) that can be found in hundreds of apps – can lead to sensitive data being transferred across many companies without consent. Alleged associated harms include data breaches, misuse, unwanted secondary data uses, and inappropriate government access.
- Data Minimization and Targeted Advertisements. Data minimization, increased transparency, and regulating third-party targeted advertisements were key ideas raised throughout the panel as a means to FTC enforcement in this area. However, one panelist highlighted that targeted advertisements can actually play a positive role in society, such as to build community, mobilize voters, and disseminate health information to groups most likely to be effected. In this way, while data minimization is positive in theory, “color blindness” towards all data collection and use is not always the answer, as data can be used for good.
- Harm to Minors. Panelists raised the harms of targeted advertising to teens who allegedly cannot distinguish between commercial content and entertainment content online. A key recommendation was raising protections for minors beyond COPPA, in line with global trends, such as instituting a mechanism for teens to easily delete their online data.
- Consent Framework. Panelists generally expressed that, in their view, the concept of “notice and consent” is not a useful framework given the alleged power dynamics between consumers and those collecting their data online, and the purportedly asymmetric information provided to consumers when making those choices.
- Concepts Missing From the ANPR. In response to the moderator’s question on whether the ANPR was missing anything, panelists raised the following topics: the FTC should (i) explore enumerating a list of sensitive categories of data, and define how precise location data needs to be for its collection to count as “unfair”; (ii) promulgate rules to regulate service provider relationships; (iii) set forth standards for data deidentification; and (iv) implement rules to prevent discrimination of marginalized communities, combined with strengthening the FTC’s civil rights expertise.
Public Commenters.
- The FTC presented an array of public commenters after the two panels. Commenters included individuals from organizations like the U.S. Chamber of Commerce Technology Engagement Center, TechFreedom, the Centre for Information Policy Leadership, the Center for Democracy and Technology, Human Rights Watch, and the Electronic Privacy Information Center (“EPIC”). Some commenters were deeply concerned by the FTC’s broad-based expansion of its enforcement authority, while other commenters noted that the FTC’s expansion of its authority was necessary because of the privacy harms that the public allegedly suffers.
- Industry participants emphasized that the FTC was mandating economy-wide changes relating to privacy, data security, and algorithms which would step on Congressional authority. According to these participants, this would trigger the Supreme Court’s Major Questions doctrine since the FTC does not have clear authorization from Congress to make such a broad-based rule.
- Other members of the public noted that the FTC should take far-reaching action to protect personal data, with an emphasis on controls to safeguard children, student, health, and education data.
The ANPR and the public workshop are just initial steps in the lengthy FTC rulemaking process. Given the broad-based scope of the potential rules, the rulemaking process will be closely watched and analyzed. Gibson Dunn attorneys are closely monitoring these developments, and are available to discuss these issues as applied to your particular business.
__________________________
[1] Federal Trade Commission Press Release, FTC Explores Rules Cracking Down on Commercial Surveillance and Lax Data Security Practices (Aug. 11, 2022), https://www.ftc.gov/news-events/news/press-releases/2022/08/ftc-explores-rules-cracking-down-commercial-surveillance-lax-data-security-practices.
[2] Federal Trade Commission Event, Commercial Surveillance and Data Security Public Forum (Sept. 8, 2022), https://www.ftc.gov/news-events/events/2022/09/commercial-surveillance-data-security-anpr-public-forum.
[3] Magnuson-Moss Warranty Federal Trade Commission Improvement Act, 15 U.S.C. § 57a(a)(1)(B). The FTC had largely abandoned the promulgation of new trade regulation rules because the Magnuson-Moss process was perceived as too cumbersome and the agency generally preferred case-by-case enforcement over rulemaking. The Biden Administration, however, has revitalized the interest in promulgating trade regulation rules, to “provide much needed clarity about how our century-old statute applies to contemporary economic realities [allowing] the FTC to define with specificity what acts or practices are unfair or deceptive under Section 5 of the FTC Act.” Statement of Commissioner Rebecca Kelly Slaughter, Regarding the Adoption of Revised Section 18 Rulemaking Procedures (July 1, 2021), here.
[4] Public comments are available at, https://www.federalregister.gov/documents/2022/08/22/2022-17752/trade-regulation-rule-on-commercial-surveillance-and-data-security.
This alert was prepared by Svetlana S. Gans, Samantha Abrams-Widdicombe, and Kunal Kanodia.
Gibson Dunn lawyers are available to assist in addressing any questions you may have about these developments. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the firm’s Privacy, Cybersecurity & Data Innovation practice group:
United States
Matthew Benjamin – New York (+1 212-351-4079, [email protected])
Ryan T. Bergsieker – Denver (+1 303-298-5774, [email protected])
S. Ashlie Beringer – Co-Chair, PCDI Practice, Palo Alto (+1 650-849-5327, [email protected])
David P. Burns – Washington, D.C. (+1 202-887-3786, [email protected])
Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650-849-5203, [email protected])
Svetlana S. Gans – Washington, D.C. (+1 202-955-8657, [email protected])
Lauren R. Goldman– New York (+1 212-351-2375, [email protected])
Stephenie Gosnell Handler – Washington, D.C. (+1 202-955-8510, [email protected])
Nicola T. Hanna – Los Angeles (+1 213-229-7269, [email protected])
Howard S. Hogan – Washington, D.C. (+1 202-887-3640, [email protected])
Robert K. Hur – Washington, D.C. (+1 202-887-3674, [email protected])
Kristin A. Linsley – San Francisco (+1 415-393-8395, [email protected])
H. Mark Lyon – Palo Alto (+1 650-849-5307, [email protected])
Vivek Mohan – Palo Alto (+1 650-849-5345, [email protected])
Karl G. Nelson – Dallas (+1 214-698-3203, [email protected])
Rosemarie T. Ring – San Francisco (+1 415-393-8247, [email protected])
Ashley Rogers – Dallas (+1 214-698-3316, [email protected])
Alexander H. Southwell – Co-Chair, PCDI Practice, New York (+1 212-351-3981, [email protected])
Deborah L. Stein – Los Angeles (+1 213-229-7164, [email protected])
Eric D. Vandevelde – Los Angeles (+1 213-229-7186, [email protected])
Benjamin B. Wagner – Palo Alto (+1 650-849-5395, [email protected])
Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, [email protected])
Debra Wong Yang – Los Angeles (+1 213-229-7472, [email protected])
Europe
Ahmed Baladi – Co-Chair, PCDI Practice, Paris (+33 (0) 1 56 43 13 00, [email protected])
James A. Cox – London (+44 (0) 20 7071 4250, [email protected])
Patrick Doris – London (+44 (0) 20 7071 4276, [email protected])
Kai Gesing – Munich (+49 89 189 33-180, [email protected])
Bernard Grinspan – Paris (+33 (0) 1 56 43 13 00, [email protected])
Joel Harrison – London (+44(0) 20 7071 4289, [email protected])
Vera Lukic – Paris (+33 (0) 1 56 43 13 00, [email protected])
Penny Madden – London (+44 (0) 20 7071 4226, [email protected])
Michael Walther – Munich (+49 89 189 33-180, [email protected])
Asia
Kelly Austin – Hong Kong (+852 2214 3788, [email protected])
Connell O’Neill – Hong Kong (+852 2214 3812, [email protected])
Jai S. Pathak – Singapore (+65 6507 3683, [email protected])
© 2022 Gibson, Dunn & Crutcher LLP
Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.