197 Search Results

July 6, 2020 |
Supreme Court Upholds TCPA’s Robocall Ban, But Strikes Government-Debt Exception As Unconstitutional Under First Amendment

Click for PDF Decided July 6, 2020 Barr v. American Association of Political Consultants, Inc., No. 19-631

Today, the Supreme Court held 6-3 that the federal-debt-collection exception to the TCPA’s robocall ban violates the First Amendment, but also held 7-2 that the proper remedy is to sever the exception—leaving in place the entirety of the TCPA’s 1991 ban on robocalls. 

Background: The Telephone Consumer Protection Act of 1991 (TCPA) generally prohibits robocalls to cell phones and home phones. In 2015, Congress amended the Act to exempt robocalls to cell phones for collecting debts owed to or guaranteed by the federal government—including student-loan and mortgage debts—from the TCPA’s general prohibition.

Plaintiffs—a group of political and nonprofit organizations seeking to make robocalls—sued the U.S. Attorney General arguing that the 2015 government-debt exception violates the First Amendment by unconstitutionally favoring debt-collection speech over political and other speech. As relief, the plaintiffs sought to invalidate the TCPA’s entire robocall ban for cell phones, rather than only the 2015 government-debt exception. Plaintiffs’ theory was that the exception undermines the credibility of the purported privacy interest supporting the entire robocall ban.

The district court held that the 2015 government-debt exception was a content-based speech regulation, but that it survived strict scrutiny given the government’s compelling interest in collecting debt. The Fourth Circuit reversed, holding that the government-debt exception failed strict scrutiny. Applying traditional severability principles, the Fourth Circuit then concluded that the government-debt exception should be severed from the statute, leaving the TCPA’s robocall ban in effect.

Issue: 1. Whether the government-debt exception from the TCPA’s robocall ban for cell phones violates the First Amendment.

2. If so, whether the TCPA’s entire robocall ban is unconstitutional.

Court's Holding: 1. The government-debt exception is a content-based speech restriction that impermissibly favors debt-collection speech over political and other speech in violation of the First Amendment.

2. The TCPA’s robocall ban stands because the government-debt exception is severable from the remainder of the statute.

“Congress has impermissibly favored debt-collection speech over political and other speech . . . As a result, plaintiffs still may not make political robocalls to cell phones, but their speech is now treated equally with debt-collection speech.

Justice Kavanaugh, writing for a plurality of the Court

What It Means:
  • The TCPA’s robocall ban remains in effect as it existed before 2015, prohibiting virtually all automated voice calls and text messages to cell phones. Six Justices (writing a total of three opinions) agreed that the 2015 government-debt exception was content based and that the government, in attempting to defend the content-based speech restriction, failed to sufficiently justify treating government-debt-collection speech differently from other important categories of robocall speech, such as political speech and issue advocacy.
  • Seven Justices agreed that the 2015 government-debt exception could be severed from the remainder of the statute to preserve the underlying 1991 robocall restriction.  Not only has the Communications Act (of which the TCPA is part) had an express severability clause since 1934, the Court explained, but also, even without the severability clause, the presumption of severability would still apply—and the remainder of the restriction is capable of functioning independently without the narrow government-debt exception.
  • As in Seila Law LLC v. Consumer Financial Protection Bureau (No. 19-7), Justices Gorsuch and Thomas dissented from the Court’s severability holding.  Justice Gorsuch wrote, “[s]evering and voiding the government-debt exception does nothing to address the injury” of barring plaintiffs from engaging in political speech robocalls.  Slip. op. 6 (Gorsuch, J., concurring in the judgment in part and dissenting in part).  Justice Gorsuch and Justice Thomas argued that the Court should reconsider its severability doctrine.

The Court's opinion is available here.

Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding developments at the Supreme Court.  Please feel free to contact the following practice leaders:

Appellate and Constitutional Law Practice

Allyson N. Ho +1 214.698.3233 aho@gibsondunn.com Mark A. Perry +1 202.887.3667 mperry@gibsondunn.com

Related Practice: Privacy, Cybersecurity and Consumer Protection

Alexander H. Southwell +1 212.351.3981 asouthwell@gibsondunn.com Ahmed Baladi +33 (0) 1 56 43 13 00 Timothy W. Loose +1 213.229.7746 tloose@gibsondunn.com
© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

July 1, 2020 |
As California Consumer Privacy Act Enforcement Commences, a Tougher New Data Privacy Law Will Go Before California Voters in November

Click for PDF Today marks the date that the California Consumer Privacy Act (“CCPA”) becomes enforceable. Just one week ago, however, on June 24, 2020, a new and tougher proposed privacy law, the California Privacy Rights Act (“CPRA”), cleared the final hurdle to appear on the November 3, 2020 ballot in California. The CPRA ballot initiative represents an effort to address the perceived inadequacies of the CCPA, which, according to some, was hastily enacted into law by the California state legislature to avoid its enactment as a ballot initiative. Further, the ballot initiative reflects a fear by its proponents that the California legislature will make compromises that are, by its own standards, unacceptable. Hence, unlike the CCPA, if the CPRA is enacted by California voters in November, it could not be amended by the state legislature. Given the significance of the proposed privacy provisions (one of which would create a new state privacy enforcement agency to replace the Attorney General as the police of privacy rights), the prospects of the CPRA on the November ballot should be monitored. Below we highlight a few notable aspects of the CPRA, including important dates. Background In September 2019, before the CCPA even went into effect, Alastair Mactaggart and the Californians for Consumer Privacy (the non-profit group behind the original CCPA initiative in 2018), filed the new ballot initiative, CPRA (referred to by many as “CCPA 2.0”). If enacted in November, the CPRA would become state law as written, and could be amended only by another ballot initiative, not the state legislature, as noted above. By mid-march 2020, the Californians for Consumer Privacy reported that it had collected more than enough signatures (roughly 930,000) to appear on the November 2020 ballot. Though delays in the counties’ official signature counting process nearly derailed the CPRA as a viable ballot initiative (prompting Californians for Consumer Privacy to file a motion for writ of mandate to order the Secretary of State to direct the counties to complete the process by the deadline), two of the final three counties alone reported 718,233 verified signatures, which is more than the required number of 675,000 signatures to put the initiative on the November 2020 ballot. This final verification occurred just one day before the June 25, 2020 deadline. Brief Overview of CPRA and its Interaction with CCPA If the CPRA is approved by California voters in November, it will go into effect on January 1, 2023. Until that time, the CCPA will remain in full force and effect, and compliance with the CCPA will be critical. Indeed, under Section 1798.185(c) of the CCPA, the California Attorney General is authorized to enforce the CCPA starting today, July 1, 2020.[1] The CPRA would impose new obligations that would only apply to personal information (“PI”) collected after January 1, 2023, except the right to access personal information would extend to personal information collected on or after January 1, 2022. The CPRA would grant the California Attorney General the power at the outset to adopt regulations to expand upon and update the CCPA until July 1, 2021, at which point a newly created California Protection Agency would assume responsibility for administering the law. In addition, the final regulations arising from the CPRA would need to be adopted by July 1, 2022, a full year before the CPRA goes into effect. Importantly, the CPRA would also extend the current moratoria on the application of CCPA to PI collected in the employee/job applicant and business-to-business contexts until January 1, 2023, allowing the legislature time to consider addressing those categories in a separate bill. This extension would be effective immediately, should the ballot measure pass, and the extended timeline should give businesses the time necessary to prepare for the compliance challenges that might arise with respect to these categories of PI. Among the other significant changes that the CPRA would effectuate are: clarification of the definition of “sale” of PI and related obligations (e.g., to explicitly include the “sharing” of PI for monetary or other valuable consideration, and clarifying obligations regarding “cross-context behavioral advertising”); the expansion of consumer rights to include the right to correct PI and limit the use of sensitive PI (the definition of which the CPRA seeks to amend); data retention limitation requirements; service provider obligations to assist businesses with CPRA compliance; and the expansion of the private right of action to cover breach of an email address in combination with a password and security question and answer permitting access to the email account. Notably, the CPRA does not add a comprehensive private right of action for any other violations, leaving that enforcement to the proposed California Protection Agency. Looking Forward Because the initiative has only been certified for four days, the prospects for the initiative in the November election are unclear. It can be expected that the initiative will garner significant support, however. The CPRA joins nearly a dozen other initiatives that will also be on the ballot in California in November. As the possibility of the CPRA moves closer to reality, we will provide additional information on how it will change data privacy and cybersecurity regulation in California. In the meantime, if you are interested in hearing more about the most notable provisions, and their application to your particular concerns, we are happy to discuss. Please do not hesitate to contact anyone in the list below with your questions. _____________________    [1]   California Attorney General Xavier Becerra recently denied requests to consider a 6-month enforcement delay to January 2, 2021, due to challenges and disruptions presented by the coronavirus pandemic, including a request from a coalition of more than 60 businesses led by the Association of National Advertisers. Attorney General Becerra’s office noted in an email to Forbes, “Right now, we're committed to enforcing the law upon finalizing the rules or July 1, whichever comes first. . .We're all mindful of the new reality created by COVID-19 and the heightened value of protecting consumers' privacy online that comes with it. We encourage businesses to be particularly mindful of data security in this time of emergency.” See Marty Swant, “Citing COVID-19, Trade Groups Ask California’s Attorney General To Delay Data Privacy Enforcement,” Forbes (Mar. 19, 2020), available at: https://www.forbes.com/sites/martyswant/2020/03/19/citing-covid-19-trade-groups-ask-californias-attorney-general-to-delay-data-privacy-enforcement/#1ecf88de5c30.


The following Gibson Dunn lawyers assisted in the preparation of this client update: Alexander Southwell, Benjamin Wagner, Cassandra Gaedt-Sheckter, and Lisa Zivkovic.

Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments.  Please contact the Gibson Dunn lawyer with whom you usually work, or any member of the firm’s California Consumer Privacy Act Task Force or its Privacy, Cybersecurity and Consumer Protection practice group: California Consumer Privacy Act Task Force: Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Alexander H. Southwell – New York (+1 212-351-3981, asouthwell@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Please also feel free to contact any member of the Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin – New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)

Europe Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan – Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com)

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

June 23, 2020 |
GDPR Update: French Administrative Supreme Court Upholds 50 Million Euro Fine Against Google LLC

Click for PDF On June 19, 2020, the French Administrative Supreme Court (“Conseil d’Etat”) dismissed Google LLC’s appeal against the French Data Protection Authority’s decision of January 21, 2019, imposing a fine of 50 million euros on Google LLC. This fine is, to date, the highest fine issued under the GDPR in the European Union. The decision is now final with no further possibility of appeal before French courts. This Client Alert lays out the key aspects and implications of the decision.

I. Context of the decision  

On January 21, 2019, the French Data Protection Authority (“CNIL”) imposed a fine of 50 million euros on Google LLC for breach of EU transparency and information obligations and lack of valid consent to process data for targeted advertising purposes under the General Data Protection Regulation (EU) 2016/679 (“GDPR”). Google filed an appeal against the CNIL’s decision with the French Administrative Supreme Court, which issued its final ruling on June 19, 2020. Google raised two requests for preliminary rulings from the Court of Justice of the European Union (one in relation to the jurisdiction of the CNIL and another in relation to the consent mechanism the company had used) but the French Administrative Supreme Court determined that there was no need to refer such questions to the Court of Justice of the European Union.

II. CNIL’s jurisdiction

In its ruling, the French Administrative Supreme Court first confirmed the CNIL’s jurisdiction. In order to challenge the CNIL’s jurisdiction, Google claimed that its main establishment, as defined under the GDPR, was Google Ireland Limited, which is its head office in Europe, has human and financial resources and assumes the responsibility of “many organizational functions” in Europe. In doing so, Google tried to demonstrate that the Irish supervisory authority (the DPC) should have had jurisdiction in this matter considering the one-stop-shop mechanism provided by the GDPR under which an organization established in multiple EU Member States shall have, as its sole interlocutor, the supervisory authority of its “main establishment” (also called, the “lead supervisory authority”). Under the GDPR, the “main establishment” should correspond to the place of the central administration in the EU, unless decisions on the purposes and means of data processing are taken in another establishment which has the power to have such decisions implemented, in which case the latter establishment should be considered the main establishment. The French Administrative Supreme Court found that Google Ireland Limited did not exercise, at the time of the challenged conduct, control over the other European affiliates of the company, so that it could not be regarded as the “central administration,” and that Google LLC was determining alone the purposes and means of the processing. The Court noted that Google Ireland Limited was assigned new responsibilities in relation to data processing in Europe, but highlighted that this new scope of responsibility was in any event effective only after the date the CNIL issued its decision. The Court also pointed out that while the CNIL cooperated with other supervisory authorities in the EU in relation to its jurisdiction, none of them raised a concern with respect to the CNIL’s exercise of jurisdiction, and the Irish supervisory authority even publicly stated at that time that it was not the lead supervisory authority of Google LLC. Therefore, the Court rejected Google’s jurisdictional arguments, including the request for preliminary rulings from the Court of Justice of the European Union regarding that issue.

III. GDPR violation

The French Administrative Supreme Court also confirmed the breaches identified by the CNIL regarding Google’s transparency and information obligations, as well as the lack of valid consent to process its users’ personal data for targeted advertising purposes. First, the Court found that Google’s consumer disclosures were scattered, thus hindering the accessibility and clarity of information for users, while the data processing carried out was particularly intrusive. Furthermore, with respect to the validity of the consent collected, the Court confirmed that the information Google provided to consumers that was related to targeted advertising was not presented in a sufficiently clear and distinct manner for the user's consent to be valid. In particular, the consent was collected in a global manner for various purposes and through a pre-ticked box, which do not meet the requirements of the GDPR. In that respect, the French Administrative Supreme Court also determined that there was no need to raise a request for preliminary rulings from the Court of Justice of the European Union. Finally, the French Administrative Supreme Court stated that the administrative fine of 50 million euros was not disproportionate and confirmed its amount.

IV. Conclusion

This decision is an important reminder that providing clear disclosures and consent mechanisms are key obligations to be complied with when a company’s processing of personal data is subject to the GDPR, as shortcomings in those areas may lead to significant monetary sanctions. For organizations with multiple subsidiaries or affiliates in the EU, this decision also illustrates the importance of clarifying their corporate organization, identifying their main establishment in the EU, and ensuring that this main establishment satisfies the criteria set out in the GDPR in order to benefit from the one-stop-shop mechanism.
The following Gibson Dunn lawyers prepared this client alert: Ahmed Baladi, Vera Lukic, Adelaide Cassanet, Clemence Pugnet, and Ryan T. Bergsieker. Please also feel free to contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the Privacy, Cybersecurity and Consumer Protection Group: Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

June 12, 2020 |
California Consumer Privacy Act Update: Attorney General Finalizes Regulations and Provides Interpretive Guidance

Click for PDF The Office of the California Attorney General (“OAG”) announced on June 1, 2020, that it submitted the final proposed regulations for the California Consumer Privacy Act (“CCPA”), and related documents, to the California Office of Administrative Law. The final text of the proposed regulations remains substantively unchanged from the March 11, 2020 version published by the OAG (please see our prior alerts regarding the substance of the previous versions here, here, and here). However, the package of documents submitted to the Office of Administrative Law contained a significant amount of information, including the final draft of the regulations,[1] a final statement of reasons supporting the changes from the original proposed draft of regulations promulgated on October 11, 2019,[2] the comments received by the OAG, the OAG’s responses to those comments,[3] and a historical log of the prior versions, notices, transcripts from public hearings, preliminary activities, and supporting documents. These documents are available on the Attorney General’s CCPA website: https://oag.ca.gov/privacy/ccpa. Due to Executive Order N-40-20 related to the COVID-19 pandemic, the Office of Administrative Law has 30 working days and an additional 60 calendar days to approve the regulations, or until September 14, 2020, at which time the final regulations will be filed and become enforceable. However, the OAG included in its CCPA package to the Office of Administrative Law a request for expedited review, stating that “[w]hile the Attorney General is mindful of the challenges imposed by COVID-19 and Governor Newsom’s Executive Order N-40-20 granting additional time to finalize proposed regulations, the Attorney General respectfully requests that the Office of Administrative Law complete its review within 30 business days, given the statutory mandate for regulations [by July 1, 2020].” If the Office of Administrative Law takes the full time permitted by Executive Order N-40-20 to review the proposed regulations, the enforcement deadline for the regulations would be September 14, 2020. The Attorney General would still be permitted to enforce the statutory text of the CCPA, by its own provisions, starting July 1, 2020, but would not be permitted to enforce the regulations until they are approved. While it is possible the OAG will choose to wait until the regulations are in place, businesses should plan for CCPA enforcement to begin as stated in the statute—on July 1, 2020. While much of the information in the package is not new, the Final Statement of Reasons (“FSOR”) provides justifications for the changes made to the original proposed regulations (the original proposed regulations were supported by the Initial Statement of Reasons, which is incorporated by reference into the FSOR). In addition, the OAG’s responses to all comments that were submitted (FSOR Appendices A, C, and E) provide valuable insight through text clarifications, and reasons for accepting or rejecting certain comments. The responses also postpone resolution of a few issues, stating that the OAG needed to prioritize those issues that were required for operationalizing the CCPA, and need to consider others further.[4] These documents are available on the OAG’s website, and total over 500 pages. Below, we highlight a few notable highlights from each. If you are interested in hearing more about the most notable comments, and their application to your particular concerns, we are happy to share more complete information and discuss with you upon request.

Final Statement of Reasons

  • Definitions. The OAG clarified that it revised the initial proposed definition of and sections relating to “price or service difference” as it related to financial incentives to confirm that differences in price or quality of the goods or services offered to the consumer must implicate consumers’ rights under the CCPA (i.e., are “related to the collection, retention, or sale of personal information”) to be of concern. Separately, with respect to questions about whether an entity “does business in California,” the OAG declined to issue any regulation, stating that “[i]n the absence of a specific definition [it] should be given meaning according to the plain language of the words and other California law.” As a result, businesses can consider whether they are “doing business” under California tax law, for example (see, e.g., https://www.ftb.ca.gov/file/business/doing-business-in-california.html, describing relevant thresholds of California sales, property or payroll pursuant to tax law). Of course, even if an entity does business in California, it must also meet the other threshold requirements to be subject to CCPA, including being a for-profit entity meeting certain size requirements.
  • Notice at collection of personal information. The OAG imposes important additional notice requirements and explains that the notice at collection “shall be made readily available where consumers will encounter it at or before the point of collection of any personal information” (emphasis added). This change is considered necessary to “encompass a variety of contexts . . . such as notices delivered online regarding online collection or orally when information is collected by telephone, and physical proximity, such as notices delivered by signage in retail environments.” For example, the OAG specifies that a business collecting information offline should have flexibility regarding the manner in which businesses point to an online notice (e.g., the change to the regulations “responds to comments noting that the prior language was overly prescriptive and that the OAG should allow for a QR code or other ways to direct the consumer to the text of the notice”). In addition, the OAG confirms that for a business that does not collect personal information directly from a consumer, the OAG considered and rejected requiring such businesses to “post[] . . . an online privacy policy.” On the other hand, the OAG confirms that the regulations impose certain specific additional requirements for the notice, including that if any changes are made to the business’s practices that are materially different than what was disclosed previously, the regulations “require explicit consent [to] put the consumer in the same position they would have been had the material change been disclosed during the consumer’s first engagement with the business.” The OAG explains that “[s]imply updating an online privacy policy or providing notice without explicit consent” would be insufficient, but “[b]usinesses have discretion to determine the manner in which to notify the consumer and obtain consent within the framework of the CCPA and the regulations.” Additionally, a “just-in-time” notice is required for personal information collected from a mobile device that a consumer “would not reasonably expect.” The OAG justifies these additional requirements as consistent with the purpose of the CCPA to provide sufficient transparency, and that it is “[i]nherent in [its] authority . . . to adopt regulations that fill in details not specifically addressed by the CCPA, but fall within the scope of the CCPA.”
  • Responding to requests to know and requests to delete. The OAG explained that it added Section 999.313(c)(3) (alleviating a business’s obligation to search for personal information under certain conditions) in order to decrease a business’s “burden or inability to search unstructured data for a consumer’s personal information” in response to a request to know, provided that the consumer is informed of the categories of records that may contain personal information that it did not search because it meets certain conditions. The OAG believes this balances the stakeholders’ interests, as it provides a consumer information that the “business may have other personal information about them but assures them that this information is only maintained by the business in an unsearchable or inaccessible format, solely for legal or compliance purposes, and is not being used for the business’s commercial benefit.” With respect to requests to delete, the OAG noted that it considered and rejected not requiring businesses to inform consumers whose requests to delete were denied of their right to opt-out of the sale of personal information, because doing so allows consumers to control the proliferation of their personal information in the marketplace.  Additionally, the OAG explained that this regulation in fact lessens the burden on businesses because otherwise it might require businesses to treat these denials as requests to opt-out.
  • Service providers. The OAG explains that the modifications to the regulations pertaining to service providers were designed to facilitate the engagement of service providers, on the one hand, and prevent businesses from using service providers to shirk their obligations under the CCPA to consumers, on the other.  For instance, the OAG modified the definition of “service provider” under the regulations to clarify that businesses that are engaged as service providers for non-profits or public entities, which are not “businesses” under the CCPA and thus not subject to the related obligations, are still service providers.  The OAG explained that non-profits and public entities might not otherwise employ service providers for fear of incurring unnecessary and burdensome costs related to CCPA compliance.  Similarly, the OAG clarified that businesses that collect personal information directly from consumers (or about consumers) or that render services to a third party at the direction of another “second” business are still service providers, allowing businesses to be engaged as service providers for the initial collection of personal information without being considered “businesses” under the CCPA.  On the other hand, certain modifications were designed to prevent businesses from evading their obligations under the CCPA by engaging service providers.   For instance, the OAG modified the regulations to clarify that a service provider’s failure to provide services required by the CCPA pursuant to a written contract will constitute a violation of the CCPA that is enforceable by the OAG and not simply a breach of contract.  The OAG noted that this modification was necessary to ensure that service providers comply with the restrictions set forth in their service-provider contracts even if the business does not enforce those restrictions.  Lastly, the OAG specified that service providers must comply with requests to opt-out of the sale of personal information in order to prevent businesses from engaging service providers to avoid having to comply with such requests.
  • Requests to opt-out. The OAG’s modifications to the regulations pertaining to the requests to opt-out require that businesses provide consumers with “easy” mechanisms that “require minimal steps” to opt-out of the sale of their personal information and treat user-enabled global privacy controls as valid requests to opt-out.  In the first instance, the OAG explained that the modification was necessary to avoid the possibility that some businesses may create confusing or complex mechanisms for consumers to exercise their rights under the CCPA.  In the second, the OAG explained that the modification is “forward-looking” and counterbalances the ease with which businesses collect personal information.  Requiring businesses to treat user-enabled global privacy controls as valid requests to opt-out is forward-looking because it encourages the “development of technological solutions to facilitate and govern the submission of requests to opt-out.”  Furthermore, the OAG explained that its experience as the enforcer of the California Online Privacy Protection Act (“CalOPPA”), whereby businesses must state how they treat “do-not-track” signals, discouraged it from making this provision discretionary.  The majority of businesses, the OAG explained, disclose that they do not respond to “do-not-track” signals because compliance with such signals is discretionary.  Moreover, the alternative methods for opting-out that were proposed, such as using a business’s designated methods for submitting requests to know or delete, were insufficient on a global scale and did not adequately “counterbalance the ease and frequency by which personal information is collected and sold in online contexts, such as when a consumer visits a website.”

Responses to Comments Submitted

  • No extensions of enforcement. The OAG rebuffed repeated requests to delay CCPA enforcement and the rollout of the finalized regulations, particularly those that insisted that businesses must focus on the COVID-19 pandemic and might face special burdens with much of their workforce on “work from home.” The OAG explained that “[t]he proposed rules were released on October 11, 2019, with modifications made public on February 10, 2020 and March 11, 2020. Thus, businesses have been aware that these [related] requirements could be imposed as part of the OAG’s regulations.” However, the OAG also indicated that it would exercise “prosecutorial discretion if warranted, depending on the particular facts at issue. Prosecutorial discretion permits the OAG to choose which entities to prosecute, whether to prosecute, and when to prosecute.” This suggests that the OAG will take a more flexible tack toward enforcement, even if it will not budge on when the law will come into effect. Specifically as to COVID-19, the OAG also argued that “any delays in implementation of the regulation will have a detrimental effect on consumer privacy as more and more Californians are using online resources to shop, work, and go to school.”
  • CCPA compliance is fact-specific and contextual. In response to many comments regarding applicability of the CCPA and whether certain scenarios are compliant, the OAG noted that there was no clear answer, and that a “fact-specific and contextual determination” in consultation “with an attorney who is aware of all pertinent facts and relevant compliance concerns” is required. While such answers were presumably not particularly satisfying for many commenters, the consistent reaction provides room for interpretation on various issues, and supports a no-one-size-fits-all, principled, risk-based approach by businesses.
  • IP addresses and other information not necessarily maintained as “personal information.” In explaining the deletion of former Section 999.302, which had elaborated on the definition of “personal information” and created a safe harbor as to IP addresses not linked to particular consumers, the AG’s office indicated that “[t]he OAG [] deleted this provision to prioritize the implementation of regulations that operationalize and assist in the immediate implementation of the law..” However, the OAG also noted that “[f]urther analysis is required on this issue.” This suggests that similar guidance might ultimately return in some form. In the meantime, the OAG again stated that “[w]hether information is “personal information” is a “fact-specific and contextual determination,” requiring consultation “with an attorney who is aware of all pertinent facts and relevant compliance concerns.” While the OAG stated that “personal information” is defined broadly, and IP addresses are included in the definition, the CCPA also has provisions that “do not require a business to collect, retain, or otherwise reidentify or link information if the information is maintained in a manner that would not be considered personal information. See Civ. Code §§ 1798.100(e), 1798.110(d), 1798.145(k).”
  • Definition of “sale” is still open for interpretation. In response to specific questions regarding the definition of “sale,” and whether it includes or excludes, for example, “real-time bidding in online advertising” and “the passing of information for targeted advertising”—an issue that has been widely debated—the OAG did not provide a clear response. Instead, the OAG stated that whether these particular situations constitute a sale requires a fact-specific determination, “including whether or not the parties involved are third parties or service providers.” Further, in discussing service providers’ ability to use information internally, and in the advertising context, the OAG stated that “[t]he CCPA allows a service provider to furnish advertising services to the business that collected personal information from the consumer, and such ads may be shown to the same consumer on behalf of the same business on any website. See Civ. Code § 1798.140(d)(5). Prohibiting a service provider from placing such ads is [] unnecessary because the CCPA would not prohibit the business’s own marketing department from placing the same ads itself.”
  • Businesses should consider multiple notices. Although clear from the text that various disclosures are required, the OAG’s comments suggest that businesses should consider displaying additional notices where necessary (again, “ultimately a fact-specific determination”)—though having an omnibus privacy policy may be sufficient in certain circumstances. For instance, the OAG responded to many comments that “consumers [must] be given a notice at collection, notice of right to opt-out, and notice of financial incentive. These requirements are separate and apart from the CCPA’s requirements for the disclosures in a privacy policy.” While the OAG confirms that businesses do have the discretion to “have all the information contained in the different notices in one place through the privacy policy,” businesses must still “comply[] with its statutory requirements to separately provide” the other three notices. For example, the notice at collection can point to the appropriate section of the more detailed privacy policy, as could a notice of financial incentive point to the section of the policy related to the financial incentive when a consumer signs up for a loyalty program. Businesses should thus make a “fact-specific determination” as to whether separate just-in-time notices, specific references to the privacy policy, or other notifications are required to satisfy CCPA and regulation requirements.
  • Notice need not be provided to consumers if personal information is not directly collected from them. The OAG clarified that a business does not need to provide a notice at collection to a consumer if it does not collect personal information directly from them and does not sell that consumer’s personal information. In particular, the OAG specified that this exemption applies to situations where personal information is included in user-generated content, a consumer uploads another consumer’s personal information, and employees provide information about their family members.
  • CCPA retroactivity clarified. The OAG responded to comments relating to whether certain provisions are retroactive, giving some insight into what was required as of January 1, 2020. For example, the OAG clarified that the notice of collection does not need to contain information regarding personal information obtained prior to CCPA’s effective January 1, 2020 date (if it will not be collected going forward), and that disclosures in response to a request to know do need to include the preceding 12 months, regardless of whether that means disclosing personal information that was collected prior to the effective date. Most importantly, the OAG confirmed that it, “cannot bring enforcement actions based on conduct occurring before the effective date of the CCPA.” This last comment should provide some comfort to businesses concerned about security incidents that occurred prior to the enforcement date.
  • Cookie banner not necessary. Some comments exhibited a concern that the notice at collection requirement imposed an obligation to have a pop-up or “European-style cookie banner” for cookie data collection. The OAG responded that “[t]he provision does not require a cookie banner, but rather leaves it to businesses to determine the formats that will best achieve the result in particular environments,” and that “businesses have discretion to determine how to provide notice in compliance with § 999.305, which requires that the notice be readily available where consumers will encounter it at or before the point of collection.”
  • Imposition of burden on businesses has been carefully weighed. When confronted with the argument that a particular provision would impose undue burdens on businesses, the OAG repeatedly asserted that “[a]ny potential competitive harm is speculative, and in any case, the potential for harm is further mitigated because all similarly situated competitors in California will be bound by the same disclosure requirements.” This approach suggests a willingness to impose costs as long as they are broadly shared.
  • Financial incentives and loyalty programs require calculating the value of the consumer’s data. Many comments opposed the regulation requiring businesses to calculate the value of consumer data when offering a financial incentive. These critics stated that it is burdensome, unclear, and would not provide critical insight to consumers. The OAG doubled-down on the requirement, stating that the regulations only require a good-faith estimate of the value, and such an estimate is necessary for providing consumers with the material terms of any financial incentive program, as required by the CCPA. Moreover, the OAG noted that “the disclosed value of a consumer’s data to a business could . . . be relevant to enforcement of the CCPA . . . because any financial incentive or price or service difference must be reasonably related to the value of the consumer’s data.” As a result, businesses offering any financial incentive, or price or service difference, in light of collection, retention, or sale of data (e.g., collection of email in exchange for coupons, loyalty programs) should consider requisite notice and calculation of the value of consumers’ data.
  • Intellectual property rights may not be a valid reason not to provide information in a request to know. Several comments urged the OAG to promulgate a regulation allowing businesses to deny a request to know on the basis of preserving their intellectual property rights, such as a business’s trade secrets. In response, the OAG stated that while it has the authority to enact such an exception, “the comment[s] do[] not show that [it] is necessary to comply with state or federal law,” that “consumer personal information is itself a protected form of intellectual property,” or that “a consumer’s personal information collected by the business could be subject to the business’s copyright, trademark, or patent rights.” In addition, it states that there has not been a showing of harm in disclosing that information to the consumer directly. The OAG concludes that “any potential competitive harm is speculative,” that “potential for harm is further mitigated because all similarly situated competitors in California will be bound by the same disclosure requirement,” and that “a blanket exemption from disclosure for any information a business deems could be a trade secret or another form of intellectual property would be overbroad and defeat the Legislature’s purpose of providing consumers with the right to know information businesses collect from them.”

* * *

We stand ready and available to guide companies through the issues raised in the regulations and statute, and any inquiries or concerns left unanswered. Please do not hesitate to contact anyone in the list below with your questions. __________________________ [1] The Office of the Attorney General, Final Text of Proposed Regulations (June 1, 2020), available at: https://oag.ca.gov/privacy/ccpa. [2] The Office of the Attorney General, Final Statement of Reasons (June 1, 2020), available at https://oag.ca.gov/privacy/ccpa. [3] The Office of the Attorney General, Appendix A. Summary and Response to Comments Submitted during 45-Day Period (June 1, 2020), available at available at https://oag.ca.gov/privacy/ccpa; The Office of the Attorney General, Appendix C. Summary and Response to Comments Submitted during 1st 15-Day Period (June 1, 2020), available at https://oag.ca.gov/privacy/ccpa; The Officer of the Attorney General, Appendix E. Summary and Response to Comments Submitted during 2nd 15-Day Period (June 1, 2020), available at https://oag.ca.gov/privacy/ccpa. [4] Indeed, the comments frequently provided responses that deferred making any decisions, referred businesses to an attorney, determined comments were beyond the scope of the regulations, or stated that comments were not more likely to result in effectuating the CCPA’s purpose. For example, the following phrases occurred dozens of times in the responses: “The comment raises specific legal questions that may require a fact-specific determination. The commenter should consult with an attorney who is aware of all pertinent facts and relevant compliance concerns. The regulations provide general guidance for CCPA compliance;” “To meet the July 1, 2020 deadline set forth by the CCPA, the OAG has prioritized the drafting of regulations that operationalize and assist in the immediate implementation of the law. Further analysis is required to determine whether a regulation is necessary on this issue.”
The following Gibson Dunn lawyers assisted in the preparation of this client update: Alexander Southwell, Mark Lyon, Ryan Bergsieker, Cassandra Gaedt-Sheckter, Daniel Rauch, Lisa Zivkovic and Tony Bedel. Gibson Dunn's lawyers are available to assist in addressing any questions you may have regarding these developments.  Please contact the Gibson Dunn lawyer with whom you usually work, or any member of the firm's California Consumer Privacy Act Task Force or its Privacy, Cybersecurity and Consumer Protection practice group: California Consumer Privacy Act Task Force: Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Cassandra L. Gaedt-Sheckter - Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Alexander H. Southwell - New York (+1 212-351-3981, asouthwell@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Please also feel free to contact any member of the Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)

Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan - Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

May 22, 2020 |
Webcast: Preparing for the California Consumer Privacy Act (CCPA)

The California Consumer Privacy Act (CCPA) is in effect, and we are already seeing the first class action lawsuits. The Attorney General will also have enforcement power starting July 1, 2020, which is quickly approaching. Given the multitude of health and economic concerns facing all of us in the current environment, CCPA compliance may have fallen to the back burner. To introduce some topics you may not be thinking about, and help you prioritize your next steps in preparation for the AG enforcement date and the expected onslaught of private lawsuits, please join us for a two-hour webinar featuring two programs: For our first hour, Mark Lyon and Cassandra Gaedt-Sheckter, from our Palo Alto office, will present CCPA and the Dawn of Enforcement: Regulations, Global Privacy for the Future, and Where We Are Today, where they will discuss:

  • the latest round of the California Attorney General’s draft regulations;
  • other states’ proposals, and how to implement a global privacy policy in light of conflicting, and rapidly changing laws;
  • looking forward to compliance in 2021, what you should begin to think about now; and,
  • CCPA compliance in light of the COVID-19 pandemic.
For the second hour, Eric Vandevelde and Jeremy Smith, from our Los Angeles office, will present Cybersecurity and the CCPA:  Litigation, “Reasonable” Security, and Crisis Planning, where they will discuss:
  • the first data breach class actions seeking statutory damages of $750 per person that have already been filed and will present novel legal questions;
  • what the statute means by “reasonable” security and what steps can be proactively taken to prove that your organization is in compliance;
  • the role of the Attorney General in non-data breach cases; and
  • how CCPA and the threat of private and government lawsuits should inform crisis planning and communications with the public.
View Slides (PDF)

PANELISTS:  Mark Lyon Partner, Gibson Dunn Eric Vandevelde Partner, Gibson Dunn Cassandra Gaedt-Sheckter Associate, Gibson Dunn Jeremy Smith Associate, Gibson Dunn
MCLE INFORMATION:  This program has been approved for credit in accordance with the requirements of the New York State Continuing Legal Education Board for a maximum of 2.0 credit hours, of which 2.0 credit hours may be applied toward the areas of professional practice requirement.  This course is approved for transitional/non-transitional credit. Attorneys seeking New York credit must obtain an Affirmation Form prior to watching the archived version of this webcast. Please contact Victoria Chan (Attorney Training Manager) at vchan@gibsondunn.com to request the MCLE form. Gibson, Dunn & Crutcher LLP certifies that this activity has been approved for MCLE credit by the State Bar of California in the amount of 2.0 hours. California attorneys may claim “self-study” credit for viewing the archived version of this webcast.  No certificate of attendance is required for California “self-study” credit.

April 30, 2020 |
Webcast: Returning to Work: Health, Employment, and Privacy Considerations and Constraints as Businesses Resume Post-Quarantine Operations in the U.S.

As businesses plan to resume or expand operations in a post-quarantine COVID-19 world, they face a complex, and sometimes conflicting, patchwork of public health, employment, and privacy considerations requiring them simultaneously to:

  • Develop, implement, and continue to evaluate infection control programs—including PPE use, cleaning and disinfection protocols, social distancing and hand hygiene programs, and return to work policies—to reduce illness and transmission risk and keep up with evolving community health and industry standards.
  • Evaluate, implement, and document enhanced worker screening and contact tracing programs to identify, respond to, and understand the root cause of worker illnesses.
  • Implement screening and other programs with an eye to privacy, balancing the need to collect information with applicable and potentially conflicting privacy obligations arising under state constitutional and common law; statutes including the California Consumer Privacy Act, California’s Confidentiality of Medical Information Act, the Illinois Biometric Information Privacy Act, and various tracking and data breach statutes; and evolving general privacy principles of transparency, data minimization, confidentiality, and data security.
  • Remain compliant with wage and hour obligations in a “new normal” of altered work schedules and arrangements and new activities ancillary to workers’ regular shifts that may include PPE use, additional personal hygiene steps, or employee screening requirements.
  • Navigate the framework of federal and state employment law protecting employee rights, including those protecting potentially higher risk workers based on age or disability, worker health and safety obligations, and paid and unpaid leave rights, and be prepared to respond to employee concerns (and potential reluctance to work) while remaining sensitive to whistle-blower, anti-retaliation, worker speech.
View Slides (PDF)

PANELISTS: Karl Nelson is a Gibson Dunn partner who advises and represents employers across the country in connection with employment law compliance and litigation, including with respect to fair employment practices, benefits issues, worker health and safety, whistle-blower claims, and collective bargaining rights and obligations.  He has been actively involved as part of the firm’s COVID-19 Response Team in guiding clients across a range of industries in responding to the recent health crisis. Katherine V.A. Smith is a partner in Gibson Dunn’s Los Angeles office whose practice focuses on high stakes employment litigation matters such as wage and hour class actions, representative actions brought under the California Private Attorney General Act (“PAGA”), whistleblower retaliation cases, and executive disputes.  In addition to litigation, Ms. Smith also dedicates a significant portion of her practice to advising employers on nearly all aspects of employment law, including those arising from the COVID-19 crisis. Alexander H. Southwell is a nationally-recognized technology investigations lawyer and counselor, serving as global Co-Chair of Gibson, Dunn & Crutcher’s Privacy, Cybersecurity, and Consumer Protection Practice Group.   He represents a wide-range of leading companies, counseling on privacy, information technology, data breach, theft of trade secrets and intellectual property, computer fraud, national security, and network and data security issues, including handling investigations, enforcement defense, and litigation.  Recently, he has focused on advising clients on cybersecurity and privacy issues relating to COVID-19 crisis management programs and has led a number of COVID-related pro bono projects. Cassandra Gaedt-Sheckter is a senior associate in Gibson Dunn’s Palo Alto office who focuses on cutting-edge privacy law compliance concerns for clients in a broad range of industries, including relating to federal, state, and international privacy and cybersecurity laws, and representing companies in technology-related privacy class action and IP litigation matters.   She is a leader of the firm’s CCPA Task Force, and has been particularly dedicated in recent months to advising clients on privacy and cybersecurity issues relating to businesses’ implementation of COVID-19 crisis management and prevention programs. Dr. Christopher Kuhlman is a board certified toxicologist (DABT) and industrial hygienist (CIH) with CTEH. Dr. Kuhlman specializes in toxicology, risk assessment, toxicity evaluations, and emergency response toxicology. Recently, he has been working with employers around to globe to meet the ongoing challenges of the outbreak of COVID-19.
MCLE CREDIT INFORMATION: This program has been approved for credit in accordance with the requirements of the New York State Continuing Legal Education Board for a maximum of 1.0 credit hour, of which 1.0 credit hour may be applied toward the areas of professional practice requirement.  This course is approved for transitional/non-transitional credit. Attorneys seeking New York credit must obtain an Affirmation Form prior to watching the archived version of this webcast. Please contact Victoria Chan (Attorney Training Manager) at vchan@gibsondunn.com to request the MCLE form. Gibson, Dunn & Crutcher LLP certifies that this activity has been approved for MCLE credit by the State Bar of California in the amount of 1.0 hour. California attorneys may claim “self-study” credit for viewing the archived version of this webcast.  No certificate of attendance is required for California “self-study” credit.

April 28, 2020 |
European Perspective on Tracing Tools in the Context of COVID-19

Click for PDF As part of the fight against the spread of COVID-19 and desire to effectively lift the lockdown, governments and private companies around the world are considering the use of data driven digital tools, in particular digital tracing solutions. Such tracing technology may serve multiple purposes; among the main ones are: (i) collecting mobile location data in order to model the spread of the virus and measure the effectiveness of confinement measures; and (ii) contact tracing, in order to alert individuals that they have been in close proximity of someone who has tested positive for COVID-19. Various initiatives to develop tracing solutions are currently being pursued, including both public and private initiatives, which praise the merits of tracing solutions that have been rolled out successfully in Asia-Pac. The use of such tracing tools in the context of the pandemic requires the collection of personal data such as health data and potentially location data, which has prompted data protection authorities in Europe to alert on the need to comply with fundamental privacy rules set forth in the GDPR[1] and the ePrivacy Directive[2]. Debates have sparked in various European jurisdictions on how to conciliate the use of digital tracing solutions - which could be perceived as intrusive - and the need to guarantee individuals’ rights such as privacy and data security. In order to help European countries navigate through these complex issues, on April 21, 2020 the European Data Protection Board (“EDPB”) adopted its guidelines on the use of location data and contact tracing tools in the context of the COVID-19 outbreak (“Guidelines”).[3] This Client Alert summarizes the key privacy implications of collecting personal data through tracing tools in Europe.

1. Positions and Guidance of the EU Institutions regarding COVID-19 Tracing Applications

Since the beginning of the pandemic, the European Commission and its various institutions have been supportive of private and public initiatives for the creation of tracing applications capable of contributing to the containment of COVID-19. The European Commission has actually drafted an inventory of the initiatives carried out in Europe and worldwide to develop and offer digital tracing tools in response to the COVID-19 pandemic[4]. However, the use of data intensive technologies and applications, as well as the artificial intelligence underpinning such applications, have also raised concerns from a data privacy and cybersecurity perspective, which have led both the European Commission and the EDPB to adopt guidelines:
  • Taking the GDPR and the ePrivacy Directive as references, the European Commission and the EDPB have started to publish their opinions on the compliance of tracing applications with EU privacy and cybersecurity rules in mid-March and throughout April. The EDPB insisted first on the fact that the GDPR should not hinder the capacity of the EU Member States to fight the pandemic through the processing of mobile location data, provided that such data were anonymized and collected in an aggregated manner.[5] The European Commission published its guidance on applications[6] setting out the main requirements that applications shall meet to ensure compliance with data protection regulations (e.g., retention of control by users, applicable legal basis, data minimization principle). The European Commission further indicated in its guidance that app developers should aim to exploit the latest privacy-enhancing technological solutions, such as Bluetooth proximity technology, in order to provide contact tracing features without allowing applications to track individuals’ locations.
  • Based on the European Commission’s guidance, the EDPB finally adopted its Guidelines on April 21, 2020 on the use of location data and contact tracing tools in the context of the COVID-19 outbreak.[7]

The EDPB emphasized that preference should always be given to the processing of anonymized data rather than personal data of identified or identifiable persons. It also reminded that anonymization[8] processes have to comply with strict requirements and pass the “reasonableness test”, which considers the effectiveness of anonymization tools taking into account both objective aspects (e.g., technical means to re-identify individuals) and factual elements (e.g., nature and volume of data involved that would need to be process to re-identify individuals).

As to the use of tracing tools, the EDPB confirmed that the use of contact tracing applications should be voluntary. Accordingly, in order to ensure that individuals have a real choice, those who decide not to (or cannot) use such tracing applications should not suffer any negative consequence derived from that decision or situation. In the words of the EDPB, individuals must have full control over their personal data at all times, and should be able to choose freely to use any application.

The EDPB also recalled general data protection principles that should be complied with in this context, notably:

Accountability: The EDPB indicates that the controller of the contact tracing application should be clearly identified. In this respect, it considers that national health authorities could act as controllers for applications developed by public administrations, but other controllers may also be envisaged.

Purpose limitation: The Guidelines specify that the purposes aimed by tracing applications should exclude objectives or uses of data that are unrelated with the management of the COVID-19 crisis (for example, commercial purposes).

Data minimization and principles of privacy by design and by default: According to these GDPR principles, any personal data processed should be reduced to the strict minimum. Rather than collecting and sharing location data via the application, contact tracing applications should be based on proximity communication technologies that enable the broadcasting and receipt of data among users (e.g., proximity data based on Bluetooth Low Energy). The EDPB recommends that this data should be subject to regular pseudonymisation, and that measures should be implemented to prevent re-identification.

Lawfulness of processing: Different legal bases are considered by the EDPB depending on the data collected and the entity providing the application (e.g., consent, performance of a task for public interest). The choice of the most appropriate legal basis will largely depend on whether an application is developed and offered by private parties or by public administrations.

Storage limitation: The Guidelines recommend that personal data should be erased or anonymized immediately after the COVID-19 crisis.

Data security: The EDPB recommends to use pseudonymous identifiers and state-of-the-art cryptographic techniques to ensure data security.

Finally, the EDPB considers that a data protection impact assessment shall be carried out before implementing a tracing tool, as the data processes involved in the functioning of such an application are likely to result in a high risk to the rights and freedoms of individuals.

As an annex to these Guidelines, the EDPB provided practical guidance to app developers and users of contact tracing applications. For example, we note that the source code of the application and of its back-end should be open.

It is worth mentioning that the EDPB adopted two letters on April 24, 2020 in response to members of the European Parliament[9]. In its letters, the EDPB notably reminded that data protection law already enables to implement data processing necessary to fight an epidemic and that “there is no need to lift GDPR provisions but just to observe them”. The EDPB mainly referred to its recently adopted Guidelines and specified that it has taken into account concerns from the stakeholders involved as well as the general public when drafting such Guidelines.

2. Specific Positions of Member States regarding COVID-19 Tracing Applications

Certain EU Member States and other states from the European Economic Area have already implemented tracing tools in their respective territories (e.g., in Austria, Czech Republic, Iceland, Poland). While other EU Member States are currently still considering the development and roll-out of tracing applications, the conditions under which such applications would contribute to the fight against the COVID-19 have not been clearly established yet. In this context, EU Member States’ Data Protection Authorities (“DPAs”) have also issued their own opinions in recent weeks, applicable to applications used in their respective territories. While these DPA opinions should follow and be based on the basic principles set at EU level, each DPA has taken a unique position on the various data privacy implications of tracing tools in its own EU Member State.
  • Belgium
In Belgium, the main initiative to develop a contact tracing application has come from the public administration. By mid-April, it was reported that the Belgian State was working on a public application that would enable the tracking of infected individuals and the issuing of warnings to other individuals who would have crossed infected persons within a period time. However, reporting to the press on April 23, 2020, the Belgian Minister of Digital Agenda indicated that Belgium no longer needed, for the time being, an application for automated contact tracing. Instead, the Minister expressed a preference for confinement rules and for manual tracing by health services. To support its position, the Minister also referred to the high utilization rate that a tracing application would require to ensure its effectiveness (60%), and the low download rates that had been recorded in other European countries where an application had already been launched (e.g., in Austria, where only 400,000 downloads had been registered in a country with a population of 8.9 million people – i.e., a 4.5% usage rate). Notwithstanding the position adopted by the Belgian Government, on April 8, 2020 the Belgian Data Protection Authority (“Belgian DPA”) published a press release[10] emphasizing basic principles for the processing of personal data by contact tracing applications. First, the Belgian DPA recommends not processing any personal data if this is not required to offer the services to the users (e.g., name, e-mail address, mobile number). However, the provision of certain applications implies necessarily the processing of personal data in order to offer the service (e.g., IP addresses). In such situation, applications should only process personal data to ensure their appropriate functioning in light of the objective pursued. Any data inputted may continue to be processed by the app provider depending on whether the user intends the service to continue after it finishes using the application.
  • France
On April 24, 2020, the French DPA (“CNIL”) adopted a decision on the project of mobile application “StopCovid”[11] initiated by the French government, which is a contact tracing application based on Bluetooth technology (not using geolocation technology). First, in accordance with the purpose limitation principle, the CNIL notes that the tracing tool may only be used to inform its users in the event of contact with an individual tested positive for COVID-19 but not for other purposes like monitoring the compliance of confinement measures. Besides, the CNIL welcomes the fact that the intended tool would be based on a voluntary approach. As recommended by the EDPB Guidelines, the CNIL reminds that individuals who decide not to download/use the “StopCovid” application should not suffer any negative consequences (such as a prohibition to take public transportation). With respect to the lawfulness of the processing, again in line with the EDPB Guidelines, the CNIL estimates that the performance of a task of public interest would be the most appropriate legal basis when the processing is carried out by public authorities (Art. 6-1-e of the GDPR). For the specific processing of health data, the CNIL considers that the processing carried out in the context of the “StopCovid” application would be necessary for reasons of public interest in the area of public health (Art. 9-2-i of the GDPR). Having this in mind, the CNIL recommends that the use of a voluntary contact tracing application should be governed by a specific legal provision in French law. In addition, the authority reminds the principles of data minimization and storage limitation according to which the data shall be kept only for the use of the application. Finally, the CNIL provided specifications on the application configuration. As to the accountability principle, the authority estimates that the controller should be the French Health Ministry or any other health authority involved in the health crisis management. It also reminds the necessity to carry out a data protection impact assessment, as recommended by the EDPB. The importance of data accuracy and data security as well as the respect of data subjects’ rights should also be taken into account. The French Parliament will debate, in the upcoming weeks, on whether or not to implement this application. After the debate, if it is decided to deploy the application, the CNIL has requested to be consulted again in order to give its opinion on the final arrangements of the application “StopCovid”.
  • Germany
In Germany, one initiative emerged that initially garnered the strongest State support: the Pan-European Privacy-Preserving Proximity Tracing (“PEPP-PT”) initiative[12]. Composed of a consortium of over 130 members, including telecommunications operators, health service providers, scientists and other relevant actors and stakeholders, the PEPP-PT initiative was created on March 31, 2020 in order to develop and offer a tracing technology that would be compliant with EU privacy and data protection rules and would be effective in the contention efforts of States against COVID-19. However, the PEPP-PT initiative recently suffered strong criticism given its centralized structure, which requires users to upload contact logs to a central reporting server, thereby allegedly exposing users to direct State control.[13] The PEPP-PT protocol is allegedly supported by the UK, France[14] and, until recently, Germany.[15] Another initiative, Decentralized Privacy-Preserving Proximity Tracing (“DP-3T”), has also garnered strong support in the EU. Backed by Switzerland, Austria and Estonia,[16] in cooperation with companies such as Apple and Google, DP-3T would reportedly abide more strictly by the guidance offered by EU authorities, in particular regarding the reliance on proximity data technology and the absence of tracking of location data. Its decentralized structure does not require users to upload contact logs (which remain in the users’ device), and the processing of data to inform users of contacts with infected individuals occurs locally. Further, under the decentralized DP-3T approach users may opt to voluntarily share their phone number and details of their symptoms with the authorities, but this would not automatically occur as opposed to the centralized structure of the PEPP-PT initiative. Germany has been one of the main supporters of the PEPP-PT initiative until April 26, 2020, when it backed away from a centralized approach in favor of a decentralized system architecture. The German data protection commissioner[17] indicated on his website that contact tracing tools should be implemented in a transparent manner and on a voluntary basis. According to the commissioner, an individual tracking or a later re-personalization should be excluded.
  • Spain
In Spain, regions like Madrid, Catalonia, Basque Country and Valencia, have offered publicly-sponsored applications to trace infected individuals. Based on the tracing application developed by the Madrid Region, the Spanish Government launched the nation-wide application, initially covering a limited number of regions. These applications aim at tracing users and their contacts in order to alert them of potential COVID-19 contagion and spread. However, the overlapping uses of the different applications and their limited success (not exceeding 10% of the population in the respective regions) have put their effectiveness into question. Recently, it was reported that Spain is participating in the PEPP-PT initiative, although it is unclear if this position will shift to the DP-3T initiative, backed by other Member States. As soon as the technology would become available, Spain would require the cooperation of both public administrations and private entities to launch the automated tracing application[18]. On March 26, 2020, the Spanish Data Protection Agency (“AEPD”) published a communication on self-evaluation and contact tracing applications to fight COVID-19[19]. While the AEPD acknowledged that the GDPR and Spanish data protection rules cannot serve as an obstacle to limit the effectiveness of any measure, it reminded that fundamental data protection and privacy rights still needed to be complied with. As regards the legal bases available to offer contact tracing applications, the AEPD indicates that data processing by national and regional health authorities may be carried out in the public interest and to protect the vital interests of the individuals. Applications developed and operated by private entities need to rely on another legal basis in order to process personal data (e.g., consent). Any data collected may only be processed for purposes related to the control of the COVID-19 epidemic (e.g., to offer information on the use and control of the self-evaluation applications or to obtain statistics with aggregated geolocation data to offer maps that inform users on high/low risk areas). The AEPD also reminded app developers that parental authorization shall be required for users aged below 16.
  • United Kingdom
The United Kingdom DPA (“ICO”) published, on April 17, 2020, an opinion[20] on the Apple and Google joint initiative (called the Contact Tracing Framework) to enable the use of Bluetooth technology to help governments and public health authorities reduce the spread of the virus. The ICO indicates that the proposals for this initiative appear to be aligned with the principles of data protection by design and by default. It also specifies in its opinion that organizations designing contact tracing applications are responsible for ensuring that the application complies with data protection law and that such organizations are acting as controllers. The ICO also published a series of questions[21] to be taken into account to ensure that privacy concerns are properly considered when using digital tracing tools. Finally, it is worth noting that the ICO revealed[22] that it has been working with the National Health Service (“NHS”) in the context of the development of a contact tracing application in order to help them ensure a high level of transparency and governance. The NHS has emphasized its commitment to transparency, security and privacy and its collaboration with health data privacy stakeholders and advisers in developing the application[23]. However, the NHS’s proposed application is different from the Apple-Google model, in particular by using a structure centralized within the NHS, meaning that the matching process, which works out which phones to send alerts to, happens on a computer server rather than decentralized on handsets, to record infections and send out alerts. While it is hoped that this will make it easier for the NHS to notify people appropriately and adapt the system as knowledge improves, there may be a trade-off in terms of the central repository’s vulnerability to hackers.

***

As can be seen, the EU institutions, the EDPB and the EU Member State DPAs have published their guidelines and made clear the red lines that should be complied with in preparing and offering contact tracing tools. Regardless of whether they result from public or private initiatives, these rules and principles enshrined in the GDPR should drive the development and offering of contact tracing tools. The effectiveness of contact tracing applications relies on its wide adoption by users in a territory, which largely depends on the strength of the State sponsorship received or the ability of companies to advertise its use. From a technical standpoint, research projects and initiatives like PEPP-PT and DP-3T have emerged which aim at developing national applications based on a standardized approach. The infrastructure of the tools that are being developed under such protocols, centralized or decentralized, may come under scrutiny by the European Commission and the DPAs. However, given the importance of State sponsorship in the adoption of protocols for particular territories, and the apparent divergent approach followed by different EU Member States, it is likely that both protocols will co-exist within a non-harmonized EU approach. We will continue to monitor privacy and cybersecurity developments related to COVID-19 in Europe and around the world, and will provide further communications as developments warrant. Gibson Dunn's lawyers are also available to assist with any questions you may have regarding privacy implications of tracing tools in the United States. ____________________ [1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC. [2] Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector. [3] Guidelines 04/2020 on the use of location data and contact tracing tools in the context of the COVID-19 outbreak adopted on April 21, 2020. [4] Commission’s Inventory Mobile Solutions against COVID-19. [5] EDPB Statement on the processing of personal data in the context of the COVID-19 outbreak adopted on March 19, 2020. [6] Commission Guidance of April 17, 2020 on Apps supporting the fight against COVID 19 pandemic in relation to data protection. Please note that considering the urgency of the current situation, these guidelines will not be subject to public consultation. [7] Guidelines 04/2020 on the use of location data and contact tracing tools in the context of the COVID-19 outbreak adopted on April 21, 2020. [8] The EDPB defined anonymization as “the use of a set of techniques in order to remove the ability to link the data with an identified or identifiable individual against any “reasonable” effort”. [9] Letter of the EDPB to Sophie in’t Ved dated April 24, 2020; Letter of the EDPB to Mrs Ďuriš Nicholsonová and Mr Jurzyca's dated April 24, 2020. [10] Publication on the website of the Belgian DPA. [11] Decision n° 2020-046 of April 24, 2020 adopting an opinion on the project of mobile application “Stop Covid”. [12] Website of the PEPP-PT. [13] Website of Ouest France. [14] Reuters, Germany flips to Apple-Google approach on smartphone contact tracing, News Report dated April 26, 2020. [15] Statement by Helge Braun, Minister of the Chancellery, and Jens Spahn, Federal Minister of Health, on the tracing app, Press Release dated April 26, 2020 (available in German). [16] Supra note 13. [17] Publication on the website of the Federal Commissioner for Data Protection and Freedom of Information dated April 22, 2020 (available in German). [18] Publication on the website of El Pais dated April 14, 2020. [19] Publication on the website of the Spanish Data Protection Agency dated March 26, 2020. [20] ICO's Opinion: Apple and Google joint initiative on COVID-19 contact tracing technology dated April 17, 2020. [21] ICO’s Blog: Combatting COVID-19 through data: some considerations for privacy dated April 17, 2020. [22] ICO’s Statement in response to details about an NHSX contact tracing app to help deal with the COVID-19 pandemic dated April 24, 2020. [23] NHS Blog: NHSX: Digital contract tracing: protecting the NHS and saving lives dated April 24, 2020.
The following Gibson Dunn lawyers prepared this client update: Ahmed Baladi, Alexander Southwell, Patrick Doris, Michael Walther, Vera Lukic, Alejandro Guerrero, Clémence Pugnet, Selina Grün, Sarika Rabheru and Charlotte Fuscone. Gibson Dunn lawyers regularly counsel clients on the privacy and cybersecurity issues raised by this pandemic, and we are working with many of our clients on their response to COVID-19. Please also feel free to contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the Privacy, Cybersecurity and Consumer Protection Group: United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com) © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

April 22, 2020 |
Supreme Court to Resolve Longstanding Circuit Split Over Scope of Federal Anti-Hacking Statute

Click for PDF On Monday, April 20, 2020, the U.S. Supreme Court granted certiorari in Van Buren v. United States, No. 19-783, to address a decade-long circuit split regarding the scope of the Computer Fraud and Abuse Act (“CFAA”), 18 U.S.C. § 1030, a statute the Supreme Court has never before interpreted and that is routinely invoked in both criminal and civil settings.  The case gives the Court an opportunity to decide whether a person or entity legitimately authorized to access a computer for one purpose, but accesses it for some other unauthorized purpose, violates the CFAA.  The case has far-reaching implications for how millions of Americans interact with websites and use the Internet, including shaping potential criminal and civil liability for individuals who violate commonplace terms of service or exceed the scope of authorized use of their employer-provided email, computers, and databases.  It also has implications for companies drafting or revising their terms of service, updating their employee Internet or email policies, or engaging in business operations that may be seen as data “scraping,” among other situations. Statutory Background Congress first enacted Section 1030 in 1984, long before worldwide access to the Internet existed and before personal computers became ubiquitous.  The purpose of the statute was to deter “the activities of so-called ‘hackers’ who” were accessing “both private and public computer systems.”  H.R. Rep. No. 98-894, at 10 (1984).  Two years later, Congress amended the statute, and it became known as the CFAA.  Over the years, Congress has further amended the statute to cover a broad range of “protected” computers, which include servers and other technologies connected to the Internet. The CFAA covers multiple types of unlawful computer access and, in relevant part, provides that “[w]hoever . . . intentionally accesses a computer without authorization or exceeds authorized access, and thereby obtains . . . information from any protected computer,” commits a federal crime and may face civil liability.  18 U.S.C. § 1030(a)(2).  The phrase “exceeds authorized access,” which is an operative clause in a number of the provisions in the statute, is defined as: “to access a computer with authorization and to use such access to obtain or alter information in the computer that the accesser is not entitled so to obtain or alter.”  Id. § 1030(e)(6); see also id. § 1030(a)(1), (2), (4), (7).  A “protected computer” is any computer that “is used in or affect[s] interstate or foreign commerce or communication of the United States.”  Id. § 1030(e)(2)(B). Violations of the CFAA can result in both criminal and civil liability.  A criminal conviction under the “exceeds authorized access” provision of the CFAA is typically a misdemeanor, but can be a felony punishable by fines and imprisonment of up to five years in certain situations, including where the offense was committed for “commercial advantage or private financial gain.”  Id. § 1030(c)(2)(A), (B). Importantly, the statute also authorizes civil suits for compensatory damages and injunctive or other equitable relief by parties who show, among other things, that a violation of the “exceeds authorized access” provision caused them to “suffer[ ] damage or loss.”  Id. § 1030(g).  That provision is often invoked in civil suits around the country. Circuit Split For years, the courts of appeals have split over whether a person “exceeds authorized access” under Section 1030(a)(2) by using authorized computer access for an unauthorized purpose. On the one hand, the Second, Fourth, and Ninth Circuits have taken a narrow view, holding that a person “exceeds authorized access” only if he accesses information on a computer that he is prohibited from accessing—activity analogous to “breaking and entering” in the digital space.  See United States v. Valle, 807 F.3d 508, 523–28 (2d Cir. 2015); WEC Carolina Energy Sols. LLC v. Miller, 687 F.3d 199, 205–06 (4th Cir. 2012); United States v. Nosal (Nosal I), 676 F.3d 854, 857–63 (9th Cir. 2012) (en banc); see also hiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985, 999–1002 (9th Cir. 2019).  Under this view, for example, an employee who downloads confidential information from a company database that he is authorized to access, but who does so for the improper purpose of disclosing the information to someone outside the company, has not violated the CFAA.  See Nosal I, 676 F.3d at 857–63.  Nor has a company that uses automated bots to scrape information from another company’s public webpage in violation of the website’s terms of use.  hiQ Labs, Inc., 938 F.3d at 999–1002. On the other hand, the First, Fifth, and Seventh Circuits have taken a broader view, holding that a person “exceeds authorized access” if, even using a computer to access information that he is legitimately authorized to access, he does so for an improper or unauthorized purpose.  See United States v. John, 597 F.3d 263, 271–72 (5th Cir. 2010); Int’l Airport Ctrs., L.L.C. v. Citrin, 440 F.3d 418, 420–21 (7th Cir. 2006); EF Cultural Travel BV v. Explorica, Inc., 274 F.3d 577, 581–84 (1st Cir. 2001).  Under this view, for example, an employee who downloads confidential information from an internal company system that he is authorized to access in the course of his official duties, but who does so for the improper purpose of using that information to perpetrate a fraud or for some other unauthorized purpose, has violated the “exceeds authorized access” prong of the CFAA.  See John, 597 F.3d at 272–73.  So too has a company that uses data‑scraping software to systematically glean a competitor’s prices from the competitor’s public website.  EF Cultural Travel BV, 274 F.3d at 583–84. In Van Buren, the Eleventh Circuit joined those circuit courts that have taken a broader view of the CFAA’s statutory sweep, affirming the conviction and eighteen-month sentence of a police officer who used a computer to look up an exotic dancer’s license plate number in exchange for a loan.  The Eleventh Circuit reasoned that Van Buren “exceed[ed] authorized access” to the law-enforcement computer system when he used his legitimate access for an improper purpose, even though he had permission to access the database for other purposes.  United States v. Van Buren, 940 F.3d 1192, 1205–07 (11th Cir. 2019).  The court explained that it was bound by a previous decision in United States v. Rodriguez, 628 F.3d 1258 (11th Cir. 2010), which established that “even a person with authority to access a computer can be guilty of computer fraud [under the CFAA] if that person subsequently misuses the computer,” Van Buren, 940 F.3d at 1207.  Under that interpretation of “exceeds authorized access,” there was “no question” that the record contained sufficient evidence for a jury to convict Van Buren of computer fraud.  Id. at 1208. Policy Implications Although the Supreme Court might decide Van Buren narrowly based on the unique facts and procedural posture of the case, it is possible that the Court will take this opportunity to resolve the circuit split and to provide guidance about the scope of the CFAA’s “exceeds authorized access” provision.  If the Court does so, it will need to balance many competing policy interests. For example, those in favor of a narrow interpretation of the CFAA assert that the statute was not intended to be an all-purpose computer and Internet policing statute, but instead was intended to prohibit more egregious unauthorized access to computer systems akin to hacking.  An expansive reading of the statute, they contend, would subject individuals to civil or criminal liability for innocuous computer or Internet use, as when an individual violates a website’s terms of service or a school’s or employer’s computer use policy. In support of the petition for a writ of certiorari, the Electronic Frontier Foundation and other amici curiae even hypothesized that thousands of employees of federal government agencies, such as the Department of the Interior and U.S. Postal Service, would risk criminal prosecution under a broad interpretation of the statute if they violate their respective agency’s prohibitions against personal video streaming from commercial or news organizations on government-issued devices while connected to a government network.[1]  The same rationale would apply in the context of potential civil liability, wherein a broad interpretation of the CFAA could subject countless individuals to substantial damages awards or onerous court-ordered injunctions for violations of computer or Internet policies. Those endorsing the narrow view also contend that a broad construction of the statute would give prosecutors too much discretion and lead to arbitrary or discriminatory enforcement.  They cite as an example Internet “hacktivist” and Harvard University student Aaron Swartz, who was indicted for unlawfully accessing MIT’s computer network (where he was in fact an authorized user) and downloading a large number of academic journal articles in violation of the network’s terms of use.  Swartz tragically took his own life before standing trial. Supporters of the narrow view further posit that a broad construction of the CFAA would put the statute on a collision course with the First Amendment by punishing online investigative techniques commonly used by journalists, academic researchers, private investigators, and others engaged in expressive conduct or speech that may also run afoul of computer or Internet terms of use. By contrast, those in favor of a broader interpretation of the CFAA contend that an expansive interpretation of the statute is more consistent with congressional intent—to stop bad actors from computer-facilitated fraud and theft, in addition to hacking.  These proponents argue that fears of over-zealous or arbitrary criminal enforcement are overblown, particularly in light of DOJ guidance setting forth a uniform charging policy for computer crimes.  They also contend that a more expansive interpretation of the CFAA promotes a safer Internet that benefits and protects both companies and consumers alike, and can curb what some perceive to be unfair competitive intelligence practices, such as when a company scrapes data from the websites of competitors. The Supreme Court’s decision in Van Buren may provide much-needed clarity on these and other issues, giving companies, consumers, and law enforcement a better understanding of what type of online and computer conduct is subject to civil and criminal liability under the CFAA.  Any such guidance, in turn, would establish new parameters that companies—and others potentially liable for the activities of their agents—should closely follow when revising both their online terms of use and their internal policies governing how employees may use email, computers, and other technologies when logged onto an internal (or external) network. ____________________ [1] Brief for Electronic Frontier Foundation et al. as Amici Curiae Supporting Petitioner, Van Buren v. United States, No. 19-783 (2020), at 18–19.


Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these developments. For additional information, please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Appellate and Constitutional Law and Litigation practice groups, or the following authors: Authors:  Avi Weitzman, Matthew Benjamin, Joel M. Cohen, Alexander H. Southwell, Brandon Boxler, Erica Sollazzo Payne, Doran Satanove, and Samantha Weiss © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

April 9, 2020 |
Gibson Dunn Paris | Data Protection – April 2020

Click for PDF

Personal Data Watch

European Institutions

03/19/2020 – EDPB | Statement | COVID-19

The European Data Protection Board (EDPB) issued a statement on the processing of personal data in the context of the COVID-19 outbreak.

In its publication, the EDPB refers to the criteria for the lawfulness of the processing, outlines the main data protection principles and answers some questions relating to the processing of location data.

With respect to the lawfulness of the processing, the EDPB specifies the different legal basis that may apply in the context of an epidemic to allow employers and public health authorities to process personal data without the consent of individuals (i.e., processing in the public interest, processing to protect the vital interest of individuals, or to comply with a legal obligation). However, it is necessary to comply with national laws which may sometimes restrict these provisions.

Regarding the processing of mobile location data, the EDPB emphasizes in particular that public authorities should first seek to process these data in an anonymous manner and when it is not possible to only process anonymous data, Member States must introduce legislative measures to safeguard public security. An example of adequate safeguards would be to provide users of electronic communication services the right to a judicial remedy. Finally, the EDPB recalls that, in accordance with the principle of proportionality, the least intrusive solution should always be preferred.

For further information: Website EDPB

To be noted: Since March 6, 2020, several European Supervisory Authorities have also published recommendations on their websites with respect to the processing of personal data in the context of the fight against the COVID-19, sometimes adopting different approaches.

For more information: Austria | Belgium | Bulgaria | Czech Republic | Denmark | Finland | France | Germany | Hungria | Iceland | Ireland| Italia |Lithuania |Luxembourg | Norway | Poland | United Kingdom |Slovakia |Slovenia |Spain | Sweden |Switzerland


03/18/2020 – EDPS | Annual report 2019

On March 18, 2020, the European Data Protection Supervisor (EDPS) published its 2019 Annual Report.

This report presents the various activities conducted by the EDPS in 2019, which focused on consolidating the achievements of previous years, assessing the progress made and defining future priorities.

For more information: Website EDPS


03/04/2020 – Court of Justice of the European Union | Opinion | Advocate General | Consent

The Advocate General of the Court of Justice of the European Union, Maciej Szpunar, issued his Opinion in the case (Case C-61/19) between the Romanian Data Protection Supervisory Authority and the telecommunications provider Orange Romania.

In March 2018, the Romanian Supervisory Authority had imposed an administrative sanction against Orange Romania for having collected and kept copies of identity documents of its customers without their express consent. The authority noted that the company had concluded agreements for the provision of telecommunications services and copies of the identity documents were attached to them. These agreements would have stated that the customers had been informed and had given their consent to the collection and retention of these copies, as evidenced by the insertion of crosses in boxes added to the contractual clauses. However, according to the authority’s findings, the company did not provide evidence that, at the time the agreements were concluded, the customers had an informed choice as to the collection and retention of these copies.

In that context, Orange Romania brought an action before the national court challenging the fine imposed on it. The latter referred two questions to the Court of Justice for a preliminary ruling. In his Opinion, the Advocate General suggests that the Court should reply that an individual who wishes to conclude an agreement “for the provisions of telecommunication services with an undertaking does not give his or her ‘consent’, that is, does not indicate his or her ‘specific and informed’ and ‘freely given’ whishes, […] to that undertaking when he or she is required to state, in handwriting, on an otherwise standardized contract, that he or she refuses to consent to the photocopying and storage of his or her ID documents.”

For further information: Website IAPP | Curia


France

03/27/2020 – Council of State | Revocation of a decision from the French Supervisory Authority | Dereference | Google

On March 27, 2020, the Council of State revoked a decision of the French Supervisory Authority (the CNIL) concerning the geographical scope of the right to be forgotten.

On March 10, 2016, the CNIL had imposed a fine to Google for failing to comply with a formal notice issued by the CNIL to make effective the dereferencing on all national versions of its search engine Google Search. Google appealed against this decision before the Council of State which, in line with European Court of Justice’s ruling of September 24, 2019, revoked the CNIL’s decision.

In its decision, the Council of State emphasized that the alleged breach by Google must be ruled in accordance with the provisions of the French Data Protection Act No. 78-17 of January 6, 1978 as amended implementing the Directive of October 24, 1995, considering that the CNIL’s sanction was issued in 2016. The Council of State also refers to the European principle of dereferencing and considers that, as the French legislator has not adopted any specific provisions allowing the CNIL to dereference beyond the scope of EU law, the CNIL can only order a European dereferencing.

For further information: Council of State's Decision | CNIL Website


03/25/2020 – French Supervisory Authority | Recommendation on cookies & other trackers | Postponement

The adoption of the final version of the draft recommendation on “Cookies & other trackers”, initially scheduled for early April, is postponed.

On March 25, 2020, the French Supervisory Authority indicated in a publication on its website that the adoption of the final version of the draft recommendation on “Cookies and other trackers”, initially scheduled for early April, is postponed to a later date, which will be set depending of the evolution of the health situation.

For further information : CNIL Website


03/12/2020 – French Supervisory Authority | Control Strategy 2020

In a statement dated March 12, 2020, the French Supervisory Authority (the CNIL) presents the topics on which it will focus in priority in 2020, i.e., health data, geolocation used in the context of local services and cookies and other trackers.

With regard to the security of health data, the CNIL is willing to focus on the security measures implemented by health professionals or on their behalf in order to protect these data which are subject to a specific protection under applicable regulations.

As to geolocation, the CNIL wants to increase its investigations on services whose purpose is to facilitate daily life by using location data (these investigations will focus in particular on the proportionality of the data collected in this context, the retention periods defined, the information provided to individuals and the security measures implemented).

Concerning cookies and other trackers, the CNIL specifies that the recommendation to guide operators will be published in the spring of 2020 (please note that this date has been postponed following the COVID-19 crisis). A period of 6 months to comply will then be given to the organizations from the publication of the final recommendation. Thus, the CNIL will start its investigations in autumn 2020.

For further information: CNIL Website


03/06/2020 – French Supervisory Authority | COVID-19 | Recommendations

In the context of the health crisis of the COVID-19, the French Supervisory Authority (the CNIL) reminded, on March 6, 2020, the principles relating to the collection of personal data, in particular focusing on the collection of health data by employers.

In its publication, the CNIL responds to solicitations from professionals and individuals on the question of the collection of personal data in order to determine if individuals have symptoms of COVID-19. On this occasion, it provides for “dos and don’ts”.

As part of the “Don’ts”, the CNIL provides that employers should refrain from collecting - in a systematic and generalized manner or through surveys or individual requests - information related to the research of potential symptoms presented by an employee/agent or his/her relatives. As an example, the CNIL specifies that it is not possible to implement a mandatory body temperature readings for each employee/agent/visitor to be sent daily to his/her superiors.

Nevertheless, the CNIL mentions the possibilities offered to employers, particularly under the French Labor Code and their responsibility for the health and safety of their employees. For example, the CNIL specifies that employers can raise awareness and invite their employees to provide individual feedback concerning them in relation to a possible exposure to the virus. In the event of a reporting, an employer is entitled to record the date and identity of the individual suspected of having been exposed and the organizational measures taken and then, communicate to the health authorities, upon request, the elements related to the nature of the exposure. The CNIL also specifies that, in application of the French Labor Code, each employee/agent is obliged to inform his or her employer in the event of suspected contact with the virus.

For further information: CNIL Website


Ireland

03/25/2020 – Irish Supervisory Authority | COVID-19 | Data Subject Access Requests

On March 25, 2020, the Irish Supervisory Authority (DPC) issued a guidance on the handling of data subjects’ access requests in the context of COVID-19.

The authority states that it is aware of the difficulties created by the health crisis and recommends a methodology for responding to data subjects' requests. Any organization encountering difficulties in responding to requests should therefore, to the extent possible, communicate with individuals concerned about the processing of their request, including any extension of the response time. Furthermore, the DPC reminds that the GDPR provides for a two-month extension to respond to a request when necessary given the complexity and number of requests.

While it is not possible to derogate from the statutory obligations, the DPC specifies that if a complaint is filed, the facts of each case, including any mitigating circumstances specific to each organization, will be fully taken into account by the authority.

For further information: DPC Website


Italy

03/04/2020 – Italian Supervisory Authority | COVID-19 | Recommendations

The Italian Supervisory Authority published on its website recommendations on the processing of personal data in the context of the health crisis of COVID-19.

The Italian authority states that employers should not collect, in advance and in a systematic and generalized manner - including through specific requests to an employee or through surveys - information about the presence of symptoms or about their movements in a personal context. The authority reminds in this regard that the collection of information on the symptoms of COVID-19 and on the recent movements of each individual is the responsibility of health professionals and the civil protection system, which have to ensure compliance with public health rules.

However, the authority emphasizes that the employer may invite its employees to report if they went to a high-risk area, particularly since the employee has an obligation to inform his employer of any danger to health and safety in the workplace. Furthermore, the authority specifies that when an employee performing duties involving contact with the public encounters a suspected case of COVID-19 in the course of his/her work, he/she must ensure that the competent health services are informed - including through the employer - and must follow the preventive instructions provided by the health services consulted.

For further information: Website IAPP | Italian Supervisory Authority Website


Netherlands

03/03/2020 – Dutch Supervisory Authority | Fine | Sale of personal data

The Dutch Supervisory Authority imposed a fine of €525,000 on a tennis association for selling its members’ personal data.

In 2018, the association illegally sold the personal data of a few thousand of its members to two sponsors, providing them with data such as name, gender and address, so that the sponsors could offer them tennis-related offers. The association appealed against the sanction, arguing that it had a legitimate interest in selling its members’ data.

For further information: EDPB Website | Website IAPP


Poland

03/05/2020 – Polish Supervisory Authority | Fine | Biometric Data | Children

The Polish Supervisory Authority imposed an administrative fine of PLN 20,000 (less than €5,000) on a school for processing biometric data in a school canteen.

It has been established that the school used a biometric tool at the entrance of the school canteen to identify the children in order to verify the payment of meal prices. The authority pointed out that, in the context of this processing operation, the school processed special categories of personal data of 680 children without a legal basis when it had the possibility to use alternative forms of identification (e.g., electronic cards, names, contract numbers). In its decision, the authority ordered the deletion of the personal data processed in the form of digital information relating to the children's fingerprints and the cease of any further data collection.

For further information: EDPB Website | Polish Supervisory Authority Website


Spain

03/02/2020 – Spanish Supervisory Authority | Fines | Consent | Security Measures

On 27 and 28 February 2020, the Spanish Supervisory Authority imposed fines of a total amount of €168,000 on two Vodafone’s subsidiaries for violation of the GDPR.

The subsidiary Vodafone España was fined for violating the provisions on consent of Articles 5 (1) and 6 (1) of the GDPR as it was unable to provide proof of its customers’ consent to the processing of their personal data.

The second subsidiary, Vodafone ONO, was fined for failing to comply with the provisions of Article 32 of the GDPR relating to the implementation of appropriate technical and organizational measures to ensure data security.

For further information: Decision of the Spanish Supervisory Authority | Decision of the Spanish Supervisory Authority


Sweden

03/11/2020 – Swedish Supervisory Authority | Fine | Google | Right to request delisting

On March 11, 2020, the Swedish Supervisory Authority imposed a sanction of approximately 7 million euros on Google LLC for failure to comply with its obligations relating to the right to request delisting.

In 2017, the authority conducted an audit on how Google handles the individuals’ right to request delisting on its search engine.

In its decision, the authority ordered Google to delete a number of search results. In 2018, the authority conducted a follow-up audit to verify that Google had complied with its first decision. On this occasion, the authority notably found that Google had not correctly deleted two of the search result lists that the authority had ordered to delete in 2017.

On the one hand, Google interpreted too narrowly the web addresses to be removed from the list of search results. On the other hand, Google did not delete the list of search results without undue delay. It is specified that when Google removes a link from the search results, it informs the website to which the link is directed. This then allows the website to republish the page in question on another link which will then be displayed in a Google search. In other words, the right to delisting has no practical effect.

The authority noted that Google does not have a legal basis for informing website owners when search result lists are removed and that, furthermore, the company gives misleading information to individuals about the effectiveness of their requests. Therefore, the authority ordered Google to cease this practice. Google may appeal against this decision within 3 weeks. If Google decides not to appeal, this decision will take effect at the end of that period.

For further information: EDPB Website


United Kingdom

03/04/2020 – UK Supervisory Authority | Fine | Personal Data Breach

The UK Information Commissioner’s Office (ICO) has fined Cathay Pacific Airways Limited £500,000 for failing to protect the security of its customers’ personal data.

Between October 2014 and May 2018, the computer systems of the company lacked appropriate security measures which led to customers’ personal details being exposed, 111,578 of whom were from the UK, and approximately 9.4 million more worldwide. Due to the timing of these incidents, the ICO investigated this case under the Data Protection Act 1998.

Various errors were found during the ICO’s investigation (e.g., back-up files that were not password protected; insufficient anti-virus protection), which constitute a breach to Principle 7 of the Data Protection Act 1998.

For further information: Website IAPP |ICO Website


03/02/2020 – UK Supervisory Authority | Fine | Automated Nuisance Calls

The UK Information Commissioner’s Office (ICO) has fined CRDNN Limited with a £500,000 fine for making more than 193 million automated nuisance calls.

Following an investigation on computer equipment and documents in March 2018, the ICO investigation revealed that CRDNN Limited was found to be making nearly 1.6 million calls per day about window scrappage, debt management and window sales between 1 June and 1 October 2018. The calls were all made from fraudulent numbers, which meant that people who received the calls could not identify who was making them. The ICO considered that the company broke the law by not gaining consent from the phone owners to make those calls and by not providing a valid opt out.

For further information: ICO Website


Other News

03/31/2020 – Marriott | Personal Data Breach Notification

Marriott International announced that it has experienced another personal data breach.

The hotel chain stated that it discovered that guest information had been accessed using the login credentials of two employees. The incident was identified in February but was reported to have started in mid-January 2020. The compromised data included guest contact details, loyalty account information and personal details such as date of birth and gender. Marriott International has set up an online portal for guests to determine whether their personal data were involved in the incident.

For further information: Website IAPP | Website Marriott International


03/26/2020 – – EDPS | European Commission | Location Data

According to an article published on Reuters, various telecom operators accepted to share their data with the European commission in the context of the fight against the virus.

The European Data Protection Supervisor (EDPS) has published on its website a letter addressed to the European Commission in this respect. In its letter, the EDPS clarified that, to the extent the data are effectively anonymized, the rules of the GDPR would not apply to the data shared by telecom operators. Having said that, the EDPS indicated that such effective anonymization requires more than simply removing obvious identifiers (such as phone numbers) and that it is necessary to ensure that indirect identification is not possible.

Besides, the EDPS specifies the importance of respecting the principle of transparency, ensuring the deletion of data in the aftermath of the crisis and ensuring a high level of data security, in particular by ensuring that these levels of security will be respected by the third parties on which the Commission will rely to process the information.

For further information: Website Reuters | Website de l'EDPS


03/22/2020 – Cyber-attack | AP-HP (French hospitals)

On March 22, 2020, the Assistance Publique - Hôpitaux de Paris (AP-HP) would have been the target of a cyber-attack (denial of service attack - DDoS).

For further information: Website Nextinpact


03/15/2020 – ENISA | Cybersecurity | Teleworking

In a publication dated March 15, 2020, the Director of the European Network and Information Security Agency (ENISA) shared his top tips for teleworking in times of COVID-19.

It is notably recommended to work with a secure Wi-Fi connection and regularly updated anti-virus systems, make periodic backups and have up to date security software. With respect to the actions that employers can take, it is recommended that they provide regular feedback to their employees on the procedure to follow in case of problems.

For further information: Website ENISA


03/10/2020 – Criteo | Privacy International | French Supervisory Authority Investigation

On March 10, 2020, various media announced that Criteo would be subject to an investigation by the French Supervisory Authority (the CNIL) following a complaint by the association Privacy International.

Criteo sent a press release to the media "Techcrunch" confirming that in January 2020, the CNIL opened an investigation in response to a complaint filed by Privacy International in November 2018. The CNIL being the competent supervisory authority for Criteo, the company states that this procedure is normal and that it had already disclosed such investigation in its annual review. In particular, the company states in the press release that it will cooperate with the CNIL in its investigation and that it remains confident in its privacy practices.

For further information: Website Techcrunch


This newsletter has been prepared by the Technology & Innovation team of the Paris office. For further information, you may contact us by email:

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

April 1, 2020 |
The Cybersecurity and Infrastructure Security Agency of the Department of Homeland Security Updates Essential Critical Infrastructure Workforce Guidance

Click for PDF On Saturday, March 28, 2020, the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (“CISA”) revised its list of “Essential Critical Infrastructure Workers,” which provides expressly non-binding guidance to state and local authorities on identifying their essential workforce during the COVID-19 pandemic.  As explained by CISA Director, Christopher C. Krebs, in his memorandum accompanying the agency’s initial March 19 guidance, the list of essential workers was intended to “inform critical infrastructure community decision-making to determine the sectors, sub-sectors, segments, or critical functions that should continue normal operations[.]”  CISA’s revised guidance further emphasizes its advisory nature, modifying references to the document as “guidance” in the initial March 19 list to “advisory guidance” in the new version throughout.  It is therefore critical that businesses do not rely solely on the guidance in making any determinations about continuing operations, and that they first consult any orders and guidance issued by the states and localities in which they operate. The original March 19 list identified essential workers in 14 industry sectors: (1) Chemical; (2) Communications and Information Technology; (3) Critical Manufacturing; (4) Public Works; (5) Defense Industrial Base; (6) Law Enforcement, Public Safety, First Responders; (7) Energy; (8) Financial Services; (9) Food and Agriculture; (10) Other Community-Based Government Operations and Essential Functions; (11) Healthcare/Public Health; (12) Hazardous Materials; (13) Transportation and Logistics; and (14) Water and Wastewater. Since CISA released the initial guidance, state and local governments have relied upon it to varying degrees in implementing shelter-in-place and business closure directives.  Many states, including Hawaii, Indiana, Minnesota, North Carolina, and California, have expressly incorporated CISA’s guidance into their orders, often defining the types of workers and businesses that may continue physical operations based at least in part on the CISA list.  Other states, such as Washington, have not expressly incorporated CISA’s list into any definitions in their orders, but have mimicked language from the CISA list, or, in the case of Pennsylvania, represented on the state’s website that its governing business closure orders conform with CISA’s guidance.  On the other hand, states such as Virginia and New Jersey do not appear to have made any explicit references to the CISA guidance in implementing their stay-at-home orders. The revised guidance adds three news sectors to the list: (1) “Commercial Facilities,” encompassing workers supporting the supply chain of various commercial appliances relating to plumbing, ventilation, and refrigeration, among other things; (2) “Residential/Shelter Facilities and Services,” encompassing workers responsible for leasing residential properties, handling property management, and providing animal shelter and elderly care services, among others; and (3)  “Hygiene Products and Services,” encompassing workers who provide laundry and dry cleaning services, produce hygiene products, and install, maintain, and manufacture water heating equipment, among other things.  The revised list also expands the types of workers enumerated in the original 14 sectors.  For example, the “Public Works” sector has been revised to the “Public Works and Infrastructure Support Services” sector, newly encompassing HVAC technicians, landscapers, and “any temporary construction required to support COVID-19 response.”  And the “Energy” sector has been broadened to specify workers supporting the energy sector through renewable energy infrastructure and nuclear re-fueling operations, as well as workers involved in manufacturing and distributing equipment necessary for production at energy sector facilities.  Accordingly, we anticipate that states relying more heavily on CISA’s guidance may update their own orders and guidance to reflect the broader scope of the March 28 list of “Essential Critical Infrastructure Workers.” Businesses in states that rely more heavily on CISA’s guidance should consult the revised list and any subsequent adjustments to their state and local orders in making assessments about their physical operations, and in applying for any waivers from such orders to the extent applicable. Prior client alerts focusing on New York State’s executive orders regarding in-person workforce restrictions and guidance on essential businesses exempt from those orders may be accessed here, here, and here. Gibson Dunn is continuing to monitor developments relating to the restriction of non-essential business activity in various states.  Additional developments can be expected to follow in the coming days and weeks.


Gibson Dunn lawyers regularly counsel clients on the issues raised by this pandemic, and we are working with many of our clients on their response to COVID-19. For additional information, please contact any member of the firm’s Coronavirus (COVID-19) Response Team. Please also feel free to contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Public Policy Group, or the authors: Mylan L. Denerstein – Co-Chair, Public Policy Practice, New York (+1 212-351-3850, mdenerstein@gibsondunn.com) Lauren J. Elliot – New York (+1 212-351-3848, lelliot@gibsondunn.com) Lee R. Crain – New York (+1 212-351-2454, lcrain@gibsondunn.com) Stella Cernak – New York (+1 212-351-3898, scernak@gibsondunn.com) Doran Satanove – New York (+1 212-351-4098, dsatanove@gibsondunn.com) © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

March 20, 2020 |
Privacy and Cybersecurity Issues Related to COVID-19

Click for PDF Whatever industry you are in, you are undoubtedly concerned about the threat of the novel coronavirus (COVID-19).[1] Below, we summarize key privacy and cybersecurity implications of collecting and sharing personal information from employees, site visitors, and other individuals to manage COVID-19 risk, as well as cybersecurity risks of these and other management and mitigation efforts. Despite the need to take swift action in this rapidly evolving environment, we recommend that companies consider how to do so in a manner that minimizes privacy- and cybersecurity-related legal risks. Various regulatory agencies have issued guidance in the last several days indicating that privacy laws that limit the collection and disclosure of personal information remain in effect. And the implementation of work-from-home and other arrangements has increased exposure to various cybersecurity risks—risks that hackers have moved swiftly to exploit.

The United States

Though there is no federal data protection law in the United States, the Centers for Disease Control and Prevention (“CDC”) and the US Equal Employment Opportunity Commission (“EEOC”) have advised employers to keep certain personal health data confidential, and most companies have made commitments to their employees, customers, and/or users about keeping their personal health data confidential. In addition, some state laws, such as the California Consumer Privacy Act (“CCPA”), impose transparency requirements on covered businesses, and may result in additional liabilities in light of data breaches. Generally, when implementing COVID-19 risk mitigation measures in the United States, companies may wish to consider the following privacy and cybersecurity-focused steps:
  • Provide notice before collecting and limit use of personal information. If you decide to collect additional personal information at this time, particularly sensitive personal data, such as health or medical information, whether through the use of surveillance technologies such as thermal cameras or otherwise, consider notifying employees, visitors, or any individuals prior to such collection. For example, the CCPA requires that covered businesses provide notice to California residents (including employees) regarding the categories of information collected and uses of that information at or before the time of collection; even where companies and employers have implemented CCPA-compliant privacy policies, the collections and uses of personal information for the COVID-19 pandemic may be sufficiently novel that additional notice is required.[2] Companies should, however, be cautious about the language employed to notify individuals of these data collection practices, especially being careful not to concede that the company is processing the data, if accurate (for example, information reviewed for real-time monitoring may be treated differently by regulatory authorities than health information the company chooses to store for potential further use).For companies collecting additional information, including sensitive health or medical data, consider limiting the company’s use and retention of such data to the monitoring of health and public safety conditions at work, and de-identify the data to the extent possible. Before considering any further uses or retention of such data, consider contacting outside counsel to properly weigh potential privacy risks.
  •  Implement reasonable security protocols and issue cybersecurity reminders to employees. Consider implementing reasonable security protocols and data minimization efforts that are appropriate to the sensitivity of the personal information, such as encryption, data separation, and data access controls, to collect and store data. For instance, the Genetic Information Nondiscrimination Act (“GINA”)[3] requires that information obtained pursuant to medical examinations of employees must be collected and maintained on separate systems and treated confidentially.[4] And the CCPA threatens a private right of action and/or enforcement if businesses suffer a data breach due to its “violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information.”[5] Consider also limiting collection, use, and retention of such data to what is absolutely necessary.Additionally, cybersecurity threat actors are particularly likely to exploit vulnerabilities in companies’ IT systems to gain access to sensitive personal information during these times of turmoil, particularly as companies are struggling to implement unprecedented work-from-home policies and information is less centralized. Cybersecurity firms have reported an increase in malware attacks in which threat actors have used the widespread panic related to COVID-19 to trick victims into running malware.[6] Hackers are actively targeting companies that launched a work-from-home policy in response to the COVID-19 outbreak by exploiting outdated virtual private networks, a lack of multi-factor authentication, and insecure at-home servers.[7] In light of this potential increase in the exploitation of cybersecurity vulnerabilities, consider re-evaluating your cyber-security posture and issuing reminders to employees to be on the lookout for phishing attempts, to employ secure connections, and that the company will not collect passwords or personal information relating to their online accounts, as part of the company’s coordinated COVID-19 response efforts. Indeed, in discussing the CCPA during the COVID-19 crisis, an advisor to the California Attorney General has been quoted as “encourag[ing] businesses to be particularly mindful of data security in this time of emergency.”[8]
  •  Seek counsel before implementing any preventative or reactive data collecting or sharing measures. Details and circumstances related to COVID-19 are changing constantly and the impulse to collect and share data to help stop its spread can be strong. Privacy rules and requirements, however, must also be kept in mind and considered, particularly in light of potential additional liabilities under new laws, such as the CCPA.
In the event that you learn that an employee or visitor to your facilities tests positive for COVID-19, consider taking the following steps to lessen the company’s exposure to privacy-related liability:
  • Consider how you learn about an employee’s exposure. Companies should consider having prepared responses ready for communications with affected employees. Whether the company learns directly from the affected employee or indirectly, the prepared response may notify the employee that their exposure to COVID-19 will be communicated to other employees, but that their identity will not be revealed. If the company learns directly from the employee, the company’s response may also convey appreciation for the employee’s willingness to come forward.
  • Inform potentially exposed employees without disclosing the identity of affected individuals. Consistent with the advice above, the CDC has advised employers to inform fellow employees of their possible exposure to COVID-19, but warned against disclosing the identity of the individual who tested positive.[9] Companies may also wish to advise potentially impacted customers, vendors, and visitors of their exposure, while maintaining the confidentiality of the affected individual.
  • Maintain confidentiality and promises made in Employee Handbooks and Terms of Use. A number of federal laws, including the Americans with Disabilities Act (“ADA”),[10] and GINA,[11] and regulations promulgated pursuant to the Family and Medical Leave Act of 1993 (“FMLA”),[12] impose confidentiality requirements related to medical and health data of employees. In general, the EEOC has advised that “[e]mployers must maintain all information about employee illness as a confidential medical record in compliance with the ADA.”[13] Furthermore, employers should avoid involuntary disclosure of confidential information to employees’ supervisors, although employers may share information about the specific accommodations needed by employees.[14]
Companies should also check their Employee Handbooks and Terms of Use to ensure that any promises made in those documents are carried out—or updated with clear notice to employees—before implementing data collection and sharing practices. These promises should be acknowledged in any communications with employees, visitors, vendors, customers, or others, where applicable.

Europe

On March 19, 2020, the European Data Protection Board (“EDPB”) adopted a statement on the processing of personal data in the context of COVID-19.[15] The statement emphasized that while data protection rules, including the European Union’s General Data Protection Regulation (“GDPR”) should not “hinder measures taken in the fight” against COVID-19, data controllers and processors must ensure, “even in these exceptional times,” the protection of individuals’ personal data. Specifically, the EDPB explained that any measure taken in this context should comply with general principles of law, adding that “emergency is a legal condition which may legitimize restrictions to freedom provided these restrictions are proportionate and limited to the emergency period.” Among the core data privacy principles to be abided, the EDPB highlighted that individuals should receive transparent information on processing activities, including related purposes for processing and retention periods. Companies must adopt adequate security measures and confidentiality policies, as well as document measures implemented and underlying decision-making processes to manage the current emergency. With respect to legal bases for processing personal data, the EDPB explained that the GDPR provides legal grounds to enable employers and competent public health authorities to process data in the context of an epidemic, in accordance with national law and within the conditions set therein. In the employment context, the processing may be necessary “for compliance with a [national] legal obligation to which the employer is subject (such as obligations relating to health and safety at the workplace) or in the public interest, such as the control of diseases and other threats to health.”[16] The EDPB also emphasized that the exceptions to the prohibition of processing of health data[17] may be available to companies “where it is necessary for reasons of substantial public interest in the area of public health”[18] or “where there is a need to protect the vital interests of the individual.”[19] However, though the EDPB provided answers to some questions about the processing of data in the employment context, it failed to offer any concrete recommendations and limited its answers primarily to restating the general data protection rules (such as proportionality and data minimization principles) and relevant national laws. Member State Data Protection Authorities (“DPAs”) have also issued their own guidance in recent weeks with respect to the processing of personal data in this context.[20] These authorities have emphasized the general principles of lawfulness, necessity, transparency, and proportionality of the processing, as well as the principle of data minimization, set forth under the GDPR, and some have encouraged data controllers to refer to instructions and preventative measures issued by public health authorities for guidance. However, these DPAs have generally failed to adopt a unified approach. This legal context makes it challenging for companies to ensure compliance with applicable data privacy laws throughout Europe, let alone maintain consistency with a global approach, including the United States. Companies should consider carefully, in consultation with their legal department and outside counsel, the privacy implications in each European country of engaging in data collection and sharing in the context of the COVID-19 pandemic. The following table summarizes the developments across Europe of several key DPAs with respect to the collection and processing of personal information in the context of the COVID-19 outbreak, with further detail following.
Data Protection Authority Processing Legal Basis and Exceptions Emergency Data Collection Measures Application of Data Privacy Principles and Protections
Belgium
  • Broad application of Art. 6(1)(d) not justified for prevention measures.
  • Companies cannot rely on Art. 9(2)(i) except upon express mandatory instructions from health authorities.
  • Health risk assessment may only be carried out by the workplace doctor based on Art. 6(1)(c) and 9(2)(d).
  • GDPR Principles are still applicable.
France
  • Not addressed by DPA.
  • Companies cannot implement mandatory and systematic body temperature measurement.
  • Employers may invite employees to report their potential exposure to COVID-19.
  • In the event of such employee’s reporting, employers can record the identity of affected individuals and resulting remedial measures taken, and can report elements related to the nature of the exposure to health authorities, on request.
  • GDPR Principles are still applicable.
Germany
  • Art. 9(2)(b) and 6(1) constitute relevant legal bases and exceptions to process health and other personal data.
  • Employers can ask employees/visitors for appropriate health and other personal information for the purpose of reducing the spread of COVID-19.
  • GDPR Principles are still applicable.
Spain
  • Art. 9(2)(b) might constitute relevant legal basis and exception to process health and other personal data.
  • Under Spanish labor and risk prevention laws, employers have a duty to protect employees from and prevent work risks, in consultation with Works Council.
  • GDPR Principles are still applicable.
United Kingdom
  • Not addressed by DPA.
  • DPA considers asking people if they have visited a particular country or are experiencing COVID-19 symptoms to be reasonable.
  • On the collection of employee and visitor health data by companies, DPA stressed the general data protection principles.
  • DPA will not penalize companies that might not meet usual privacy standards or deadlines to respond to data subject requests to the extent they need to prioritize other areas or adapt their usual approach during this period.
  • GDPR Principles are still applicable.
  • Statutory timescales are not to be extended, but DPA will inform data subjects that they may experience delays when making information rights requests during the pandemic.
As the table reflects, the approach taken by European DPAs has varied significantly by jurisdiction:
  • Belgium
On March 13, 2020, the Belgian DPA stated that companies should not adopt a broad application of the legal basis that allows for processing necessary to safeguard the vital interests of the individuals under Article 6(1)(c) of the GDPR[21] when implementing preventive measures.[22] The Belgian authority also explicitly noted that companies cannot rely on Article 9(2)(b), which allows for the processing of personal data when it is “necessary for reasons of public interest in the area of public health,” unless they are required to do so pursuant to explicit instructions from the Belgian health authorities. Rather, companies should rely on workplace doctors to inform employers and persons who have been in contact with the affected employee, in accordance with Articles 6(1)(c) and 9(2)(b) of the GDPR.
  • France
On March 6, 2020, the French DPA published guidance on the collection of data, and in particular employee data, in the context of COVID-19.[23] While the French authority provided that companies should refrain from implementing a mandatory body temperature measurement for employees/agents/visitors (similar to the position taken by the Belgian, Hungarian, and Luxembourgian DPAs), it indicated that employers may invite their employees to report their potential exposure to them or to the competent health authorities. In the event of such reporting, the employer is then entitled to record the date, identity of the allegedly affected individual, and the remedial measures taken (e.g., containment, remote working, contact with occupational health care resources), and to communicate information related to the nature of the exposure to health authorities, on request.
  • Germany
On March 13, 2020, and March 17, 2020, the German Conference of Federal and State Data Protection Authorities (“DSK”) and several state-level DPAs published COVID-19 guidance on the collection and processing of health data, respectively.[24] The guidance confirms that Articles 9(2)(b) and 6(1) may provide the appropriate legal bases for the processing of relevant health data and other personal data in the context of the COVID-19 pandemic. Though this guidance suggests that a company may ask both employees and visitors for health-related information relevant to reduce risks to other employees and the public, it also emphasized that companies must process this information in accordance with general GDPR principles. Permissible questions would likely include asking whether individuals have tested positive for COVID-19, have been in contact with someone who has, or have recently visited an area classified as a risk area by the German Center for Disease Control, the Robert Koch Institute. Companies will likely also be permitted to collect and process information about employees who test positive for the virus or who have been exposed to affected individuals for the purposes of informing co-workers on an anonymous basis. While doctors and other medical personnel are required by law to report COVID-19 cases, this does not seem to apply to employers. However, the guidance is ambiguous with respect to other data collection and processing practices, such as temperature testing.
  • Spain
On March 12, 2020, the Spanish DPA published a report regarding the processing of personal data in the context of the COVID-19 outbreak. Although the report is mainly applicable to the public administration, the authority stated that, in the context of employer-employee relationships, Articles 6(1)(c) and 9(2)(b) of the GDPR may constitute relevant legal bases and exceptions to process health and other personal data. Under Spanish labor and risk prevention laws, employers, in consultation with workers through Works Councils, have a duty to protect employees from and to prevent work risks, and to regularly monitor the health conditions of employees with respect to risks inherent to work, all the while respecting the right to privacy of employees and the confidentiality of the data.[25]
  • United Kingdom
On March 12, 2020, the United Kingdom DPA issued guidance that stated that it would not penalize companies that the DPA “knows need to prioritise other areas or adapt their usual approach during this extraordinary period.”[26] It described itself as a “reasonable and pragmatic regulator, one that does not operate in isolation from matters of serious public concern.”[27] The DPA stated: “Regarding compliance with data protection, we will take into account the compelling public interest in the current health emergency.”[28] Conversely, in its discussion of the collection of health data of employees and visitors, the UK authority only emphasized the GDPR principles. The UK DPA did note, however, that it was permissible to inform staff if a colleague contracted COVID-19, noting that the affected individual should not be named and no more information than is necessary should be shared. The UK authority also noted that it would be reasonable to ask employees if they had visited a particular country, or are experiencing COVID-19 symptoms.[29] We will continue to monitor privacy and cybersecurity developments related to COVID-19 in the United States, Europe, and around the world, and will provide further communications as developments warrant. _____________________    [1]   The lawyers on Gibson Dunn's cross-functional COVID-19 Response Team—who are linked with subject-matter experts throughout the firm—are available to assist with any questions you may have regarding developments related to the COVID-19 outbreak. See https://www.gibsondunn.com/coronavirus-covid-19-resource-center/.    [2]   Though California Attorney General Xavier Becerra cannot bring enforcement actions until July 1, 2020, the California Chamber of Commerce, the Internet Coalition, the Association of National Advertisers and approximately 30 other companies across a range of industries sent the California Attorney General a letter calling for this impending deadline to be delayed until January 2, 2021 in order to allow companies more time to respond to the unique challenges posed by the COVID-19 outbreak. See Allison Grande, COVID-19 Warrants CCPA Enforcement Delay, Calif. AG Told, Law360 (March 19, 2020), available at https://www.law360.com/articles/1255181/covid-19-warrants-ccpa-enforcement-delay-calif-ag-told.    [3]   Pub. L. No. 110-233, 122 Stat. 881 (codified as amended in scattered sections of 29 & 42 U.S.C.).    [4]   42 U.S.C. § 12112(d)(3).    [5]   Cal. Civ. Code § 1798.150.    [6]   Zack Whittaker, Hackers are jumping on the COVID-19 pandemic to spread malware, TechCrunch (March 12, 2020), available at https://techcrunch.com/2020/03/12/hackers-coronavirus-malware/.    [7]   Anthony Schoettle, Hackers pounce as coronavirus spread triggers work-at-home movement, IBJ (March 13, 2020), https://www.ibj.com/articles/hackers-pounce-as-coronavirus-spread-triggers-work-at-home-movement.    [8]   See Allison Grande, COVID-19 Warrants CCPA Enforcement Delay, Calif. AG Told, Law360 (March 19, 2020), available at https://www.law360.com/articles/1255181/covid-19-warrants-ccpa-enforcement-delay-calif-ag-told.    [9]   Centers for Disease Control and Prevent, Coronavirus Disease 2019 (COVID-19), Interim Guidance for Businesses and Employers, available at https://www.cdc.gov/coronavirus/2019-ncov/community/guidance-business-response.html. [10]   Pub. L. No. 101-336 (relevant provisions codified at 42 U.S.C. § 12112(d)(3)(B); §12112(d)(4)(C)). [11]   Pub. L. No. 110-233, 122 Stat. 881 (codified as amended in scattered sections of 29 & 42 U.S.C.).122 Stat. 881.206(a). [12]   Pub. L. No. 111-84, 123 Stat. 124 (codified 29 C.F.R. § 825.500 (g)). [13]   The US Equal Employment Opportunity Commission, Pandemic Preparedness in the Workplace and the Americans with Disabilities, available at https://www.eeoc.gov/facts/pandemic_flu.html. [14]   29 C.F.R. § 1630.14(c)(1)(i). [15]   The European Data Protection Board, Statement of the EDPB Chair on the processing of personal data in the context of the COVID-19 outbreak (March 19, 2020), available at https://edpb.europa.eu/our-work-tools/our-documents/other/statement-processing-personal-data-context-covid-19-outbreak_en. [16]   Id. [17]   Under the GDPR, data regarding an individual’s health, even body temperature, may be considered as a “special category” of personal data under the GDPR. In principle, the processing of such personal data is prohibited unless one of the exceptions listed in Article 9(2) of the GDPR applies. Processing of health data should generally comply with the specific rules set forth under the GDPR, but also with the additional requirements of each Member State, where applicable (Article 9(4), GDPR). [18]   Article 9(2)(i), GDPR. [19]   Article 9(2)(c), GDPR and recital 46 of the GDPR explicitly referring to the control of an epidemic. [20]   Including the following countries: Austria, Belgium, Bulgaria, Czech Republic, Denmark, Finland, France, Germany, Hungary, Iceland, Ireland, Italy, Lithuania, Luxembourg, Norway, Poland, Slovakia, Slovenia, Spain, Sweden, Switzerland, and the United Kingdom. [21]   Article 6(1)(d), GDPR. [22]   Belgian data protection authority (APD), “COVID-19 et traitement de données à caractère personnel sur le lieu de travail” (March 13, 2020), available at https://www.autoriteprotectiondonnees.be/covid-19-et-traitement-de-donn%C3%A9es-%C3%A0-caract%C3%A8re-personnel-sur-le-lieu-de-travail. [23]   French data protection authority (CNIL), “Coronavirus (Covid-19) : les rappels de la CNIL sur la collecte de données personnelles” (March 6, 2020), available at https://www.cnil.fr/fr/coronavirus-covid-19-les-rappels-de-la-cnil-sur-la-collecte-de-donnees-personnelles. [24]   Datenschutzkonferenz (DSK), “Datenschutzrechtliche Informationen zur Verarbeitung von personenbezogenen Daten durch Arbeitgeber und Dienstherren im Zusammenhang mit der Corona-Pandemie” (March 13, 2020), available at https://www.bfdi.bund.de/DE/Datenschutz/Themen/Gesundheit_Soziales/GesundheitSozialesArtikel/Datenschutz-in-Corona-Pandemie.html?nn=5217154; cf. also Landesbeauftragter für den Datenschutz und die Informationsfreiheit Baden-Württemberg, “Hinweise zum datenschutzgerechten Umgang mit Corona-Fällen” (March 13, 2020), available at https://www.baden-wuerttemberg.datenschutz.de/wp-content/uploads/2020/03/FAQ-Corona.pdf and Landesbeauftragter für den Datenschutz und die Informationsfreiheit Rheinland-Pfalz, “Beschäftigtendatenschutz in Zeiten des Corona-Virus” (March 17, 2020), available at https://www.datenschutz.rlp.de/de/themenfelder-themen/beschaeftigtendatenschutz-corona/. [25]   Spanish data protection authority (AEPD), press release and report (March 12, 2020), available at https://www.aepd.es/es/documento/2020-0017.pdf. [26]   United Kingdom data protection authority (ICO), “Data protection and coronavirus” (March 12, 2020), available at https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/03/data-protection-and-coronavirus/ and https://ico.org.uk/for-organisations/data-protection-and-coronavirus/. [27]   Id. [28]   Id. [29]   Id.
Gibson Dunn's lawyers are available to assist with any questions you may have regarding developments related to the COVID-19 outbreak. For additional information, please contact any member of the firm's Coronavirus (COVID-19) Response Team. The following Gibson Dunn lawyers prepared this client update: In the US: Alexander H. Southwell, Ryan T. Bergsieker, Cassandra L. Gaedt-Sheckter, Daniel E. Rauch, and Lisa V. Zivkovic; in the EU: Ahmed Baladi, Patrick Doris, Michael Walther, Vera Lukic, Alejandro Guerrero, Kai Gesing, Selina Grun, and Clemence Pugnet. Gibson Dunn lawyers regularly counsel clients on the privacy and cybersecurity issues raised by this pandemic, and we are working with many of our clients on their response to COVID-19. Please also feel free to contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the Privacy, Cybersecurity and Consumer Protection Group: United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)

Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan - Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

March 17, 2020 |
California Consumer Privacy Act Update: Attorney General Proposes Further Revisions to CCPA Regulations

Click for PDF While we recognize the COVID-19 coronavirus and its impact are top of mind for all of us,[1] we also want to keep you informed of time-sensitive developments that, as of this writing, are still moving forward: On March 11, 2020, California Attorney General Xavier Becerra released another set of revisions to the proposed regulations implementing the California Consumer Privacy Act of 2018 (“CCPA”).[2] As Gibson Dunn noted last month, Attorney General Becerra previously released an initial set of proposed CCPA regulations on October 10, 2019; a first revised set of proposed regulations on February 7, 2020; and an additional amendment on February 10, 2020.[3] This latest set of changes was promulgated in response to comments received on the February modifications. Under California’s regulatory process, the public must have at least 15 days to comment on these changes, meaning, in this case, comments must be submitted by March 27, 2020.[4] After that point, if no other changes are made to the regulations, the Attorney General’s office will prepare a summary and response for each comment submitted. California’s Office of Administrative Law will then have 30 working days to approve the regulations, at which point, they would be finalized. Note, however, that the Attorney General is empowered to enforce the CCPA as of July 1, 2020, whether or not final regulations are in place before then. Below, we briefly summarize the most impactful of the March changes.

Deletion of guidance on definition of “personal information”

Perhaps the most significant change in the March revisions is the removal of February’s guidance for interpreting the definition of “personal information” under the CCPA.[5] Last month, Attorney General Becerra proposed adding guidance that whether data constituted “personal information” depended on the manner in which a business maintained that data. Specifically, data such as IP addresses would only constitute “personal information” if it reasonably could be linked to an identifiable consumer or household. The March revisions, however, have deleted this guidance, raising concerns regarding the breadth of what might encompass “personal information” for CCPA purposes.

Change in the definition of “financial incentive”

Under the February revisions, language was added further confirming that offering a “financial incentive” to consumers related to the value of that consumer’s data does not run afoul of California’s statutory ban on discriminatory pricing.[6] The March revisions, in turn, redefine such a “financial incentive” as a benefit “related to the collection, retention, or sale” of personal information, as opposed to “compensation for the disclosure, deletion, or sale” of personal information (the February definition).[7] This change is cross-referenced throughout the March revisions.[8] Notably, however, while the CCPA’s statutory text refers to “compensation” for the collection, sale, or deletion of personal information,[9] the regulations, as noted, refer to the potentially broader concept of a benefit “related to” such activities, and no longer mention “deletion” or “disclosure,” creating a potential ambiguity.

Removal of the optional “opt-out” button

The March revisions have also removed draft provisions suggesting an “opt-out” button go alongside the “Do Not Sell My Personal Information” link on businesses’ websites.[10] This does not change February’s operative modification that allowed companies to obtain user consent to sell data that the business collected from that individual during a time in which it did not provide a notice of the right to opt-out, a change from the total ban of the sale of such data present in earlier versions of the proposed regulations.[11]

Relaxation of notice requirement for companies not selling consumer data

The March revisions state that businesses that do not collect personal information “directly” from a consumer do not need to provide notice at the time of collection to the consumer, so long as that company “does not sell the consumer’s personal information.”[12] This change should ease the burden on certain companies, although the term “directly” is not defined, creating some potential ambiguity.

Additional requirements for privacy policies

One significant change in the March revisions is the reintroduction of the requirement to list in privacy policies “the categories of sources from which the personal information is collected,”[13] and the “business or commercial purpose for collecting or selling personal information.”[14] Additionally, if a company has “actual knowledge” that it sells the personal information of minors under 16 years of age, then its privacy policy must include a description of the special rules and processes for providing the right to opt-in to the sale of personal information of minors.[15] Recall that businesses must gain affirmative authorization before selling the personal information of minors under 13 years of age and consent from consumers at least 13 and less than 16 years of age before selling their personal information.[16]

Responding to requests to know and requests to delete

  • Consumers must be informed if sensitive data categories have been collected, even if such information itself is not to be disclosed to the consumer
Under the February revisions and earlier versions of the proposed regulations, businesses were forbidden, in response to “requests to know,” from disclosing certain sensitive categories of information, including biometric data, Social Security numbers and financial account numbers.[17] The March revisions clarify that businesses “shall, however, inform the consumer with sufficient particularity that it has collected the type of information.”[18] For instance, a business should disclose, in response to a request to know, that it “collects unique biometric data including a fingerprint scan,”[19] without disclosing the actual fingerprint scan itself.
  • Businesses must provide consumers denied deletion with option to opt-out of sale of their personal information
Adopting a provision that was previously only salient for “unverified” requests for deletion, the March revisions make clear that any time a company denies a request for deletion, it must inform the requestor that they also have a right to request the alternative relief of “opting out” of the sale of data (unless that consumer has already made such an opt-out request).[20]

Conclusion

The March revisions of the CCPA regulations provide additional clarification on certain ambiguities in the CCPA and previous iterations of the regulations. Moreover, the fact that this latest round of changes was less far-reaching than the February revisions suggests the regulations are nearing their final form. However, as the reversals from February’s amendments make clear, further change is still possible, and there remain important questions (such as, for instance, the meaning of a “sale” under the regulations) that have yet to be addressed with sufficient particularity for many impacted businesses. Businesses subject to the CCPA should continue to monitor the proposed regulations as they evolve. It is also important to provide comments and weigh in by March 27, 2020 on issues of interest to particular companies that remain unclear. We are available to assist with your inquiries as needed. ____________________    [1]   Gibson Dunn will continue to prepare updates regarding the impact of the COVID-19 coronavirus on a wide range of issues, including data privacy and cybersecurity, during this unprecedented moment. We also remain available to assist with these issues as our clients continue to navigate various legal and business challenges posed by COVID-19. See https://www.gibsondunn.com/coronavirus-covid-19-resource-center/.    [2]   The entire text of the draft regulations, including the most recent revisions, is available at https://www.oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-text-of-second-set-mod-031120.pdf?.    [3]   California Consumer Privacy Act Update: Attorney General Proposes Regulations Version 2.0, Gibson Dunn (Feb. 19, 2020) available at https://www.gibsondunn.com/california-consumer-privacy-act-update-attorney-general-proposes-regulations-version-2-0/.    [4]   Department of Justice, Title 11, Division 1, Chapter 20. California Consumer Privacy Act Regulations (March 11, 2020), available at https://www.oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-notice-of-second-mod-031120.pdf?.    [5]   Draft Regulations § 999.302 [DELETED].    [6]   Draft Regulations § 999.336(b).    [7]   Draft Regulations § 999.301(j).    [8]   See, e.g., Draft Regulations § 999.301(o); § 999.307(a)(1).    [9]   Cal. Civil Code § 1798.125(b)(1). [10]   Draft Regulations § 999.306(f) [DELETED]. [11]   Draft Regulations § 999.306(e). [12]   Draft Regulations § 999.305(d). [13]   Draft Regulations § 999.308(c)(1)(e). [14]   Draft Regulations § 999.308(c)(1)(f). [15]   Draft Regulations § 999.308(c)(9). [16]   See Draft Regulations §§ 999.330-32. [17]   Draft Regulations § 999.313(c)(4). [18]   Id. [19]   Id. (internal quotation omitted). [20]   Draft Regulations § 999.313(d)(7).
The following Gibson Dunn lawyers assisted in the preparation of this client update: Alexander Southwell, Ryan Bergsieker, Cassandra Gaedt-Sheckter, Dan Rauch, and Lisa Zivkovic. Gibson Dunn's lawyers are available to assist in addressing any questions you may have regarding these developments.  Please contact the Gibson Dunn lawyer with whom you usually work, or any member of the firm's California Consumer Privacy Act Task Force or its Privacy, Cybersecurity and Consumer Protection practice group: California Consumer Privacy Act Task Force: Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Cassandra L. Gaedt-Sheckter - Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Alexander H. Southwell - New York (+1 212-351-3981, asouthwell@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Please also feel free to contact any member of the Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)

Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan - Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

February 26, 2020 |
4 Questions That May Signal The End Of TCPA Class Actions

Click for PDF

The Telephone Consumer Protection Act has long been a favorite of the plaintiffs privacy bar, as the act provides up to $1,500 in damages per unwanted call.[1] That adds up very quickly in even the most modestly sized class actions.

Since the TCPA was enacted in 1991, the Federal Communications Commission has for years essentially updated the law through its own orders. These orders have applied the TCPA to technology like text messages and app-to-text messaging, none of which was even a glimmer in the eyes of those who passed the law decades ago. Similarly, although the TCPA was passed during a time where an automatic telephone dialing system, or autodialer, was limited to a machine that would randomly generate 10 digits to be called, the FCC determined that an autodialer is not limited to those systems. Moving beyond the plain language of the statute, the FCC said that a predictive dialer would constitute an autodialer so long as the system stores numbers and automatically dials them. Several of the circuit courts across the country then essentially inoculated the FCC’s interpretations from being questioned in TCPA litigation. These courts, adopting a narrow reading of the Hobbs Act’s mandate that courts of appeals have “exclusive jurisdiction to ... determine the validity of all final orders of the [FCC],” held that they were not able to upset the FCC’s determinations because they were prohibited from reviewing the FCC’s orders in private TCPA litigation.[2] But as Timothy Loose, Jeremy Smith, Wesley Sze and Danielle Hesse explain in a recent Law360 article, the times are changing, and the tide is now moving toward limiting the reach of the TCPA and the FCC’s expansive interpretations. Moreover, the viability of the entire statute will be questioned by the U.S. Supreme Court. The end of the TCPA class action frenzy may be near.

4 Questions That May Signal The End Of TCPA Class Actions (click on link)

___________________

   [1]  47 U.S.C. § 227(b)(3).    [2]  28 U.S.C. § 2342(1). © 2020, Law360, February 25, 2020, Portfolio Media, Inc. Reprinted with permission.
Gibson, Dunn & Crutcher’s lawyers are available to assist with any questions you may have regarding these issues. Please feel free to contact the Gibson Dunn lawyer with whom you usually work, or the authors: Timothy W. Loose - Los Angeles (+1 213-229-7746,tloose@gibsondunn.com) Jeremy S. Smith - Los Angeles (+1 213-229-7973, jssmith@gibsondunn.com) Wesley Sze - Palo Alto (+1 650-849-5347, wsze@gibsondunn.com) Danielle Hesse* - Los Angeles (+1 213-229-6827, dhesse@gibsondunn.com)

* Danielle Hesse is a litigation associate in Gibson Dunn's Los Angeles office.

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

February 19, 2020 |
California Consumer Privacy Act Update: Attorney General Proposes Regulations Version 2.0

Click for PDF On February 7, 2020, California Attorney General Xavier Becerra released a revised set of proposed regulations for the California Consumer Privacy Act of 2018 (“CCPA”), and released an additional amendment on February 10, 2020.[1] These proposed regulations provide further details and clarifications on the steps businesses must take to comply with the CCPA. This is not the end of the road for the development of the regulations, however, as the Attorney General will at least consider additional comments, which must be submitted by February 25, 2020, before the regulations are finalized.[2] The CCPA[3] took effect January 1, 2020, and aims to give California consumers increased visibility into and control over how companies use and share their personal information. It applies to all entities doing business in California and collecting California consumers’ personal information if they meet certain thresholds. The Attorney General’s power to enforce the law is delayed until July 1, 2020. More information can be found in our prior client alerts on the topic, including a summary of the statute (here), amendments from October 2018 (here), additional proposed amendments (here), the Attorney General’s draft regulations (here), the final amendments signed in October 2019 (here), and a summary of CCPA developments heading into 2020 (here). The revised version of the proposed regulations adjusts some of the requirements imposed on businesses by the initial proposed regulations, clarifies certain definitional ambiguities, and includes additional proposed provisions relating to service providers’ handling of personal information. Below, we briefly summarize a number of the key changes in the revised proposed regulations. The list is not exhaustive, and we encourage you to contact us with any questions. As the public comment period is an important opportunity for companies to provide feedback to shape the proposed regulations, please feel free to contact any of the Gibson Dunn attorneys listed below if you are interested in submitting comments in advance of the February 25, 2020 deadline.

Key Definitions Clarified

  • “Personal information”: Version 2.0 of the proposed regulations (“Version 2.0”) adds guidance for interpreting the definition of “personal information” under the CCPA, alleviating some concern regarding exactly how broadly “personal information” would be applied. Specifically, whether information constitutes “personal information” depends on “whether the business maintains [the] information in a manner that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.”[4] The revised regulations clarify that IP addresses—which have been a particular focus for companies collecting statistical and analytical information on the usage of their websites—that are not tied to any identifiable consumers or households, and that cannot be reasonably linked to any identifiable consumer or household, do not constitute “personal information” under the CCPA in those instances.[5]
  • “Categories”: Version 2.0 clarifies that businesses must describe “categories of sources”[6] and “categories of third parties”[7] to consumers in notices at collection, privacy policies, and in response to verified requests to know with “enough particularity to provide consumers with a meaningful understanding of the type” of source or third party.

Notices At Collection Must Be Readily Accessible

The revised regulations update obligations related to notices at collection, particularly with respect to mobile applications. For instance, the regulations state notices must be posted wherever personal information is collected, including on webpages, mobile application download pages (and within mobile applications, such as the settings menu), and printed forms.[8] Furthermore, businesses must provide consumers with a “just-in-time” notice, describing the categories of personal information being collected and a link to the full notice at collection, when collecting personal information that consumers would “not reasonably expect” to be collected from mobile devices.[9] If personal information is collected orally, oral notice may be given.[10] In addition, the draft provides the following regarding notices:
  • No Additional Consent Required For Use Not “Materially” Different

Version 2.0 makes clear that additional consent is not required for the use of previously collected personal information that is not “materially” different from the uses disclosed in the original notice at collection. Under the previous iteration of the proposed regulations, any additional use of personal information that did not fall strictly into the uses described in the notice at collection would have required the business to seek additional consent.[11]

  • Specific Business Or Commercial Purpose Need Not Be Explicitly Tied To The Category Of Personal Information

The revised regulations no longer require a business to identify the specific business or commercial purpose for the collection of each category of personal information collected.[12] In other words, Version 2.0 suggests it is sufficient simply to list the business or commercial purposes for collecting all the categories of personal information collected, and a breakdown by category is no longer necessary. This revision is similarly explained in the section of Version 2.0 on Privacy Policies.[13]

  • Data Broker Obligations Simplified

Data brokers registered with the Attorney General under California’s data broker registration law (Civil Code § 1798.99.80) that post links to their privacy policies containing instructions on how to opt-out need not provide notices at collection.[14] This provision replaces the previously proposed mandates requiring businesses not collecting personal information directly from consumers to either (1) contact consumers directly to provide notice, or (2) contact the source of the information for an attestation describing how the source provided the required notice at collection (this provision was discussed in more depth in our previous client alert regarding the first draft of the regulations, available here). However, the new provision leaves unclear how “data scrapers” that do not sell personal information—or simply companies that obtain non-exempted personal information from sources other than the consumer, such as publicly available sources other than government records—should provide notice to the consumer.

  • Employee Notice Explained

The notice at collection provided to employees does not need to include a “Do Not Sell My Personal Information” link, and may include a link to (or a paper copy of) the employee privacy policy, instead of the general consumer privacy policy.[15]

“Do Not Sell” Provisions And Optional Opt-Out Button

Version 2.0 provides an option to obtain user consent to sell data that the business collected during the time it did not have a notice of the right to opt-out, as opposed to the total ban of that sale under the previous version of the proposed regulations.[16] The new draft regulations also give the option of providing an “opt-out” button alongside the “Do Not Sell My Personal Information” link, and provide a visual depiction of how such a button may appear (see below), but requires the link nonetheless.[17]

Do Not Sell Provision Buttons

Responding To Requests To Know And Requests To Delete

  • Required Response Time Revised

Under the new proposed regulations, businesses are required to confirm receipt of requests to know and requests to delete within 10 business days—instead of “10 days”—though the allowed period to respond to the requests remains 45 calendar days.[18]

  • Designated Methods For Submitting Requests Simplified

The updated proposed regulations eliminate the requirement to maintain a webform. Businesses that operate solely through a website must provide an email address to submit requests to know, but no longer need to additionally maintain a webform. All other businesses must provide at least two designated methods for submitting requests to know, including at a minimum a toll-free number.[19] The requirement to provide two designated methods for submitting requests to delete (which do not necessarily include a webform), remains unchanged.[20]However, the proposed regulations indicate that businesses should still consider the ways in which they primarily interact with consumers when providing additional methods for submitting requests.

  • Exemption For Businesses That Do Not Sell And Only Maintain Personal Information For Legal Compliance

In response to consumers who make requests to know the personal information a business has collected about them, Version 2.0 relieves businesses of the requirement to search for personal information if the following conditions are met: 1) the business does not maintain personal information in a searchable or reasonably accessible format; 2) the business maintains the personal information solely for legal or compliance purposes; 3) the business does not sell personal information and does not use it for any commercial purpose; and 4) the business describes to the consumer the categories of records that may contain personal information, despite not having searched because it meets the above conditions.[21]

  • Biometric Information Should Not Be Provided In Response To Requests To Know

In response to requests to know, biometric information joins other categories of sensitive information that businesses cannot disclose, such as Social Security numbers and financial account numbers. Biometric data includes such information either generated from measurements or technical analysis of human characteristics. This is consistent with the legislature’s recent revision of the categories of information that trigger breach notice requirements, and consequently, the relevant categories of personal information subject to the private right of action under the CCPA.

  • Unverified Requests To Delete Do Not Need To Be Treated As Opt-Out Of Sales

Unlike the previous draft of the proposed regulations, businesses no longer need to treat unverified requests to delete as opt-out of sales of personal information. Instead, businesses are permitted to ask consumers if they would instead like to opt-out of the sale of their personal information, provided they had not already made a request to opt-out.[22]

  • Businesses Can Retain Record Of Requests To Delete

Version 2.0 explicitly allows businesses to retain a record of the request for the purpose of ensuring the consumer’s personal information remains deleted from the business’s records.[23]

  • Verification For Requests from Households With MinorsIf a member of the household is a minor under the age of 13, the business must obtain verifiable parental consent before complying with requests to access or delete specific pieces of information for the household.[24]
  • Verification For Non-Account Holders

Version 2.0 provides additional examples of acceptable methods for verifying consumers who do not have password-protected accounts, including asking the consumer to respond to in-app questions, or provide additional information about a transaction amount or an item purchased.[25]

  • Authorized Agent Requests

The new draft regulations allow businesses to require agents to submit a signed permission and consumers to directly confirm the authorization of the agent with the business, and impose additional requirements on authorized agents, such as implementing and maintaining reasonable security to protect consumers and restricting their use of the information.[26]

Service Providers Are Granted Greater Leeway

Version 2.0 no longer bars service providers from using the personal information they collect for their own purposes, so long as the personal information is not used to build household or consumer profiles or “clean or augment the data with data acquired from another source.”[27] Furthermore, in response to requests to know or delete, service providers can either act on behalf of the business or inform the consumer that the request cannot be processed because they are a service provider.[28]

Clarifications Regarding Requests To Opt-Out

The updated regulations now allow businesses to propose an opt-in, after consumers have already opted-out of the sale of their personal information, if those consumers have attempted to use a product or service that requires the sale of their personal information.[29]

Recordkeeping Requirements Lessened

Version 2.0 changes the threshold triggering the recordkeeping requirement for businesses that collect, use, or disclose the personal information of large numbers of consumers within one year.[30] The threshold for triggering the recordkeeping requirement was increased from the collection of information from 4 million consumers to 10 million consumers. Businesses that meet the 10 million threshold must compile and disclose within their privacy policy the numbers for all requests to know, delete, and opt-out received for all individuals. Though the initial draft regulations required those businesses to report the number of requests received from California residents (consumers) specifically, Version 2.0 grants the option of disclosing the number of requests received from all individuals, eliminating the added effort that may be required to parse out California residents.[31]

Conclusion

Version 2.0 of the proposed CCPA regulations provides some much needed clarification on certain of the ambiguities in the CCPA. However, not all ambiguities have been resolved. For example, the new draft regulations do not provide any practical clarity on the prohibition of the use of personal information for the purpose of building a “consumer” or “household profile” by service providers. Companies subject to the CCPA should continue to monitor the proposed regulations as they evolve. It is also important to provide comments and weigh in by February 25, 2020 on issues of interest to particular companies that remain unclear. We are available to assist with your inquiries as needed. _____________________ [1] The entire text of the draft regulations is available at https://www.oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-text-of-mod-redline-020720.pdf?. [2] Department of Justice, Title 11, Division 1, Chapter 20. California Consumer Privacy Act Regulations (February 10, 2020), available at https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-notice-of-mod-020720.pdf. [3] The CCPA is encoded in California Civil Code Sections 1798.100 to 1798.198. [4] Draft Regulations § 999.302(a) (emphasis added). [5] Id. [6] Draft Regulations § 999.301(d). [7] Draft Regulations § 999.301(e). [8] Draft Regulations § 999.305(a)(3). [9] Draft Regulations § 999.305(a)(4). [10] Draft Regulations § 999.305(a)(3)(d). [11] Draft Regulations § 999.305(a)(5). [12] Draft Regulations § 999.305(b)(2). [13] Draft Regulations § 999.308(c)(1)(d). [14] Draft Regulations § 999.305(d). [15] Draft Regulations § 999.305(e). [16] Draft Regulations § 999.306(e). [17] Draft Regulations § 999.306(f). [18] Draft Regulations § 999.313(a). [19] Draft Regulations § 999.312(a). [20] Draft Regulations § 999.312(b). [21] Draft Regulations § 999.313(c)(3). [22] Draft Regulations § 999.313(d)(1). [23] Draft Regulations § 999.313(d)(3). [24] Draft Regulations § 999.318(c). [25] Draft Regulations § 999.325(e). [26] Draft Regulations § 999.326(a), (e). [27] Draft Regulations § 999.314(c). Please note that “household or consumer profiles” are not defined. [28] Draft Regulations § 999.314(e). [29] Draft Regulations § 999.316(b). [30] Draft Regulations § 999.317(g). [31] Id.
The following Gibson Dunn lawyers assisted in the preparation of this client update: Alexander Southwell, Ryan Bergsieker, Cassandra Gaedt-Sheckter, Abbey Barrera, and Lisa Zivkovic. Gibson Dunn's lawyers are available to assist in addressing any questions you may have regarding these developments.  Please contact the Gibson Dunn lawyer with whom you usually work, or any member of the firm's California Consumer Privacy Act Task Force or its Privacy, Cybersecurity and Consumer Protection practice group: California Consumer Privacy Act Task Force: Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Cassandra L. Gaedt-Sheckter - Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Alexander H. Southwell - New York (+1 212-351-3981, asouthwell@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Please also feel free to contact any member of the Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)

Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan - Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

© 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 29, 2020 |
International Cybersecurity and Data Privacy Outlook and Review – 2020

Click for PDF For the second consecutive year, following the publication of Gibson Dunn’s eighth annual U.S. Cybersecurity and Data Privacy Outlook and Review on Data Privacy Day, we offer this separate International Outlook and Review. Like many recent years, 2019 saw significant developments in the evolution of the data protection and cybersecurity landscape in the European Union (“EU”):

  • Several EU Member States continued to adapt their national legal frameworks, and data protection authorities started to apply and enforce these laws and the GDPR.
  • The Court of Justice of the EU (“CJEU”) has started to hear cases and delivered rulings that concern the application of the General Data Protection Regulation (“GDPR”)[1] and EU data privacy legislation. The European Data Protection Board (“EDPB”), the EU’s regulatory body that took office in 2018 and is composed by representatives of all EU data protection authorities, continued to adopt relevant opinions and guidance documents regarding the interpretation of the GDPR.
  • The Council of the EU, which represents the governments and administrations of the EU Member States, pursued its internal discussions regarding the adoption of an EU regulation with respect to private life and the protection of personal data in electronic communications, intended to repeal the currently applicable legal framework (“ePrivacy Regulation”).
  • EU Member States continued to work on the transposition and application of the EU Directive on the security of network and information systems (“NIS Directive”). We cover these topics and many more in this year’s International Cybersecurity and Data Privacy Outlook and Review.
In addition to the EU, different legal developments occurred in other jurisdictions around the globe, including in other local European jurisdictions, Asia-Pacific region, Africa and Latin America. We cover these topics and many more in this year’s International Cybersecurity and Data Privacy Outlook and Review. __________________________

Table of Contents

I. European Union
A. EU GDPR: Implementation Application and Enforcement
1. National Data Protection Initiatives Implementing and Applying the GDPR 2. GDPR Cases, Investigations and Enforcement| 3. CJEU Case Law
a) Territorial Scope of the “Right To Be Forgotten” under the GDPR b) Cookie Consent under the ePrivacy Directive c) Obligations of Website Providers and Social Network Services Offering Social Plug-ins d) Validity of Data Transfer Mechanisms: Standard Contract Clauses and the EU-U.S. Privacy Shield
4. Guidance Adopted by the EDPB 5. International Transfers: Adequacy Declarations and Challenges
B. EU Cybersecurity Directive (“NIS Directive”) C. Reform of the ePrivacy Directive – the Draft EU ePrivacy Regulation Bill
II. Developments in Other European Jurisdictions: Switzerland, Turkey and Russia
A. Russia B. Switzerland C. Turkey
III. Developments in Asia-Pacific and Africa
A. China B. Singapore C. India D. Other Developments in Africa & Asia
IV. Developments in Latin America and in the Caribbean Area
A. Brazil B. Other Developments in the Caribbean Area
__________________________  

I.  European Union

A.  EU GDPR: Implementation Application and Enforcement

As is widely known, in 2018 the GDPR became the main legislative act for the protection of personal data and privacy in the EU.  Its numerous and lengthy provisions have made the object of interpretation on their application and enforcement by the CJEU and by the EU data protection authorities gathered in the EDPB.[2]

1.  National Data Protection Initiatives Implementing and Applying the GDPR

Since the adoption of the GDPR, some Member States have adapted their legal frameworks in order to transpose and implement some of the GDPR provisions into their respective national legislation. In the 2019 International Outlook and Review, we provided an overview of the national laws and regulations adopted by the Member States in 2018 in order to adapt their legislation to the GDPR. Below is an overview of the national data protection reforms implemented throughout the EU during 2019:
Member State National Data Protection Law Adopted
Bulgaria Personal Data Protection Act of 4 January 2002 implementing the GDPR, published in the State Gazette on 26 February 2019.
Czech Republic Act No. 110/2019 Coll. on the Processing of Personal Data (Data Protection Act), applicable as of its publication in the Official Gazette on 24 April 2019.
Finland Data Protection Act (1050/2018), approved on 13 November 2018 and applicable as of 1 January 2019.
France Decree No. 2019-536 of 29 May 2019.
Germany Second Law on the Adaptation of Data Protection Legislation to the GDPR, published in the Federal Gazette on 25 November 2019.
Greece Law 4624/2019 on the protection of personal data of 29 August 2019.
Poland Act of 21 February 2019 amending other legal acts in relation to the implementation of the GDPR.
Portugal Law No. 58/2019 of 8 August 2019, which repealed the previous data protection law, Law No. 67/98, of 26 October 1998.
Romania Law no. 129 of 15 June 2018 amending the Law No. 102 of 2005.
Slovenia The new Slovenian Data Protection Act (the “ZVOP-2”) is currently in the legislative pipeline, and it will repeal the current Data Protection Act (the “ZVOP-1”). On 6 March 2019, the Ministry of Justice released a draft Personal Data Protection Act.

2.  GDPR Cases, Investigations and Enforcement

2019 saw the end of the transition period that supervisory authorities granted to companies to implement the GDPR, and investigations and infringement proceedings have sky-rocketed in the Member States. The most significant cases in important EU jurisdictions are set out below. In France, the French National Data Protection Commission (“CNIL”) received group complaints from the associations None Of Your Business and La Quadrature du Net in May 2018, shortly after the application of the GDPR. In these complaints, the associations complained against Google LLC for not having a valid legal basis to process the personal data of the users of its services, particularly for the purposes of customizing and delivering targeted ads. The CNIL concluded that Google had breached its transparency and information obligations and its obligation to rely on a valid legal basis to customize and deliver personalized ads. Based on these grounds, the CNIL imposed a financial penalty of EUR 50 million to Google LLC on 21 January 2019.[3] The CNIL has also imposed a 500,000 EUR fine on a company specialized in private homes insulation, Futura Internationale, for violations of the GDPR. Further to a complaint, the CNIL investigated and found Futura Internationale to have committed the following GDPR violations: (i) the absence of a procedure to ensure the right of data subjects to object to personal data processing; (ii) the presence of irrelevant data in the company’s client database (e.g., offensive comments and comments related to health); (iii) insufficient information provided to individuals regarding the processing of their personal data and their rights; (iv) lack of cooperation with the CNIL; and (v) lack of mechanisms of supervision and compliance of data transfers outside the EU. In Ireland, a social network service is currently being investigated by Irish privacy authorities over its refusal to give a user information about how it tracks users when they click on links in public messages. The company refused to disclose the data it recorded when a user clicked on links in other people’s messages, claiming that it benefitted from a GDPR exemption to disclose the requested data, as providing it would involve a “disproportionate effort” for the company. In December 2018, the Irish Data Protection Commission opened a statutory inquiry into the company’s compliance with the relevant provisions of the GDPR following receipt of a number of breach notifications from the company since the introduction of the GDPR.[4] In 2019, the Irish Data Protection Commission concluded its investigation into the social network service over potential violations of the GDPR, and moved into the decision-making phase. During this phase, the Irish Data Protection Commission will issue a draft decision, which is expected in early 2020. On another note, in Germany, the Berlin Commissioner for Data Protection and Freedom of Information imposed a fine of approximately 14.5 million EUR on a German real estate company for violations of the privacy by design and storage-limitation principles. In particular, the Berlin authority found that the archive system of the company did not enable personal data that were no longer required to be removed, and personal data were retained for longer than necessary.[5] This is the highest fine imposed so far by a German company over data protection. The German Federal Data Protection Supervisory Authority also imposed a 9.55 million EUR fine on a telecommunications service provider for violations of the GDPR. The authority concluded that individuals calling the provider’s customer service hotline could obtain, merely by providing a customer’s name and date of birth, extensive information about other customers. The authority considered that this constituted a breach of Article 32 of the GDPR, which requires data controllers to implement technical and organizational measures to ensure a level of security appropriate to risks.[6] The company announced that it would challenge the order, arguing that the amount of the fine is disproportionate. In the UK, on 9 July 2019, the Information Commissioner Office (“ICO”) issued a notice of its intention to fine a hospitality company approximately 99 million GBP for infringements of the GDPR. The proposed fine relates to an incident that affected personal data contained in approximately 30 million guest records of residents in the European Economic Area.[7] The cyber-incident and possible data breach affected the company while it was subject to one ownership, but the breach was exposed and investigated after the company was transferred to another ownership. On 8 July 2019, the ICO also issued a notice of its intention to fine British Airways 183.39 million GBP for infringements of the GDPR. The proposed fine relates to a cyber-incident reported to the ICO by British Airways in September 2018, according to which personal data of approximately 500,000 customers were compromised.[8] On 17 December 2019, the ICO imposed a fine of 275,000 GBP, the first issued in the UK in application of the GDPR, on a pharmacy for failing to comply with security requirements for special categories of data. The pharmacy allegedly left approximately 500,000 documents (containing clients’ personal data including names, addresses, dates of birth, National Health Service numbers, as well as other medical information) in unlocked containers at the back of its premises. The ICO was alerted to this incident by the Medicines and Healthcare Products Regulatory Agency, which was carrying out its own separate investigation into the pharmacy. After completing its investigation, the ICO concluded that the pharmacy failed to process data in a manner that ensured appropriate security against unauthorized or unlawful processing and accidental loss, destruction or damage, in violation of the GDPR.[9] In Austria, the Austrian data protection authority imposed a fine of 18 million EUR on the Austrian Postal Service, due to the processing of personal data on political opinions of data subjects and for direct marketing purposes. The authority specified that the high amount of the fine imposed on the Austrian Postal Service aimed to prevent other violations.[10] Finally, in Italy, the Italian data protection authority recently imposed a fine of 11.5 million EUR on energy company Eni Gas and Luce for its unlawful processing of personal data in the context of promotional activities (telemarketing) and the activation of unsolicited contracts. The fines were determined in line with the GDPR requirements, taking into account the wide range of stakeholders involved, the pervasiveness of the conduct, the duration of the infringement, and the economic conditions of the company.[11]

3.  CJEU Case Law

Building on the body of case law developed throughout the last years, as we indicated in the 2019 International Outlook and Review, 2019 has continued to witness numerous cases before the CJEU on the application of the EU Data Protection Directive, the GDPR and the ePrivacy Directive. Set forth below are the most relevant cases and updates concerning the interpretation and application of EU privacy legislation.

a)  Territorial Scope of the “Right To Be Forgotten” under the GDPR

On 24 September 2019, the CJEU delivered a judgment in a case facing Google LLC to the French supervisory authority (“CNIL”). In the underlying proceedings under French law, Google LLC had a fine imposed for its failure to implement on all domain extensions, worldwide, those requests from data subjects to remove search results that referenced their personal data. The CNIL considered it insufficient that “right to be forgotten” requests from French data subjects would only be executed in results on the “.fr” domain of Google Search (i.e., www.google.fr), as well as only with regard to users located within the French territory.[12] In its judgment, the CJEU concluded that a search engine operator is not required to carry out that de-referencing on all versions of its search engine, but only on the versions of that search engine corresponding to all the EU Member States.

b)  Cookie Consent under the ePrivacy Directive

On 1 October 2019, the CJEU issued a ruling on the topic of cookie information and consent obligations under the ePrivacy Directive and under the GDPR. The judgment was delivered in the context of proceedings followed in Germany against Planet49 GmbH, a company that organized a promotional lottery online and which required users to input certain personal data in order to participate, followed by pre-selected checkboxes authorizing Planet49 GmbH to share the personal data with analytics providers, sponsors and cooperation partners for commercial purposes.[13] In the judgment, the CJEU considered that the “consent” referred to in the ePrivacy Directive, which is based on the definition provided in the GDPR, is not valid if it is collected by way of pre-selected checkboxes, which the user must deselect in order to refuse his or her consent. Accordingly, in the context of the use of checkboxes, valid “consent” may only be expressed through the use of blank boxes that users must actively select. The ruling applies in principle to the processing of data contained in cookies, stored and accessed in users’ devices, regardless of whether these data may be considered to be personal data. However, given that the CJEU expressly referred to and based its decision on the definition of “consent” under the GDPR, it is possible that this ruling will set a new trend in the definition of “consent” applicable to the processing of personal data in general. Furthermore, the CJEU ruled that online service providers must make available to website users information on the operation of cookies, including the duration of the operation of cookies and whether or not third parties may have access to any cookie data received.

c)  Obligations of Website Providers and Social Network Services Offering Social Plug-ins

On 29 July 2019, the CJEU delivered a judgment regarding the identification of controllers and defining the scope of information obligations imposed on online service providers. The ruling was issued in the proceedings followed against Fashion ID, an online clothing retailer, which had embedded in its website a “Like” social plug-in from a third-party social network service. Because of the manner in which the Internet works, when a visitor consulted the website of Fashion ID, that visitor’s personal data (e.g., IP addresses, cookie data and other browser technical data) were transmitted to the social network service through the social plug-in. Such transmission occurred without the knowledge or awareness of the visitor, and independently from the visitor’s membership with the social network.[14] In the judgment, the CJEU concluded that the operator of a website, such as Fashion ID, which embeds in its website a social plug-in that transmits personal data to a third-party provider, can be considered to be a “controller.” However, the CJEU limited the role of Fashion ID as a “controller” only for the purposes of those data processing operations in respect of which it actually determined the purposes and means. Furthermore, the CJEU found that both the provider of the website (Fashion ID) and of the social plug-in (the social network service provider) should each pursue a legitimate interest in order to benefit from the legal basis provided for in Article 7(f) of Directive 95/46/EC (Article 6(1)(f) of the GDPR). Finally, the CJEU concluded that the website provider (Fashion ID) needed to obtain any valid consent required, and needed to provide users with the necessary information to comply with Directive 95/46/EC (replaced by the GDPR), but only with regard to the data processing operations in respect of which the provider determined the purposes and means as a “controller.”

d)  Validity of Data Transfer Mechanisms: Standard Contract Clauses and the EU-U.S. Privacy Shield

As it was indicated in the 2018 and 2019 International Outlook and Review, on 3 October 2017, the Irish High Court decided to refer the issue of the validity of the standard contractual clauses decisions to the CJEU for a preliminary ruling.[15] Several questions were referred to the Court in May 2018 which relate, in particular, to the validity of Decision 2010/87 on standard contractual clauses (“SCCs”) for the transfer of personal data to processors established in third countries.  On 19 December 2019, the EU Advocate General issued a favorable opinion on the validity of the EU’s SCCs.[16] According to the Advocate General, Decision 2010/87 is compatible with the Charter of Fundamental Rights of the EU since there are sufficiently sound mechanisms to ensure that transfers based on the SCCs be suspended or prohibited where those clauses are breached or impossible to honor. Decision 2010/87 places obligations on data controllers and, where the latter fail to act, on EU data protection authorities, to suspend or prohibit a transfer when, because of a conflict between the obligations arising under the standard clauses and those imposed by the law of the third country of destination, those clauses cannot be complied with.[17] The final judgment of the CJEU should be adopted and released in the coming months. As it was also indicated in the 2018 and 2019 International Outlook and Review, on 12 July 2016, the European Commission formally approved the EU-U.S. Privacy Shield. The Privacy Shield replaced the EU-U.S. Safe Harbor framework for the transatlantic transfer of personal data, which was invalidated by the CJEU on 6 October 2015 in the case Maximilian Schrems v. Data Protection Commissioner.[18] Since the adoption of the Privacy Shield program in 2016, more than 5,000 companies have adhered to the Privacy Shield framework. On 22 November 2017, the CJEU declared an action brought by Digital Rights Ireland Ltd. against the Privacy Shield inadmissible. However, the EU’s General Court admitted a similar challenge of the Privacy Shield brought by French NGO La Quadrature du Net.[19] These proceedings are currently ongoing, and an opinion of the EU’s Advocate General and a Judgment are expected in the course of 2020. In October 2019 the European Commission published its third annual review of the EU-U.S. Privacy Shield, which concluded that the Privacy Shield continues to ensure an adequate level of protection of personal data transferred to participating companies in the U.S.[20] The European Commission noted the adoption of several improvements to the Privacy Shield, such as a more systematic oversight performed by the U.S. Department of Commerce, an improvement of the enforcement action by the Federal Trade Commission, the use of Privacy Shield rights by an increasing number of European individuals, or the appointment of the permanent Ombudsperson. Nevertheless, the European Commission recommended the adoption of additional measures to ensure the effective functioning of the Privacy Shield, including the strengthening of the certification/recertification process, the development of additional guidance related to human resources data, and the expansion of compliance checks. On 12 November 2019, the EDPB published its own report relating to this third annual review, which contains its main findings regarding the commercial aspects of the Privacy Shield and the access by public authorities to data transferred from the EU to the U.S. under the Privacy Shield.[21]

4.  Guidance Adopted by the EDPB

The EDPB, which took office on 25 May 2018, has continued to hold public consultations and adopt Guidelines on the interpretation and application of certain key provisions and aspects of the GDPR. The Guidelines adopted in the course of 2019 include the following:[22]

These Guidelines analyze the different elements that determine whether an entity is subject to the GDPR, depending on whether or not it has an establishment in the EU. Remarkably, the EDPB clarified in the Guidelines that controllers or processors not established in the EU could be subject to the GDPR if they intentionally target EU data subjects to offer goods or services, or if they monitor their behavior. Furthermore, these foreign entities would not benefit from the “one-stop shop” rule if they do not have one or more establishments in the EU.

These Guidelines assess the application of the legal basis contained in Article 6(1)(b) of the GDPR, which may be relied upon when personal data are processed for the performance of a contract with a data subject or in order to take steps at the request of the data subject prior to entering into a contract. In particular, the EDPB found that Article 6(1)(b) of the GDPR may not cover certain processing activities not necessary for the provision of individual services requested by a data subject, but rather for the controller’s wider business model.

In these Guidelines, the EDPB has dissected each of the grounds that data subjects may rely on to exercise their right to be forgotten, and the exceptions on which data controllers may rely on to dismiss this kind of requests, including the necessity to safeguard the right of freedom of expression and information.

The EDPB issued these Guidelines in order to shed some light into one of the most unclear obligations imposed by the GDPR. The EDPB clarified that privacy “by design and by default” required companies to implement necessary and effective safeguards in the form of technical and organizational measures. These should include state-of-the-art technology considered appropriate regarding the costs of implementation, the nature, scope, context and purpose of the processing, and the risks identified at the time of the processing.

In these long-awaited Guidelines, the EDPB expressed a common EU approach to the use of video devices (e.g., CCTV cameras) and the processing of personal data. The EDPB analyzed the possible application of a number of legal bases (e.g., consent, legitimate interests, or performance of a task in the public interest) and assessed the application of the data protection principles to video footage recording (e.g., technical and organizational measures, storage periods). It also addressed the conditions for the disclosure of video footage to third parties, and the exercise of rights by data subjects.

The GDPR foresees the appointment of accredited bodies that can certify the compliance of companies and organizations with data protection rules. In these Guidelines, the EDPB outlined the procedure for the accreditation of these certification bodies, and set out the substantive requirements for certification of entities’ compliance with the substantive requirements of the GDPR.

Under the GDPR, trade associations and other institutional bodies representing controllers or processors may prepare codes of conduct for the purposes of specifying the application of the GDPR in specific fields. These Guidelines provided the criteria for the admissibility and approval of codes, including at the national and EU level, and set up a procedure for their monitoring by accredited bodies, approval and revocation.

5.  International Transfers: Adequacy Declarations and Challenges

Both under the former EU Data Protection Directive and the current GDPR, transfers of personal data outside of the EU are generally prohibited unless, inter alia, the European Commission formally concludes that the legislation of the country where personal data is being transferred protects personal data adequately. Thus far, the adequacy decisions adopted by the European Commission under the previous legal framework (the Data Protection Directive 95/46/EC) are still in force, and cover data transfers to the following jurisdictions: Andorra, Argentina, Canada (commercial organizations), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the U.S. (limited to the EU-U.S. Privacy Shield framework).[23] As indicated in the 2019 International Outlook and Review, the European Commission had engaged with a number of jurisdictions with a view to recognizing the validity of data transfers to more countries worldwide. During 2019, adequacy talks have continued with regard to South Korea, with a view to adopting an adequacy decision in 2020. Although the negotiations have remained confidential so far, it has been reported that the main concerns of the EU authorities related to the independence and powers of the South Korean data protection authority.[24] Some amendments to the Personal Information Protection Act have been submitted to the South Korean National Assembly, in order to grant enforcement power and functions to the Personal Information Protection Commission. India, which is preparing a personal data protection bill, would also plan to obtain an adequacy decision following the finalization and adoption of this bill.[25] In addition, the evolution of the situation in Indonesia and Taiwan could also lead to future adequacy decisions. Finally, preparatory work has started in order to initiate discussions regarding the adequacy of several Latin American countries (such as Chile or Brazil).[26]

B.  EU Cybersecurity Directive (“NIS Directive”)

In the EU, cybersecurity legislation addressing incidents affecting essential service and digital service providers is primarily covered by the NIS Directive,[27] adopted on 6 July 2016. As it was explained in the 2019 International Outlook and Review, the NIS Directive is the first set of cybersecurity rules to be adopted at the EU level, which aims to set a minimum level of cybersecurity standards and to streamline cooperation between EU Member States at a time of growing cybersecurity breaches. In the course of 2019, the European Union Agency for Cybersecurity (“ENISA”) has been particularly active in issuing guidance and evaluating the responsiveness of the EU authorities, stakeholders and systems in responding to cyberattacks. In particular:
  • ENISA has published a number of guidance documents aimed to assist private parties in their evaluation of security measures adopted in application of EU instruments, such as the GDPR[28] and the NIS Directive.[29]
  • Following the trends for increased use of consumer products and services relying on cloud services and Internet of Things, ENISA has continued to issue guidance documents providing companies with an overview of the potential risks and redress measures in this context. For example, in January 2019, ENISA issued its gap analysis into the security standards observed in the field of “Internet of Things.”[30]
  • ENISA has also strived to adopt guidance documents assisting companies in their day-to-day business practices, such as the adoption of good practices on the implementation of regulatory technical standards,[31] or the adoption of measures to reinforce trust and security in electronic communications and services.[32]

C.  Reform of the ePrivacy Directive – the Draft EU ePrivacy Regulation Bill

As it was explained in the 2019 International Outlook and Review, 2016 saw the initiation of the procedures for the reform of the EU’s main set of rules on ePrivacy, the ePrivacy Directive.  In this context, further to a public consultation held by the European Commission, the first proposal of the future EU ePrivacy Regulation (the “draft ePrivacy Regulation”) was released on 10 January 2017.[33] In 2017, the draft ePrivacy Regulation was subject to an opinion of the WP29 (4 April 2017)[34] and an amended version was issued by the European Parliament (20 October 2017).[35] Since then, internal discussions have been ongoing at the level of the Council of the EU during 2018 and 2019. Despite the progress made on this front, in November 2019, it was made public that the EU Council could still not find a common position on a variety of topics concerning the ePrivacy Regulation. Press reports have identified the following outstanding aspects as being at the origin of the disagreement among Member States:[36]
  • The processing of electronic communications data for the purposes of prevention of child abuse imagery: Member States have diverging views on whether and how to achieve this objective.
  • The protection of terminal equipment information: Member States have been reported to discuss extensively regarding conditional access to website content (so-called “cookie walls”), which underlies numerous existing business models. The positions of the Council and of the European Parliament differ vastly in this area.
  • Processing of electronic communications data by third parties: While the latest draft proposal included a recital clarifying the concept of third parties, there are other ongoing discussions regarding whether the scope of these obligations should be extended to electronic communications providers in general, or to services covered by current sectoral legislation.
  • Cooperation among data protection and telecommunications regulatory authorities: A number of Member States have raised concerns regarding the cooperation among various enforcement authorities.
In light of the disagreement among Member States within the Council, it has been reported that the European Commission has recently retrieved the ePrivacy Regulation bill, in order to update it in light of the various positions expressed by the Member States to date. The European Commission allegedly aims to resubmit a new ePrivacy Regulation bill for discussion during the Croatian Presidency of the Council (January to June 2020).[37]

II.  Developments in Other European Jurisdictions: Switzerland, Turkey and Russia

As we indicated in the 2019 International Outlook and Review, the increasing impact of digital services in Europe and the overhaul brought about by the GDPR in the EU have led certain jurisdictions in the vicinity of the EU to improve and enforce more vigorously their data protection regulations.

A.  Russia

Local data privacy laws have continued to be heavily enforced, reflecting the activity of the Russian Federal Service for the Supervision of Communications, Information Technology and Mass Communications (“Roskomnadzor”) in monitoring and enforcing data protection compliance. For example, in January 2019, it became public that the Roskomnadzor had sent letters to two social network services regarding their compliance with Russian data localization laws. In February and March 2019, the Roskomnadzor announced reports on administrative proceedings against these companies for alleged violations of Russian data protection laws. In July 2019, Roskomnadzor imposed a 700,000 RUB fine (approx. 10,000 EUR) on Google for its alleged failure to remove prohibited search engine results. According to Roskomnadzor, more than a third of the links from a single Google search registry contained prohibited information under Russian law. On 2 December 2019, the fines for violations of data localization and data processing requirements were increased. In particular, the failure by operators to collect, systemize and store personal data in Russian databases will be fined with 1 million to 6 million RUB (approx. 14,000 EUR to 84,500 EUR) for legal entities. In addition, the Law highlights that repeat offences will lead to fines up to 18 million RUB (approx. 250,000 EUR) for legal entities.

B.  Switzerland

As indicated in the 2019 International Outlook and Review, to prepare for the entry into force of the GDPR, the Swiss government had issued a draft of a new Data Protection Act (the “Draft FDPA”)[38] that aims to:
  • Modernize Swiss data protection law and, to a certain extent, align it to the requirements of the GDPR; and,
  • Maintain its adequacy status granted by the European Commission, to ensure the free flow of personal data between the EU and Switzerland.
The Draft FDPA was published by the Swiss Federal Council on 15 September 2017, in order to replace the Federal Act on Data Protection of 19 June 1992 (the “FADP”). In November 2019, the Swiss Federal Assembly announced that the State Political Commission of the Council of States (“PCI-S”) had completed its detailed consultation on the Draft FDPA, which had been unanimously accepted after consultation of the representatives of the cantonal data protection officers. In order to approach the Draft FDPA to the GDPR, the PCI-S departed from the decisions of the National Council, for example, including trade union membership as a category of sensitive personal data. It is therefore expected that the Draft FDPA will be adopted in the course of 2020.

C.  Turkey

Throughout 2019, the Turkish data protection authority (the “KVKK”) has issued a number of regulations and guidance documents regarding a number of issues related to the application and enforcement of the Turkish Data Protection Act No. 6698 of 2016. These regulations and guidance documents include the following:
  • Data protection obligations: On 18 March 2019, the KVKK issued guidelines on data protection in Turkey, addressing data processing requirements such as consent, transfers of data within and outside of Turkey, and data controller obligations, among other topics.
  • Subject access requests: On 13 February 2019, the KVKK issued a decision on the time-frames to lodge a complaint with the KVKK further to a subject access request. The decision focuses on cases where a request made under the Turkish Data Protection Act was rejected, replied to insufficiently or not replied to in due time.
  • Data processing registry: On 28 April 2019, the KVKK published a guide on the preparation of processing registry. The guide specifies the content of the registry and the preparation process, such as determining the purpose of the data processing and the data retention period.
  • Data processing guide: On 6 August 2019, the KVKK published a guide, which aims at making it easier for companies to understand data protection requirements under the Turkish Data Protection Act, such as obligations of data disclosure, deletion, and anonymization, obligations to register with the data controller and exceptions to the obligation to handle a registry of operations, among other things.
  • Transparency requirements: On 8 November 2019, the KVKK issued a statement on the transparency requirements, in order to bring the practices of companies in further compliance with the Turkish Data Protection Act. In January 2020, the KVKK announced the launch of its online portal on data violations, which is expected to increase the supervisory activity and enforcement actions of the KVKK.
Furthermore, the KVKK continued with its enforcement of the Turkish Data Protection Act. For example, in May 2019, the KVKK imposed fines up to 4.65 million TRY (approx. 250,000 EUR) on a social network service for its alleged failures to notify data breaches. In July 2019, the KVKK imposed a fine of 1.45 million TRY (approx. 220,000 EUR) on a hospitality company for an alleged data breach that affected Turkish citizens. Overall, the KVKK found and imposed fines over 1 million EUR on several companies for data breaches that occurred in several sectors.

III.  Developments in Asia-Pacific and Africa

As we indicated in the 2019 International Outlook and Review, in an increasingly connected world, 2019 also saw many other countries try to get ahead of the challenges within the cybersecurity and data protection landscape.  Several international developments bear brief mention here:

A.  China

As indicated in the 2019 International Outlook and Review, China’s Cybersecurity Law was adopted on 1 June 2017, becoming the first comprehensive Chinese law to regulate the management and protection of digital information by companies.  The law also imposes significant restrictions on the transfer of certain data outside of the mainland (data localization) enabling government access to such data before they are exported.[39] On 10 September 2018, the National People’s Congress of China announced, as part of its legislative agenda, that its Standing Committee would consider draft laws with relatively mature conditions, including a draft personal information protection law and a draft data security law.[40] On 25 January 2019, the Ministry of Industry and Information Technology of the People’s Republic of China (“MIIT”), the Cyberspace Administration of China (“CAC”), the Ministry of Public Security, and the State Administration for Market Regulation released a statement on privacy practices for applications. In particular, the announcement outlined the consent requirements from the perspective of the Chinese Cybersecurity Law, which requires controllers to provide privacy notices in clear, concise wording, to obtain freely given consent, to discourage “bundled” forms of consent, and to encourage app operators to provide an opt-out mechanism for personalized advertisements. In March, the MIIT identified a number of organizations that had been involved in nuisance calls and the use of illegal apps to collect personal information. The MIIT noted that it had made arrangements for companies involved to immediately shut down phone lines used to facilitate the illegal calls. It also highlighted that it would cooperate with the Central Network Information Office, the Ministry of Public Security and the General Administration of Market Supervision in order to strengthen the protection of personal information collected by mobile apps. The CAC has also been involved in the adoption of bills and rules regarding the protection of personal data in China, including the following:
  • On 24 May 2019, the CAC published draft measures to enhance the security and management of critical information infrastructure, and launched a public consultation on the same topic.
  • On 28 May 2019, the CAC published draft measures on data security management including, among others, provisions for privacy, data processing, notifications, and consent.
  • On 31 May 2019, the CAC issued draft measures on the collection, storage, use, transfer and disclosure of children’s personal information. The draft measures apply to children under 14 years of age and, among other things, specify that network operators should set up dedicated children’s personal information protection user agreements as well as designate personnel to be responsible for protecting children’s personal information.
  • On 13 June 2019, the CAC issued draft measures on cross-border data transfers. In particular, the draft measures require network operators to provide a declaration form, signed contract between network operators and receivers, and a security risk assessment, among other things, prior to personal information being transferred out of China.
  • In July 2019, the CAC announced the release of an Internet information service complaint platform in order to facilitate and encourage data subjects to defend their rights.

B.  Singapore

As indicated in the 2019 International Outlook and Review, the Personal Data Protection Commission of Singapore issued on 7 November 2017 the proposed advisory guidelines for the collection and use of national registration identification numbers.  The Commission gave businesses and organizations 12 months from the date of publication to review their processes and implement necessary changes to ensure compliance.[41] Following the expiration of this grace period, the Singapore Personal Data Protection Commission (“PDPC”) has initiated enforcement action and issued fines against numerous companies across all sectors for violations of Singapore data protection laws. For example, in January 2019, the PDPC imposed a fine of 750,000 SGD (approx. 500,000 EUR) on Integrated Health for data security failures.

C.  India

As we indicated in the 2019 International Outlook and Review, the Indian Ministry of Electronics and Information Technology published, on 27 July 2018, the Personal Data Protection Bill (the “Bill”) and the Data Protection Committee Report (the “Report”).[42] In December 2019, after further deliberations, the Bill was approved by the Cabinet Ministry of India, and was tabled in the Indian Parliament by the Minister of Electronics and Information Technology. At the end of December 2019, the Bill started being analyzed by a Joint Parliamentary Committee in consultation with various groups.

D.  Other Developments in Africa & Asia

Throughout 2019, a number of jurisdictions in Asia and Africa have adopted data protection legislation, including the following:
  • Kenya: On 8 November 2019, the Kenya Data Protection Bill 2019 was signed into law.
  • Nigeria: The National Information Technology Development Agency issued the Nigeria Data Protection Regulation 2019.
  • Togo: In October 2019, Law No. 2019-014 Relating to the Protection of Personal Data was published in the Official Gazette.
  • Uganda: In 2019, the Data Protection and Privacy Act entered into force.
  • Indonesia: In October 2019, Government Regulation No. 71 of 2019 on the Implementation of Electronic Systems and Transactions became effective.
  • New Zealand: In 2019, the Parliament discussed the Privacy Amendment Bill, which should become law in the course of 2020.
  • Thailand: In May 2019, the Personal Data Protection Act and the Cybersecurity Act entered into force.

IV.  Developments in Latin America and in the Caribbean Area

The overhaul of data protection rules in important jurisdictions around the globe has also impacted Latin America and the Caribbean countries, where some local administrations have bolstered their respective legislation and undertaken initiatives to bring their framework closer to that of the EU.

A.  Brazil

As we indicated in the 2019 International Outlook and Review, a new General Data Protection Law was adopted in Brazil on 14 August 2018, after several years of discussions among decision-makers.[43] In July 2019, the President of Brazil promulgated Law No 13.853 amending the General Data Protection Law. In its final form, the General Data Protection Law introduced some important revisions, such as the creation of an enforcement authority (the National Data Protection Authority), the extension of its application to public bodies as well as to private entities, the extensive appointment of Data Protection Officers, and the postponement of its application until 2022. In the midst of the adoption of the General Data Protection Law, enforcement action of the Brazilian authorities has thrived to protect the privacy of its citizens. For example, on 30 December 2019, it was announced that the Ministry of Justice and Public Security had fined a social network service 6.6 million BRL (approx. 1.4 million EUR) for the alleged transfer to and misuse of personal data of Brazilian users by a political marketing consultancy firm.

B.  Other Developments in the Caribbean Area

Throughout 2019, a number of jurisdictions in the Caribbean area have adopted data protection legislation, including the following:
  • Barbados: In August 2019, the bill for the Data Protection Act 2019 was passed by the House of Assembly after its approval by the Senate.
  • Cayman Islands: On 30 September 2019, the Data Protection Law, 2017 (Law 33 of 2017) entered into force.
  • Jamaica: In July 2019, the Minister of Science, Energy and Technology had submitted to the Parliament a bill to reform the Data Protection Act 2017.
______________________   [1]   See Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC, OJ L 119 4.5.2016, p. 1.   [2]   The EDPB is an EU body that is formed by the representatives of the data protection authorities of the EU Member States, the EEA States (Iceland, Lichtenstein and Norway), and the European Data Protection Supervisor (the data protection agency that supervises the compliance of the EU institutions with EU data protection legislation). Under the GDPR, the EDPB has certain advisory, enforcement and decision-making powers.   [3]   See: https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-penalty-50-million-euros-against-google-llc.   [4]   See: https://www.dataprotection.ie/en/news-media/press-releases/data-protection-commission-opens-statutory-inquiry-twitter.   [5]   See: https://www.datenschutz-berlin.de/fileadmin/user_upload/pdf/pressemitteilungen/2019/20191105-PM-Bussgeld_DW.pdf.   [6]   See: here.   [7]   See: here.   [8]   See: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/07/ico-announces-intention-to-fine-british-airways/.   [9]   See: https://ico.org.uk/media/action-weve-taken/mpns/2616742/doorstop-mpn-20191217.pdf. [10]   See: https://edpb.europa.eu/news/national-news/2019/administrative-criminal-proceedings-austrian-data-protection-authority_en. [11]   See: https://edpb.europa.eu/news/national-news/2020/italian-supervisory-authority-fines-eni-gas-e-luce-eur-115-million-account_en. [12]   See CJEU, Case C-507/17 Google LLC v. CNIL (24 September 2019). [13]   See CJEU, Case C-673/17 Verbraucherzentrale Bundesverband e.V. v. Planet49 GmbH (1 October 2019). [14]   See CJEU, Case C-40/17 Fashion ID GmbH & Co.KG v. Verbraucherzentrale NRW eV (29 July 2019). [15]   See Irish High Court Commercial, The Data Protection Commissioner v. Facebook Ireland Limited and Maximilian Schrems, 2016 No. 4809 P. [16]   See Opinion of Advocate General Saugmandsgaard Øe on Case C-311/18 Data Protection Commissioner v. Facebook Ireland Limited, available here. [17]   See Opinion of the Advocate General in the case C-311/18 Facebook Ireland and Schrems, available here. [18]   See CJEU, Case C-362/14, Maximillian Schrems v. Data Protection Commissioner (6 October 2016). [19]   See General Court, Case T-738/16, La Quadrature du Net and Others v. Commission. [20]   See Report from the commission to the European parliament and the council on the third annual review of the functioning of the EU-U.S. Privacy Shield, available here. [21]   See “EU – U.S. Privacy Shield - Third Annual Joint Review,” available here. [22]   See: https://edpb.europa.eu/our-work-tools/general-guidance/gdpr-guidelines-recommendations-best-practices_en. [23]   See: https://ec.europa.eu/info/law/law-topic/data-protection/data-transfers-outside-eu/adequacy-protection-personal-data-non-eu-countries_en. [24]   See IAPP, “South Korea’s EU adequacy decision rests on new legislative proposals” (27 November 2018), available at https://iapp.org/news/a/south-koreas-eu-adequacy-decision-rests-on-new-legislative-proposals/. [25]   See IAPP, “India to seek adequacy status with EU” (31 July 2019), available at https://iapp.org/news/a/india-to-seek-adequacy-status-with-eu/. [26]   See “Communication from the Commission to the European Parliament and the Council - Data protection rules as a trust-enabler in the EU and beyond – taking stock,” available here. [27]   See Directive (EU) 2016/1148 of the European Parliament and of the Council of 6 July 2016 concerning measures for a high common level of security of network and information systems across the Union, OJ L 194, 19.7.2016, pp. 1-30, available here. [28]   See: https://www.enisa.europa.eu/publications/recommendations-on-shaping-technology-according-to-gdpr-provisions, https://www.enisa.europa.eu/publications/recommendations-on-shaping-technology-according-to-gdpr-provisions-part-2, and https://www.enisa.europa.eu/publications/pseudonymisation-techniques-and-best-practices. [29]   See: https://www.enisa.europa.eu/publications/eu-ms-incident-response-development-status-report. [30]   See: https://www.enisa.europa.eu/publications/iot-security-standards-gap-analysis. [31]   See: https://www.enisa.europa.eu/publications/good-practices-on-the-implementation-of-regulatory-technical-standards. [32]   See: https://www.enisa.europa.eu/publications/reinforcing-trust-and-security-in-the-area-of-electronic-communications-and-online-services. [33]   See: https://ec.europa.eu/digital-single-market/en/proposal-eprivacy-regulation. [34]   See: http://ec.europa.eu/newsroom/document.cfm?doc_id=44103. [35]   See: here. [36]   See: https://iapp.org/news/a/how-the-eprivacy-regulation-failed-again/. [37]   See: https://www.euractiv.com/section/data-protection/news/commission-to-present-revamped-eprivacy-proposal/. [38]   The Draft FDPA is available in the official languages of Switzerland: An unofficial English version of the Draft FDPA is also available here. [39]   See FT Cyber Security, “China’s cyber security law rattles multinationals,” Financial Times (30 May 2017), available at https://www.ft.com/content/b302269c-44ff-11e7-8519-9f94ee97d996. [40]   See: http://www.npc.gov.cn/npc/xinwen/2018-09/10/content_2061041.htm (Press Release in Chinese). [41]   See Singapore Personal Data Protection Commission, Proposed Advisory Guidelines on the Personal Data Protection Act For NRIC Numbers, published 7 November 2017, available here. [42]   See http://meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill%2C2018_0.pdf. [43]   See IAPP, “GDPR matchup: Brazil’s General Data Protection Law” (4 October 2018), available at https://iapp.org/news/a/gdpr-matchup-brazils-general-data-protection-law/.

The following Gibson Dunn lawyers prepared this client update: The following Gibson Dunn lawyers assisted in the preparation of this client alert: Ahmed Baladi, Alexander Southwell, Alejandro Guerrero, Guillaume Buhagiar and Clémence Pugnet.

Gibson Dunn's lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work, the authors, or the following leaders and members of the firm's Privacy, Cybersecurity and Consumer Protection practice group:

Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan - Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero Perez - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Guillaume Buhagiar - Paris (+33 (0)1 56 43 13 00, gbuhagiar@gibsondunn.com) Clémence Pugnet - Paris (+33 (0)1 56 43 13 00, cpugnet@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 30, 2020 |
Ashley Rogers Named to Dallas Business Journal 40 Under 40

The Dallas Business Journal named Dallas of counsel Ashley Rogers to its 2020 40 Under 40 list, which “honors rising stars and people you should know in the North Texas business community.”  The list was published on January 22, 2020. Ashley Rogers’ practice focuses on a wide range of data privacy and consumer protection matters, with particular expertise in representing clients in the technology and internet industries in putative data privacy class actions and in government investigations.  She also counsels clients on a range of issues relating to data privacy and consumer protection laws.

January 28, 2020 |
Law360 Names Gibson Dunn Among Its 2019 Cybersecurity & Privacy Practice Groups of the Year

Law360 named Gibson Dunn one of its five Cybersecurity & Privacy Groups of the Year for 2019.  The firm’s Cybersecurity & Privacy practice was profiled on January 27, 2020. The firm’s Privacy, Cybersecurity and Consumer Protection Practice Group has a demonstrated history of helping companies successfully navigate the complex and rapidly evolving laws, regulations, and industry best practices relating to privacy, cybersecurity and consumer protection.  Our global and interdisciplinary team advises clients across a broad range of industries in high-stakes matters on the full spectrum of issues in these areas.

January 27, 2020 |
U.S. Cybersecurity and Data Privacy Outlook and Review – 2020

Click for PDF In honor of Data Privacy Day—a worldwide effort to raise awareness and promote best practices in privacy and data protection—we offer this eighth edition of Gibson Dunn’s United States Cybersecurity and Data Privacy Outlook and Review. In 2019, companies, courts, and regulators faced unprecedented challenges as they navigated a rapidly evolving set of cybersecurity and privacy issues.  Congress and state legislatures proposed (and, in the case of some states, enacted) measures ranging from limits on the use of consumer data to protecting children’s internet privacy.  Increasingly active federal and state regulators enforced data privacy, cybersecurity, and consumer protection standards in the face of novel cybersecurity threats.  Private parties stepped up the pace of civil litigation in a year that saw numerous high-profile data breaches and continued questions over who can sue for damages.  And questions regarding the government’s ability to access data, from biometric information to files stored overseas, came into sharper legislative and judicial focus. This Review places these, and other, 2019 developments in broader context, addressing: (1) the regulation of privacy and data security, including key legislative developments, enforcement actions by federal and state authorities, and new regulatory guidance; (2) trends in civil litigation around data privacy issues in areas including privacy class actions, digital communications, and biometric information privacy laws; and (3) the collection of electronically stored information by government actors, including the extraterritoriality of subpoenas and warrants and the collection of data from electronic devices.  While we do not attempt to address every development that occurred in 2019, this Review examines a number of the most significant developments affecting companies as they navigate the evolving cybersecurity and privacy landscape. This Review focuses on cybersecurity and privacy developments within the United States.  For information on developments outside the United States, please see Gibson Dunn’s International Cybersecurity and Data Privacy Outlook and Review, which addresses developments in 2019 outside the United States that are of relevance to domestic and international companies alike.  We have adopted the practice of referring to companies by generic descriptors in the body of the alert; for further details, please see the endnotes. ________________________

Table of Contents

I.  REGULATION OF PRIVACY AND DATA SECURITY
A.  Legislative Developments
1.  State 2.  Federal
B.  Enforcement and Guidance
1.  Federal Trade Commission 2.  Department of Health and Human Services and HIPAA 3.  Securities and Exchange Commission 4.  Other Federal Agencies 5.  State Attorneys General and Other State Agencies
II.  CIVIL LITIGATION
A.  Data Breach Litigation B.  Telephone Consumer Protection Act Litigation C.  Biometric Information Privacy Act Litigation D.  Other Notable Cases
III.  GOVERNMENT DATA COLLECTION
A.  Collection of Data from Computers, Cellphones, and Other Devices B.  Other Notable Developments
IV.  CONCLUSION ________________________

I.  Regulation of Privacy and Data Security

A.  Legislative Developments

1.  State

a)  California Consumer Privacy Act of 2018

As the first comprehensive consumer privacy law in the United States, the California Consumer Privacy Act of 2018 (“CCPA”) has changed the legal landscape.  According to one observer, initial compliance with the CCPA will cost businesses around $55 billion.[1]  As reported in detail in Gibson Dunn’s prior CCPA updates,[2] the law requires businesses to disclose what personal information they collect from California consumers (defined broadly as California residents), for what purpose, and to what third parties the information is shared or sold.  The law also allows consumers the right to request deletion of their personal information and opt out of the sale of such information, among other provisions. Despite passing in 2018, and coming into effect in January 2020, the law continued to evolve in 2019, and is still evolving.  California’s Attorney General is set to release final regulations in the first part of 2020 (at the time of publishing this Review only a draft version of the regulations had been released, in October 2019).[3]  Further, the California legislature passed multiple amendments just two months before the law became effective,[4] and continued attempts at amending the law are expected, along with another ballot initiative in November 2020 that would expand the CCPA’s reach.[5]  And despite clarifying amendments and draft regulations aimed at implementing the CCPA, there are still a number of open issues for businesses to analyze. As an example, the scope of “sale” continues to be the subject of extensive debate.  The CCPA regulates the “sale” of personal information, which it defines as the exchange of personal information “for monetary or other valuable consideration.”[6]  This definition creates some uncertainty for businesses that do not expressly sell user data in a traditional sense, but may receive some tangible benefit from sharing the data with a third party.  In addition, where data is automatically collected and analyzed by a third party using web-browser cookies, it can be technologically difficult or impossible to identify what information is associated with the particular consumer and to wholly comply with the consumer’s request.  Separately, the law’s private right of action for subjects of certain data breaches caused by a lack of “reasonable” security protections has caused concern regarding what constitutes “reasonable,” particularly in light of the statute’s potentially steep statutory damages.[7] While California’s Attorney General will not bring enforcement actions under the CCPA until July 1, 2020, the law went into effect January 1, 2020, and the Attorney General indicated in late 2019 that he may consider prosecuting businesses not in compliance with the law as of the effective date.[8]  That said, the Attorney General also has reported that enforcement initially will focus on companies that deal in large amounts of sensitive personal data—such as health data and Social Security numbers[9]—and on companies that collect the personal data of children.[10]  Meanwhile, the CCPA’s narrow private right of action for data breaches is already in full effect.[11]  While as of the time of this writing no such actions have been widely reported as filed, Gibson Dunn will continue to monitor CCPA-related developments.

b)  Other State Laws

Aside from the CCPA, several other states also considered, passed, or began enforcement on their own data privacy and consumer protection laws in 2019.
i.  Nevada
On October 1, 2019 Nevada’s “Act relating to Internet privacy” went into effect.[12]  Compared to the CCPA, Nevada’s privacy law has a narrower definition of “sale” of personal information:  “the exchange of covered information for monetary consideration by the operator to a person for the person to license or sell the covered information to additional persons.”[13]  This definition does not include the CCPA’s broader definition of an exchange of covered information for “other valuable consideration.”  The Nevada law also has a narrower definition of “consumer”—a “consumer” is a “person who seeks or acquires, by purchase or lease, any good, service, money or credit for personal, family or household purposes from the Internet website or online service of an operator.”  The law excludes from the definition of “operator”: (1) financial institutions and affiliates subject to the GLBA; (2) HIPAA-covered entities; and (3) certain manufacturers of motor vehicles and persons who repair or service motor vehicles.[14] The law requires that website operators provide an online notice disclosing what covered information the operators maintain, and requires that they permit consumers to opt out of any sale of such information by the website to third parties.[15]  Nevada’s privacy law contains no private right of action, and caps penalties at $5,000 per violation.[16]
ii.  Maine
Like Nevada, Maine’s new data privacy law, “An Act to Protect the Privacy of Online Customer Information,” which will go into effect July 1, 2020, is narrower than the CCPA in many ways.[17]  For example, it applies only to broadband providers in Maine and affects only those who are physically located and billed for broadband services in Maine.[18]  The Act generally prohibits broadband providers from using, disclosing, selling, or permitting nonconsensual access to their customers’ personal information.  The law imposes a transparency requirement on broadband providers to publish privacy notices informing customers of their rights and of the provider’s obligations at the point of sale.  Similar to the CCPA, the law prohibits broadband providers from refusing service to customers who do not provide their consent or charging customers a penalty or offering customers a discount based on the customer’s decision to provide consent or not.[19]
iii.  New York
The Stop Hacks and Improve Electronic Data Security (“SHIELD”) Act modifies New York’s data breach law by changing the definition of a data breach to include any unauthorized person gaining access to or acquisition of the protected information.[20]  The Act also expands upon the definition of “private information” to include, in conjunction with a New York resident’s name, number, personal mark or other identifier, the following data: (1) bank account, credit, or debit card number, provided that the numbers could be used to access an individual’s account without more; and (2) biometric information.[21]  The Act also adds to the definition of “private information” usernames or email addresses accompanied with passwords or security questions and answers that would grant access to an online account.[22]  The Act requires covered entities to establish data security programs to safeguard personal user data, safeguards that are tailored to the size of the business.[23]  The Act relieves covered entities, however, of their notification obligations if a breach was the result of an inadvertent disclosure by persons authorized to access private information and the entity determines that the exposure is unlikely to result in harm to the affected individuals.  However, if the breach affects over 500 New York residents, the covered entity must provide a written determination as to the risk of harm to these individuals to the New York Attorney General.[24]  Notably, 2019 also saw legislative attempts, ultimately unsuccessful, for New York to pass the New York Privacy Act,[25] a proposed law offering protections as broad, or broader, than those provided by the CCPA, as discussed more fully below.[26]

c)  State Laws Under Consideration

Numerous states considered privacy legislation in 2019, and many of those states are expected to revive their failed 2019 bills in 2020.[27]  For example, Washington is expected to adopt a version of the “Washington Privacy Act”—previously stalled in 2019—which, in addition to adopting many of the CCPA’s provisions, would set limits on the commercial use of facial recognition technology and would grant consumers the right to confirm whether a controller is processing personal data about the consumer and to access that data, to correct inaccurate data, to delete personal data, and to clearly opt out of the use of personal information for targeted advertising.[28]  The draft legislation provides for no private right of action and caps penalties at $7,500 per violation.[29] In addition, New York, Florida, Texas, Massachusetts, New Jersey, Virginia, and New Hampshire are a few of the many states considering adopting comprehensive privacy laws similar to CCPA (in the absence of preemptive federal legislation).  In particular, New York’s proposed law contains more stringent requirements than the CCPA.[30]  It would require consumer opt-in before a company could use, process, sell, share, or transfer that consumer’s data, and would impose upon controllers and data brokers who collect, sell, or license personal data a fiduciary duty of care, loyalty, and confidentiality.[31]  The New York proposed law would also allow for a private right of action.[32]  With so many diverging state privacy bills passed or gaining traction, many businesses are rightfully concerned that 2020 signals the beginning of a patchwork of comprehensive state privacy laws, resulting in an even more complex compliance environment.[33]

2.  Federal

a)  Comprehensive Privacy Legislation

Three comprehensive privacy bills are currently being considered in Congress, each discussed below.  Democrats have published a “Senate Democratic Privacy Principles” list of minimum provisions required in any Democratic-backed privacy legislation,[34]  and favor a federal privacy law that includes a private right of action.[35]  Republicans favor a law that explicitly preempts state privacy laws like those in California, Nevada, and Maine.[36]  Many commentators have suggested that enacting federal privacy legislation will be difficult in 2020 given the federal elections, and expect states to be more successful in enacting privacy legislation.[37]  Indeed, comprehensive federal privacy legislation has been a topic of discussion for many years, but such legislation has not yet been enacted.
i.  House Energy and Commerce Committee Staff Bipartisan Draft Privacy Bill
One bill that is likely to see action in 2020 is a bipartisan staff draft out of the House Energy and Commerce Committee.  The House Energy and Commerce Committee draft bill is more comprehensive than the CCPA because it establishes within the FTC a specialized enforcement arm, the Bureau of Privacy, and an Office of Business Mentorship to assist with compliance.[38]  Many parts of the bill, however, are still in flux.[39]  Notably, it does not, in its current form, contain a private right of action or address state law preemption, despite advocates both proposing and opposing such measures.[40]  In terms of consumers, the proposal would include, among other protections the right to request to know information collected and the purpose of collection; the right to correct personal information; the right to request to delete information; and the ability to port that information to another service provider.[41] The draft bill also places requirements on businesses, similar to those of the European Union’s GDPR: maintaining privacy policies; implementing a privacy program and establishing reasonable policies, practices and procedures for the processing of covered data; designating a privacy protection officer; and seeking affirmative consent for the processing of covered data unless the processing is “consistent with the reasonable consumer expectations within the context of the interaction between the covered entity and the individual.”[42]  Additionally, large companies would be required to provide annual filings to the FTC, including the results of an internal risk assessment and measures taken to address those risks.  The bill also mandates express affirmative consent for all processing of sensitive information, which consent must be given separately for each type of personal information processed.[43] While the draft bill is a step toward a bipartisan, comprehensive privacy law, at the time of publishing this Review, the two major political parties have not reached an agreement regarding several sections of the bill, including exceptions to the consent requirement, categorization of sensitive data and de-identified data, revenues and amounts of data processing sufficient to require heightened compliance from companies; opt-out requirements for first-party marketing; discriminatory use of data; and the size of the Bureau of Privacy; along with the issues of preemption and a private right of action.[44]
ii.  Consumer Online Privacy Rights Act and United States Consumer Data Privacy Act of 2019
The proposed Consumer Online Privacy Rights Act (“COPRA”),[45] introduced by Senator Maria Cantwell (D-WA), and the draft United States Consumer Data Privacy Act of 2019 (“CDPA”),[46] circulated by Senator Roger Wicker (R-MS), Chairman of the Senate Commerce Committee, share many of the features included in the House Energy and Commerce Committee staff bipartisan draft privacy bill, requiring companies to adopt privacy policies and risk-based data security practices and assessments, and provide consumers the right to access, correct, delete, and port their data.[47]  The Democrat-backed COPRA contains a private right of action while CDPA does not, and CDPA contains broad state-law preemption, while COPRA generally does not.[48]

b)  Other Federal Legislation

There were several other privacy-related bills introduced in 2019 and 2020 prior to the publication of this Review, including: Online Privacy Act of 2019,[49] Designing Accounting Safeguards to Help Broaden Oversight and Regulations on Data Act,[50] Do Not Track Act,[51] Social Media Privacy Protection and Consumer Rights Act of 2019,[52] Algorithmic Accountability Act of 2019,[53] Balancing the Rights of Web Surfers Equally and Responsibly Act of 2019,[54] Privacy Bill of Rights Act,[55] Information Transparency & Personal Data Control Act,[56] the DATA Privacy Act,[57] and the Preventing Real Online Threats Endangering Children Today (“PROTECT”) Kids Act.[58]  None, as of this writing, has gained significant traction. Most of these bills substantially overlap with the comprehensive federal privacy bills discussed above—except for the following legislation:
  • The Do Not Track Act, introduced by Senator Josh Hawley (R-MO), would require the FTC to develop an online Do Not Track (“DNT”) system.[59] Opting in would prevent sites and apps from tracking a user without consent, but a user could still consent to tracking by certain apps or sites.[60]  If a user opted in to DNT, then that user would transmit a signal indicating that a company would be disallowed from targeted advertising or information sharing without prior permission.[61]  And in the event a user does not transmit such a signal, the site or app would still have to notify the user that the DNT system is available for them to opt into.[62]
  • The Algorithmic Accountability Act of 2019, introduced by Senators Cory Booker (D-NJ) and Ron Wyden (D-OR), would require companies to conduct impact assessments to explain how their algorithms work and evaluate their algorithms’ use of personal information against the following metrics: “accuracy, fairness, bias, discrimination, privacy and security.”[63] Then, the Act would allow the FTC to promulgate compliance regulations based on the algorithm(s) used.[64]
  • Two similar bipartisan bills, the Preventing Real Online Threats Endangering Children Today (PROTECT Kids Act),[65] introduced in the House by Representatives Tim Walberg (R-MI) and Bobby Rush (D-Ill.), and a set of amendments to the 1998 Children’s Online Privacy Protection Act (“COPPA 2.0”),[66] introduced by Senators Ed Markey (D-MA) and Josh Hawley (R-MO), would update the original COPPA with additional protections. Both bills would raise the minimum age under which parental consent must be obtained before a company can collect personal data and location from 13 to 16 years old.[67]  The PROTECT Kids Act would clarify that COPPA applies to mobile applications as well as other types of online activity, and expands the types of personal information protected under COPPA to include geolocation and biometric information.[68]  COPPA 2.0, meanwhile, would provide parents with the ability to “erase” their children’s data from particular services.[69]

B.  Enforcement and Guidance

1.  Federal Trade Commission

a)  Priorities

In 2019, the Federal Trade Commission (“FTC”) remained one of the most active and aggressive regulators of privacy and data security.  The Commission continued to conduct policy reviews on a wide range of issues as part of its “Hearings Initiative” announced in 2018, which involved public hearings that took place through the spring of 2019.[70]  The FTC also announced plans to study the privacy practices of internet service providers and has issued orders to seven companies to obtain information about their policies and practices with regard to collecting, using, and sharing personal information of consumers.[71]  Relatedly, the FTC has also emphasized changes it has made to strengthen and improve “data security orders” issued to companies, making such orders more specific, increasing accountability for third-party assessors of compliance, and requiring that companies elevate data security concerns to their boards or similar governing bodies.[72] The Commissioners emphasized their commitment to pursuing enforcement actions against companies that engage in unfair or unreasonable privacy and data security practices with all of the tools available to the FTC.[73]  Recognizing potential limits to the FTC’s authority, however, the majority of the Commissioners have called on Congress to enact legislation that would: (1) authorize the FTC to obtain civil penalties for initial privacy and data security violations; (2) provide the FTC with narrow Administrative Procedure Act (“APA”) rulemaking authority to allow it to keep up with technological developments; and (3) give the FTC jurisdiction over nonprofits and common carriers.[74]  The Commissioners also urged Congress to enact a national privacy law that would be enforceable by the FTC.[75]  With growing public demand for additional consumer privacy protections, pressure on Congress to enhance the FTC’s authority to protect consumer privacy will likely continue.

b)  Data Security and Privacy Enforcement

Demonstrating the Commissioners’ commitment to their cited priorities, the FTC continued to pursue enforcement actions related to privacy and data security in 2019, a number of which included significant monetary remedies and new prescriptive standards for information security and privacy programs in the technology industry. Political Consulting Firm, Former CEO, and App Developer.  In December 2019, the FTC entered into a settlement to resolve allegations that the former CEO of Cambridge Analytica and a developer of apps for the firm used deceptive tactics to collect personal information from social media users that it then used to target and profile voters.[76]  Under the settlement agreement, the former CEO and app developer are prohibited from making false or deceptive statements about the extent to which they collect, use, share, or sell personal information and the purposes for which such data is acquired and distributed.[77]  The former CEO and app developer are also required to destroy any personal information collected from consumers via the app that was used in violation of the FTC Act and any work product that originated from that data.[78]  Notably, in its home country the firm has also been subject to discipline by the United Kingdom’s Information Commissioner’s Office for its data collection and utilization practices.[79] The FTC also issued an opinion that found that the firm, which filed for bankruptcy last year, engaged in similar deceptive tactics in violation of the FTC Act and misrepresented its participation in the EU-U.S. Privacy Shield framework.[80]  The final order prohibits the firm from misrepresenting the extent to which it protects personal information and its participation in the EU-U.S. Privacy Shield framework or other regulatory organizations.[81]  The order also requires the firm to continue to apply Privacy Shield protections to personal information it collected while participating in the Privacy Shield program or to return and delete the information.[82] Email Management Company.  The FTC announced a final settlement with an email management company in December 2019, resolving allegations that the company deceived consumers about how it accessed and used their email.[83]  Specifically, the FTC alleged that despite telling consumers that it would not “touch” their personal emails while helping users consolidate emails or unsubscribe from unwanted communications, the company shared users’ email receipts with its parent company, who in turn used the personal contact and purchasing information in the market research analytics products it sells.[84]  Under the settlement agreement, the company is prohibited from misrepresenting the extent to which it collects, uses, stores, and shares consumer data.[85]  Additionally, the company and its parent company must delete email receipts previously collected unless they obtain express consent to maintain the receipts.[86] Operation Services Company.  In November 2019, the FTC entered into a settlement with a Utah-based technology company that provides back-end operation services to multilevel marketers over allegations that the company failed to enact reasonable security safeguards, and, as a result, allowed a hacker to access personal information of approximately one million consumers over a two-year period.[87]  Specifically, the FTC alleged that the company failed to delete personal information it no longer needed, neglected to implement cybersecurity safeguards to detect unusual activity on its network, and failed to adequately segment and test its network and conduct code review of its software.[88]  Additionally, the FTC alleged that the company stored personal consumer information, including Social Security numbers, payment card information, and passwords, in clear, readable text on its network.[89]  The proposed settlement prohibits the company from collecting, selling, sharing, or storing personal information unless it implements an adequate information security program which includes cybersecurity risk assessment, safeguards to protect personal information, and testing and monitoring of safeguards.[90]  The settlement also requires a third-party assessment of the company’s information security program every two years for the next 20 years.[91] App Developer.  In October 2019, the FTC pursued its first case against a the developer of a “stalking” app (an app that can allow purchasers to monitor a mobile device’s activity without the knowledge or consent of the device’s users).  The FTC alleged such apps compromised the privacy and security of the mobile devices on which these apps were installed.[92]  The developer allegedly failed to adequately secure the information collected from the mobile devices, which resulted in a hacker accessing usernames, passwords, text messages, GPS locations, photos, and other data.[93]  The FTC alleged that the company and its owner violated the FTC Act and COPPA, which requires operators to secure information collected from children under the age of 13.[94]  The settlement agreement requires the app developer and its owner to delete data collected from the apps and prohibits them from promoting, selling, or distributing any monitoring app that requires users to bypass a mobile device’s security protections absent assurances that the app is being used for legitimate purposes.[95]  It also requires the app developer and owner to implement and maintain a comprehensive security program and obtain third-party assessments of the program every two years for the next 20 years.[96]  Under the settlement, the app developer and owner are also prohibited from violating COPPA and from misrepresenting the extent to which they protect the personal information they collect.[97] Auto Dealer Software Company.  Establishing a prescriptive standard for what constitutes reasonable security under the FTC Act, in September 2019 the FTC approved a final order settling charges against an Iowa-based auto dealer software provider that allegedly failed to take basic, low-cost measures to secure consumer data.[98]  The FTC alleged that the security failures resulted in a data breach that exposed personal information of over 12 million consumers stored by 130 of the company’s auto dealer clients.[99]  Under the final order, the software company is prohibited from sharing, collecting, or maintaining personal information unless it implements and maintains a comprehensive information security program designed to protect consumers’ personal information.[100] The order also requires the company to obtain third-party assessments of its information security program every two years for 20 years, and requires a senior corporate manager responsible for overseeing the information security program to certify the company’s compliance with the order on an annual basis.[101]  Such a standard can be instructive for interpreting other privacy laws that do not define “reasonable security,” including the CCPA (discussed further above). Internet Search Engine and Video Sharing Platform.  A web search engine and its subsidiary video sharing platform agreed to a settlement with the FTC and the New York Attorney General in September 2019 to resolve allegations that the video sharing platform collected personal information from children without parental consent, in violation of COPPA.[102]  The video sharing service allegedly knew that a number of its channels were directed at children but did not comply with COPPA’s requirements to obtain parental consent prior to collecting personal information about children.[103]  As part of the settlement, the companies agreed to pay $34 million to New York and $136 million to the FTC, the largest monetary penalty the FTC has ever obtained in a COPPA case.[104]  The proposed settlement also requires the companies to develop, implement, and maintain a system on the video sharing platform that allows channel owners to designate child-directed content so the companies can ensure compliance with COPPA.[105]  Additionally, the settlement requires the companies to notify channel owners that child-directed content may be subject to COPPA and provide COPPA training to employees who interact with channel owners.[106]  Finally, the settlement requires the companies to provide notice about their data collection practices and obtain parental consent prior to collecting personal information from children under the age of 13 and prohibits future violations of COPPA.[107] Social Media Company.  In July 2019, the FTC and DOJ filed a proposed consent order to resolve allegations that a social media company violated an earlier consent order with the FTC entered in 2012 by misrepresenting to consumers the extent of data sharing with third-party applications and the control consumers had over such sharing, and by failing to maintain a reasonable privacy program.[108]  The FTC also alleged that the social media company engaged in deceptive practices related to the collection and use of consumer phone numbers to enable security features.[109]  As part of the settlement, the company agreed to pay a $5 billion civil penalty, without admitting or denying the FTC’s allegations except as specifically stated in the proposed order.[110]  In addition to the monetary penalty, the settlement order expands on the privacy program requirements embodied in the 2012 order and enhances oversight and accountability of the company’s data privacy practices.[111]  In addition to requiring the company to implement early detection measures, the order also requires reporting of covered incidents to the FTC and regular status updates to the FTC regarding such incidents until their resolution.[112]  The order further imposes the requirement that the company’s chief executive periodically certify that the company is in compliance with its obligations under the order.[113] Consumer Credit Reporting Agency.  In July 2019, a consumer credit reporting agency agreed to pay at least $575 million, and up to $700 million total as part of a global settlement with consumers, the FTC, the Consumer Financial Protection Bureau, and attorneys general representing 50 U.S. states and territories based on allegations that the credit reporting agency’s failure to implement basic measures to secure personal information on its network resulted in a data breach in 2017 that impacted 147 million people.[114]  To address identity theft risks caused by the data breach, a portion of the settlement announced in July was to be dedicated to a fund that will provide affected consumers with credit monitoring services, a remedy discussed further below.[115]  In addition to providing such monetary relief to consumers, the settlement also requires the credit reporting agency to implement a comprehensive data security program.[116]  Under the settlement, the credit reporting agency must obtain third-party assessments of its information security program every two years for the next 20 years and must provide an annual update to the FTC regarding the status of the consumer claims process.[117] Smart Home Products Manufacturer.  The FTC entered into a settlement with a manufacturer of smart home products in July 2019 over allegations that the company misrepresented the measures it took to secure its wireless routers and internet-connected cameras, leaving sensitive consumer information, including live video and audio feeds, exposed to third parties.[118]  The manufacturer allegedly told consumers that its products offered “advanced network security,” but failed to perform basic testing and remediation to address well-known security flaws and stored mobile app login credentials in clear, readable text on a user’s mobile device.[119]  Under the proposed settlement, the manufacturer is required to implement a comprehensive security program that includes specific planning, testing, and monitoring standards.[120]  The settlement also requires the manufacturer to obtain biennial, third-party assessments of its software security program for ten years.[121] Video Social Networking App.  In February 2019, the operators of a video social networking app agreed to pay $5.7 million to settle FTC allegations that the company violated COPPA by collecting personal information from children without obtaining parental consent.[122]  Profile information of users, including children, was public on the app and could be seen by other users,[123] and the FTC alleged that the company was aware that a significant portion of its users were under the age of 13 and had received thousands of complaints from parents of young children.[124]  In addition to the monetary payment, the settlement requires the app’s operators to take offline all videos made by children under the age of 13.[125] Privacy Shield Enforcement.  As discussed above, the FTC also brought actions against a number of companies regarding false claims of certification under the EU-U.S. Privacy Shield and Swiss-U.S. Privacy Shield frameworks, which allow companies to transfer personal data lawfully from the European Union and Switzerland, respectively, to the United States.[126]  Each company held itself out as being certified and compliant with the Privacy Shield(s), despite failing to complete the certification process or allowing their certifications to lapse.[127]  The FTC also sent warning letters to a number of other companies that falsely represented participation in these Privacy Shield frameworks, calling for them to remove statements regarding their participation in these frameworks from their websites and other company documents within 30 days.[128]  The FTC has emphasized that enforcement of the Privacy Shield frameworks is a “high priority,”[129] and Gibson Dunn will continue to monitor developments in this area.

c)  Circuit Split Over FTC Monetary Relief Authority

The FTC has long viewed its authority to recover monetary relief under Section 13(b) of the FTC Act as well settled, despite the lack of express reference to monetary remedy or relief in the provision, which refers only to “injunctions.”[130]  The United States Supreme Court had not yet addressed whether Section 13(b) authorizes monetary relief, but prior to this year, the nine federal courts of appeals that had addressed the issue had construed Section 13(b) to allow the FTC to obtain monetary relief, including restitution, rescission, and disgorgement.[131]  However, in August 2019, the Court of Appeals for the Seventh Circuit issued a decision in FTC v. Credit Bureau Center, LLC, expressly overturning its own precedent and breaking with eight other circuit courts by holding that Section 13(b) does not authorize the FTC to seek monetary awards.[132] The implications of Credit Bureau are potentially far-reaching.  Other circuit courts may decide to reconsider their own opinions on this issue, many of which rely on a now-overturned Seventh Circuit decision.  Additionally, in December, the FTC filed a petition for a writ of certiorari asking the Supreme Court to review the decision,[133] and the likelihood of the Supreme Court granting certiorari is heightened because the prior Seventh Circuit decision Credit Bureau overruled was relied upon by many other circuits in decisions upholding the FTC’s authority to obtain monetary relief under Section 13(b).  If the Supreme Court affirms the decision, the FTC’s ability to obtain monetary relief under Section 13(b) will be eliminated or significantly restricted.  In that case, the Commission, absent new statutory authority, would be limited to pursuing monetary remedies through other existing means, including the process set forth in Section 19 of the FTC Act that requires, as a condition to such relief, that the agency invoke a previously promulgated rule or prevail in a prior administrative proceeding.  Unsurprisingly, while the Supreme Court decides whether to grant certiorari, the Commissioners continue to urge Congress to pass legislation that will grant the FTC authority to obtain monetary relief for initial privacy and data security violations.[134]  Congress’s decision to pursue the legislation requested by the Commissioners may be influenced by the ultimate resolution of Credit Bureau.

2.  Department of Health and Human Services and HIPAA

The Department of Health and Human Services (“HHS”) continued in its efforts to enforce patient privacy protections in 2019, both through investigations and civil penalties for violations of Health Insurance Portability and Accountability Act (“HIPAA”) regulations.  HHS also continued to consider major overhauls to the HIPAA regulations.  HHS was not the only entity to enforce healthcare privacy violations in the last year, as 2019 saw the resolution of the first multistate data breach lawsuit brought by Attorneys General of several states alleging violations of HIPAA.  These developments are addressed below.

a)  HHS OCR Enforcement

In February 2019, the HHS’s Office for Civil Rights (“OCR”), the office that enforces HIPAA privacy, security, and breach notification rules, reported it had amassed a record $28.6 million in civil penalties from HIPAA violators in 2018.[135]  In April 2019, OCR announced that it would reduce the penalties it seeks for lower-level HIPAA violations in the future,[136] and some observers have suggested that the total for 2019 was only around $12 million.[137] Nonetheless, there were several notable HIPAA-related settlements, judgments, and proceedings during 2019: Medical Imaging Services Company.  In May 2019, OCR announced a $3 million settlement with a medical imaging services company based on violations of HIPAA data privacy rules.[138]  OCR found the imaging company had posted the PHI of more than 300,000 patients on an unsecured server, permitting search engines to index this PHI and make it publicly available.[139] Hospital System.  In October 2019, OCR reached a settlement imposing a civil penalty of more than $2.1 million on a hospital system after two hospital employees stole the PHI of more than 24,000 patients.  An OCR investigation found the hospital system’s compliance regime had failed to regularly review system access records, did not restrict employee authorization to appropriate levels, and did not timely report this breach to HHS. State Government Health Agency.  OCR announced in November 2019 that it would impose a $1.6 million civil penalty against a state agency which provides assisted living centers, drug and substance use services, and supplemental nutrition benefit programs.  OCR found that a data breach led to the posting of roughly 6,500 patients’ PHI on a publicly viewable internet site.[140] OCR also found that, because the agency did not deploy adequate activity audit controls, it was unable to determine how many unauthorized persons may have accessed the data at issue. University Medical CenterAlso in November 2019, OCR announced a settlement in which a university medical center agreed to pay penalties of $3 million and to take corrective action after PHI was impermissibly disclosed through the loss of two unencrypted mobile devices: a flash drive and a laptop.[141]  OCR specifically noted that it had investigated the medical center for a very similar violation in 2010, and that the medical center continued to permit the use of unencrypted mobile devices even after this investigation.[142] HIPAA Right of Access Initiative and Settlements.  In spring 2019, OCR announced a new “HIPAA Right of Access Initiative” to enforce compliance with HIPAA requirements that guarantee patients’ right to prompt and economical access to their health records.[143]  Late in the year, OCR announced the first- and second-ever enforcement actions and settlements under this initiative.  The first, announced in September 2019, implicated a hospital operator that failed to timely provide a patient with access to her fetal heart monitor data.[144]  The second, announced in December 2019, implicated a primary care provider that failed to timely provide a patient’s electronic medical records to a third party.[145]  In each case, the provider agreed to pay OCR $85,000 and to adopt a corrective action plan.[146] Cancer Center Challenges OCR Authority.  Finally, 2019 also saw litigation which might ultimately reduce OCR’s regulatory capability going forward.  In a 2018 ruling, OCR won a $4.3 million civil penalty against a hospital-based cancer center for violations of HIPAA.  There, an administrative law judge for HHS found on summary judgment that the cancer center had violated HIPAA following the theft or loss of a laptop and two USB thumb drives containing unencrypted ePHI in 2012 and 2013, and assessed the penalty at issue.[147]  In April 2019, however, the cancer center appealed this decision to a federal district court in Texas, requesting that the penalty be reduced or overturned.  The cancer center’s petition argues that the $4.3 million penalty was unconstitutionally excessive, and that OCR lacked statutory authority to impose it.[148]  Gibson Dunn will continue to monitor developments on this matter.

b)  Request for Public Comments on Reforming HIPAA

In addition to bringing enforcement actions, HHS also concluded a far-ranging review of HIPAA regulations, which sought to “remove regulatory obstacles and decrease regulatory burdens in order to facilitate efficient care coordination and/or case management and to promote the transformation to value-based healthcare, while preserving the privacy and security of PHI.”[149]  The request for public comments closed in February 2019 after receiving over 1,300 submissions,[150] with commenters ranging from state health agencies[151] and disability health advocates[152] to professional associations representing healthcare providers.[153]  HHS has not yet announced further action on the proposed rulemaking, and Gibson Dunn will continue to monitor developments.

c)  State Attorneys General Settle Multistate Action Premised on HIPAA

In a multistate data breach lawsuit alleging violations of HIPAA, a bipartisan group of 16 state Attorneys General, led by Indiana Attorney General Curtis T. Hill Jr., settled a lawsuit in Indiana federal court against a healthcare information technology company and its subsidiary related to a breach discovered in 2015 that compromised personal data of 3.9 million people.[154]  The initial lawsuit, filed in December 2018, had alleged that the company failed to protect ePHI in the hands of its business associate after a breach related to a third-party web application.[155]  Under the terms of the judgment and consent decree, the company agreed to pay a $900,000 settlement and to deploy more rigorous data security protections in the future.[156]

3.  Securities and Exchange Commission

The Securities and Exchange Commission (“SEC”) continued to devote increased attention to cybersecurity and data-protection issues in 2019, evidenced by its updated guidance on privacy and cybersecurity to private firms.  One area of focus for the Commission has been cryptocurrency and initial coin offerings.  While the SEC has continued to bring enforcement actions related to cryptocurrency, it has also suggested that it may refrain from taking action against virtual currency companies provided that certain parameters exist.

a)  Data Privacy Guidance and Examination Priorities

In April 2019, the SEC issued guidance addressing privacy notices and safeguard policies that SEC-registered investment bankers and broker-dealers must comply with.[157]  This guidance noted that the SEC’s Office of Compliance Inspection and Examination (“OCIE”) had identified common deficiencies, such as failure to provide customers with sufficient data privacy notices or to inform them of their right to opt out of certain disclosures.[158]  The guidance also noted that common areas of deficiency include use of personal devices to store customer information, use of unsecured networks, and failures to ensure that outside vendors adhere to confidentiality standards.[159] Separately, OCIE released its 2020 Examination Priorities for registered firms in early January 2020.[160]  The Priorities make clear that registrants’ use of non-traditional sources of data from inputs like mobile device geolocations, consumer credit card records, and other internet-based information, sometimes known as “alternative data,” will be a focus of examination review.[161] The Priorities also make clear that OCIE will prioritize cyber and other information security risks throughout its examinations.[162]

b)  Cybersecurity and Data Breaches

Attempted Hacking of EDGAR databaseIn early 2019, the SEC brought charges against a Ukrainian-led group of nine defendants for attempting to hack the SEC’s EDGAR[163] data system, the primary system through which companies submit filings required by law to the SEC.  The defendants had hacked into the database to extract nonpublic information to use for illegal trading,[164] reaping an alleged $4.1 million in profits from the scheme.[165] Data Misuse Risk DisclosureThe SEC also brought charges against a social network company alleging it had made misleading disclosures regarding the risk that the company might misuse consumer data.  Specifically, the SEC alleged that the company failed to disclose that customer data had been misused for several years after the company became aware of the misuse.  The SEC and the company agreed to settle the matter for a civil penalty of $100 million without the company admitting or denying the allegations.[166] Enforcing Regulation Systems Compliance and IntegrityIn September 2019, the SEC brought an enforcement action against a securities clearing agency for violation of the Regulation System Compliance and Integrity (“Reg SCI”) rules, including failing to establish and enforce procedures around financial risk management and information system security.  The clearing agency ultimately settled by agreeing to pay $20 million in penalties and to comply with extensive remedial measures.[167]  The SEC noted that this action was particularly important in light of the risks that the clearing agency’s practices posed to “the broader financial system.”[168]

c)  Cryptocurrency

Unregistered and/or Fraudulent Initial Coin OfferingsIn 2019, the SEC focused substantial enforcement resources on combatting unregistered or fraudulent Initial Coin Offerings (“ICOs”) to the public.  In February, the SEC halted the unregistered sale of over $12.5 million in digital assets as part of an unregistered ICO.  The SEC required the issuer to return funds to all investors who purchased the tokens and to register the tokens pursuant to the Securities Exchange Act of 1934.  It did not, however, impose any monetary penalties, citing the issuer’s cooperation and interest in taking prompt remedial steps.[169]  In October, the SEC filed an emergency action and obtained a temporary restraining order against several offshore entities suspected of conducting an unregistered ICO that raised more than $1.7 billion of investor funds.[170]  Finally, in December 2019, the SEC filed a complaint alleging a digital-asset entrepreneur had conducted a fraudulent ICO raising more than $42 million.[171] First “No Action” Letter for Cryptocurrency.  While continuing to target cryptocurrency operators who run afoul of federal regulations, the SEC also published its first ever “no action” letter for the use of a virtual token currency.[172] Specifically, the SEC stated that a business-travel startup’s sale of cryptocurrency travel tokens to the public would not trigger enforcement action, provided the token’s price stays fixed at one U.S. dollar each, that they are used only for air charter services, and that the startup will not represent the tokens as having potential profit value.[173]

4.  Other Federal Agencies

In addition to the FTC, HHS and SEC, other federal government entities continue to make headlines in the data security and privacy space.  This past year, there were notable developments at the Federal Communications Commission (“FCC”), the Consumer Financial Protection Bureau (“CFPB”), the Department of Defense (“DoD”), and other federal agencies.

a)  Federal Communications Commission

i.  Illegal Robocall Mitigation
Mitigating and preventing illegal robocalls remained a core focus for the FCC in 2019.  In June, the FCC issued rules clarifying that voice service providers could offer tools that blocked calls reasonably suspected to be illegal spam robocalls.[174]  And in August, the FCC issued an order banning caller ID “spoofing” of phone numbers on text messages and on incoming international calls.[175] Alongside these measures, the FCC continues to encourage telecommunications companies to roll out the STIR/SHAKEN[176] framework of call authentication for consumer use.[177]  STIR/SHAKEN provides legitimate calls with digital authentication tokens, making it easier for carriers to identify and filter out spam robocalls.  Several carriers have already adopted STIR/SHAKEN-based tools for users.[178]  And under the newly passed federal TRACED Act[179], the FCC has increased authority to mandate other carriers to deploy such authentication.[180]
ii.  National Security Purchasing Order and Proposed Rulemaking
In November, in response to purported concerns that Chinese telecommunications firms might be using technological assets to spy on the United States,[181] the FCC took two interrelated steps to bar recipients of FCC Universal Service Funds (“USF”) from purchasing from foreign companies deemed to pose national cybersecurity threats.  First, the FCC adopted an Order barring companies from spending any USF funds on such purchases.[182]  At least one Chinese company alleged to present such a threat has sued the FCC to challenge this policy, and its petition is currently pending in the U.S. Court of Appeals for the Fifth Circuit.[183] At the same time, the FCC issued a Further Notice of Proposed Rulemaking (“FNPR”) seeking comment on rules that condition the receipt of any USF funds on certifying that a company does not use or purchase any such services or equipment.[184]  The FNPR comment period closes on February 3, 2020, while the window for reply comments closes March 3, 2020.

b)  Consumer Financial Protection Bureau

The CFPB has continued to operate under uncertainty regarding its continued existence and, by extension, its role in consumer data protection.  Late in 2018, Kathy Kraninger was confirmed by the Senate as the new CFPB director.[185]  Initially, she asserted the agency would engage in vigorous enforcement action and make consumer data security an important priority.[186]  But in September 2019, Kraninger filed a Supreme Court brief stating that she now believed the CFPB was unconstitutionally created and so must be disbanded.[187]  The Court is set to decide that question in 2020.[188] Despite this uncertainty, in July, the CFPB, in conjunction with the FTC and various state regulators, announced a settlement with a national provider of consumer credit information over a data breach which impacted 150 million consumers.[189]  Under this agreement, the provider would pay up to $700 million in monetary relief, including up to $425 million in monetary relief to consumers.[190]

c)  Department of Defense

The DoD made new efforts to address and defeat cybersecurity threats in 2019, particularly with respect to the national security supply chain.  To this end, the DoD’s Guidebook for Contractor Purchasing[191] highlighted that safeguarding DoD-covered defense information would be critical to supply chain management[192] and proposed various measures to check for vendor compliance with the Department’s cybersecurity standards.[193] Perhaps the most significant procurement-related developments came in the rollout of the DoD’s Cybersecurity Maturity Model Certification (“CMMC”) program for vendors on the DoD’s supply chain.  The CMMC will set out a proposed five-level hierarchy of “cyber hygiene” standards suppliers of DoD equipment must meet to contract with the Department, with each ascending level corresponding to a higher level of required protection depending on the sensitivity of the product involved.[194] The CMMC’s goal is to review and combine cybersecurity standards and best practices from across the information technology industry, to certify independent third-party organizations to conduct audits and inform the development of the standards, and to build upon existing vendor regulations by adding a verification component.[195]  Throughout 2019, the DoD released draft versions of the CMMC for comment and review, with the most recent released in December.[196] As the CMMC program comes into place, vendors may face challenges implementing it and matching the new standards as they upgrade their measures of protection.[197] DoD itself may also have some work to do: in 2019, various audits revealed areas of potential vulnerability which the DoD must work to address.  In July, the DoD’s Inspector General issued reports warning the Department had taken insufficient steps to verify the cybersecurity risk posed by off-the-rack technology systems purchased by DoD personnel,[198] and that DoD contractors failed to take cybersecurity precautions such as requiring multifactor authentication and systematically identifying network vulnerabilities.[199] As it increases its focus on cybersecurity, the DoD will be guided by this year’s iteration of the National Defense Authorization Act,[200] which establishes a Principal Cyber Advisor for each of the military services, directs the Department to produce an annual report on military cyberspace operations, and endorses the CMMC program.[201]

d)  Other Agencies

Apart from these examples, other federal agencies also made news in the data and cybersecurity space throughout 2019.  In June, the DOJ announced an antitrust investigation into some of the nation’s largest technology companies,[202] with the company’s practices of amassing substantial amounts of consumer data flagged as a potential antitrust concern.[203] Gibson Dunn will continue to monitor developments as this effort proceeds. In September, the Commodity Futures Trading Commission imposed a $1.5 million fine on a commissions merchant for allowing an email phishing attack to steal $1 million in customer funds via the company’s computer systems.[204]  In December, the Department of Commerce initiated a notice of proposed rulemaking on regulations to block transactions that might endanger the nation’s information and communications technology supply chain.[205]  2019 also saw the Department of Energy continue efforts to improve the cybersecurity of America’s critical infrastructure systems,[206] albeit with warnings from watchdogs like the Government Accountability Office that key vulnerabilities remained.[207]  And the Department of Homeland Security (“DHS”) itself came under scrutiny after a data breach at the Federal Emergency Management Agency (“FEMA”) exposed the sensitive data of over 2.3 million disaster survivors.[208] Notably, the National Institute of Standards and Technology (“NIST”) also released two updated standards for other federal agencies to use in procurement when contracting with vendors.  The first, NIST SP 800-171, Revision 2,[209] addresses contractual protections vendors should have when protecting Controlled Unclassified Information (“CUI”).  This draft made comparatively minor changes from previous versions, but emphasized that Version 3, its next revision, will likely provide a comprehensive update.[210]  NIST also released a draft of NIST SP 800-171B,[211] a heightened set of contracting standards intended for vendors engaged in “Critical Programs and High Value Assets,” and specifically focused on “(1) penetration resistant architecture; (2) damage-limiting operations; and (3) designing for cyber resiliency and survivability.”[212]  And in January of 2020, NIST also released Privacy Framework Version 1.0, aimed at providing voluntary strategies and tools for organizations that want to “improve their approach to using and protecting personal data.”[213] As data and privacy concerns become more salient, the depth and degree of federal agency involvement will surely continue to grow.

5.  State Attorneys General and Other State Agencies

State-level regulators also continued to play a key role in data privacy and security matters in 2019, collaborating to bring enforcement actions yielding recoveries in the hundreds of millions of dollars and actively protecting consumers from the danger of data breaches.

a)  State Attorneys General

As noted above, in July 2019, Attorneys General from 48 states, Puerto Rico, and the District of Columbia, along with the FTC and CFPB, settled a long-running dispute against a major credit reporting agency.  This action stemmed from a 2017 data breach in which unauthorized persons gained access to portions of the reporting agency’s network, affecting more than 147 million consumers.  Under the settlement, as discussed, the reporting agency is required to implement various consumer protection safeguards and controls and to offer no-cost credit monitoring to consumers as discussed above.  In particular, in addition to other remedies described above, the agency had to pay the Attorneys General $175 million for purposes including consumer education and litigation costs.[214] On July 31, a manufacturer of security camera software agreed to pay $8.6 million to settle multistate litigation alleging that the company violated the False Claims Act (“FCA”) and state whistleblower acts because it knowingly failed to report or remedy flaws in the security surveillance systems it sold to the federal government and to multiple state governments.  These flaws made the system vulnerable to hackers.  The settlement provided refunds to the federal government and 16 states that had purchased the allegedly defective software.  This was the first cybersecurity-related settlement under the FCA or comparable state statutes.[215] In October 2019, the Attorneys General of 47 states and territories announced a multistate antitrust investigation into a social networking platform.  This investigation is being led by the New York Attorney General and will focus on whether the platform has stifled competition and put consumers’ data at risk.  Many of the Attorneys General who have joined this investigation have issued statements emphasizing the need to combat anticompetitive business practices and protect consumer data.[216] Individual states also took action apart from litigation.  In October 2019, New Jersey’s Attorney General announced a new “Cyber Savvy Youth” initiative.  This initiative will educate and test the cybersecurity knowledge of students from kindergarten through high school.  At the same time, the state’s Division of Consumer Affairs announced the 2018 statistics regarding data breaches affecting New Jersey residents: 906 data breaches were reported to the New Jersey State Police last year, a nearly 6 percent decrease from the 958 breaches reported in 2017. In addition, civil settlements reached by the Attorney General’s Office following data breach incidents had resulted in more than $6.4 million in recoveries for the state on a year-to-date basis.[217]

b)  New York Department of Financial Services

Apart from Attorneys General, other state regulators continued to engage in the data privacy space.  In May 2019, for example, New York’s Department of Financial Services (“DFS”) announced the creation of a new Cybersecurity Division.  The Division will focus on protecting consumers and industries from cyber threats by conducting cyber-related investigations, issuing regulatory guidance, offering counsel, and enforcing DFS’s cybersecurity regulations.[218]

II.  Civil Litigation

A.  Data Breach Litigation

1.  High-Profile Incidents and Related Litigation in 2019

Just nine months into the year, the number and sheer scale of cyberattacks occurring in 2019 had already surpassed those of prior years, earning 2019 the label of “the worst year on record” for data security breaches.[219]  Not surprisingly, several high-profile attacks in 2019, including the following, culminated in consumer class action and shareholder litigation. Clinical Laboratories.  On June 3, 2019, a medical diagnostics provider announced that its medical billing contractor suffered a data breach between August 1, 2018 and March 30, 2019, in which hackers accessed the personal data of nearly 12 million of the laboratory’s customers.[220]  Another leading clinical laboratory that contracted with the same billing contractor was also impacted by the breach, which affected up to 7.7 million of its patients.[221]  Class action lawsuits were subsequently filed in federal and state courts, including in California and New Jersey.[222]  On June 18, 2019, the billing contractor filed for bankruptcy, citing the fallout from the breach.[223] Convenience Store Chain.  On December 19, 2019, a convenience store chain announced that it had discovered malware capable of exposing credit card numbers, expiration dates, and cardholder names at all of the chain’s more than 850 stores.[224]  In the weeks following the announcement, nearly a dozen proposed class action lawsuits were filed in the Eastern District of Pennsylvania.[225]

2.  Updates in High-Profile Data Breach Cases from Prior Years

a)  Key Settlements

Consumer Credit Reporting Agency.  As outlined above, in July 2019 a consumer credit reporting agency agreed to pay at least $575 million, and up to $700 million, as part of a global settlement with consumers, the FTC, the Consumer Financial Protection Bureau, and 50 U.S. states and territories based on allegations that the reporting agency’s failure to implement basic measures to secure personal information on its network resulted in a data breach in 2017 that impacted 147 million people.  On December 19, 2019, a federal district judge in Georgia granted final approval to that portion of the global settlement defining monetary relief for consumers impacted by the breach.  Under approved settlement, the reporting agency will pay up to $425 million in restitution to consumers, $77.5 million in attorney’s fees to class counsel, and up to $3 million in class counsel litigation expenses.[226]  The company also agreed to spend $1 billion to improve its own cybersecurity, pay in full all valid consumer claims for out-of-pocket expenses, and cover credit monitoring services for affected consumers.[227]  The federal judge approving the consumer settlement concluded that the deal, which encompasses more than $7 billion in aggregate benefits to consumers, represents “the largest and most comprehensive recovery in a data breach case in U.S. history by several orders of magnitude.”[228] Internet Service Company.  On July 20, 2019, a federal district judge in the Northern District of California preliminarily approved a $117.5 million settlement to resolve litigation arising out of a trio of data breaches of an internet service provider’s user account data between 2012 and 2016.[229]  The deal covers an estimated 194 million class members.[230]  The preliminary approval came after the judge had rejected prior versions of the settlement, citing a lack of sufficient specificity as to the class size, monetary and non-monetary relief, and details of the nature of the data breaches.[231] Earlier in the year, in January 2019, a California Superior Court judge approved a $29 million deal to resolve three shareholder derivative lawsuits against the company’s former officers and directors in California and Delaware, which arose out of the same series of data breaches.[232]

b)  Litigation

Social Media Company.  Following reports that Cambridge Analytica obtained information on a social media company’s users, the social media company faced several shareholder derivative lawsuits and consumer class actions, the latter of which were ultimately consolidated in the Northern District of California.  On September 9, 2019, the federal district judge presiding over the consumer class actions permitted certain of the plaintiffs’ claims to proceed, while granting the social media company’s motion to dismiss other claims.[233]  The court held, with respect to the surviving claims, that plaintiffs maintained a privacy interest in information they disclosed to a limited audience and that they had alleged an injury sufficient to confer standing based on that privacy interest alone, even in the absence of a secondary economic injury such as identity theft.[234]  On October 31, 2019, the court issued a single-sentence order denying the social media company’s motion to certify the court’s Article III standing analysis for interlocutory review.[235]  A hearing on class certification is scheduled for late 2021.[236] Sports Apparel Company.  Last year we also reported on class action litigation filed against a fitness apparel company following its announcement that hackers obtained access to the data of 150 million users of its fitness-tracking app.[237]  On February 11, 2019, a federal district judge in the Central District of California granted the company’s motion to compel arbitration, holding that by clicking “accept” in response to the app’s terms and conditions, which incorporated the American Arbitration Association Rules, the plaintiff had “clearly and unmistakably delegated the arbitrability issue to the arbitration.”[238]

3.  The Deepening Circuit Split on Standing Post-Spokeo

In 2019, the divide among circuit courts over the requirements for Article III standing in data breach cases continued to deepen in the wake of the Supreme Court’s 2016 ruling in Spokeo, Inc. v. Robins.[239] In Spokeo, the Supreme Court held that a statutory violation alone cannot establish injury-in-fact standing; a plaintiff must allege a “concrete” injury stemming from the violation.[240]  Following that decision, lower courts have diverged over what facts a plaintiff must allege to establish a “concrete” injury sufficient to confer Article III standing in data breach cases.  While some courts of appeals, including the Ninth and D.C. Circuits, have held that the theft of consumers’ private information in and of itself establishes a “substantial risk” of future harm sufficient to confer standing,[241] other courts, including the Fourth and Eighth Circuits, have held that such allegations are too speculative.[242] In June 2019, a divided panel of the D.C. Circuit reaffirmed the split, holding that government employees “cleared the low bar to establish standing” by alleging that they faced an increased risk of identity theft following a 2015 hack of the Office of Personnel Management (“OPM”).[243]  The majority’s decision expanded on the court’s prior holding in Attias v. CareFirst, which had pointed to the circumstances of the breach at issue to conclude that the hackers had “the intent and the ability to use” the stolen data “for ill.”[244]  Here, the majority reasoned, the sensitivity of the stolen data and the fact that some class members had already suffered identity theft or fraud rendered the question of the hacker’s intent “markedly less important.”[245]  The majority further rejected the dissent’s conclusion that the passage of two years between the cyberattacks and the filing of the complaint “was enough to render the threat of future harm insubstantial.”[246] Thus far, the Supreme Court has not signaled an interest in resolving the divide.  As we reported last year, the Supreme Court denied a petition to review the D.C. Circuit’s Attias decision.[247]  In March 2019, the Supreme Court again passed on the opportunity, declining to review the Ninth Circuit’s decision in In re Zappos.com, Inc., which held that plaintiffs had established standing based on the allegation that the information exposed in a data breach could be used to cause future harm.[248]

B.  Telephone Consumer Protection Act Litigation

The past year brought several significant actions and noteworthy developments related to the Telephone Consumer Protection Act (“TCPA”). First, at the start of the year, the FCC’s Consumer and Government Affairs Bureau solicited comments on a motor vehicle servicer’s petition for declaratory review around the FCC’s understanding of “dual purpose” communications (communications that both provide a service and simultaneously act as commercial messages for TCPA purposes).[249]  The servicer argued its prerecorded messages to customers, recommending that they take their cars for inspections at certain times, were not “dual purpose,” since the communications allegedly were entirely service-based rather than commercial.[250]  Accordingly, the servicer argued, the communications should not be subject to the heightened written consent standards the TCPA imposes on commercial messages.[251]  The FCC has yet to issue guidance in response to the petition, but Gibson Dunn will continue to monitor developments in this area, and the Commission’s interest in such questions suggests clarifications of the “dual purpose” concept might be made in 2020. Turning to another aspect of the TCPA, as discussed above, on June 6, the FCC adopted a Declaratory Ruling and Third Further Proposed Rulemaking to allow phone carriers to block both illegal and unwanted robocalls by default without waiting for customers to opt in to the service.[252]  The FCC’s ruling requires carriers to use “reasonable analytics”—such as those used by call-management apps—to determine which calls to block.[253] On June 20, the Supreme Court issued an opinion in PDR Network, LLC v. Carlton & Harris Chiropractic, Inc., although the Court did not definitively decide the issue presented.[254]  Acknowledging that it is “difficult to answer [the] question” of whether the Hobbs Act requires the district court to accept the FCC’s legal interpretation of the term “unsolicited advertisement” in the TCPA, the Court remanded to the Fourth Circuit to answer two preliminary questions:  first, whether the FCC’s 2006 order is a “legislative” or “interpretive” rule under the APA, as the former has the “force and effect of law” while the latter does not;[255] and second, whether PDR Network had a “prior” and “adequate” opportunity to seek judicial review of the FCC’s 2006 order, as required by Section 703 of the APA.[256]  If not, the Court noted that PDR Network “may” be permitted to challenge the validity of the order under the APA, even if the order is deemed a legislative rather than an interpretive rule.[257]  In a four-Justice concurrence, Justice Kavanaugh deemed the question “straightforward,” stating that the relevant statute does not “expressly preclude judicial review of an agency’s statutory interpretation in an enforcement action” and PDR Network therefore “may argue to the District Court that the FCC’s interpretation of the TCPA is wrong,” and he concluded that, on remand, “the District Court should interpret the TCPA under usual principles of statutory interpretation, affording appropriate respect to the [FCC’s] interpretation.”[258]  He went on to provide an extensive analysis that will “remain[] available to the court on remand . . . and . . . to other courts in the future.”[259] In August, the Eleventh Circuit created a circuit split when it concluded that the receipt of a single unsolicited text message—which is “more akin to walking down a busy sidewalk and having a flyer briefly waived in one’s face”—does not generate the harm necessary to give rise to claims under the TCPA.[260]  That ruling is at odds with the Ninth Circuit’s January 2017 decision in Van Patten v. Vertical Fitness Group, LLC, which held that the receipt of just two unsolicited text messages constituted concrete harm under Article III.[261]  Though no parties have filed petitions for certiorari to date, it is likely that the Supreme Court will be presented with the question of what constitutes standing under the TCPA. Later in the year, within a 15-day span a social media company and a communications company filed separate petitions for certiorari with the Supreme Court regarding the constitutionality of the TCPA.  Specifically, the companies are asking the Court to opine on whether the TCPA’s prohibition on calls made using an automated telephone dialing system (“ATDS”) or an artificial or prerecorded voice is an unconstitutional restriction on speech.[262]  The social media company’s petition also asks the Court whether the Ninth Circuit’s statutory interpretation of the TCPA’s definition of an ATDS in Marks v. Crunch San Diego[263] is overly broad.[264]  Although the FCC sought public comments on this question following both Marks and the D.C. Circuit Court’s decision in ACA International v. FCC,[265] the agency has yet to issue any guidance.  Thus, the Supreme Court’s consideration of this question would be significant.  And in a further constitutional challenge to the TCPA, this January the Supreme Court granted certiorari in Barr v. American Association of Political Consultants Inc.,[266] in which it will consider whether the TCPA’s “government-debt exception” violates the First Amendment and, if so, whether the appropriate remedy would be to sever the exception from the statute. Finally, on December 30, President Trump signed into law the Telephone Robocall Abuse Criminal Enforcement and Deterrence (“TRACED”) Act, which is intended to combat illegal robocalls under the TCPA.[267]  Specifically, the legislation: (1) increases civil penalties for TCPA violations to up to $10,000 per call; (2) provides the FCC with additional time to bring actions based on violations related to knowingly providing misleading or inaccurate caller ID information; and (3) requires telecommunications carriers to implement, at no additional charge, the FCC’s STIR/SHAKEN call authentication procedures to prevent scammers from spoofing numbers.[268]  The House in July passed a similar law aimed at cracking down on unwanted automated phone calls, the Stopping Bad Robocalls Act, on which the Senate has yet to vote.[269]

C.  Biometric Information Privacy Act Litigation

As we foreshadowed in last year’s Review, 2019 was an active year for biometric privacy litigation.  In particular, litigation continued around Illinois’ Biometric Information Privacy Act (“BIPA”), which confers a private right of action to individuals “aggrieved” under the statute,[270] unlike similar statutes in states such as California, Texas, and Washington.  The Illinois Supreme Court seemed to invite such litigation with its decision in Rosenbach v. Six Flags,[271] in which the court held that individuals aggrieved under the BIPA have standing to sue without alleging an actual injury, because the BIPA provides individuals with a substantive right to control their biometric information and no-injury BIPA violations are not merely “technicalit[ies]” but instead are “real and significant” harms to important rights.[272] As a result of Rosenbach, to withstand a motion to dismiss plaintiffs need merely to allege that they are aggrieved persons under the BIPA.  Illinois courts and federal courts applying Illinois law have applied Rosenbach in precisely this manner.  For example, in Rottner v. Palm Beach Tan, Inc., an Illinois appellate court reversed the lower court’s dismissal of a BIPA action for failure to sufficiently plead damages, issued prior to Rosenbach, because “Rottner, like Rosenbach, has standing to sue and has adequately stated a claim for liquidated damages under section 20 of the Act, even if she has alleged only a violation of the Act and not any actual damages beyond violation of law.”[273]  Similarly, in Rogers v. CSX Intermodal Terminals, Inc., the U.S. District Court for the Northern District of Illinois granted in part the defendant’s motion to dismiss putative class action claims for intentional and reckless violations of the BIPA, which the court deemed insufficiently pled, but it denied the motion as to claims of statutory violations of the BIPA, which the court noted required only that a plaintiff allege he or she was an aggrieved person under the BIPA.[274]  Likewise, in Namuwonge v. Kronos, Inc.,[275] the court determined that the plaintiff failed to plead any facts that would support a finding of intentionality or recklessness, and instead merely alleged that the putative class was composed of aggrieved persons under the BIPA.[276]  The court thus struck the intentional and reckless claims from the complaint, but it left untouched the remaining BIPA claims.[277] In addition to using Rosenbach to defeat motions to dismiss, plaintiffs also have used it to avoid being compelled into arbitration.  In Liu v. Four Seasons Hotel, Ltd.,[278] an Illinois appellate court rejected the defendant’s attempt to compel arbitration of its employees’ BIPA claims on the ground that the claims merely sought “wages and hours” relief, clarifying that: “[s]imply because an employer opts to use biometric data, like fingerprints, for timekeeping purposes does not transform a complaint into a wages or hours claim.”[279]  Although this holding applies narrowly to circumstances in which employers attempt to construe privacy claims as wage and hour claims, it nevertheless highlights Rosenbach’s impact in facilitating the survival of such claims.  Indeed, some companies are choosing to settle BIPA claims for sizeable sums rather than litigate them, as Smith Senior Living and its timekeeping company Kronos (which lost a motion to dismiss in a separate BIPA action last year) did to the tune of $1.55 million for a class of just under 1,700.[280] Perhaps the biggest impact of Rosenbach, though, has been the flood of class actions filed against large corporations as a result of the BIPA’s relatively simple pleading requirements.[281] As this Review went to press, the Supreme Court declined to grant certiorari on one closely watched case in this area.[282]  The case involves the Ninth Circuit’s affirmance of the certification of a class of a social media company’s users for alleged violations of the Illinois BIPA predicated on the company’s use of facial recognition technology.[283]

D.  Other Notable Cases

In addition to the cases described above, 2019 brought developments in a number of matters discussed in last year’s Review, as well as a host of new matters concerning shareholders’ derivative rights, companies’ recordation and storage of data through connected devices and otherwise, the Internet of Things, medical records, the scope of the Wiretap Act, and privacy‑related insurance coverage.  We describe some of the key updates and cases on these issues in greater detail below. Social Media Company.  As highlighted in last year’s Review, at the end of 2018, the media reported that two bugs had exposed profile data of millions of users of a social media service.[284]  Upon release of the news, plaintiffs filed complaints, which were consolidated in a single class action complaint in the Northern District of California.[285]  The company filed a motion to dismiss the complaint on April 10, 2019, but later agreed to a settlement in principle after mediation on August 14, 2019.[286]  Under the proposed settlement, the company must pay $7.5 million; individual claimants will each receive up to $5.00, with the potential to receive up to $12.00 depending on the number of claimants.[287] Derivative shareholder litigation against the social media company, also discussed in last year’s Review, was also consolidated in the Northern District of California.  In May 2019, the company moved to dismiss the shareholders’ amended complaint, arguing, among other things, that it fixed the bug before it made any statements shareholders claimed were “misleading,” and that shareholders had failed to adequately plead scienter or material harm to the business.[288]  The court has yet to rule on the motion to dismiss. Social Media Company.  After the media reported in March 2018 that Cambridge Analytica had obtained information on some of a different social media company’s users, the social media company’s shareholders brought a number of derivative lawsuits that were consolidated in the U.S. District Court for the Northern District of California.  On March 22, 2019, the court granted in part the company’s motion to dismiss the shareholders’ state claims on forum non conveniens grounds, finding the forum selection clause in the company’s Restated Certificate of Incorporation valid and applicable.[289]  The court granted the social media company’s motion to dismiss the federal claims with leave to amend, holding that the shareholders failed to adequately plead demand futility.[290]  The shareholders filed an amended complaint on December 17, 2019.[291] In May 2019, the Washington, D.C. Superior Court denied the company’s motion to dismiss claims brought by the D.C. Attorney General alleging violations of the D.C. Consumer Protection Procedures Act for failing to take reasonable steps to protect the “trove” of personal consumer data that the company “collects and maintains.”[292]  The court concluded that the Attorney General had adequately pleaded the merits of its case at the motion-to-dismiss stage and any existing factual questions should be decided by a jury.[293] Banking Institutions.  In last year’s Review, we reported on litigation against banking institutions claiming that the institutions impermissibly recorded consumer calls.  In February 2019, the U.S. District Court for the Western District of Pennsylvania approved a stipulated dismissal of one such action following a settlement between the bank and the plaintiff.[294]  It does not appear that the institution involved in the California-based case has appealed from the California Court of Appeals’ decision, which reversed summary judgment for the institution and held that the institution had failed to show it lacked intent to record the relevant conversations, defining “intent” as acting with “the purpose or desire of recording a confidential conversation, or with the knowledge to a substantial certainty” that a confidential conversation will be recorded.[295] Technology Company - Location History.  On December 19, 2019, the Northern District of California granted a technology company’s motion to dismiss class-action claims that it had stored users’ locations even where those users had turned off location history settings in apps.[296]  The plaintiffs had asserted claims under the California Invasion of Privacy Act (“CIPA”) and California’s state-constitutional right to privacy.[297]  In its motion filed in May 2019, the company argued that the plaintiffs had consented to the collection and storage of location data by agreeing to its Privacy Policy, and that the laws plaintiffs cited were inapplicable because the company did not deploy an “electronic tracking device” “attached to a . . . movable thing” under the CIPA or egregiously breach social norms under the state constitution.[298]  The court found the statements within the company’s Privacy Policy and Terms of Service irrelevant, but it concluded, among other things, that the CIPA applies only to “unconsented geolocation tracking,” not the storage and collection of geolocation data, and that the plain terms of the statute did not encompass the circumstances presented.[299]  The court therefore dismissed the plaintiffs’ CIPA claims with prejudice.[300]  The court also found that the plaintiffs had failed to plead facts to establish a legally protected privacy interest under the state constitution, but granted plaintiffs leave to amend the complaint on this issue.[301] Technology Company - Medical Records.  On June 26, 2019, plaintiffs filed a class action complaint and demand for jury trial against a technology company and a private university, claiming that the university turned over to the company “the confidential, highly sensitive and HIPAA-protected records of every patient who walked through its doors between 2009 and 2016” without notifying patients or obtaining their express consent, thereby violating state consumer fraud, contract, intrusion upon seclusion, and unjust enrichment laws.[302]  The plaintiffs labeled the company’s and the university’s assertions that the medical records were de-identified “incredibly misleading,” alleging that the records contained detailed date stamps and free-text notes, and that because the company is a “prolific data mining” company, it could determine individuals’ identities from the records.[303]  The plaintiffs further claimed that the company collected the records in order to build and patent its own commercial electronic health record system and develop software that could be sold at premium prices, and that, in exchange for providing the records, the university received a perpetual license to use the software that the company developed.[304]  The university and company filed separate motions to dismiss, arguing, among other things, that the plaintiffs had failed to allege an actual injury and thus lacked Article III standing.[305]  The motions are currently pending. Connected Vehicles and Devices, and the Internet of Things.  On November 11, 2019, an automobile manufacturer we discussed in last year’s Review moved to decertify state-based classes of drivers in Michigan, Illinois, and Missouri,[306] following the U.S. Supreme Court’s refusal in January to hear the manufacturer’s challenge to the certifications.[307]  Also on November 11, the manufacturer moved both for summary judgment on the drivers’ claims that defects in certain vehicles’ infotainment systems made the vehicles vulnerable to hackers and to dismiss the claims for lack of subject matter jurisdiction.[308]  The manufacturer asserted that none of the plaintiffs had alleged that his or her vehicle’s system had malfunctioned or was hacked; thus, the plaintiffs had suffered no legally cognizable injury.[309]  It also argued that there is a growing consensus among courts that consumers’ claims that they “overpaid” for a product because it theoretically could have been made safer are insufficient to establish subject matter jurisdiction.[310]  The court has yet to rule on these dispositive motions. Additional connected-device cases continue to emerge, and the bases of such cases continued to test the scope of the Wiretap Act in 2019.  In May, for example, the Northern District of California held that the vibration intensity settings a user chooses on an adult product constitutes “content” under the Wiretap Act, and the harvesting of such data could constitute intrusion upon seclusion under California state law.[311]  In August, the U.S. District Court for the District of New Jersey partially granted two electronic companies’ motion-to dismiss claims that Smart TVs collected data on consumers, including which programs consumers watched, IP and MAC addresses, and ZIP codes, and that the companies sold the data to third parties who used it to conduct targeted advertising.[312]  The court dismissed the plaintiffs’ state law claims and claims under the Video Privacy Protection Act (“VPPA”), finding the latter “squarely foreclosed” by controlling precedent that established such “static” identifying information does not constitute personally identifiable information under the VPPA.[313]  The court allowed the plaintiffs’ Wiretap Act claim to go forward, finding that the companies were not parties to any allegedly intercepted “communication” between the content provider and the Smart TV, and that information about what consumers are watching constitutes “content” under the Act.[314]  The companies have asked the court to reconsider its ruling on the Wiretap Act claim, or, in the alternative, to certify the court’s order for interlocutory appeal.[315] In June, plaintiffs filed class action complaints in California state court (subsequently removed to federal court) and Washington federal court against a large retailer and technology company, alleging that the company used voice-enabled devices to build a “massive database of billions of voice recordings” containing private personal details of children, among others, without the consent of the children or their parents.[316]  The plaintiffs claimed that the company does not have to store these voice recordings but does so for its own commercial gain, and they asserted that the company’s alleged actions violate multiple states’ wiretap laws.[317]  Since filing, some plaintiffs have voluntarily dismissed their complaints without prejudice.[318]  The company moved to dismiss the remaining plaintiffs’ claims in early January 2020, arguing that the plaintiffs had failed to state a claim because, among other things, the “mere creation of recordings within a communication service” intended to provide instructions over the internet does not constitute illicit interception, eavesdropping, or recording.[319] Minors’ privacy rights also were in the news in 2019 as a result of class actions filed against app developers and major media companies alleging that the defendants used gaming apps for children to track online behavior and leveraged the collected data to target advertising to the children playing the games.[320]  On May 22, 2019, the Northern District of California allowed the majority of the plaintiffs’ privacy claims to move forward, finding, among other things, that the plaintiffs’ allegations that defendants gathered user-specific information, worked with third-party companies to buy and sell the information, targeted ads to users, and tracked users’ responses to those ads met the standard required to survive a motion to dismiss.[321]  Trial is currently scheduled for October 2020.[322] Computer Fraud and Abuse Act Litigation.  In July, an online ticket vendor reached a favorable settlement on its allegations that individuals had used bots to purchase large quantities of tickets in violation of the company’s Terms of Use, as we described in last year’s Review.  Under the settlement, the defendants are permanently enjoined from using the ticket vendor to search for or purchase tickets, from violating the vendor’s Terms of Use, and from conspiring with others to engage in such activities, among other things.[323] Cybersecurity Insurance and Acts of War.  In December 2019, an insurer, about which we wrote in last year’s Review, settled with its insured after the latter filed an appeal in the Eleventh Circuit challenging a district court decision that the insured’s personal injury policy did not cover data breach litigation costs.[324]  Similarly, the bank and insurer we discussed in last year’s Review in the context of financial institution bonds also settled in March 2019.[325] In a case that could have broad implications for companies seeking to insure themselves against cybersecurity attacks, a suit between a food and beverage company and its insurer after the insurer denied coverage for a ransomware attack was one of the most salient of 2019.[326]  The food and beverage company was one of hundreds of companies impacted by the “NotPetya” cyberstrike in 2017, for which the U.S. government ultimately assigned responsibility to Russia.[327]  When the company made a claim to its insurance company to cover costs resulting from the attack, pointing to provisions of its insurance policy that provided coverage for damage to electronic data or damages resulting from the failure of electronic data processing equipment or media, the insurance company invoked an exception to coverage for “hostile or warlike action in time of peace or war.”[328]  The food and beverage company has asserted breach of contract, promissory estoppel, and unreasonable conduct claims under Illinois law and has requested at least $100 million in damages.[329]  A pharmaceutical company filed a similar suit against its insurer in New Jersey related to the NotPetya strike, seeking $1.3 billion in damages.[330]  Neither case has yet resolved, but as the risks and prevalence of cybersecurity attacks increase, in particular attacks with suspected connections to foreign governments, the interpretation of “act of war” exclusions in security-related insurance policies likely will become increasingly important. Cy Pres Settlements.  An open question going into 2020 is the legality of cy pres-only settlements, or settlements from which the proceeds go to public interest organizations rather than class members, which we discussed in last year’s Review.  Although the Supreme Court seemed poised to address this question in Frank v. Gaos,[331] a case concerning a technology company’s alleged transmission of users’ search terms to third parties through referrer headers, the Court instead remanded the case to the district court to evaluate the plaintiffs’ standing in light of Spokeo, Inc. v. Robins,[332] discussed above.

III.  Government Data Collection

A.  Collection of Data from Computers, Cellphones, and Other Devices

This year, a number of court decisions addressed the issue of individuals’ privacy rights with respect to data stored on cell phones and other personal electronic devices.  Although one of the more prominent decisions bolstered such rights by narrowing the Government’s ability to collect and search data without warrants, courts have reached divergent conclusions regarding the Government’s authority to demand that an individual provide biometric input (such as pressing their fingerprint) to unlock digital devices. In November 2019, a federal district court in Massachusetts held that the Fourth Amendment prevented warrantless data searches of electronic devices at border crossings unless there is reasonable suspicion the devices contain contraband.[333]  In doing so, the court cabined the Fourth Amendment’s traditional “border search exception,” under which a variety of suspicionless searches are permitted.[334]  The court found that while that exception might allow for cursory searches—such as taking a brief look to determine whether a device is in fact owned by the person carrying it—it did not extend to a full search of one’s personal photographs, phone contacts, or sensitive personal or professional data.[335]  Both parties have appealed the decision to the U.S. Court of Appeals for the First Circuit, where the matter is pending.[336] As digital devices increasingly require thumbprint or facial recognition credentials upon startup, a split has emerged over whether the Government can compel arrestees to provide their biometric inputs to unlock their devices. In the Matter of the Search of a Residence in Oakland, prosecutors applied for a warrant to search electronic devices in an extortion investigation.[337]  The warrant application sought the authority to compel any person present during the search to provide biometric inputs (such as pressing a finger or displaying their face) to unlock the devices.[338]  The court rejected the application, reasoning that providing biometric data is akin to compelling a witness to provide testimony, and thus a violation of the Fifth Amendment.  In reaching this conclusion, the Court analogized forced biometric authorization to forcing a witness to produce a passcode to a digital device, which courts have regularly found to invoke the Fifth Amendment privilege.[339] In United States v. Barrera, however, a federal court in Illinois reached the opposite result.[340]  The court in Barrera found the Fifth Amendment is invoked only when “the compelled act forces an individual to disclose the contents of the subject’s own mind,” and is distinct from one’s physical characteristics.[341]  In this respect, the Barrera Court compared compelled biometric use to physical acts, such as providing blood samples or handwriting exemplars, which courts have routinely held to be non-testimonial in nature.[342]

B.  Other Notable Developments

1.  Extraterritoriality and Warrants

In 2018, Congress passed the Clarifying Lawful Overseas Use of Data Act (“CLOUD Act”).[343] The Act’s two main prongs were to: (1) empower the government to make agreements with foreign countries that mutually remove any barriers to compliance with each nation’s court orders to produce data; and (2) clarify that any communication provider subject to U.S. jurisdiction must, upon appropriate legal request, produce any data in their possession, regardless of where the data is stored.[344] This year saw the United States and the United Kingdom sign the first-ever CLOUD Act bilateral pact: the US-UK Bilateral Data Access Agreement.[345]  Under the agreement, the U.S. can now access any electronic data stored in the United Kingdom using American legal processes (and vice versa).[346]  However, the agreement has brought protest from groups who believe that standards for search and seizure in the United Kingdom are weaker than those required by the Fourth Amendment, putting civil liberties at risk.[347] Apart from its United Kingdom agreement, the federal government has also begun talks with both the European Union[348] and Australia,[349] suggesting 2020 may well bring new CLOUD Act pacts. This year the government also sought to clarify the scope of the CLOUD Act via formal Department of Justice guidance.  The DOJ’s white paper asserted that the second, location-based prong of the CLOUD Act did not create a substantive change, but rather “simply clarified existing U.S. law on this issue; it did not change the existing high standards under the U.S. law that must be met before law enforcement agencies can require disclosure of electronic data.”[350] Nonetheless, privacy rights groups remain skeptical of the Act,[351] and Gibson Dunn will continue to monitor developments in this area.

2.  Foreign Intelligence Surveillance Court Approves FBI’s Proposed Electronic Surveillance Procedures

This fall, the Foreign Intelligence Security Court (“FISC”) considered whether the FBI’s protocols for identifying targets for electronic surveillance and collecting their data complied with the Foreign Intelligence Surveillance Act (“FISA”) and with the Fourth Amendment.[352]  On September 4, FISC upheld the certifications, approving a procedure under which: (1) the FBI differentiates between queries of U.S. persons and all other queries; (2) prior to reviewing the contents of any U.S. person query, the FBI provides a written statement as to why such query is reasonably likely to return foreign intelligence information or evidence of a crime; and (3) the FBI provides records of such queries to the Department of Justice and the Office of the Director of National Intelligence for oversight.[353]  Additionally, the FISC affirmed that the NSA’s 2018 Targeting Procedures prohibit collection of communications solely containing reference to, but not to or from, a foreign intelligence target (also known as “abouts” collection).[354]

3.  Increased Government Use of Biometric Identification Technologies Draws Scrutiny

On October 31, 2019, the American Civil Liberties Union (“ACLU”) filed a complaint against the FBI, DOJ and the Drug Enforcement Administration to compel the release of its policies, contracts and other records relating to the use of facial recognition programs and other biometric identification and tracking technology.  The complaint argues that such “highly invasive” technologies permit the U.S. government to track people and their associations in potentially unconstitutional ways.[355]  For example, according to an FBI witness, the FBI has the ability to run facial recognition searches against over 640 million photographs.[356]  The FBI’s guidelines permit the use of such technology without a warrant, demonstration of probable cause, or other fact-based suspicion.[357] Similarly, Immigration and Customs Enforcement (“ICE”) has recently been scrutinized for its use of “Rapid DNA” testing on families at the U.S.-Mexico border to identify biological parent-child relationships within 90 minutes.[358]  The Electronic Frontier Foundation filed suit this fall seeking records of ICE’s testing procedures and accuracy, arguing that Rapid DNA testing is error-prone and expressing concern over the technology’s use on lawful residents in non-border circumstances.[359] Also in light of concerns regarding invasiveness and accuracy, three municipalities in California and one in Massachusetts have banned the municipal government from using facial recognition systems altogether.[360]  At the state level, California and Massachusetts are considering laws to place a moratorium on government use of facial recognition and other biometric identification technologies until regulations are established to protect the public’s interest.[361]  At the federal level, the U.S. Congress has held multiple hearings throughout 2019 on the government’s use of facial recognition, and several bills have been introduced to prohibit and/or limit such use.[362]

IV.  Conclusion

2019 has proven to be another significant year in the development and application of data privacy and cybersecurity law and for 2020 the fast pace of change will continue.  As technology and data collection become more sophisticated, companies, governments and the public at large will continue to explore the opportunities, and perils, that these changes present.  We will be tracking these important issues in the year ahead.  ______________________      [1]    Eric Goldman, What we’ve learned from California’s Consumer Privacy Act so far, The Hill (Jan. 11, 2020), available at https://thehill.com/opinion/cybersecurity/477821-what-weve-learned-from-the-california-consumer-privacy-act-so-far.     [2]    See, e.g., California Consumer Privacy Act: Compliance Heading into the New Year, Gibson Dunn (Dec. 12, 2019), available at https://www.gibsondunn.com/california-consumer-privacy-act-compliance-heading-into-the-new-year/; California Consumer Privacy Act Final Amendments Signed, Gibson Dunn (Oct. 16, 2019), available at https://www.gibsondunn.com/california-consumer-privacy-act-2019-final-amendments-signed/; California Consumer Privacy Act Update: Regulatory Update, Gibson Dunn (Oct. 11, 2019), available at https://www.gibsondunn.com/california-consumer-privacy-act-update-regulatory-update/; California Consumer Privacy Act Update — California State Committees Vote on Amendments, Gibson Dunn (Apr. 30, 2019), available at https://www.gibsondunn.com/california-consumer-privacy-act-update-california-state-committees-vote-on-amendments/.     [3]    California SB-1121 requires that the final regulations be published on or before July 1, 2020.     [4]    Laura Mahoney, California Governor Signs Bills to Refine Sweeping Privacy Law, Bloomberg Law (Oct. 12, 2019), available at https://news.bloomberglaw.com/privacy-and-data-security/california-governor-signs-bills-to-refine-sweeping-privacy-law.     [5]    Allison Grande, Calif. Voters May Get Chance To Tighten Privacy Law, Law360 (Sept. 25, 2019), available at https://www.law360.com/articles/1202779/calif-voters-may-get-chance-to-tighten-privacy-law.     [6]    Cal. Civ. Code §§ 1798.100, 1798.140.     [7]    Id.     [8]    Mark Anderson, California privacy law to take effect immediately in 2020, AG says, Sacramento Business Journal (last updated Dec. 17, 2019), available at https://www.bizjournals.com/sacramento/news/2019/12/16/california-to-start-enforcing-privacy-law.html.     [9]    Alexei Koseff, California promises aggressive enforcement of new privacy law, S.F. Chronicle (Dec. 16, 2019), available at https://www.sfchronicle.com/politics/article/California-promises-aggressive-enforcement-of-new-14911017.php.     [10]    Id.     [11]    Cal. Civ. Code §§ 1798.100, 1798.150.     [12]    Act relating to Internet privacy, S.B. 220 (Nev. 2019), available at https://www.leg.state.nv.us/App/NELIS/REL/80th2019/Bill/6365/Text.     [13]    Id.     [14]    Id.     [15]    Id.     [16]    Id.     [17]    Act to Protect the Privacy of Online Customer Information, S. P. 275 (Me. 2019), available at http://www.mainelegislature.org/legis/bills/getPDF.asp?paper=SP0275&item=1&snum=129.     [18]    Id.     [19]    Id.     [20]    Stop Hacks and Improve Electronic Data Security Act (SHIELD Act), S5575B (N.Y. 2019), available at https://www.nysenate.gov/legislation/bills/2019/s5575.     [21]    Id.     [22]    Id.     [23]    Id.     [24]    Id.     [25]    An to amend the general business law, in relation to the management and oversight of personal data (New York Privacy Act), S.5842 (N.Y. 2019), available at https://legislation.nysenate.gov/pdf/bills/2019/S5642/.     [26]    Lucas Ropek, NY’s Data Privacy Bill Failed; Is There Hope Next Session?, Government Technology (July 15, 2019), available at https://www.govtech.com/policy/NYs-Data-Privacy-Bill-Failed-Is-There-Hope-Next-Session.html.     [27]    Allison Schiff, State Legislatures Are Back In Session, So Expect New Privacy Bills. Next Up: Washington State, AdExchanger (Jan. 14, 2020), available at https://adexchanger.com/privacy/state-legislatures-are-back-in-session-so-expect-new-privacy-bills-next-up-washington-state/.     [28]    Act Relating to the management and oversight of personal data, S.B. 5376 (Wash. 2019), available at https://app.leg.wa.gov/billsummary?BillNumber=5376&Year=2019&Initiative=false.     [29]    Id.     [30]    Act to amend the general business law, in relation to the management and oversight of personal data, S.5642 (N.Y. 2019), available at https://www.nysenate.gov/legislation/bills/2019/s5642.     [31]    Id.     [32]    Id.     [33]    See, e.g., Allison Schiff, State Legislatures Are Back In Session, So Expect New Privacy Bills. Next Up: Washington State, AdExchanger (Jan. 14, 2020), available at https://adexchanger.com/privacy/state-legislatures-are-back-in-session-so-expect-new-privacy-bills-next-up-washington-state/.     [34]    Senate Democrat Privacy Principles, Senate Democrats (Nov. 14, 2019), available at https://www.democrats.senate.gov/imo/media/doc/Final_CMTE%20Privacy%20Principles_11.14.19.pdf.     [35]    See, e.g., Lauren Feiner, A federal privacy law is starting to crystallize, but Democrats and Republicans can’t agree on how to do it, CNBC (last updated Dec. 4, 2019), available at https://www.cnbc.com/2019/12/04/a-federal-privacy-law-is-starting-to-crystallize-senators-remain-divided-over-details.html.     [36]    See, e.g., Abbie Gruwell, Preemption Takes Center Stage Amid Federal Data Privacy Action, The National Conference of State Legislatures Blog (Apr. 8, 2019), available at https://www.ncsl.org/blog/2019/04/08/preemption-takes-center-stage-amid-federal-data-privacy-action.aspx.     [37]    See, e.g., Cameron F. Kerry, Will this new Congress be the one to pass data privacy legislation?, Brookings (Jan. 7, 2019), available at https://www.brookings.edu/blog/techtank/2019/01/07/will-this-new-congress-be-the-one-to-pass-data-privacy-legislation/.     [38]    House Energy and Commerce Committee Staff Bipartisan Draft Privacy Bill (2019), available at https://privacyblogfullservice.huntonwilliamsblogs.com/wp-content/uploads/sites/28/2019/12/2019.12.18-Privacy-Bipartsian-Staff-Discussion-Draft.pdf.     [39]    Emily Birnbaum, Key House committee offers online privacy bill draft, The Hill (Dec. 18, 2019), available at https://thehill.com/policy/technology/475191-key-house-committee-offers-online-privacy-bill-draft.     [40]    House Energy and Commerce Committee Staff Bipartisan Draft Privacy Bill (2019), available at https://privacyblogfullservice.huntonwilliamsblogs.com/wp-content/uploads/sites/28/2019/12/2019.12.18-Privacy-Bipartsian-Staff-Discussion-Draft.pdf.     [41]    Id.     [42]    Id.     [43]    Id.     [44]    Id.     [45]    S. 2968, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/2968?q=%7B%22search%22%3A%5B%22cantwell%22%5D%7D&s=3&r=3.     [46]    United States Consumer Data Privacy Act of 2019 Staff Discussion Draft (2019), available at https://privacyblogfullservice.huntonwilliamsblogs.com/wp-content/uploads/sites/28/2019/12/Nc7.pdf.     [47]    Id.     [48]    Id; S. 2968, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/2968?q=%7B%22search%22%3A%5B%22cantwell%22%5D%7D&s=3&r=3.     [49]    H.R. 4978, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/4978/text.     [50]    S. 1951, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/1951/text.     [51]    S. 1578, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/1578/text.     [52]    S. 189, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/189/text.     [53]    H.R. 2231, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/2231/text.     [54]    S. 1116. 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/1116/text.     [55]    S. 1214, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/1214/text.     [56]    H.R. 2013, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/2013/text.     [57]    S. 583, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/583/text.     [58]    H.R. 5573, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/5573/text.     [59]    S. 1578, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/1578/text.     [60]    Id.     [61]    Id.     [62]    Id.     [63]    H.R. 2231, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/2231/text.     [64]    Id.     [65]    H.R. 5573, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/5573/text.     [66]    S. 748, 116th Cong (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/748/text.     [67]     H.R. 5573, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/5573/text; S. 748, 116th Cong (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/748/text.     [68]    H.R. 5573, 116th Cong. (2019), available at https://www.congress.gov/bill/116th-congress/house-bill/5573/text.     [69]    S. 748, 116th Cong (2019), available at https://www.congress.gov/bill/116th-congress/senate-bill/748/text.     [70]    See Prepared Opening Remarks of Chairman Joseph Simons, Hearings on Competition and Consumer Protection in the 21st Century, The FTC’s Approach to Consumer Privacy (Apr. 9, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1512673/chmn-simons-opening_remarks_ftc_hearing_12.pdf; Remarks of Chairman Joseph Simons, Hearings on Competition and Consumer Protection in the 21st Century, Session on FTC’s Role in a Changing World (Mar. 25, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1508536/oia_hearing_march_25_remarks_chmn_simons.pdf.     [71]    Press Release, Federal Trade Commission, FTC Seeks to Examine the Privacy Practices of Broadband Providers (Mar. 26, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/03/ftc-seeks-examine-privacy-practices-broadband-providers.     [72]    See, e.g., Andrew Smith, New and improved FTC data security orders: Better guidance for companies, better protection for consumers, Federal Trade Commission Blog (Jan. 6, 2020), available at https://www.ftc.gov/news-events/blogs/business-blog/2020/01/new-improved-ftc-data-security-orders-better-guidance.     [73]    See, e.g., Remarks of Commissioner Rebecca Kelly Slaughter, The Near Future of U.S. Privacy Law (Sept. 6, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1543396/slaughter_silicon_flatirons_remarks_9-6-19.pdf.     [74]    See Remarks of Commissioner Rebecca Kelly Slaughter, The Near Future of U.S. Privacy Law (Sept. 6, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1543396/slaughter_silicon_flatirons_remarks_9-6-19.pdf; Prepared Remarks of Chairman Joseph Simons, Introductory Keynote: American Bar Association Consumer Protection Conference (Feb. 5, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1451379/simons-_nashville-aba-remarks.pdf.     [75]    See Prepared Remarks of Chairman Joseph Simons, Introductory Keynote: American Bar Association Consumer Protection Conference (Feb. 5, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1451379/simons-_nashville-aba-remarks.pdf.     [76]    Press Release, Federal Trade Commission, FTC Grants Final Approval to Settlement with Former Cambridge Analytica CEO, App Developer over Allegations they Deceived Consumers over Collection of Facebook Data (Dec. 18, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/12/ftc-grants-final-approval-settlement-former-cambridge-analytica.     [77]    Id.     [78]    Id.     [79]    See, e.g., Press Release, United Kingdom Information Commissioner’s Office, SCL Elections prosecuted for failing to comply with enforcement notice (Jan. 9, 2019), available at https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/01/scl-elections-prosecuted-for-failing-to-comply-with-enforcement-notice/.     [80]    Press Release, Federal Trade Commission, FTC Issues Opinion and Order Against Cambridge Analytica For Deceiving Consumers About the Collection of Facebook Data, Compliance with EU-U.S. Privacy Shield (Dec. 6, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/12/ftc-issues-opinion-order-against-cambridge-analytica-deceiving.     [81]    Id.     [82]    Id.     [83]    Press Release, Federal Trade Commission, FTC Finalizes Settlement with Company that Misled Consumers about how it Accesses and Uses their Email (Dec. 17, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/12/ftc-finalizes-settlement-company-misled-consumers-about-how-it.     [84]    Id.     [85]    Id.     [86]    Id.     [87]    Press Release, Federal Trade Commission, Utah Company Settles FTC Allegations it Failed to Safeguard Consumer Data (Nov. 12, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/11/utah-company-settles-ftc-allegations-it-failed-safeguard-consumer.     [88]    Id.     [89]    Id.     [90]    Id.     [91]    Id.     [92]    Press Release, Federal Trade Commission, FTC Brings First Case Against Developers of “Stalking” Apps (Oct. 22, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/10/ftc-brings-first-case-against-developers-stalking-apps; see also FTC Brings First Case Against Tracking Apps, Gibson Dunn (Nov. 1, 2019), available at https://www.gibsondunn.com/california-consumer-privacy-act-2019-final-amendments-signed/.     [93]    Id.     [94]    Id.     [95]    Id.     [96]    Id.     [97]    Id.     [98]    Press Release, Federal Trade Commission, FTC Gives Final Approval to Settlement with Auto Dealer Software Company That Allegedly Failed to Protect Consumers’ Data (Sept. 6, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/09/ftc-gives-final-approval-settlement-auto-dealer-software-company.     [99]    Id.     [100]    Federal Trade Commission (F.T.C.), In re LightYear Dealer Technologies, LLC, Docket No. C-4687 (F.T.C. Sept. 6, 2019), available at https://www.ftc.gov/system/files/documents/cases/172_3051_c-4687_dealerbuilt_decision_order.pdf.     [101]    Id.     [102]    Press Release, Federal Trade Commission, Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law (Sept. 4, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations.     [103]    Id.     [104]    Id.     [105]    Id.     [106]    Id.     [107]    Id.     [108]    Press Release, Federal Trade Commission, FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook (July 24, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions.     [109]    Id.     [110]    Stipulated Order for Civil Penalty, Monetary Judgment, and Injunctive Relief, United States v. Facebook, Inc., No. 19-cv-2184 (D.D.C. July 24, 2019), ECF No. 2-1, available at https://www.ftc.gov/system/files/documents/cases/182_3109_facebook_order_filed_7-24-19.pdf.     [111]    Id.     [112]    Id.     [113]    Id.     [114]    Press Release, Federal Trade Commission, Equifax to Pay $575 Million as Part of Settlement with FTC, CFPB, and States Related to 2017 Data Breach (July 22, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/07/equifax-pay-575-million-part-settlement-ftc-cfpb-states-related.     [115]    Id.     [116]    Id.     [117]    Id.     [118]    Press Release, Federal Trade Commission, D-Link Agrees to Make Security Enhancements to Settle FTC Litigation (July 2, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/07/d-link-agrees-make-security-enhancements-settle-ftc-litigation.     [119]    Id.     [120]    Id.     [121]    Id.     [122]    Press Release, Federal Trade Commission, Video Social Networking App Musical.ly Agrees to Settle FTC Allegations That it Violated Children’s Privacy Law (Feb. 27, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc.     [123]    Id.     [124]    Id.     [125]    Id.     [126]    Press Release, Federal Trade Commission, FTC Issues Opinion and Order Against Cambridge Analytica For Deceiving Consumers About the Collection of Facebook Data, Compliance with EU-U.S. Privacy Shield (Dec. 6, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/12/ftc-issues-opinion-order-against-cambridge-analytica-deceiving; Press Release, Federal Trade Commission, California Company Settles FTC Allegations that it Falsely Claimed Participation in EU-U.S. Privacy Shield (Nov. 19, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/11/california-company-settles-ftc-allegations-it-falsely-claimed; Press Release, Federal Trade Commission, FTC Charges Nevada Company with Falsely Claiming Participation in the EU-U.S. Privacy Shield (Nov. 7, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/11/ftc-charges-nevada-company-falsely-claiming-participation-eu-us; Press Release, Federal Trade Commission, FTC Approves Final Consent Order Settling Charges That Background Screening Company Falsely Claimed Compliance with EU-U.S. Privacy Shield Framework (Aug. 21, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/08/ftc-approves-final-consent-order-settling-charges-background.     [127]    Id.     [128]    Press Release, Federal Trade Commission, FTC Takes Action against Companies Falsely Claiming Compliance with the EU-U.S. Privacy Shield, Other International Privacy Agreements (June 14, 2019), available at https://www.ftc.gov/news-events/press-releases/2019/06/ftc-takes-action-against-companies-falsely-claiming-compliance-eu.     [129]    Federal Trade Commission, Privacy Shield, available at https://www.ftc.gov/tips-advice/business-center/privacy-and-security/privacy-shield.     [130]    See 15 U.S.C. § 53(b).     [131]    See FTC v. Commerce Planet, Inc., 815 F.3d 593, 598–99 (9th Cir. 2016); FTC v. Ross, 743 F.3d 886, 890–92 (4th Cir. 2014); FTC v. Bronson Partners, LLC, 654 F.3d 359, 365–66 (2d Cir. 2011); FTC v. Magazine Sols., LLC, 432 F. App’x 155, 158 n.2 (3d Cir. 2011) (unpublished); FTC v. Direct Mktg. Concepts, Inc., 624 F.3d 1, 15 (1st Cir. 2010); FTC v. Freecom Commc’ns, Inc., 401 F.3d 1192, 1202 n.6 (10th Cir. 2005); FTC v. Gem Merch. Corp., 87 F.3d 466, 468–70 (11th Cir. 1996); FTC v. Security Rare Coin & Bullion Corp., 931 F.2d 1312, 1314–15 (8th Cir. 1991); FTC v. Amy Travel Serv., Inc., 875 F.2d 564, 571–72 (7th Cir. 1989).     [132]    FTC v. Credit Bureau Ctr., LLC, 937 F.3d 764 (7th Cir. 2019) (vacating a $5.26 million judgment in favor of the FTC).     [133]    Petition for a Writ of Certiorari, FTC v. Credit Bureau Ctr., LLC, No. 19-____ (U.S. Dec. 19, 2019), available at https://www.ftc.gov/system/files/documents/cases/petitionforawritofcertiorari_no._19.pdf.     [134]    See, e.g., Remarks of Commissioner Rebecca Kelly Slaughter, The Near Future of U.S. Privacy Law (Sept. 6, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1543396/slaughter_silicon_flatirons_remarks_9-6-19.pdf; Prepared Remarks of Chairman Joseph Simons, Introductory Keynote: American Bar Association Consumer Protection Conference (Feb. 5, 2019), available at https://www.ftc.gov/system/files/documents/public_statements/1451379/simons-_nashville-aba-remarks.pdf.     [135]    Press Release, Department of Health and Human Services, OCR Concludes All-Time Record Year for HIPAA Enforcement with $3 Million Cottage Health Settlement (Feb. 7, 2019), available at https://www.hhs.gov/about/news/2019/02/07/ocr-concludes-all-time-record-year-for-hipaa-enforcement-with-3-million-cottage-health-settlement.html.     [136]    Ben Kochman, HIPAA Enforcers Lower Fines For Less Serious Violations, Law360 (Apr. 26, 2019), available at https://www.law360.com/articles/1154042/hipaa-enforcers-lower-fines-for-less-serious-violations.     [137]    See, e.g., Dena Castricone, HIPAA Compliance Lessons From 2019 Enforcement Trends, Law360 (Jan. 22, 2020), available at https://www.law360.com/articles/1236238/hipaa-compliance-lessons-from-2019-enforcement-trends.     [138]    Press Release, Department of Health and Human Services, Tennessee Diagnostic Medical Imaging Services Company Pays $3,000,000 to Settle Breach Exposing over 300,000 Patients’ Protected Health Information (May 6, 2019), available at https://www.hhs.gov/about/news/2019/05/06/tennessee-diagnostic-medical-imaging-services-company-pays-3000000-settle-breach.html.     [139]    Id.     [140]    Press Release, Department of Health and Human Services, OCR Imposes a $1.6 Million Civil Money Penalty against Texas Health and Human Services Commission for HIPAA Violations (Nov. 7, 2019), available at https://www.hhs.gov/about/news/2019/11/07/ocr-imposes-a-1.6-million-dollar-civil-money-penalty-against-tx-hhsc-for-hipaa-violations.html.     [141]    Press Release, Department of Health and Human Services, Failure to Encrypt Mobile Devices Leads to $3 Million HIPAA Settlement (Nov. 5, 2019), available at https://www.hhs.gov/about/news/2019/11/05/failure-to-encrypt-mobile-devices-leads-to-3-million-dollar-hipaa-settlement.html.     [142]    Id.     [143]    Press Release, Department of Health and Human Services, OCR Settles First Case in HIPAA Right of Access Initiative (Sept. 9, 2019), available at https://www.hhs.gov/about/news/2019/09/09/ocr-settles-first-case-hipaa-right-access-initiative.html.     [144]    Id.     [145]    Press Release, Department of Health and Human Services, OCR Settles Second Case in HIPAA Right of Access Initiative (Dec. 12, 2019), available at https://www.hhs.gov/about/news/2019/12/12/ocr-settles-second-case-in-hipaa-right-of-access-initiative.html.     [146]    Press Release, Department of Health and Human Services, OCR Settles First Case in HIPAA Right of Access Initiative (Sept. 9, 2019), available at https://www.hhs.gov/about/news/2019/09/09/ocr-settles-first-case-hipaa-right-access-initiative.html; Press Release, Department of Health and Human Services, OCR Settles Second Case in HIPAA Right of Access Initiative (Dec. 12, 2019), available at https://www.hhs.gov/about/news/2019/12/12/ocr-settles-second-case-in-hipaa-right-of-access-initiative.html.     [147]    Press Release, Department of Health and Human Services, Judge Rules in Favor of OCR and Requires a Texas Cancer Center to Pay $4.3 Million in Penalties for HIPAA Violations (June 18, 2018), available at https://www.hhs.gov/about/news/2018/06/18/judge-rules-in-favor-of-ocr-and-requires-texas-cancer-center-to-pay-4.3-million-in-penalties-for-hipaa-violations.html.     [148]    See Complaint, Univ. of Tex. MD Anderson Cancer Ctr. v. Azar, Docket No. 4:19-cv-01298 (S.D. Tex. Apr. 9, 2019), ECF No. 1.     [149]    Request for Information on Modifying HIPAA Rules To Improve Coordinated Care, 83 Fed. Reg. 64302 (proposed Dec. 14, 2018) (to be codified at 45 C.F.R. pts. 160, 164), available at https://www.federalregister.gov/documents/2018/12/14/2018-27162/request-for-information-on-modifying-hipaa-rules-to-improve-coordinated-care.     [150]    See Request for Information on Modifying HIPAA Rules to Improve Coordinated Care, regulations.gov, available at https://www.regulations.gov/docket?D=HHS-OCR-2018-0028.     [151]    See, e.g., Comment of Wash. State Dep’t of Soc. and Health Servs., Request for Information on Modifying HIPAA Rules To Improve Coordinated Care, FR Docket No. 2018-27162 (Feb. 12, 2019), available at https://www.regulations.gov/document?D=HHS-OCR-2018-0028-1095.     [152]    See, e.g., Comment of Nat’l Disability Rights Network, Request for Information on Modifying HIPAA Rules To Improve Coordinated Care, FR Docket No. 2018-27162 (Feb. 12, 2019), available at https://www.regulations.gov/document?D=HHS-OCR-2018-0028-1294.     [153]    See, e.g., Comment of Nat’l Ass’n of Chain Drug Stores, Request for Information on Modifying HIPAA Rules To Improve Coordinated Care, FR Docket No. 2018-27162 (Feb. 11, 2019), available at https://www.regulations.gov/document?D=HHS-OCR-2018-0028-0874.     [154]    See Complaint, State of Arizona v. Med. Informatics Eng’g, Inc., No. 3:18-cv-00969 (N.D. Ind. Dec. 04, 2018), ECF No. 1.     [155]    Id.     [156]    Consent Judgment and Order, State of Arizona v. Med. Informatics Eng’g, Inc., No. 3:18-cv-00969 (N.D. Ind. May 28, 2019), ECF No. 66.     [157]    SEC Office of Compliance Inspection and Examinations, Risk Alert - Investment Adviser and Broker-Dealer Compliance Issues Related to Regulation S-P – Privacy Notices and Safeguard Policies (Apr. 16, 2019), available at https://www.sec.gov/files/OCIE%20Risk%20Alert%20-%20Regulation%20S-P.pdf.     [158]    Id. at 2–3.     [159]    Id. at 3–4.     [160]    Press Release, SEC Office of Compliance Inspections and Examinations Announces 2020 Examination Priorities (Jan. 7, 2020), available at https://www.sec.gov/news/press-release/2020-4.     [161]    Id.     [162]    Id.     [163]    U.S. Securities and Exchange Commission, Electronic Data Gathering, Analysis, and Retrieval, available at https://www.sec.gov/edgar.shtml (last visited Jan. 23, 2020).     [164]    Complaint, SEC v. Ieremenko et al., No. 2:19-cv-00505 (D.N.J. Jan. 15, 2019), ECF No. 1.     [165]    Press Release, U.S. Securities and Exchange Commission, SEC Brings Charges in EDGAR Hacking Case (Jan. 15, 2019), available at https://www.sec.gov/news/press-release/2019-1.     [166]    Press Release, U.S. Securities and Exchange Commission, Facebook to Pay $100 Million for Misleading Investors About the Risks It Faced From Misuse of User Data (July 24, 2019), available at https://www.sec.gov/news/press-release/2019-140.     [167]    Press Release, U.S. Securities and Exchange Commission, SEC and CFTC Charge Options Clearing Corp. with Failing to Establish and Maintain Adequate Risk Management Policies (Sept. 4, 2019), available at https://www.sec.gov/news/press-release/2019-171.     [168]    SEC Division of Enforcement, 2019 Annual Report at 13, available at https://www.sec.gov/files/enforcement-annual-report-2019.pdf.     [169]    Press Release, U.S. Securities and Exchange Commission, Company Settles Unregistered ICO Charges After Self-Reporting to SEC (Feb. 20, 2019), available at https://www.sec.gov/news/press-release/2019-15.     [170]    Complaint, SEC v. Telegram Group Inc. et al., No. 1:19-cv-9439 (S.D.N.Y. Oct. 11, 2019), ECF No. 1; see also Press Release, U.S. Securities and Exchange Commission, SEC Halts Alleged $1.7 Billion Unregistered Digital Token Offering (Oct. 11, 2019), available at https://www.sec.gov/news/press-release/2019-212.     [171]    Complaint, SEC v. Eyal, No. 1:19-cv-11325 (S.D.N.Y. Dec. 11, 2019), ECF No. 1.     [172]    TurnKey Jet, Inc., SEC No-Action Letter (Apr. 3, 2019), available at https://www.sec.gov/divisions/corpfin/cf-noaction/2019/turnkey-jet-040219-2a1.htm.     [173]    Id.     [174]    See Advanced Methods to Target and Eliminate Unlawful Robocalls, Declaratory Ruling and Third Further Notice of Proposed Rulemaking, FCC 19-51, 34 FCC Rcd. 4876 (June 6, 2019).     [175]    In the Matters of Implementing Section 503 of RAY BAUM’S Act, Second Report and Order, FCC Rcd. 19-73 (Aug. 1, 2019), available at https://docs.fcc.gov/public/attachments/FCC-19-73A1.pdf     [176]    STIR/SHAKEN stands for “Secure Telephony Identity Revisited/ Secure Handling of Asserted information using toKEN.”     [177]    Ajit Pai, Comm’r, FCC, Remarks at the Robocall Symposium of New England States (Nov. 21, 2019), available at https://docs.fcc.gov/public/attachments/DOC-360946A1.pdf     [178]    Press Release, FCC, Chairman Pai Statement on Progress by Major Phone Companies in Implementing Caller ID Authentication (Aug. 14, 2019), available at https://docs.fcc.gov/public/attachments/DOC-359087A1.pdf.     [179]    Pallone-Thune Telephone Robocall Abuse Criminal Enforcement and Deterrence (TRACED) Act, Pub. L. No. 116-105, 133 Stat. 3274 (2019).     [180]    Id.     [181]    See, e.g., Cassell Bryan-Low et al., Special report - Hobbling Huawei: Inside the U.S. war on China’s tech giant, Reuters (May 21, 2019), available at https://www.reuters.com/article/us-huawei-usa-5g-specialreport/special-report-hobbling-huawei-inside-the-u-s-war-on-chinas-tech-giant-idUSKCN1SR1EU; Diane Bartz & Christian Shepherd, U.S. legislation steps up pressure on Huawei and ZTE, China calls it ‘hysteria’, Reuters (Jan. 16, 2019), available at https://www.reuters.com/article/us-usa-china-huawei-tech/u-s-legislation-steps-up-pressure-on-huawei-and-zte-china-calls-it-hysteria-idUSKCN1PA2LU.     [182]    In the Matter of Protecting Against Nat’l Security Threats to the Comm’ns Supply Chain Through FCC Programs, Report and Order, Further Notice of Proposed Rulemaking, and Order, FCC 19-121 (Nov. 22, 2019), available at https://docs.fcc.gov/public/attachments/FCC-19-121A1.pdf.     [183]    See Petition for Review, Huawei Techs. v. FCC, No. 19-60896 (5th Cir. Dec. 5, 2019), available at https://prodnet.www.neca.org/publicationsdocs/wwpdf/12519huawei.pdf; see also Petition for Review, Huawei Technologies USA, Inc. et al. v. Federal Communications Commission et al., No 19-60896 (5th Cir. Jan. 7, 2020).     [184]    In the Matter of Protecting Against Nat’l Security Threats to the Comm’ns Supply Chain Through FCC Programs, Report and Order, Further Notice of Proposed Rulemaking, and Order, FCC 19–121, ¶¶ 122-60  (Nov. 22, 2019), available at https://docs.fcc.gov/public/attachments/FCC-19-121A1.pdf.     [185]    Jim Puzzanghera, New CFPB Director Kathy Kraninger says she won’t be puppet of Mick Mulvaney, L.A. Times (Dec. 11, 2018), available at http://www.latimes.com/business/la-fi-kathy-kraninger-cfpb-20181211-story.html.     [186]    Id.     [187]    Brief for the Respondent, Seila Law LLC v. CFPB, No. 19-7 (U.S. Sept. 17, 2019).     [188]    Seila Law LLC v. CFPB, 140 S. Ct. 427 (2019) (granting certiorari). Barr v. Am. Ass’n of Political Consultants Inc.,     [189]    Press Release, CFPB, CFPB, FTC and States Announce Settlement with Equifax Over 2017 Data Breach (July 22, 2019), available at https://www.consumerfinance.gov/about-us/newsroom/cfpb-ftc-states-announce-settlement-with-equifax-over-2017-data-breach/.     [190]    Id.     [191]    Dep’t of Defense, Defense Contract Mgmt. Agency, Contractor Purchasing System Review (CPSR) Guidebook (June 14, 2019), available at https://www.dcma.mil/Portals/31/Documents/CPSR/CPSR_Guidebook_062719.pdf     [192]    Id. at 97.     [193]    See, e.g., id. at 103–05.     [194]    U.S. Dep’t of Defense Office of the Under Secretary of Defense for Acquisition & Sustainment, Cybersecurity Maturity Model Certification (CMMC) Version 0.7 (Dec. 6, 2019), available at https://www.acq.osd.mil/cmmc/docs/CMMC_Version0.7_UpdatedCompiledDeliverable_20191209.pdf.     [195]    U.S. Dep’t of Defense Office of the Under Secretary of Defense for Acquisition & Sustainment, Welcome Page, available at https://www.acq.osd.mil/cmmc/index.html.     [196]    U.S. Dep’t of Defense Office of the Under Secretary of Defense for Acquisition & Sustainment, Cybersecurity Maturity Model Certification (CMMC) Version 0.7 (Dec. 6, 2019), available at https://www.acq.osd.mil/cmmc/docs/CMMC_Version0.7_UpdatedCompiledDeliverable_20191209.pdf.     [197]    Travis J. Tritten, Defense Contractors to Face Added Costs With Cybersecurity Audit, Bloomberg Gov’t (Jan. 15, 2020), available at https://about.bgov.com/news/defense-contractors-to-face-added-costs-with-cybersecurity-audit/.     [198]    U.S. Dep’t of Defense Inspector General, Audit of the DoD’s Management of the Cybersecurity Risks for Government Purchase Card Purchases of Commercial Off-the-Shelf Items (July 26, 2019), available at https://media.defense.gov/2019/Jul/30/2002164272/-1/-1/1/DODIG-2019-106.PDF.     [199]    U.S. Dep’t of Defense Inspector General, Audit of Protection of DoD Controlled Unclassified Information on Contractor-Owned Networks and Systems (July 23, 2019), available at https://media.defense.gov/2019/Jul/25/2002162331/-1/-1/1/DODIG-2019-105.PDF.     [200]    National Defense Authorization Act (NDAA) for Fiscal Year 2020, Pub. L. No. 116-92, 133 Stat 1198 (2019).     [201]    Id.     [202]    Press Release, DOJ, Justice Department Reviewing the Practices of Market-Leading Online Platforms (July 23, 2019), available at https://www.justice.gov/opa/pr/justice-department-reviewing-practices-market-leading-online-platforms.     [203]    Tony Romm, DOJ issues new warning to big tech: Data and privacy could be competition concerns, Wash. Post (Nov. 8, 2019), available at https://www.washingtonpost.com/technology/2019/11/08/doj-issues-latest-warning-big-tech-data-privacy-could-be-competition-concerns/.     [204]    Press Release, CFTC, CFTC Orders Registrant to Pay $1.5 Million for Violations Related to Cyber Breach (Sept. 12, 2019), available at https://www.cftc.gov/PressRoom/PressReleases/8008-19.     [205]    Press Release, Dep’t of Commerce, U.S. Department of Commerce Proposes Rule for Securing the Nation’s Information and Communications Technology and Services Supply Chain (Nov. 26, 2019), available at https://www.commerce.gov/news/press-releases/2019/11/us-department-commerce-proposes-rule-securing-nations-information-and.     [206]    Brandi Vincent, How the Energy Department Is Prioritizing Secure Infrastructure, Nextgov (Mar. 21, 2019), available at https://www.nextgov.com/cybersecurity/2019/03/how-energy-department-prioritizing-secure-infrastructure/155734/.     [207]    See, e.g., GAO-19-332, Critical Infrastructure Protection, Actions Needed to Address Significant Cybersecurity Risks Facing the Electric Grid (Aug. 2019), available at https://www.gao.gov/assets/710/701079.pdf.     [208]    DHS Office of Inspector General, Management Alert – FEMA Did Not Safeguard Disaster Survivors’ Sensitive Personally Identifiable Information (REDACTED) (Mar. 15, 2019), available at https://www.oig.dhs.gov/sites/default/files/assets/2019-03/OIG-19-32-Mar19.pdf.     [209]    Nat’l Institute of Standards and Tech., Protecting Controlled Unclassified Information in Nonfederal Systems and Organizations (June 2019), available at https://csrc.nist.gov/publications/detail/sp/800-171/rev-2/draft.     [210]    Id. at iv.     [211]    Nat’l Institute of Standards and Tech., Protecting Controlled Unclassified Information in Nonfederal Systems and Organizations – Enhanced Security Requirements for Critical Programs and High Value Assets (June 2019), available at https://csrc.nist.gov/CSRC/media/Publications/sp/800-171b/draft/documents/sp800-171B-draft-ipd.pdf.     [212]           Id. at iv (emphases omitted).     [213]    Nat’l Institute of Standards and Tech., NIST Releases Version 1.0 of Privacy Framework (Jan. 16, 2020), available at https://www.nist.gov/news-events/news/2020/01/nist-releases-version-10-privacy-framework.     [214]    Final Judgment and Consent Decree, The State of Alabama v. Equifax, Inc. (July 19, 2019), available at https://www.sec.gov/Archives/edgar/data/33185/000119312519198584/d734596dex104.htm.     [215]    See Press Release, NY State Office of the Attorney General, Attorney General James Secures $6 Million From Cisco Systems In Multistate Settlement (Aug. 1, 2019), available at https://ag.ny.gov/press-release/2019/attorney-general-james-secures-6-million-cisco-systems-multistate-settlement; Mark Chandler, Executive Platform: A Changed Environment Requires a Changed Approach, Cisco Blogs (July 31, 2019), available at https://blogs.cisco.com/news/a-changed-environment-requires-a-changed-approach.     [216]    Press Release, N.Y. Dep’t Fin. Serv., Attorney General James Gives Update on Facebook Antitrust Investigation (Oct. 22, 2019), available at https://ag.ny.gov/press-release/2019/attorney-general-james-gives-update-facebook-antitrust-investigation.     [217]    Press Release, Office of the Attorney Gen., NJ Announces New “Cyber Savvy Youth” Initiative to Keep Kids Safe Online and Releases Annual Statistics on Cyber Breaches (Oct. 31, 2019), available at https://www.nj.gov/oag/newsreleases19/pr20191031a.html.     [218]    Press Release, N.Y. Dep’t Fin. Serv., Acting Superintendent Linda A. Lacewell Names Justin Herring Executive Deputy Superintendent of Newly Created Cybersecurity Division (May 22, 2019), available at https://www.dfs.ny.gov/reports_and_publications/press_releases/pr1905221.     [219]    See RiskBased Security, Data Breach Quickview Report, 2019 Q3 Trends (Nov. 2019), available at https://pages.riskbasedsecurity.com/hubfs/Reports/2019/Data%20Breach%20QuickView%20Report%202019%20Q3%20Trends.pdf.     [220]    Christopher Rowland, Quest Diagnostics Discloses Breach of Patient Records, Wash. Post (June 3, 2019), available at https://www.washingtonpost.com/business/economy/quest-diagnostics-discloses-breach-of-patient-records/2019/06/03/aa37b556-860a-11e9-a870-b9c411dc4312_story.html.     [221]    Jessica Davis, Quest, Labcorp, AMCA Face Breach Lawsuits, State Investigations, Health Security (June 11, 2019), available at https://healthitsecurity.com/news/quest-labcorp-amca-face-hit-by-breach-lawsuits-state-investigations.     [222]    Id.     [223]    Ben Kochman, Debt Collection Co. Files Ch. 11 After Health Data Breach (June 11, 2019), available at https://www.law360.com/articles/1170470?scroll=1&related=1.     [224]    Taylor Telford, Wawa Hit With Massive Data Breach, Potentially Affecting More Than 850 Locations, CEO Says, Wash. Post (Dec. 20, 2019), available at https://www.washingtonpost.com/business/2019/12/20/wawa-hit-with-massive-data-breach-potentially-affecting-all-locations-ceo-says/.     [225]    Matt Fair, Firm Says Lead Counsel Bids in Wawa Suits Should Wait, Law360 (Jan. 3, 2020), available at https://www.law360.com/articles/1231110/firm-says-lead-counsel-bids-in-wawa-suits-should-wait.     [226]    Allison Grande, Contested Equifax Data Breach Deal Gets Final Nod, Law360 (Dec. 20, 2019), available at https://www.law360.com/articles/1230211/contested-equifax-data-breach-deal-gets-final-nod.     [227]    Id.; see also Allison Grande, Equifax Data Breach Settlement Is A Good Deal, Judge Says, Law360 (Jan. 15, 2020), available at https://www.law360.com/articles/1234404?scroll=1&related=1.     [228]    Allison Grande, Equifax Data Breach Settlement Is A Good Deal, Judge Says, Law360 (Jan. 15, 2020), available at https://www.law360.com/articles/1234404?scroll=1&related=1.     [229]    Order Granting Preliminary Approval, In Re: Yahoo! Inc. Customer Data Security Breach Litigation, 5:16-MD-02752 (N.D. Cal. July 20, 2019), ECF No. 390.     [230]    Id.     [231]    Dorothy Atkins, Yahoo’s Revised $117M Data Breach Deal Gets Koh’s Initial OK, Law360 (July 22, 2019), available at https://www.law360.com/articles/1180718/yahoo-s-revised-117m-data-breach-deal-gets-koh-s-initial-ok.     [232]    Vince Sullivan, $29M Yahoo Breach Deal in Calif. Ends Chancery Suit in Del., Law360 (Jan. 11, 2019), available at https://www.law360.com/articles/1117984/-29m-yahoo-breach-deal-in-calif-ends-chancery-suit-in-del-.     [233]    Order Granting in Part and Denying in Part Facebook Inc.’s Motion to Dismiss, In re Facebook, Inc., Consumer Privacy User Profile Litig., No. 18-MD-02843 (N.D. Cal. Sept. 9, 2019), ECF No. 298.     [234]    Id.     [235]    Pretrial Order No. 26: Order Denying Motion to Certify for Interlocutory Appeal, In re: Facebook, Inc. Consumer Privacy User Profile Litig., No. 18-MD02843-VC (N.D. Cal. Oct. 31, 2019), ECF No. 330.     [236]    Pretrial Order No. 32: Case Management Schedule, In re: Facebook, Inc. Consumer Privacy User Profile Litig., No. 18-MD02843-VC (N.D. Cal. Dec. 13, 2019), ECF No. 356.     [237]    Hamza Shaban, Under Armour Announces Data Breach, Affecting 150 million MyFitnessPal App Accounts, Wash. Post (Mar. 29, 2018), available at https://www.washingtonpost.com/news/theswitch/wp/2018/03/29/under-armour-announces-data-breach-affecting-150-million-myfitnesspal-appaccounts.     [238]    Order, Murray v. Under Armour, Inc., 18-cv-04032 (C.D. Cal. Feb. 11, 2019), ECF No. 36.     [239]    136 S. Ct. 1540 (2016).     [240]    Id. at 1549.     [241]    See, e.g., In re Zappos.com, Inc., 888 F.3d 1020, 1028 (9th Cir. 2018); Attias v. Carefirst, Inc., 865 F.3d 620, 628 (D.C. Cir. 2017).     [242]    See, e.g., Beck v. McDonald, 848 F.3d 262, 266-67 (4th Cir. 2017); In re SuperValu, Inc., 870 F.3d 763, 771–72 (8th Cir. 2017).     [243]    In re U.S. Office of Personnel Mgmt. Data Sec. Breach Litig., 928 F.3d 42, 61 (D.C. Cir. 2019) (quoting Attias v. Carefirst, Inc., 865 F.3d 620, 622 (D.C. Cir. 2017)).     [244]    865 F.3d 620, 628 (D.C. Cir. 2017).     [245]    In re U.S. Office of Personnel Mgmt. Data Sec. Breach Litig., 928 F.3d 42, 58 (D.C. Cir. 2019) .     [246]    Id. at 59.     [247]    CareFirst, Inc. v. Attias, 138 S. Ct. 981 (2018).     [248]    In re Zappos.com, Inc., 888 F.3d 1020 (9th Cir. 2018).  See Zappos.com, Inc. v. Stevens, No. 18-225, Doc. No. 13 (Mar. 25, 2019) (denying certiorari).     [249]    Consumer and Governmental Affairs Bureau Seeks Comment on Petition For Expedited Declaratory Ruling Filed By SGS North America, Inc., GC Docket No. 02-278 (Dec. 20, 2018), available at https://ecfsapi.fcc.gov/file/12212239203475/DA-18-1290A1.pdf (noting comment period closes in February of 2020).     [250]    Petition For Expedited Declaratory Ruling or, in the Alternative, Request for Retroactive Waiver, GC Docket No. 02-278 (Dec. 17, 2018), available at https://ecfsapi.fcc.gov/file/121726169703/SGS%20--%20FCC%20Petition%20for%20Declaratory%20Ruling.pdf.     [251]    Id.     [252]    In the Matter of Advanced Methods to Target and Eliminate Unlawful Robocalls, Federal Communications Commission, Declaratory Ruling and Third Further Notice of Proposed Rulemaking, FCC 19-51 (June 6, 2019), available at https://docs.fcc.gov/public/attachments/FCC-19-51A1.pdf     [253]    Id. at 12–13.     [254]    139 S. Ct. 2051 (2019).     [255]    Id.     [256]    Id.     [257]    Id. at 2056.     [258]    Id. at 2058.     [259]    Id.     [260]    Salcedo v. Hanna, 936 F.3d 1162, 1172 (11th Cir. 2019).     [261]    847 F.3d 1037, 1043 (9th Cir. 2017).     [262]    Facebook, Inc. v. Duguid, Petition for Writ of Certiorari, No. 19-511 (U.S. Oct. 17, 2019) (“Facebook Petition”); Charter Commc’ns, Inc. v. Gallion, Petition for Writ of Certiorari, No. 19-575 (U.S. Nov. 1, 2019).     [263]    904 F.3d 1041 (9th Cir. 2018).     [264]    Facebook Petition at 23–29.     [265]    885 F.3d 687 (D.C. Cir. 2018).  Gibson Dunn represented the U.S. Chamber of Commerce, one of the petitioners, in this case.     [266]    Barr v. Am Ass’n of Political Consultants Inc., No. 19-631 (U.S. Jan. 10, 2020) (granting certiorari).     [267]    S. 151 116th Congress (2019-2020), available at https://www.congress.gov/bill/116th-congress/senate-bill/151/text.     [268]    Id.     [269]    H.R. 3375 116th Congress (2019-2020), available at https://www.congress.gov/bill/116th-congress/house-bill/3375/text.     [270]    740 Ill. Comp. Stat. Ann. 14/20 (West 2008).     [271]    129 N.E.3d 1197 (Ill. 2019).     [272]    Id. at 1206.     [273]    2019 WL 1049107 (Ill. App. Ct. Mar. 4, 2019).     [274]    409 F. Supp. 3d 612 (N.D. Ill. 2019).     [275]    2019 WL 6253807 (N.D. Ill. Nov. 22, 2019).     [276]    Id. at *5.     [277]    Id.     [278]    2019 WL 1560416 (Ill. App. Ct. Apr. 9, 2019).     [279]    Id. at *4.     [280]    See generally Plaintiff’s Unopposed Motion and Memorandum in Support of Preliminary Approval of Class Action Settlement, Dixon v. The Washington and Jane Smith Community-Beverly, 2019 WL 2445292 (N.D. Ill. May 9, 2019) (No. 1:17-cv-08033).     [281]    See, e.g., Complaint, Yozze v. Universal Parks & Resorts Mgmt. Servs. LLC, No. 2019-CH-06366 (Ill. Cir. Ct. May 23, 2019); Complaint, Acaley v. Vimeo Inc., No. 2019-CH-10873 (Ill. Cir. Ct. Sept. 20, 2019); Complaint, Miracle-Pond v. Shutterfly Inc., No. 2019-CH-07050 (Ill. Cir. Ct. June 11, 2019).     [282]    Patel v. Facebook, Inc., 932 F.3d 1264 (9th Cir. 2019).     [283]    Id. at 1267.     [284]    See David Thacker, Expediting changes to Google+, Google (Dec. 10, 2018), available at https://www.blog.google/technology/safety-security/expediting-changes-google-plus/; see also Douglas MacMillan & Robert McMillan, Google Exposed User Data, Feared Repercussions of Disclosing to Public, Wall Street Journal (Oct. 8, 2018), available at https://www.wsj.com/articles/google-exposed-user-data-feared-repercussions-of-disclosing-to-public-1539017194; Lily Hay Newman, A New Google+ Blunder Exposed Data from 52.5 Million Users, Wired (Dec. 10, 2018), available at https://www.wired.com/story/google-plus-bug-52-million-users-data-exposed/.     [285]    Joint Stipulation and [Proposed] Order re Plaintiffs’ Filing of Amended Consolidated Complaint, In re Google Plus Profile Litig., No. 5:18-cv-06164-EJD (N.D. Cal. Feb. 20, 2019), ECF No. 35.     [286]    Joint Stipulation and [Proposed] Order to Continue Hearing Date for Motion for Preliminary Approval, In re Google Plus Profile Litig., No. 5:18-cv-06164-EJD (VKD) (N.D. Cal. Dec. 5, 2019), ECF No. 55.     [287]    Plaintiffs’ Notice of Motion for Preliminary Approval of Class Action Settlement, Exhibit 1 (Settlement Agreement), In re Google Plus Profile Litig., No. 5:18-cv-06164-EJD (VKD) (N.D. Cal. Jan. 6, 2020), ECF No. 57-2.     [288]    Defendants’ Notice of Motion and Motion to Dismiss Consolidated Amended Complaint for Violation of the Federal Securities Laws and Memorandum of Points and Authorities in Support at 2, In re Alphabet, Inc., Sec. Litig., No. 4:18-CV-06245-JSW (N.D. Cal. May 31, 2019), ECF No. 71.     [289]    Order on Motions to Dismiss, Motion to Stay, and Motion to Intervene, In re Facebook, Inc. S’holder Derivative Privacy Litig., No. 18-CV-01792-HSG (N.D. Cal. Mar. 22, 2019), ECF No. 113.     [290]    Id. at 22.     [291]    Plaintiffs’ First Amended Consolidated Shareholder Derivative Complaint, In re Facebook, Inc. S’holder Derivative Privacy Litig., No. 18-CV-01792-HSG (N.D. Cal. Dec. 17, 2019), ECF No. 142.     [292]    Order Denying Defendant Facebook, Inc.’s Opposed Motion to Dismiss, or in the Alternative, to Stay Proceedings at 2, District of Columbia v. Facebook, No. 2018 CA 8715 B (D.C. Super. Ct. May 31, 2019).     [293]    Id. at 24–29.     [294]    Order Approving Stipulation of Dismissal, Mulder v. Wells Fargo Bank, N.A., No. 2:18-CV-00029 (W.D. Pa. Feb. 5, 2019), ECF No. 58.     [295]    See Rojas v. HSBC Card Servs. Inc., 20 Cal. App. 5th 427, 430–35 (Ct. App. 2018).     [296]    Order Granting Defendant’s Motion to Dismiss at 10, In re: Google Location History Litig., No. 5:18-cv-05062-EJD (N.D. Cal. Dec. 19, 2019), ECF No. 113.     [297]    Id. at 19.     [298]    Defendant Google LLC’s Motion to Dismiss Plaintiffs’ Consolidated Complaint, In re: Google Location History Litig., No. 5:18-cv-05062-EJD (N.D. Cal. May 28, 2019), ECF No. 87.     [299]    Order Granting Defendant’s Motion to Dismiss, In re: Google Location History Litig., No. 5:18-cv-05062-EJD (N.D. Cal. Dec. 19, 2019), ECF No. 113.     [300]    Id. at 2.     [301]    Id.     [302]    Complaint, Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. June 26, 2019), ECF No. 1; see also Amended Class Action Complaint and Demand for Jury Trial, Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. Oct. 8, 2019), ECF No. 42.     [303]    Complaint at 2, Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. June 26, 2019), ECF No. 1.     [304]    Id. at 17; Amended Class Action Complaint and Demand for Jury Trial at 2, Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. Oct. 8, 2019), ECF No. 42.     [305]    Defendant Google LLC’s Motion to Dismiss Plaintiff’s Complaint, Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. Aug. 27, 2019), ECF No. 30; The University of Chicago and The University of Chicago Medical Center’s Motion to Dismiss, Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. Aug. 27, 2019), ECF No. 26.     [306]    See FCA US LLC’s Motion to Decertify Classes, Flynn v. FCA US LLC, No. 3:15-CV-855-SMY-RJD (S.D. Ill. Nov. 11, 2019), ECF No. 550.     [307]    United States Supreme Court Order List, United States Supreme Court, 18-398 (Jan. 7, 2019), available at https://www.supremecourt.gov/orders/courtorders/010719zor_m6ho.pdf.     [308]    See FCA US LLC’s Motion for Summary Judgment and Brief in Support of its Motion for Summary Judgment, Flynn v. FCA US LLC, No. 3:15-CV-855-SMY-RJD (S.D. Ill. Nov. 11, 2019), ECF Nos. 561, 562; FCA US LLC’s Motion to Dismiss for Lack of Subject Matter Jurisdiction and Brief in Support of its Motion to Dismiss for Lack of Subject Matter Jurisdiction, Flynn v. FCA US LLC, No. 3:15-CV-855-SMY-RJD (S.D. Ill. Nov. 11, 2019), ECF Nos. 574, 575.     [309]    See FCA US LLC’s Brief in Support of its Motion for Summary Judgment at 1, Flynn v. FCA US LLC, No. 3:15-CV-855-SMY-RJD (S.D. Ill. Nov. 11, 2019), ECF No. 562; FCA US LLC’s Brief in Support of its Motion to Dismiss for Lack of Subject Matter Jurisdiction, Flynn v. FCA US LLC, No. 3:15-CV-855-SMY-RJD (S.D. Ill. Nov. 11, 2019), ECF No. 575.     [310]    See FCA US LLC’s Brief in Support of its Motion to Dismiss for Lack of Subject Matter Jurisdiction at 12, Flynn v. FCA US LLC, No. 3:15-CV-855-SMY-RJD (S.D. Ill. Nov. 11, 2019), ECF No. 575.     [311]    Order Granting in Part and Denying in Part Defendant’s Motion to Dismiss, S.D. v. Hytto Ltd., D/B/A/ Lovense, No. 18-cv-00688-JSW (N.D. Cal. May 15, 2019), ECF No. 44.     [312]    Letter Order, White v. Samsung Electronics America, Inc., No. 17-01775 (D.N.J. Aug. 21, 2019), ECF No. 104.     [313]    Id. at 5.     [314]    Id. at 6–7.     [315]    See Defendant Samsung Electronics America, Inc.’s Brief in Support of Motion to Reconsider or, in the Alternative, Motion to Certify Order of August 21, 2019 for Interlocutory Appeal, White v. Samsung Elec. Am., Inc., No. 17-01775 (MCA) (SCM) (D.N.J. Sept. 4, 2019), ECF No. 105-1; Notice of Defendant Sony Electronics Inc. Joining Defendant Samsung Electronics America, Inc.’s Motion to Reconsider or, in the Alternative, Motion to Certify Order of August 21, 2019 for Interlocutory Appeal, White v. Samsung Elec. Am., Inc., No. 17-01775 (MCA) (SCM) (D.N.J. Sept. 4, 2019), ECF No. 106.     [316]    See First Amended Class Action Complaint and Demand for Jury Trial at 2, B.F. and A.A. v. Amazon.com, Inc., No. 2:19-cv-00910 (W.D. Wash. July 8, 2019), ECF No. 24; Amended Class Action Complaint and Demand for Jury Trial, R.A. v. Amazon.com, Inc., 2:19-cv-06454-CJC-AGR (C.D. Cal. Sept. 18, 2019), ECF No. 42.     [317]    First Amended Class Action Complaint and Demand for Jury Trial at 8, 18–33, B.F. and A.A. v. Amazon.com, Inc., No. 2:19-cv-00910 (W.D. Wash. July 8, 2019).     [318]    Notice of Voluntary Dismissal, R.A. v. Amazon.com, Inc., 2:19-cv-06454-CJC-AGR (C.D. Cal. Dec. 6, 2019), ECF No. 45; Notice of Voluntary Dismissal by Plaintiffs A.A., B.F., S.M., C.M., and F.B., C.O. v. Amazon.com, Inc., 2:19-cv-910-RAJ-MLP (Dec. 10, 2019), ECF No. 95.     [319]    Defendants’ Motion to Dismiss Plaintiffs’ Second Amended Complaint at 1, B.F. and A.A. v. Amazon.com, Inc., No. 2:19-cv-910-RAJ-MLP (W.D. Wash. Jan. 9, 2020), ECF No. 106.     [320]    Order re Motions to Dismiss, McDonald v. Kiloo APS, 17-cv-04344-JD (N.D. Cal. May 22, 2019), ECF No. 270.     [321]    Id.     [322]    Scheduling Order, McDonald v. Kiloo APS, 17-cv-04344-JD (N.D. Cal. Dec. 19, 2019), ECF No. 316.     [323]    Final Judgment, Ticketmaster L.L.C. v. Prestige Entm’t, Inc., No. 2:17-cv-07232-ODW (JCx) (C.D. Cal. July 8, 2019), ECF No. 101.     [324]    Joint Motion to Dismiss, St. Paul Fire & Marine Ins. Co. v. Rosen Hotels & Resorts, Inc., No. 18-14427 (11th Cir. Dec. 27, 2019).     [325]    Joint Motion to Dismiss, The Nat’l Bank of Blacksburg v. Everest Nat’l Ins. Co., No. 7:18-cv-00310-GEC (W.D. Va. Mar. 22, 2019), ECF No. 30.     [326]    Complaint, Mondelez Int’l v. Zurich Am. Ins. Co., No. 2018L011008, 2018 WL 4941760 (Ill. Cir. Ct. Oct. 10, 2018).     [327]    See Adam Satariano & Nicole Perlroth, Big Companies Thought Insurance Covered a Cyberattack. They May Be Wrong, N.Y. Times (Apr. 15, 2019), available at https://www.nytimes.com/2019/04/15/technology/cyberinsurance-notpetya-attack.html?login=email&auth=login-email.     [328]    Complaint, Mondelez Int’l v. Zurich Am. Ins. Co., No. 2018L011008 (Cir. Ct. Ill. Oct. 10, 2018).     [329]    Id.     [330]    See David Voreacos et al., Merck Cyberattack’s $1.3 Billion Question: Was It an Act of War?, Bloomberg (Dec. 2, 2019, 10:01 PM), available at https://www.bloomberg.com/news/features/2019-12-03/merck-cyberattack-s-1-3-billion-question-was-it-an-act-of-war; see also Adam Satariano & Nicole Perlroth, Big Companies Thought Insurance Covered a Cyberattack. They May Be Wrong, N.Y. Times (Apr. 15, 2019), available at https://www.nytimes.com/2019/04/15/technology/cyberinsurance-notpetya-attack.html?login=email&auth=login-email.     [331]    139 S. Ct. 1041 (2019) (per curiam).     [332]    136 S. Ct. 1540 (2016).     [333]    Alasaad v. Nielsen, No. 17-cv-11730-DJC, 2019 WL 5899371, at *21 (D. Mass, Nov. 12, 2019).     [334]    Id. at *8.     [335]    Id. at *13.     [336]  See Notice of Appeal, Alasaad v. Nielsen, No. 1:17-cv-11730 (D. Mass. Jan. 10, 2020), ECF No. 115; Notice of Appeal, Alasaad v. Nielsen, No. 1:17-cv-11730 (D. Mass. Jan. 13, 2020), ECF No. 117.     [337]    In the Matter of Residence in Oakland, California, 354 F. Supp. 3d, 1010, 1013 (N.D. Cal. 2019).     [338]    Id.     [339]    Id. at 1016 (citing Doe v. United States, 487 U.S. 201, 219 (1988) (Stevens, J., dissenting); Fisher v. United States, 425 U.S. 391, 420 (1976)); see also In re Grand Jury Subpoena Duces Tecum, 670 F.3d 1335 (11th Cir. 2012) (holding that decryption and production of hard drives would implicate the Fifth Amendment privilege); United States v. Kirschner, 823 F. Supp. 2d 665, 669 (E.D. Mich. 2010) (holding that subpoena requiring defendant to provide password violated the Fifth Amendment); Securities and Exchange Commission v. Huang, No. 15-269, 2015 WL 5611644 (E.D. Pa. Sept. 23, 2015) (holding that passcodes to defendant’s work-issued phone is not corporate record and forcing him to produce personal passcodes violates the Fifth Amendment).     [340]    In the Matter of the Search Warrant Application for the Cellular Telephone in United States v. Anthony Barrera, No. 19 CR 439, 2019 WL 6253812, at *7 (N.D. Ill. Nov. 22, 2019).     [341]    Id. at *3 (internal quotations and citations omitted).     [342]    Id.     [343]    18 U.S.C. § 2713.     [344]    Id.; White Paper, Department of Justice, Promoting Public Safety, Privacy, and the Rule of Law Around the World: The Purpose and Impact of the Cloud Act, 4, 6 (April 2019), available at https://www.justice.gov/dag/page/file/1153436/download.     [345]    Press Release, Department of Justice, U.S. and UK Sign Landmark Cross-Border Data Access Agreement to Combat Criminals and Terrorists Online (Oct. 3, 2019), available at https://www.justice.gov/opa/pr/us-and-uk-sign-landmark-cross-border-data-access-agreement-combat-criminals-and-terrorists.     [346]    Id.     [347]    See Coalition Statement to U.S. House, Senate Committees Re: U.S.-U.K. CLOUD Act Agreement (Oct. 29, 2019), available at https://epic.org/privacy/intl/USUK-CLOUD-Act-Letter-20191028.pdf.     [348]    Press Release, European Commission, Criminal Justice: Joint Statement on the Launch of EU-U.S. Negotiations to Facilitate Access to Electronic Evidence (Sept. 25, 2019), available at https://ec.europa.eu/commission/presscorner/detail/en/STATEMENT_19_5890.     [349]    Press Release, Department of Justice, Joint Statement Announcing United States and Australian Negotiation of a CLOUD Act Agreement by U.S. Attorney General William Barr and Minister for Home Affairs Peter Dutton (Oct. 7, 2019), available at https://www.justice.gov/opa/pr/joint-statement-announcing-united-states-and-australian-negotiation-cloud-act-agreement-us.     [350]    White Paper, Department of Justice, Promoting Public Safety, Privacy, and the Rule of Law Around the World: The Purpose and Impact of the Cloud Act, 3 (Apr. 2019), available at https://www.justice.gov/dag/page/file/1153436/download.     [351]    See, e.g., Katitza Rodriguez & Camille Fischer, A Race to the Bottom of Privacy Protection: The US-UK Deal Would Trample Cross Border Privacy Safeguards, Electronic Frontier Foundation (Oct. 4, 2019), available at https://www.eff.org/deeplinks/2019/10/race-bottom-privacy-protection-us-uk-deal-would-trample-cross-border-privacy; Press Release, EPIC, NGOs Object to U.S.-U.K. CLOUD Agreement, Urge Congressional Action, Electronic Privacy Information Center (Oct. 29, 2019), available at https://epic.org/2019/10/ngos-object-to-us-uk-cloud-agr.html.     [352]    Office of the Dir. of Nat’l Intelligence, Release of Documents Related to the 2018 FISA Section 702 Certifications (Oct. 8, 2019), available at https://www.intel.gov/index.php/ic-on-the-record-database/results/951-release-of-documents-related-to-the-2018-fisa-section-702-certifications.     [353]    Office of the Dir. of Nat’l Intelligence, Release of Documents Related to the 2018 FISA Section 702 Certifications (Oct. 8, 2019), available at https://www.intel.gov/index.php/ic-on-the-record-database/results/951-release-of-documents-related-to-the-2018-fisa-section-702-certifications.     [354]    Id.     [355]    Complaint at 1–2, Am. Civil Liberties Union et al. v. United States Dept. of Justice et al., No. 1:19-cv-12242 (D. Mass. Oct. 31, 2019).     [356]    Am. Civil Liberties Union Director, The FBI is Tracking Our Faces in Secret. We’re Suing (Oct. 31, 2019), available at https://www.aclu.org/news/privacy-technology/the-fbi-is-tracking-our-faces-in-secret-were-suing/.     [357]    Complaint at 2, Am. Civil Liberties Union et al. v. United States Dept. of Justice et al., No. 1:19-cv-12242 (D. Mass. Oct. 31, 2019).     [358]    Saira Hussain, Elec. Frontier Found., ICE’s Rapid DNA Testing on Migrants at the Border Is Yet Another Iteration of Family Separation (Aug. 2, 2019), available at https://www.eff.org/deeplinks/2019/08/ices-rapid-dna-testing-migrants-border-yet-another-iteration-family-separation.     [359]    Complaint at 5 & 7, Elec. Frontier Found. v. United States Dep’t of Homeland Sec., No. 3:19-cv-07431 (N.D. Cal. Nov. 12, 2019).     [360]    Complaint at 2, Am. Civil Liberties Union et al. v. United States Dept. of Justice et al., No. 1:19-cv-12242 (D. Mass. Oct. 31, 2019).     [361]    Id. at 3.     [362]    Id.

The following Gibson Dunn lawyers assisted in the preparation of this client update: Ryan Bergsieker, Alexander Southwell, Timothy Loose, Roscoe Jones Jr., Ashley Rogers, Daniel Rauch, Reuben Aguirre, Jennifer Bracht, Chris Connelly, Meghan Dunn, Sarah Erickson-Muschko, Cassandra Gaedt-Sheckter, Julie Hamilton, Doriel Jacov, Nicole Lee, Reid Rector, Jacob Rierson, Isabella Sayyah, Jeremy Smith, Danny Weiner, and Lisa Victoria Zivkovic.

Gibson Dunn's lawyers are available to address any privacy or cybersecurity concerns your business may face.  Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any member of the firm's Privacy, Cybersecurity and Consumer Protection practice group:

Privacy, Cybersecurity and Consumer Protection Group:

United States Alexander H. Southwell - Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Debra Wong Yang - Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Matthew Benjamin - New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker - Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Howard S. Hogan - Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen - Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley - San Francisco (+1 415-393-8395, ) H. Mark Lyon - Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson - Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde - Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner - Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong - San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com)

Europe Ahmed Baladi - Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox - London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris - London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan - Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden - London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther - Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing - Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Alejandro Guerrero - Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Vera Lukic - Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Sarah Wazen - London (+44 (0)20 7071 4203, swazen@gibsondunn.com)

Asia Kelly Austin - Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak - Singapore (+65 6507 3683, jpathak@gibsondunn.com)

Questions about SEC disclosure issues concerning data privacy and cybersecurity also may be addressed to the following practice leaders: Securities Regulation and Corporate Governance Group: Elizabeth Ising - Washington, D.C. (+1 202-955-8287, eising@gibsondunn.com) James J. Moloney - Orange County, CA (+ 949-451-4343, jmoloney@gibsondunn.com) Lori Zyskowski - New York (+1 212-351-2309, lzyskowski@gibsondunn.com) © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 13, 2020 |
Gibson Dunn Named a 2019 Law Firm of the Year

Law360 named Gibson Dunn a Firm of the Year for 2019 in its article, “The Firms That Dominated in 2019,” featuring seven firms that received the most Practice Group of the Year awards.  Of the seven, Gibson Dunn is one of two firms with the most winning Practice Groups of the Year, noting that the firm “scored big wins across several practice groups for major technology companies such as Facebook and Uber in lawsuits that tackled hot-button issues like internet privacy and the gig economy.” Law360 also noted Gibson Dunn “dominated the competition again this year” in announcing its Practice Groups of the Year, which “honor the law firms behind the litigation wins and major deals that resonated throughout the legal industry in the past year.” The firm was named a Practice Group of the Year in the following categories:

  • Appellate [PDF] – Gibson Dunn’s Appellate and Constitutional Law Practice Group’s lawyers participate in appeals in all 13 federal courts of appeals and state appellate courts throughout the United States and have presented arguments in front of the Supreme Court of the United States more than 100 times.
  • Class Action [PDF] – The Class Actions Practice Group has unrivaled experience in defeating enterprise-threatening class action lawsuits throughout the United States.  The group has an unparalleled record in securing early dismissal in cases where other defendants facing similar lawsuits have been forced to litigate through costly, burdensome discovery and other pretrial proceedings.
  • Cybersecurity & Privacy [PDF] – The firm’s Privacy, Cybersecurity and Consumer Protection Practice Group has a demonstrated history of helping companies successfully navigate the complex and rapidly evolving laws, regulations, and industry best practices relating to privacy, cybersecurity and consumer protection.  Our global and interdisciplinary team advises clients across a broad range of industries in high-stakes matters on the full spectrum of issues in these areas.
  • International Arbitration [PDF] – The International Arbitration Practice Group advises leading multinational corporations in arbitration proceedings around the world.  The International Arbitration group’s lawyers have appeared before many of the world’s leading arbitrators and work with all major arbitral institutions and rules.
  • Real Estate [PDF] – The Real Estate Practice Group handles the most sophisticated real estate transactions worldwide.  Our team of lawyers handles complex and challenging matters for a wide array of clients, such as the owners, developers and financiers of the largest real estate projects in the United States and Europe, both in the private and public sectors.
  • Sports & Betting [PDF] – The Sports Law Practice Group advises clients on the most complex sports industry matters, from the purchase and sale of U.S. and non-U.S. professional teams to precedent-setting litigation.  Gibson Dunn’s global sports practice represents a wide range of clients in matters relating to professional and amateur sports, including individual teams, sports facilities, athletic associations, athletes, financial institutions, television networks, sponsors and municipalities.  The Betting and Gaming Practice Group is one of the most preeminent betting and gaming legal practices worldwide, representing the most prestigious and influential clients in the industry across Europe, Asia and the Americas.
  • Technology [PDF] – The Media, Entertainment and Technology Group represent both established and emerging media, entertainment and technology companies and handle our clients’ most important and complex corporate and intellectual property transactions, litigation, antitrust, internal investigations and other legal challenges.
  • Trials [PDF] – Acclaimed as a litigation powerhouse, Gibson Dunn has a long record of outstanding successes.  The members of our litigation practice group are not just litigators, they are first-rate trial lawyers who have tried cases and argued appeals before the U.S. Supreme Court and state supreme courts in addition to federal and state courts across the United States involving almost every foreseeable area of controversy.

January 10, 2020 |
2019 Year-End German Law Update

Click for PDF Since the end of World War II, Germany’s foreign policy and economic well-being were built on three core pillars: (i) a strong transatlantic alliance and friendship, (ii) stable and influential international institutions and organizations, such as first and foremost, the EU, but also others such as the UN and GATT, and, finally, (iii) the rule of law. Each of these pillars has suffered significant cracks in the last years requiring a fundamental re-assessment of Germany’s place in the world and the way the world’s fourth largest economy should deal with its friends, partners, contenders and challengers. A few recent observations highlight the urgency of the issue:

  • The transatlantic alliance and friendship has been eroding over many years. A recent Civey study conducted for the think tank Atlantic-Brücke showed that 57.6% of Germans prefer a “greater distance” to the U.S., 84.6% of the 5,000 persons polled by Civey described the German-American relationship as negative or very negative, while only 10.4% considered the relationship as positive.
  • The current state of many international institutions and organizations also requires substantial overhaul, to put it mildly: After Brexit has occurred, the EU will have to re-define its role for its remaining 27 member states and its (new) relationship with the UK, which is still the fifth-largest economy on a stand-alone basis. GATT was rendered de facto dysfunctional on December 10, 2019, when its Appellate Body lost its quorum to hear new appeals. New members cannot be approved because of the United States’ veto against the appointment of new appeal judges. The UN is also suffering from a vacuum created by an attitude of disengagement shown by the U.S., that is now being filled by its contenders on the international stage, mainly China and Russia.
  • Finally, the concept of the rule of law has come under pressure for some years through a combination of several trends: (i) the ever expanding body of national laws with extra-territorial effect (such as the FCPA or international sanction regulations), a rule-making trend not only favored by the U.S., but also by China, Russia, the EU and its member states alike, (ii) the trend – recently observed in some EU member states – that the political party in charge of the legislative and executive branch initiates legislative changes designed to curtail the independence of courts (e.g. Poland and Hungary), and (iii) the rise of populist parties that have enjoyed land-slide gains in many countries (including some German federal states) and promulgate simple solutions, not least by cutting corners and curtailing legal procedures and legal traditions.
These fundamental challenges occur toward the end of a period of unprecedented rise in wealth and economic success of the German economy: Germany has reaped the benefits of eight decades of peace and the end of the Cold War after the decay of the Soviet Union. It regained efficiencies after ambitious structural changes to its welfare state in the early years of the millennium, and it re-emerged as a winner from the 2008 financial crisis benefiting (among others) from the short-term effects of the European Central Bank’s policy of a cheap Euro that mainly benefits the powerful German export machine (at the mid- and long-term cost to German individual savers). The robust economy that Germany enjoyed over the last decade resulted in record budgets, a reduction of public debt, a significant reduction in unemployment, and individual consumption at record levels. Therefore, the prospects of successfully addressing the above challenges are positive. However, unless straight forward and significant steps are identified and implemented to address the challenges ahead, the devil will be in the detail. The legislative changes across all practice areas covered in this year-end update are partly encouraging, partly disappointing in this respect. It is impossible to know whether the new laws and regulations will, on balance, make Germany a stronger and more competitive economy in 2020 and beyond. Healthy professional skepticism is warranted when assessing many of the changes suggested and introduced. However, we at Gibson Dunn are determined and committed to ensuring that we utilize the opportunities created by the new laws to the best benefit of our clients, and, at the same time, helping them in their quest to limit any resulting threats to the absolute minimum. As in prior years, in order to succeed in that, we will require your trust and confidence in our ability to support you in your most complicated and important business decisions and to help you form your views and strategies to deal with sophisticated German legal issues in times of fundamental change. Your real-world questions and the tasks you entrust us with related to the above developments and changes help us in forming our expertise and sharpening our focus. This adds the necessary color that allows us to paint an accurate picture of the multifaceted world we are living in, and on this basis, it will allow you to make sound business decisions in the interesting times to come. In this context, we are excited about every opportunity you will provide us with to help shaping our joint future in the years to come. _______________________

Table of Contents      

  1. Corporate, M&A
  2. Tax
  3. Financing and Restructuring
  4. Labor and Employment
  5. Real Estate
  6. Compliance and Litigation
  7. Antitrust and Merger Control
  8. Data Protection
  9. IP & Technology
 _______________________

1.   Corporate, M&A

1.1   ARUG II – New Transparency Rules for Listed German Corporations, Institutional Investors, Asset Managers, and Proxy Advisors In November 2019, the German parliament passed ARUG II, a long awaited piece of legislation implementing the revised European Shareholders’ Rights Directive (Directive (EU) 2017/828). ARUG II is primarily aimed at listed German companies and provides changes with respect to “say on pay” provisions, as well as additional approval and disclosure requirements for related party transactions, the transmission of information between a corporation and its shareholders and additional transparency and reporting requirements for institutional investors, asset managers and proxy advisors. “Say on pay” on remuneration of board members; remuneration policy and remuneration report In a German stock corporation, shareholders determine the remuneration of the supervisory board members at a shareholder meeting, whereas the remuneration of the management board members is decided by the supervisory board. Under ARUG II, shareholders of German listed companies must be asked to vote on the remuneration of the board members pursuant to a prescribed procedure. First, the supervisory board will have to prepare a detailed remuneration policy (including maximum remuneration amounts) for the management board, which must be submitted to the shareholders if there are major changes to the remuneration, and in any event at least once every four years. The result of the vote on the policy will only be advisory except that the shareholders’ vote to reduce the maximum remuneration amount will be binding. With respect to the remuneration of supervisory board members, the new rules require a shareholder vote at least once every four years. Second, at the annual shareholders’ meeting, the shareholders will vote ex post on the remuneration report which contains the remuneration granted to the present and former members of the management board and the supervisory board in the previous financial year. Again, the shareholders’ vote, however, will only be advisory. Both the remuneration report and the remuneration policy have to be made public on the company’s website for at least ten years. The changes introduced by ARUG II will not apply retroactively and will not therefore affect management board members’ existing service agreements, i.e. such agreements will not have to be amended in case they do not comply with the new remuneration policy. Related party transactions German stock corporation law already provides for various safeguards to protect minority shareholders in transactions with major shareholders or other related parties (e.g. the capital maintenance rules and the laws relating to groups of companies). In the future, for listed companies, these mechanisms will be supplemented by a detailed set of approval and transparency requirements for related party transactions. In particular, transactions exceeding certain thresholds will require prior supervisory board approval, provided that a rejection by the supervisory board can be overruled by shareholder vote, and a listed company must publicly disclose any such material related party transaction, without undue delay over media channels providing for European-wide distribution. Communication / Know-your-Shareholder Listed corporations will have the right to request information on the identity of their shareholders, including the name and both a postal and electronic address, from depositary banks, thus allowing for a direct communication line, also with respect to bearer shares (“know-your-shareholder”). Furthermore, depositary banks and other intermediaries will be required to pass on important information from the corporation to the shareholders and vice versa, e.g. with respect to voting in shareholders’ meetings and the exercise of subscription rights. Where there is more than one intermediary in a chain, the intermediaries are required to pass on the respective information within the chain. Increased transparency requirements for institutional investors, asset managers and proxy advisors Institutional investors and asset managers will be required to disclose their engagement policy (including how they monitor, influence and communicate with investee companies, exercise shareholders’ rights and manage actual and potential conflicts of interests). They will also have to report annually on the implementation of their engagement policy and on their voting decisions. Institutional investors will also have to disclose to which extent key elements of their investment strategy match the profile and duration of such institutional investors’ liabilities towards their ultimate beneficiaries. If they involve asset managers, institutional investors also have to disclose the main aspects of their arrangements with them. The new disclosure and reporting requirements, however, only apply on a “comply or explain” basis, i.e. investors and asset managers may choose not to comply with the transparency requirements provided that they give an explanation as to why this is the case. Proxy advisors will have to publicly disclose on an annual basis whether and how they have applied their code of conduct based again on the “comply or explain” principle. They also have to provide information on the essential features, methodologies and models they apply, their main information sources, the qualification of their staff, their voting policies for the different markets they operate in, their interaction with the companies and the stakeholders as well as how they manage conflicts of interests. These rules, however, do not apply to proxy advisors operating from a non-EEA state with no establishment in Germany. Entry into force and transitional provisions The provisions concerning related party transactions already apply. The rules relating to communications via intermediaries and know-your-shareholder information will apply from September 3, 2020. The “mandatory say on pay” resolutions will only have to be passed in shareholder meetings starting in 2021. The remuneration report will have to be prepared for the first time for the financial year 2021. It needs to be seen whether companies will already adhere to the new rules prior to such dates on a voluntary basis following requests from their shareholders or pressure from proxy advisors. In any event, both listed companies as well as the other addressees of the new transparency rules should make sure that they are prepared for the new reporting and disclosure requirements.

Back to Top

1.2   Restatement of the German Corporate Governance Code – New Stipulations for the Members of the Supervisory Board and the Remuneration of the Members of the Board of Management

A restatement of the German Corporate Governance Code (Deutscher Corporate Governance Kodex, DCGK” or the “Code”) is expected for the beginning of 2020, after the provisions of the EU Shareholder Rights Directive II (Directive (EU) 2017/828 of the European Parliament and of the Council of May 17, 2017 amending Directive 2007/36/EC as regards the encouragement of long-term shareholder engagement) were implemented into German domestic law as part of the "ARUG II" reform as of January 1, 2020. This timeline seeks to avoid overlaps and potentially conflicting provisions between ARUG II and the Code. In addition to structural changes, which are designed to improve legal clarity compared to the previous 2017 version, the new Code contains a number of substantial changes which affect boards of management and supervisory boards in an effort to provide more transparency to investors and other stakeholders. Some of the key modifications can be summed up as follows:
(a)   Firstly, restrictions on holding multiple corporate positions are tightened considerably. The new DCGK will recommend that (i) supervisory board members should hold no more than five supervisory board mandates at listed companies outside their own group, with the position of supervisory board chairman being counted double, and (ii) members of the board of management of a listed company should not hold more than two supervisory board mandates or comparable functions nor chair the supervisory board of a listed company outside their own group. (b)   A second focal point is the independence of shareholder representatives on the supervisory board. In this context, the amended DCGK for the first time introduces certain criteria which can indicate a lack of independence by supervisory board members such as long office tenure, prior management board membership, family or close business relationships with board members and the like. However, the Government Commission DCGK (Regierungskommission Deutscher Corporate Governance Kodex) (the “Commission”) has pointed out that these criteria should not replace the need to assess each case individually. Furthermore, at least 50% of all shareholder representatives (including the chairperson) shall be independent. If there is a controlling shareholder, at least two members of the supervisory board shall be independent of such controlling shareholder (assuming a supervisory board of six members). (c)   A third key area of reform focuses on the remuneration of members of the board of management. Going forward, it is recommended that companies should determine a so-called “target total remuneration”, i.e. the amount of remuneration that is paid out in total if 100 percent of all previously determined targets have been achieved, as well as a "maximum compensation cap", which should not be exceeded even if the previously determined targets are exceeded. Under the new Code, the total remuneration of the management board should be “explainable to the public”. (d)  Finally, the Commission has decided to simplify corporate governance reporting and put an end to the parallel existence of (i) the corporate governance report under the Code and (ii) a separate corporate governance statement contained in the management report of the annual accounts. Going forward, the corporate governance statement in the annual financial statements will be the core instrument of corporate governance reporting.
In recent years, governance topics have assumed ever increasing importance for both domestic and foreign investors and are typically a matter of great interest at annual shareholders’ meetings. Hence, we recommend that (listed) stock corporations, in a first step, familiarize themselves with the content of the new recommendations in the Code and, thereafter, take the necessary measures to comply with the rules of the revised DCGK once it takes effect . In particular, stock corporations should evaluate and disclose the different mandates of their current supervisory board members to comply with the new rules.

Back to Top

1.3   Cross-Border Mobility of European Corporations Facilitated On January 1, 2020 the European Union Directive on cross-border conversions, mergers and divisions (Directive (EU) 2019/2121 of the European Parliament and of the Council of November 27, 2019) (the “Directive”) has entered into force. While a legal framework for cross-border mergers had already been implemented by the European Union in 2005, the lack of a comparable set of rules for cross-border conversions and divisions had led to fragmentation and considerable legal uncertainty. Whenever companies, for example, attempted to move from one member state to another without undergoing national formation procedures in the new member state and liquidation procedures in the other member state, they were only able to rely on certain individual court rulings of the European Court of Justice (ECJ). Cross-border asset transfers by (partial) universal legal succession ((partielle) Gesamtrechtsnachfolge) were virtually impossible due to the lack of an appropriate legal regime. The Directive now seeks to create a European Union-wide legal framework which ultimately enhances the fundamental principle of freedom of establishment (Niederlassungsfreiheit). The Directive in particular covers the following cross-border measures:
  • The conversion of the legal structure of a corporation under the regime of one member state into a legal structure of the destination member state (grenzüberschreitende Umwandlung) as well as the transfer of the registered office from one member state to another member state (isolierte Satzungsitzverlegung);
  • Cross-border division whereby certain assets and liabilities of a company are transferred by universal legal succession to one or more entities in another member state which are to be newly established in the course of the division. If all assets and liabilities are transferred, at least two new transferee companies are required and the transferor company ceases to exist upon effectiveness of the division. In all cases, the division is made in exchange for shares or other interests in the transferor company, the transferee company or their respective shareholders, depending on the circumstances.
  • The Directive further amends the existing legal framework for cross-border merger procedures by introducing common rules for the protection of creditors, dissenting minority shareholders and employees.
  • Finally, the Directive provides for an anti-abuse control procedure enabling national authorities to check and ultimately block a cross-border measure when it is carried out for abusive or fraudulent reasons or in circumvention of national or EU legislation.
Surprisingly, however, the Directive does not cover a cross-border transfer of assets and liabilities to one or more companies already existing in another member state (Spaltung durch Aufnahme). In addition, the Directive only applies to corporations (Kapitalgesellschaften) but not partnerships (Personengesellschaften). Member states have until January 2023 to implement the Directive into domestic law. Through this legal framework for corporate restructuring measures, it is expected that the Directive will harmonize the interaction between national procedures. If the member states do not use the contemplated national anti-abuse control procedure excessively, the Directive can considerably facilitate cross-border activities. Forward looking member states may even consider implementing comparable regimes for divisions into existing legal entities which are currently beyond the scope of the Directive.

Back to Top

1.4   Transparency Register: Reporting Obligations Tightened and Extended to Certain Foreign Entities

The Act implementing the 5th EU Anti-Money Laundering Directive (Directive (EU) 2018/843) which amended the German Anti-Money Laundering Act (Geldwäschegesetz, GwG) with effect as of January 1, 2020 (see below under section 6.2) also introduced considerable new reporting obligations to the transparency register (Transparenzregister), which seeks to identify the “ultimate beneficial owner”. Starting on January 1, 2020, not only associations incorporated under German private law, but also foreign associations and trustees that have a special link to Germany must report certain information on their „beneficial owners“ to the German transparency register. Such link exists if foreign associations acquire real property in Germany. Non-compliance is not only an administrative offence (potential fines of up to EUR 150,000), but the German notary recording a real estate transaction must now check actively that the reporting obligation has been fulfilled before notarizing such transaction and must refuse notarization if it has not. Foreign trustees must in addition report the beneficial owners of the trust if a trust acquires domestic real property or if a contractual partner of the trust is domiciled in Germany. Reporting by a foreign association or trustee to the German transparency register is, however, not required if the relevant information on the beneficial owners has already been filed with a register of another EU member state. Additional requirements apply to foreign trustees. In addition, the reporting obligations of beneficial owners, irrespective of their place of residence, towards a German or, as the case may be, foreign association, regarding their interest have been clarified and extended. Associations concerned must now also actively make inquiries with their direct shareholders regarding any beneficial owners and must keep adequate records of these inquiries. Shareholders must respond to such inquiries within a reasonable time period and, in addition, must also notify the association pro-actively, if they become aware that the beneficial owner has changed as well as duly record any such notification. Furthermore, persons or entities subject to the GwG obligations (“Obliged Persons”) inspecting the transparency register to fulfil their customer due diligence requirements (e.g. financial institutions and estate agents) must now notify the transparency register without undue delay of any discrepancies on beneficial ownership between entries in the register and other information and findings available to them. Finally, the transparency register is now also accessible to the general public without proof of legitimate interest with regard to certain information about the beneficial owner (full legal name of the beneficial owner, the month and year of birth, nationality and country of residence as well as the type and extent of the economic interest of the beneficial owner). As in the past, however, the registry may restrict inspection into the transparency register, upon request of the beneficial owner, if there are overriding interests worthy of protection. In return for any disclosure, starting on July 1, 2020, beneficial owners may request information on inspections made by the general public (in contrast to inspections made by public authorities or Obliged Persons such as, e.g. financial institutions, auditing firms, or tax consultants and lawyers). Although reporting obligations to the transparency register were initially introduced more than 2.5 years ago, compliance with these obligations still seems to be lacking in practice. Therefore, any group with entities incorporated in Germany, any foreign association intending to acquire German real estate and any individual qualifying as a beneficial owner of a domestic or foreign association should check whether new or outstanding inquiry, record keeping or reporting obligations arise for them and take the required steps to ensure compliance. In this context, we note that for some time now the competent administrative enforcement authority (Bundesverwaltungsamt) has increased its efforts to enforce the transparency obligations, including imposing fines on associations that have failed to make required filings. It is to be expected that they will further tighten the reins based on this reform.

Back to Top

1.5   UK LLPs with Management Seat in Germany – Status after Brexit?

As things stand at present the British government is pushing to enact its Withdrawal Agreement Bill (the “WAB”) to ensure that it can take the UK out of the EU on January 31, 2020. Pursuant to the WAB such withdrawal from the EU is not intended to result in a so-called “Hard Brexit” as the WAB introduces a transition period until December 31, 2020 during which the European fundamental freedoms including the freedom of establishment would continue to apply. Freedom of establishment has, over the last decade in particular, resulted in German law recognizing that UK (and other EU) companies can have their effective seat of management (Verwaltungssitz) in Germany rather than the respective domestic jurisdiction. Until the end of the transition period, UK company structures such as UK Plc, Ltd. or LLP will continue to benefit from such recognition. But what happens thereafter if the EU and the UK (or, alternatively, Germany and the UK) do not succeed in negotiating particular provisions for the continued recognition of UK companies in the EU or Germany, respectively? From a traditional German legal perspective, such companies will lose their legal capacity as a UK company in Germany after the transition period because German courts traditionally follow the real or effective seat theory (Sitztheorie) and thus apply German corporate law to the companies in question rather than the incorporation theory (Gründungstheorie) which would lead to the application of English law. There would be a real risk that UK companies that have their effective management seat in Germany would have to be reclassified as a German company structure under the numerus clausus of German company structures. For some company structures such as the “LLP” German law does not have an equivalent LLP company structure as such, and reclassifying it as a German law limited partnership would not work either in most cases due to lack of registration in the German commercial register. In short, the only alternative for future recognition of a UK multi-person LLP, under German law, may be a German civil law partnership (GbR) or in certain cases a German law commercial partnership (OHG), with all legal consequences that flow from such structures, including, in particular, unlimited member liability. The discussion on how to resolve this issue in Germany has focused on a type of German partnership with limited liability (Partnerschaftsgesellschaft mit beschränkter Haftung, PartGmbB), that has only limited scope. A PartGmbB is only open to members of the so-called liberal or free professions such as attorneys or architects. In addition, the limitation of liability in a PartGmbB applies only to liability due to professional negligence and risks associated with the profession, and would thus not benefit their members generally. Unless UK companies with an effective seat of management in Germany opted to risk reliance on the status quo – in the event there is no new framework for recognition after the transition period – affected companies should either change their seat of management to the UK (or any other EU jurisdiction that applies the incorporation theory) and establish a German branch office, or, alternatively, consider forming a suitable German legal corporate structure before the end of the transition period at the end of December 2020.

Back to Top

1.6   The ECJ on Corporate Agreements and the Rome I Regulation

In its decision C-272/18, of 3 October 2019, the European Court of Justice (ECJ) further clarified the scope of the EU regulation Rome I (Regulation (EC) No 593/2008 of the European Parliament and of the Council of 17 June 2008 on the law applicable to contractual obligations (the “Rome I Regulation”) on the one hand, and international company law which is excluded from the scope of the Rome I Regulation on the other hand. The need for clarification resulted from Art. 1 para. 2 lit f. of the Rome I Regulation pursuant to which “questions governed by the law of companies and other bodies, corporate or unincorporated, such as the creation, by registration or otherwise, legal capacity, internal organization or winding-up of companies and other bodies […]” are excluded from the scope of the Rome I Regulation. The ECJ, as the highest authority on the interpretation of the Regulation, held that the “corporate law exception” does not apply to contracts which have shares as object of such contract only. According to the explicit statement of the Advocate General Saugmandsgaard Øe, this also includes share purchase agreements which are now held to be within the scope of the Rome I Regulation. This exception from the scope of the Rome I Regulation is thus much narrower than it has been interpreted by some legal commentators in the past. The case concerned a law suit brought by an Austrian consumer protection organization (“VKI”) against a German public instrument fund (“TVP”), and more particularly, trust arrangements for limited (partnership) interests in funds designed as public limited partnerships. The referring Austrian High Court had to rule on the validity of a choice of law clause in trust agreements concerning German limited partnership interests between the German fund TVP, as trustee over the investors’ partnership interests, and Austrian investors qualifying as consumers, as trustors. This clause provided for the application of German substantive law only. VKI claimed that this clause was, under Austrian substantive law, not legally effective and binding because pursuant to the Rome I Regulation, a contract concluded by a consumer with another person acting in the exercise of his/her trade or profession shall either be governed by the law of the country of the consumer’s habitual residence (in this case Austria) and/or, in the event the parties have made a choice as to the applicable law, at least not result in depriving the consumer of the protection offered to him/her by his/her country of residence. The contractual choice of German law could not therefore, in VKI’s view, deprive Austrian investors of rights guaranteed by Austrian consumer protection laws. TVP, on the other hand, argued that the Rome I Regulation was not even applicable as the contract in question was an agreement related to partnership interests and, thus, to corporate law which was excluded from the scope of the Rome I Regulation. The ECJ ruled that the relevant corporate law exclusion from the scope of the Rome I Regulation is limited to the organizational aspects of companies such as their incorporation or internal statutes. In turn, a mere connection to corporate law was ruled not to be sufficient to fall within the exclusion. Sale and purchase agreements in M&A transactions, or as in the matter at hand trust arrangements, are therefore covered by the Rome I Regulation. The decision provides that the choice of law principle of the Rome I Regulation is, subject to the restrictions imposed by the Regulation itself for particular groups such as consumers and employees, applicable in more cases than considered in the past with respect to corporate law related contracts.

Back to Top

1.7   German Foreign Direct Investment – Further Rule-Tightening Announced for 2020

Restrictions on foreign investment is increasingly becoming a perennial topic. After the tightening of the rules on foreign direct investment in 2017 (see 2017 Year-End German Law Update under 1.5) and the expansion of the scope for scrutiny of foreign direct investments in 2018 (see 2018 Year-End German Law Update under 1.3), the German Ministry of Economy and Energy (Bundesministerium für Wirtschaft und Energie) in November 2019 announced further plans to tighten the rules for foreign direct investments in Germany in its policy guideline on Germany’s industrial strategy 2030 (Industriestrategie 2030 – Leitlinien für eine deutsche und europäische Industriepolitik). The envisaged amendments to the German Foreign Trade and Payments Ordinance (Außenwirtschaftsverordnung, AWV) relate to the following three key pillars: Firstly, by October 2020, the German rules shall be adapted to reflect the amended EU regulations (so-called EU Screening Directive dated March 19, 2019). This would be achieved, inter alia, by implementing a cooperation mechanism to integrate other EU member states as well as the EU Commission into the review process. Further, the criteria for public order or security (öffentliche Ordnung oder Sicherheit) relevant to the application of foreign trade law is expected to be revised and likely expanded to cover further industry sectors such as artificial intelligence, robotics, semiconductors, biotechnology and quantum technology. The threshold for prohibiting a takeover may be lowered to cover not only a “threat” but a “foreseeable impairment” of the public order or security (as contemplated in the EU directive). Secondly, if the rules on foreign direct investments cannot be relied on to block an intended acquisition, but such acquisition nonetheless affects sensitive or security related technology, another company from the German private sector may acquire a stake in the relevant target as a so-called “White Knight” in a process moderated by the government. Thirdly, as a last resort, the strategy paper proposes a “national fallback option” (Nationale Rückgriffsoption) under which the German state-owned Kreditanstalt für Wiederaufbau could acquire a stake in enterprises active in sensitive or security-related technology sectors for a limited period of time. Even though the details for the implementation of those proposals are not yet clear, the trend towards more protectionism continues. For non-EU investors a potential review pursuant to the rules on foreign direct investment will increasingly become the new rule and should thus be taken into account when planning and structuring M&A transactions.

Back to Top

2.   Tax - German Federal Government Implements EU Mandatory Disclosure Rules

On December 12, 2019 and December 20, 2019, respectively, the two chambers of the German Federal Parliament passed the Law for the Introduction of an Obligation to report Cross-Border Tax Arrangements (the “Law”), which implements Council Directive 2018/822/EU (referred to as “DAC 6”) into Germany’s domestic law effective as of July 1, 2020. DAC 6 entered into force on June 25, 2018 and requires so-called intermediaries, and in some cases taxpayers, to report cross-border arrangements that contain defined characteristics with their national tax authorities within specified time limits. The stated aim of DAC 6 is to provide tax authorities with an early warning mechanism for new risks of tax avoidance. The Law follows the same approach as provided for in DAC 6. The reporting obligation would apply to “cross-border tax arrangements” in the field of direct taxes (e.g. income taxes but not VAT). Cross-border arrangements concern at least two member states or a member state and a non-EU country. Purely national German arrangements are - contrary to previous drafts of the Law – not subject to reporting.
(a)   Reportable cross-border arrangements must have one or more specified characteristics (“hallmarks”). The hallmarks are broadly scoped and represent certain typical features of tax planning arrangements, which potentially indicate tax avoidance or tax abuse. (i)    Some of these hallmarks would result in reportable transactions only if the “main benefit test” is satisfied. The test would be satisfied if it can be established that the main benefit that a person may reasonably expect to derive from an arrangement is obtaining a tax advantage in Germany or in another member state. Hallmarks in that category are, inter alia, the use of substantially standardized documentation or structures, the conversion of income into lower taxed categories of revenue or payments to an associated enterprise that are tax exempt or benefit from a preferential tax regime or arrangement. (ii)   In addition, there are hallmarks that would result in reportable transactions regardless of whether the main benefit test is satisfied. Hallmarks in this category are, for example, assets that are subject to depreciation in more than one jurisdiction, relief from double taxation that is claimed more than once, arrangements that involve hard-to-value intangibles or specific transfer pricing arrangements. (b)   The primary obligation to disclose information to the tax authorities rests with the intermediary. An intermediary is defined as “any person that promotes, designs for a third party, organizes, makes available for implementation or manages the implementation of a reportable cross-border arrangement.” Such intermediary must be resident in the EU or provides its services through a branch in the EU. Typical intermediaries are tax advisors, accountants, lawyers, financial advisors, banks and consultants. When multiple intermediaries are engaged in a cross-border arrangement, the reporting obligation lies with all intermediaries involved in the same arrangement. However, an intermediary can be exempt from reporting if he can prove that a report of the arrangement has been filed by another intermediary. In the event an intermediary is bound by legal professional privilege from reporting information, the intermediary would have to inform the relevant taxpayer of the possibility of waiving the privilege. If the relevant taxpayer does not grant the waiver, the responsibility for reporting the information would shift to the taxpayer. Other scenarios where the reporting obligation is shifted to the taxpayer are in-house schemes without involvement of intermediaries or the use of intermediaries from countries outside the EU. (c)   Reporting to the tax office is required within a 30-day timeframe after the arrangement is made available for implementation or when the first step has been implemented. The report must contain the applicable hallmark, a summary of the cross-border arrangement including its value, the applicable tax provisions and certain information regarding the intermediary and the taxpayer. The information will be automatically submitted by the competent authority of each EU member state through the use of a central directory on administrative cooperation in the field of direct taxation. (d)  The reporting obligations commence on July 1, 2020. However, the Law also has retroactive effect: for all reportable arrangements that were implemented in the interim period between June 24, 2018 and June 30, 2020 the report would have to be filed by August 31, 2020. Penalties for noncompliance with the reporting obligations are up to EUR 25,000 while there are no penalties for noncompliance with such reportable arrangements for the interim period between June 25, 2018 and June 30, 2020.
Since, as noted above, the reporting obligation can be shifted to the client as the taxpayer and the client will then be responsible for complying with the reporting obligations, taxpayers should consider establishing a suitable reporting compliance process. Such process may encompass sensitization for and identification of reportable transactions, the determination of responsibilities, the development of respective DAC 6 governance and a corresponding IT-system, recording of arrangements during the transitional period after June 24, 2018, robust testing and training as well as live operations including analysis and reporting of potential reportable arrangements.

Back to Top

3.   Financing and Restructuring

3.1   EU Directive on Preventive Restructuring Framework – Minimum Standards Across Europe? On June 26, 2019, the European Union published Directive 2019/1023 on a preventive restructuring framework (Directive (EU) 2019/1023 of the European Parliament and of the Council of June 20, 2019) (the “Directive”). The Directive aims to introduce standards for “honest entrepreneurs” in financial difficulties providing businesses with a “second chance” in all EU member states. While some member states had already introduced preventive restructuring schemes in the past (e.g. the UK scheme of arrangement), others, like Germany, stayed inactive, leaving debtors with the largely creditor-focused and more traditional tools set forth in the German Insolvency Code (Insolvenzordnung, InsO). By contrast, the Directive now seeks to protect workers and creditors alike in “a balanced manner”. In addition, a particular focus of the Directive are small and medium-sized enterprises, which often do not have the resources to make use of already existing restructuring alternatives abroad. The key features of the Directive provide, in particular:
  • The preventive restructuring regime shall be available upon application of the debtor. Creditors and employee representatives may file an application, but generally the consent of the debtor shall be required in addition;
  • Member states are required to implement early warning tools and to facilitate access to information enabling debtors to properly assess their financial situation early on and detect circumstances which may ultimately lead to insolvency;
  • Preventive restructuring mechanisms must be set forth in domestic law in the event there is a “likely insolvency”. Debtors must be given the possibility to remain in control of the business operations while restructuring measures are implemented to avoid formal insolvency proceedings. In Germany, it will be a challenge to properly distinguish between the newly introduced European concept of “likely insolvency” which is the door opener for preventive restructuring under the Directive and the existing German legal concept of “imminent illiquidity” (drohende Zahlungsunfähigkeit) which under current insolvency law enables German debtors to proceed with a voluntary insolvency filing;
  • A stay of individual enforcement measures for an initial period of four months (with an extension option of up to a maximum of 12 months) must be provided for, thus putting debtors in a position to negotiate a restructuring plan. During this time period, the performance of executory contracts cannot be withheld solely due to non-payment;
  • Minimum requirements for a restructuring plan include an outline of the contemplated restructuring measures, effects on the workforce, as well as the prospects that insolvency can be prevented on the basis of such measures;
  • Restructuring measures contemplated by the Directive are wide ranging and include a change in the composition of a debtor’s assets and liabilities, a sale of assets or of the business as a going concern, as well as necessary operational changes;
  • Voting on the restructuring plan is generally effected by separate classes of creditors in each case with a majority requirement of not more than 75%.
  • Cross-class cram down will be available subject to certain conditions including (i) a majority of creditor classes (including secured creditors) voted in favor and (ii) dissenting creditors are treated at least equal to their pari passu creditors (or better than creditors ranking junior). In addition, the restructuring plan must be approved by either a judicial or administrative authority in order to be binding on dissenting voting classes. Such approval is also required in the event of new financing or when the workforce is reduced by more than 25%.
Member states have until July 17, 2021 to implement the Directive into domestic law (subject to a possible extension of up to one year), but considering the multiple alternative options the Directive leaves to member states, discussions on how to best align existing domestic laws with the requirements of the Directive have already started. Ultimately, the success of the Directive depends on the willingness of the member states to implement a truly effective pre-insolvency framework. The inbuilt flexibility and variety of structuring alternatives left to the member states can be an opportunity for Germany to finally enact an out-of-court restructuring scheme beyond the existing debtor in possession (Eigenverwaltung) or protective shield (Schutzschirm) proceedings which, however, currently kick in only at a later stage of financial distress after an insolvency filing has already been made.

Back to Top

3.2   Insolvency Contestation in Cash Pool Scenarios

One of the noticeable developments in the year 2019 was that inter-company cash-pool systems have increasingly come under close scrutiny in insolvency scenarios. There were several decisions by the German Federal Supreme Court (Bundesgerichtshof, BGH), the most notable one probably a judgment handed down on June 27, 2019 (case IX ZR 167/18) in a double insolvency case where the respective insolvency administrators of an insolvent group company and its insolvent parent and cash pool leader were fighting over the treatment of mutually granted upstream and downstream loans during the operation of a group-wide cash management system that saw multiple loan movements between the two insolvent debtors during the relevant pre-insolvency period. Under applicable German insolvency contestation laws (Insolvenzanfechtung), the insolvency administrator of the insolvent subsidiary has the right to contest any shareholder loan repayments or equivalent payments made to its parent as shareholder and pool leader within a period of one year prior to the point in time when the insolvency filing petition is lodged. The rationale of this rule is to protect the insolvent estate and regular unsecured trade creditors from pre-insolvency payments to shareholders who in an insolvency would only be ranked as subordinated creditors. The contestation right – if successful - allows the insolvency administrator to claw back from shareholders such earlier repayments to boost the funds available for distribution in the insolvency proceedings. In cases such as the one at hand where the cash pool was operated in a current account system resulting in multiple cash payments to and from the pool leader, the parent’s potential exposure could have grown exponentially if the insolvency administrator of the subsidiary could have simply added up all loan repayments made within the last year, irrespective of the fact that the pool leader, in turn, regularly granted new down-stream loan payments to the subsidiary as and when liquidity was needed. In one of the main conclusions of the judgment, the BGH confirmed the calculation mechanism for the maximum amount that can be contested and clawed back in scenarios such as this: The court, in this respect, does not simply add up all loan repayments in the last year. Instead, it uses the historic maximum amount of the loans permanently repaid within the one-year contestation period as initial benchmark and then deducts the outstanding amounts still owed by the insolvent subsidiary at the end of the contestation period. Interim fluctuations, where further repayments to the pool leader occurred, are deemed immaterial if they have been re-validated by new subsequent downstream loans. Consequently, the court limits the exposure of the pool leader in current account situations to the balance of loans, not by way of a simple addition of all repayments. In a second clarification, the BGH decreed that customary, arm’s length interest charged by the pool leader to the insolvent subsidiary for its downstream loans and then paid to the shareholder as pool leader are not qualified as a “payment equivalent to a loan repayment”, because interest is an independent compensation for the downstream loan, not capital transferred to the lender for temporary use. Beyond the specifics of the decision, the increased focus of the courts on cash pools in crisis situations should cause larger groups of companies that operate such group-wide cash management systems to revisit the underlying contractual arrangements to ensure that participating companies and the pool leader have adequate mutual early warning systems in place, as well as robust remedies and/or withdrawal rights to react as early as possible to the deterioration of the financial position of one or several cash pool participants. Even though the duration of the one-year contestation period will often mean that even carefully and appropriately drafted cash pooling documentation cannot always preempt or avoid all risk in a later financial crisis, at least, the potential personal liability risks for management which go beyond the mere contestation risk can be mitigated and addressed this way.

Back to Top

4.   Labor and Employment

4.1   De-Facto Employment – A Rising Risk for Companies A widely-noticed court decision by the Federal Social Court (Bundessozialgericht) (judgment of June 4, 2019 – B12 R11 11/18 R) on the requalification of freelancers as de-facto employees has potentially increased risks to companies who employ freelancers. In this decision, the court requalified physicians officially working as “fee doctors” in hospitals as de-facto employees, because they were considered as integrated into the hospital hierarchy, especially due to receiving instructions from other doctors and the hospital management. While this decision concerned physicians, it found wide interest in the general HR community, as it tightened the leeway for employing freelancers. This aspect is particularly important for companies in Germany, as there is a war for talent, particularly with respect to engineers and IT personnel. These urgently sought-after experts are in high demand and therefore often able to dictate the contractual relationships. In this respect, they often prefer a freelancer relationship, as it is more profitable for them and gives them the opportunity to also work for other (even competing) companies. Against the background of this decision, every company would be well advised to review very thoroughly, whether a “freelancer” is really free of instructions regarding the place of work, the working hours, and the details of the work to be done. Otherwise, the potential liability for the company – both civil and criminal – is considerable if freelancers are deemed to be de-facto employees.

Back to Top

4.2   New Constraints for Post-Contractual Non-Compete Covenants

A recently published decision by the Higher District Court (Oberlandesgericht) of Munich has restricted the permissible scope of post-contractual non-compete covenants for managing directors (decision of August 2, 2018 – 7 U 2107/18). The court held that such restrictions are only valid if and to the extent they are based upon a legitimate interest of the company. In addition, their scope has to be explicitly limited in the respective wording tailored to the individual case. This court decision is important, because, unlike for “regular” employees, post-contractual non-compete agreements for managing directors are not regulated by statutory law. Therefore, every company should, in a first step, carefully review whether a post-contractual non-compete is really necessary for the relevant managing director. If it is deemed to be indispensable, the wording should be carefully drafted according to the above-mentioned principles.

Back to Top

4.3   ECJ Judgments on Vacation and Working Hours

The European Court of Justice (ECJ) has handed down two employee-friendly decisions regarding (a) the forfeiture of entitlement to vacation and (b) the control of working hours (case C-684/16, judgment of November 6, 2018 and case C-55/18, judgment of May 14, 2019). According to the first decision, employee vacation entitlement cannot simply be forfeited due to the lapse of time, even if such a forfeiture is stipulated by national statutory law. Rather, the employer has an obligation to actively notify employees of their outstanding entitlement to vacation and encourage them to take their remaining vacation. In the other decision, the ECJ demanded that the company establish a system to control and document all the working hours of its employees, not only those exceeding a certain threshold. In practical terms of the German economy, not all companies currently have such seamless time control and documentation systems in place. However, until this ECJ judgment is implemented into German statutory law, companies cannot be fined solely based upon the ECJ judgment. Thus, a legislative response to this issue and the court decision must be awaited.

Back to Top

5.         Real Estate

5.1   Real Estate – Rent Price Cap concerning Residential Space in Berlin On November 26, 2019, the Berlin Senate (the government of the federal state of Berlin) passed a draft bill for the “Act on Limiting Rents on Berlin’s Residential Market” (Gesetz zur Mietenbegrenzung im Wohnungswesen in Berlin), the so-called Berlin rent price cap (Mietendeckel). It is expected that this bill will be adopted by the Berlin House of Representatives (the legislative chamber of the federal state of Berlin) and come into force in early 2020, with certain provisions of the bill having retroactive effect as of June 18, 2019. This bill shall apply to residential premises in Berlin (with a few exceptions) that were ready for occupancy for the first time before January 1, 2014. The three key instruments of this bill are (a) a rent freeze, (b) the implementation of rent caps and (c) a limit on modernization costs that can be passed on to the tenant.
(a)   The rent freeze shall apply to all existing residential leases and shall freeze the rent at the level of the rent on June 18, 2019 (or, if the premises were vacant on that date, the last rent before that date). This rent freeze also applies to indexed rents and stepped rents. As of 2022, landlords shall be entitled to request an annual inflation related rent adjustment, however, capped at 1.3% p.a.. Prior to entering into a new residential lease agreement, the landlord must inform the future tenant about the relevant rent as at June 18, 2019 (or earlier, as applicable). (b)   Depending on the construction year and fit-out standards (with / without collective heating / bathroom), initial monthly base rent caps between EUR 3.92 and EUR 9.80 per square meter (m²) shall apply. These caps shall be increased by 10% for buildings with up to two apartments. Another increase of EUR 1 per m² shall apply with respect to an apartment with “modern equipment”, i.e. an apartment that has at least three of the following five features: (i) barrier-free access to a lift, (ii) built-in kitchen, (iii) “high quality” sanitary fit-out, (iv) “high quality” flooring in the majority of the living space and (v) low energy performance (less than 120 kWh/(m²a). The bill does not contain a definition of what constitutes “high quality”. For new lettings after June 18, 2019 and re-lettings after this bill has come into force, the rent must not exceed the lower of the applicable rent caps and the rent level as of June 18, 2019 (or earlier, as applicable). If the agreed monthly rent as of June 18, 2019 (or earlier) was below EUR 5.02 per m², the re-letting rent may be increased by EUR 1 per m² up to a maximum monthly rent of EUR 5.02 per m². Once the act has been in effect for nine months, the tenants may request the public authorities to reduce the rent of all existing leases to the appropriate level if the rent is considered “extortionate”, i.e. if the rent exceeds the applicable rent cap level (subject to certain surcharges / discounts for the location of the premises) by more than 20% and it has not been approved by public authorities. The surcharges / discounts amount to +74 cents per m² (good location), -9 cents per m² (medium location) and –28 cents per m² (simple location). (c)   Modernization costs shall only be passed on to tenants if they relate to (i) measures required under statutory law, (ii) thermal insulation of certain building parts, (iii) measures for the use of renewable energies, (iv) window replacements to save energy, (v) replacement of the heating system, (vi) new installation of elevators or (vii) certain measures to remove barriers. Such costs can also only be passed on to tenants to the extent that the monthly rent is not increased by more than EUR 1 per m² and the applicable rent cap is not exceeded by more than EUR 1 per m². To cover the remaining modernization costs, landlords may apply for subsidies under additional subsidy programs of the state of Berlin. Any rent increase due to modernization measures is to be notified to the state-owned Investitionsbank Berlin.
Breaches of the material provisions of this bill are treated as an administrative offence and may be fined by up to EUR 500,000 in each individual case. Many legal scholars consider the Berlin rent price cap unconstitutional (at least, in parts) for infringing the constitutional property guarantee, the freedom of contract and for procedural reasons. In particular, they raise concerns about whether the state of Berlin is competent to pass such local legislation (as certain provisions deviate from the German Civil Code (BGB) as federal law) and whether the planned retroactive effect is permissible. The opposition in the Berlin House of Representatives and a parliamentary faction on the federal level have already announced that they intend to have the Berlin rent cap reviewed by the Berlin’s Regional Constitutional Court (Verfassungsgerichtshof des Landes Berlin) and the Federal Constitutional Court (Bundesverfassungsgericht). In light of the severe potential fines, landlords should nonetheless consider compliance with the provisions of the Berlin rent price cap until doubts on the constitutional permissibility have been finally clarified.

Back to Top

5.2   Changes to the Transparency Register affecting Real Property Transactions

Certain aspects of the act implementing the 5th EU Anti-Money Laundering Directive (Directive (EU) 2018/843) which amended the German Anti-Money Laundering Act (GwG) are of particular interest to the property sector. We would, therefore, refer interested circles to the above summary in section 1.4.

Back to Top

6.   Compliance and Litigation

6.1   German Corporate Sanctions Act German criminal law so far does not provide for corporate criminal liability. Corporations can only be fined under the law on administrative offenses. In August 2019, the German Federal Ministry of Justice and Consumer Protection (Bundesministerium der Justiz und für Verbraucherschutz) circulated a legislative draft of the Corporate Sanctions Act (Verbandssanktionengesetz, the “Draft Corporate Sanctions Act”) which would, if it became law, introduce a hybrid system. The main changes to the current legal situation would eliminate the prosecutorial discretion in initiating proceedings, tighten the sentencing framework and formally incentivize the implementation of compliance measures and internal investigations. So far, German law grants the prosecution discretion on whether to prosecute a case against a corporation (whereas there is a legal obligation to prosecute individuals suspected of criminal wrongdoing). This has resulted not only in an inconsistent application of the law, in particular among different federal states, but also in a perceived advantageous treatment of corporations over individuals. The Draft Corporate Sanctions Act now intends to introduce mandatory prosecution of infringements by corporations, with an obligation to justify non-prosecution under the law. The law as currently proposed would also apply to criminal offenses committed abroad if the company is domiciled in Germany. Under the current legal regime, corporations can be fined up to a maximum of EUR10 million (in addition to the disgorgement of profits from the legal violation), which is often deemed insufficient by the broader public. The Draft Corporate Sanctions Act plans to increase potential fines to a maximum of 10% of the annual—worldwide and group-wide—turnover, if the group has an average annual turnover of more than EUR100 million. Additionally, profits could still be disgorged. The Draft Corporate Sanctions Act would also introduce two new sanctions: a type of deferred prosecution agreement with the possibility of imposing certain conditions (e.g. compensation for damages and monitorship), and a “corporate death penalty,” namely the liquidation of the company to combat particularly persistent and serious criminal behavior. The Draft Corporate Sanctions Act would also allow the prosecutor to either refrain from pursuing prosecution or to positively take into account in the determination of fines the existence of an adequate compliance system. If internal investigations are carried out in accordance with the requirements set out in the Draft Corporate Sanctions Act (including in particular: (i) substantial contributions to the authorities’ investigation, (ii) formal division of labor between those conducting the internal investigation, on the one hand, and those acting as criminal defense counsel, on the other, (iii) full cooperation, including full disclosure of the investigation and its results to the prosecution, and (iv) adherence to fair trial standards, in particular the interviewee’s right to remain silent in internal investigations), the maximum fine might be reduced by 50%, and the liquidation of the company or a public announcement might be precluded. It is unclear under the current legal regime whether work product created in the context of an internal investigation is protected against prosecutorial seizure. The Draft Corporate Sanctions Act wants to introduce a clarification in this respect: only such documents will be protected against seizure that are part of the relationship of trust between the company as defendant and its defense counsel. Therefore, documents used or created in the preparation of the criminal defense would be protected. Documents from interviews in the context of an internal investigations, however, would only be protected in case they stem from the aforementioned relationship between client and defense counsel. Interestingly, and as mentioned above, the draft law requires that counsel conducting the internal investigation must be separate from defense counsel if the corporation wants to claim a cooperation bonus. How this can be achieved in practice, in particular in an international context where criminal defense counsel is often expected to conduct the internal investigation and where the protection of legal privilege may depend on this dual role, is unclear. In particular here, the draft does not seem sufficiently thought-through, and both the legal profession and the business community are voicing strong opposition. Overall, it is doubtful at the moment that the current government coalition, in its struggle for survival, will continue to pursue the implementation of this legislative project as a priority. Therefore, it remains to be seen whether, when, and with what type of amendments the German Corporate Sanctions Act will be passed by the German Parliament.

Back to Top

6.2   Amendments to the German Anti-Money Laundering Act: Further Compliance Obligations, including for the Non-Financial Sector

On January 1, 2020, the Act implementing the 5th EU Anti-Money Laundering Directive (Directive (EU) 2018/843) became effective. In addition to further extending the scope of businesses that are required to conduct anti-money laundering and anti-terrorist financing procedures in accordance with the German Anti-Money Laundering Act (Geldwäschegesetz, GwG), in particular in the area of virtual currencies, it introduced new obligations and stricter individual requirements for persons or entities subject to the GwG obligations (“Obliged Persons”). The new requirements must be taken into account especially in relation to customer onboarding and ongoing anti-money laundering and countering terrorist financing (“AML/CTF”) compliance. The following overview provides a summary of some key changes, in particular, concerning the private non-financial sector, which apply in addition to the specific reporting obligations to the transparency register already described above under section 1.4.
  • The customer due diligence obligations (“KYC”) were further extended and also made more specific. In particular, Obliged Persons are now required to collect proof of registration in the transparency register or an excerpt of the documents accessible via the transparency register (e.g. shareholder lists) when entering into a new business relationship with a relevant entity. In addition, the documentation obligations with regard to the undertaken KYC measures have been further increased and clarified. Further important changes concern the enhanced due diligence measures required in the case of a higher risk of money laundering or terrorist financing, in particular with regard to the involvement of “high-risk countries”.
  • Obliged Persons must now also notify the registrar of the transparency register without undue delay of any discrepancies on beneficial ownership between entries in the transparency register and other information and findings available to them.
  • Obliged Persons must register with the Financial Intelligence Unit (FIU), regardless of whether they intend to report a suspicious activity, as soon as the FIU’s new information network starts its operations, but no later than January 1, 2024.
  • In accordance with the findings of the First National Risk Assessment, the duties for the real estate sector were significantly extended and increased. Real estate agents are now also subject to the AML/CTF risk management requirements of the GwG and are required to conduct customer due diligence when they act as intermediaries in the letting of immovable property if the monthly rent amounts to EUR 10,000 or more. Furthermore, notaries are now explicitly required to check the conclusiveness of the identity of the beneficial owner before notarizing a real estate purchase transaction in accordance with section 1 of the German Federal Real Estate Transfer Tax Act (Grunderwerbsteuergesetz) and may even be required to refuse notarization, see also section 1.4 above on the transparency register.
  • In an effort towards a more uniform EU-wide approach with regard to politically exposed persons (“PEPs”), EU member states must submit to the EU Commission a catalogue of specific functions and offices which under the relevant domestic law justify the qualification as PEP by January 10, 2020. The EU Commission will thereafter publish a consolidated catalogue, which will be binding for Obliged Persons when determining whether a contractual partner or beneficial owner qualifies as PEP with the consequence that enhanced customer due diligence applies.
  • Furthermore, the new law brought some clarifications by changing or introducing definitions, including in particular a new self-contained definition for the term “financial company”. For example, the legislator made clear that industrial holdings are not subject to the duties of the GwG: Any holding companies which exclusively hold participations in companies outside of the credit institution, financial institution or insurance sector do not qualify as financial companies under the GwG, unless they engage in business activities beyond the tasks associated with the management of their participations. That said, funds are not explicitly excluded from the definition of financial companies – and since their activities generally also include the acquisition and sale of participations, it is often questionable whether the exemption for holding companies applies.
  • Another noteworthy amendment concerns the group-wide compliance obligations in section 9 of the GwG: the amended provision now distinguishes (more) clearly between obligations applicable to an Obliged Person that is the parent company of a group and the other members of the group.
The amendments to the GwG have further intensified the obligations not only for the classical financial sector but also the non-financial sector. Since the amendments entered into force on January 1, 2020, the relevant business circles are well advised to review whether their existing AML/CTF risk management system and KYC procedures need to be adjusted in order to comply with the new rules.

Back to Top

6.3   First National Risk Assessment on the Money Laundering and Terrorist Financing Risk for Germany – Implications for the Company-Specific Risk Analyses

The first national risk assessment for the purposes of combatting money laundering and terrorist financing (“NRA”) was finally published on the website of the German Federal Ministry of Finance (Bundesministerium der Finanzen) on October 21, 2019 (currently in German only). When preparing their company-specific risk analyses under the GwG, Obliged Persons must now take into consideration also the country-, product- and sector-specific risks identified in the NRA. Germany as a financial center is considered a country with a medium-high risk (i.e. level 4 of a five-point scale from low to high) of being abused for money laundering and terrorist financing. The NRA identifies, in particular, the following key risk areas: anonymity in transactions, the real estate sector, the banking sector (in particular, in the context of correspondent banking activities and international money laundering) and the money remittance business due to the high cash intensity and cross-border activities. With regard to specific cross-border concerns, the NRA has identified eleven regions and states that involve a high risk of money laundering for Germany: Eastern Europe (particularly Russia), Turkey, China, Cyprus, Malta, the British Virgin Islands, the Cayman Islands, Bermuda, Guernsey, Jersey and the Isle of Man. Separately, a medium-high cross-border threat was identified for Lebanon, Panama, Latvia, Switzerland, Italy and Great Britain, and a further 17 countries were qualified as posing a medium, medium-low or low threat with regard to money laundering. The results of the NRA (including the assessment of cross-border threats in its annex 4) need to be taken into consideration by Obliged Persons both of the financial and non-financial sector when preparing or updating their company-specific risk analyses in a way that allows a third party to assess how the findings of the NRA were accounted for. Obliged Persons (in particular, if supervised by the BaFin (Bundesanstalt für Finanzdienstleistungsaufsicht) or active in other non-financial key-risk sectors), if they have not already done so, should thus conduct a timely review, and document such a review, of whether the findings of the NRA require an immediate update to their risk assessment or whether they consider an adjustment in the context of their ongoing review.

Back to Top

7.   Antitrust and Merger Control

7.1   Antitrust and Merger Control Overview 2019 Germany’s antitrust watchdog, the German Federal Cartel Office (Bundeskartellamt), has had another very active year. On the cartel enforcement side, the Bundeskartellamt concluded several cartel investigations and imposed fines totaling EUR 848 million against 23 companies or associations and 12 individuals from various industries including bicycle wholesale, building service providers, magazines, industrial batteries and steel. As in previous years, leniency applications continue to play an important role for the Bundeskartellamt‘s antitrust enforcement activities with a total of 16 leniency applications received in 2019. With these applications and dawn raids at 32 companies, it can be expected that the agency will have significant ammunition for an active year in 2020 in terms of antitrust enforcement. With respect to merger control, the Bundeskartellamt reviewed approximately 1,400 merger filings in 2019. 99% of these filings were concluded during the one-month phase 1 review. Only 14 merger filings (i.e. 1% of all merger filings) required an in-depth phase 2 examination. Of those, four mergers were prohibited and five filings were withdrawn – only one was approved in phase 2 without conditions, and four phase 2 proceedings are still pending. In addition, the Bundeskartellamt has been very active in the area of consumer protection and concluded its sector inquiry into comparison websites. The agency has also issued a joint paper with the French competition authority regarding algorithms in the digital economy and their competitive effects. For 2020, it is expected that the Bundeskartellamt will conclude its sector inquiry regarding online user reviews as well as smart TVs and will continue to focus on the digital economy. Furthermore, the Bundeskartellamt has also announced that it is hoping to launch the Federal Competition Register for Public Procurement by the end of 2020 – an electronic register that will list companies that have been involved in serious economic offenses.

Back to Top

7.2   Competition Law 4.0: Proposed Changes to German Competition Act

The German Federal Ministry for Economic Affairs and Energy (Bundesministerium für Wirtschaft und Energie) has compiled a draft bill for the tenth amendment to the German Act against Restraints of Competition (Gesetz gegen Wettbewerbsbeschränkungen, GWB) that aims at further developing the regulatory framework for digitalization and implementing European requirements set by Directive (EU) 2019/1 of December 11, 2018 by empowering the competition authorities of the member states to be more effective enforcers and to ensure the proper functioning of the internal market. While it is not yet clear when the draft bill will become effective, the most important changes are summarized below. (Super) Market Dominance in the Digital Age Various amendments are designed to help the Federal Cartel Office (Bundeskartellamt) deal with challenges created by restrictive practices in the field of digitalization and platform economy. One of the criteria to be taken into account when determining market dominance in the future would be “access to data relevant for competition”. For the first time, companies that depend on data sets of market-dominating undertakings or platforms would have a legal claim to data access against such platforms. Access to data will also need to be granted in areas of relative market power. Giving up the reference to “small and medium-sized” enterprises as a precondition for an abuse of relative or superior market power takes into account the fact that data dependency may exist regardless of the size of the concerned enterprise. Last but not least, the draft bill refers to a completely new category of “super dominant” market players to be controlled by the Bundeskartellamt, i.e. undertakings with “paramount significance across markets”. Large digital groups may not have significant market shares in all affected markets, but may nevertheless be of significant influence on these markets due to their key position for competition and their conglomerate structures. Before initiating prohibitive actions against such “super dominant” market players, the Bundeskartellamt will have to issue an order declaring that it considers the undertaking to have a “paramount significance across markets”, based on the exemplary criteria set out in the draft bill. Rebuttable Presumptions Following an earlier decision of the German Federal Supreme Court (Bundesgerichtshof, BGH), the draft bill suggests introducing a rebuttable presumption whereby it is presumed that direct suppliers and customers of a cartel are affected by the cartel in case of transactions during the duration of the cartel with companies participating in the cartel. The rebuttable presumption is intended to make it easier for claimants to prove that they are affected by the cartel. Another rebuttable presumption shall apply in favor of indirect customers in the event of a passing-on. However, there is still no presumption for the quantification of damages. Another procedural simplification foreseen in the draft bill is a lessening of the prerequisites to prove an abuse of market dominance. It would suffice that market behavior resulted in an abuse of market dominance, irrespective of whether the market player utilized its dominance for abusive purposes. Slight Increase of Merger Control Threshold The draft bill provides for an increase of the second domestic turnover threshold from EUR 5 million to EUR 10 million. Concentrations would consequently only be subject to filing requirements in the future if, in the last business year preceding the concentration, the combined aggregate worldwide turnover of all the undertakings concerned was more than EUR 500 million, and the domestic turnover of, at least, one undertaking concerned was more than EUR 25 million and that of another undertaking concerned was more than EUR 10 million. This change aims at reducing the burden for small and medium-sized enterprises. The fact that transactions that provide for an overall consideration of more than EUR 400 million may trigger a filing requirement remains unchanged.

Back to Top

7.3   “Undertakings” Concept Revisited – Parents Liable for their Children?

Following the Skanska ruling of the European Court of Justice (ECJ) earlier this year (case C-724/17 of March 14, 2019) , the first German court decisions (by the district courts (Landgerichte) of Munich and Mannheim) were issued in cases where litigants were trying to hold parent companies liable for bad behavior by their subsidiaries. As a reminder: In Skanska, the ECJ ruled on the interpretation of Article 101 of the Treaty on the Functioning of the European Union (TFEU) in the context of civil damages regarding the application of the “undertakings” concept in cases where third parties claim civil damages from companies involved in cartel conduct. The “undertakings” concept, which the ECJ developed with regard to the determination of administrative fines for violations of Article 101 TFEU, establishes so-called parental liability. This means that parent entities may be held liable for antitrust violations committed by their subsidiaries, as long as the companies concerned are considered a “single economic unit” because the parent has “decisive influence” over the offending company and is exercising that influence. The Skanska case extends parental liability to civil damages cases. The decisions by the two German courts in Mannheim and Munich denied a subsidiary’s liability for its parent company, or for another subsidiary, respectively.

Back to Top

8.   Data Protection: GDPR Fining Concept Raises the Stakes

While some companies are still busy implementing the requirements of the General Data Protection Regulation (the “GDPR”), the German Conference of Federal and State Data Protection Authorities has increased the pressure in October 2019 by publishing guidelines for the determination of fines in privacy violation proceedings against companies (the “Fining Concept”). Even though the Fining Concept may seem technical at first glance, it has far-reaching consequences for the fine amounts, which have already manifested in practice. The Fining Concept applies to the imposition of fines by German Data Protection Authorities within the scope of the GDPR. Since the focus for determining fines is on the global annual turnover of a company in the preceding business year, it is to be expected that fines will increase significantly. For further details, please see our client update from October 30, 2019 on this subject. In the past few months, in particular after the Fining Concept was published, several German Data Protection Authorities already issued a number of higher fines. Most notably, in November 2019 the Berlin Data Protection Authority imposed a fine against a German real estate company in the amount of EUR 14.5 million (approx. USD 16.2 million) for non-compliance with general data processing principles. The company used an archive system for the storage of personal data from tenants, which did not include a function for the deletion of personal data. In December 2019, another fine in the amount of EUR 9.5 million (approx. USD 10.6 million) was imposed by the Federal Commissioner for Data Protection and Freedom of Information against a major German telecommunications service provider for insufficient technical and organizational measures to prevent unauthorized persons from being able to obtain customer information. Many German data protection authorities have announced further investigations into possible GDPR violations and recent fines indicate that the trend towards higher fine levels will continue. This development leaves no doubt that the German Data Protection Authorities are willing to use the sharp teeth that data protection enforcement has received under the GDPR – and leave behind the rather symbolic fine ranges that were predominant in the pre-GDPR era. This is particularly true in light of the foreseeable temptation to use the concept of “undertakings” as developed under EU antitrust laws, which may include parental liability for GDPR violations of subsidiaries in the context of administrative fines as well as civil damages. For further details on the concept of “undertakings” in light of recent antitrust case law, please see above under Section 7.3.

Back to Top

9.   IP & Technology

On April 26, 2019, the German Trade Secret Act (the “Act”) came into effect, implementing the EU Trade Secrets Directive (2016/943/EU) on the protection of undisclosed know-how and business information (trade secrets) against their unlawful acquisition, use and disclosure. The Act aims at consolidating what has hitherto been a potpourri of civil and criminal law provisions for the protection of trade secrets and secret know-how in German legislation. Besides an enhanced protection of trade secrets in litigation matters, one of the most important changes to the pre-existing rules in Germany is the creation of a new and EU-wide definition of trade secrets. Trade secrets are now defined as information that (i) is secret (not publicly known or easily available), (ii) has a commercial value because it is secret, (iii) is subject to reasonable steps to keep it secret, and (iv) there is a legitimate interest to keeping it secret. This definition therefore requires the holder of a trade secret to take reasonable measures to keep a trade secret confidential in order to benefit from its protection. To prove compliance with this requirement when challenged, trade secret holders will further have to document and track their measures of protection. This requirement goes beyond the previous standard pursuant to which a manifest interest in keeping an information secret would have been sufficient. There is no clear guidance yet on what is to be understood as “reasonable measures” in this respect. A good indication may be the comprehensive case law developed by U.S. courts when interpreting the requirement of “reasonable efforts” to maintain the secrecy of a trade secret under the U.S. Uniform Trade Secrets Act. Besides a requirement to advise recipients that the information is a confidential trade secret not to be disclosed (e.g. through non-disclosure agreements), U.S. courts consider the efforts of limiting access to a “need-to-know” scope (e.g. through password protection). Another point that is of particular importance for corporate trade secret holders is that companies may be indirectly liable for negligent breaches of third-party trade secrets by their employees. Enhanced liability risks may therefore result when hiring employees who were formerly employed by a competitor and had access to the competitor’s trade secrets. Reverse engineering of lawfully acquired products is now explicitly considered a lawful means of acquiring information, except when otherwise contractually agreed. Previously, reverse engineering was only lawful if it did not require considerable expense. To avoid disclosing trade secrets that form part of a product or object by surrendering prototypes or samples, contracts should provide for provisions to limit the acquisition of the trade secret. In a nutshell, companies would be well advised to review their internal policies and procedures to determine whether there are reasonable and sufficiently trackable legal, technical and organizational measures in place for the protection of trade secrets, to observe and assess critically what know-how is brought into an organization by lateral hires, and to amend contracts for the surrender of prototypes and samples as appropriate.

Back to Top


The following Gibson Dunn lawyers assisted in preparing this client update: Birgit Friedl, Marcus Geiss, Silke Beiter, Stefan Buehrle, Lutz Englisch, Daniel Gebauer, Kai Gesing, Franziska Gruber, Selina Gruen, Dominick Koenig, Markus Nauheim, Mariam Pathan, Annekatrin Pelster, Wilhelm Reinhardt, Sonja Ruttmann, Martin Schmid, Sebastian Schoon, Benno Schwarz, Dennis Seifarth, Ralf van Ermingen-Marbach, Milena Volkmann, Michael Walther, Finn Zeidler, Mark Zimmer and Caroline Ziser Smith. Gibson Dunn's lawyers are available to assist in addressing any questions you may have regarding the issues discussed in this update. The two German offices of Gibson Dunn in Munich and Frankfurt bring together lawyers with extensive knowledge of corporate, financing and restructuring, tax, labor, real estate, antitrust, intellectual property law and extensive compliance / white collar crime and litigation experience. The German offices are comprised of seasoned lawyers with a breadth of experience who have assisted clients in various industries and in jurisdictions around the world. Our German lawyers work closely with the firm's practice groups in other jurisdictions to provide cutting-edge legal advice and guidance in the most complex transactions and legal matters. For further information, please contact the Gibson Dunn lawyer with whom you work or any of the following members of the German offices: General Corporate, Corporate Transactions and Capital Markets Lutz Englisch (+49 89 189 33 150), lenglisch@gibsondunn.com) Markus Nauheim (+49 89 189 33 122, mnauheim@gibsondunn.com) Ferdinand Fromholzer (+49 89 189 33 170, ffromholzer@gibsondunn.com) Dirk Oberbracht (+49 69 247 411 503, doberbracht@gibsondunn.com) Wilhelm Reinhardt (+49 69 247 411 502, wreinhardt@gibsondunn.com) Birgit Friedl (+49 89 189 33 122, bfriedl@gibsondunn.com) Silke Beiter (+49 89 189 33 170, sbeiter@gibsondunn.com) Annekatrin Pelster (+49 69 247 411 502, apelster@gibsondunn.com) Marcus Geiss (+49 89 189 33 122, mgeiss@gibsondunn.com) Finance, Restructuring and Insolvency Sebastian Schoon (+49 69 247 411 505, sschoon@gibsondunn.com) Birgit Friedl (+49 89 189 33 122, bfriedl@gibsondunn.com) Alexander Klein (+49 69 247 411 505, aklein@gibsondunn.com) Marcus Geiss (+49 89 189 33 122, mgeiss@gibsondunn.com) Tax Hans Martin Schmid (+49 89 189 33 110, mschmid@gibsondunn.com) Labor Law Mark Zimmer (+49 89 189 33 130, mzimmer@gibsondunn.com) Real Estate Peter Decker (+49 89 189 33 115, pdecker@gibsondunn.com) Daniel Gebauer (+49 89 189 33 115, dgebauer@gibsondunn.com) Technology Transactions / Intellectual Property / Data Privacy Michael Walther (+49 89 189 33 180, mwalther@gibsondunn.com) Kai Gesing (+49 89 189 33 180, kgesing@gibsondunn.com) Corporate Compliance / White Collar Matters Benno Schwarz (+49 89 189 33 110, bschwarz@gibsondunn.com) Michael Walther (+49 89 189 33 180, mwalther@gibsondunn.com) Mark Zimmer (+49 89 189 33 130, mzimmer@gibsondunn.com) Finn Zeidler (+49 69 247 411 504, fzeidler@gibsondunn.com) Markus Rieder (+49 89189 33 170, mrieder@gibsondunn.com) Ralf van Ermingen-Marbach (+49 89 18933 130, rvanermingenmarbach@gibsondunn.com) Antitrust Michael Walther (+49 89 189 33 180, mwalther@gibsondunn.com) Jens-Olrik Murach (+32 2 554 7240, jmurach@gibsondunn.com) Kai Gesing (+49 89 189 33 180, kgesing@gibsondunn.com) Litigation Michael Walther (+49 89 189 33 180, mwalther@gibsondunn.com) Mark Zimmer (+49 89 189 33 130, mzimmer@gibsondunn.com) Finn Zeidler (+49 69 247 411 504, fzeidler@gibsondunn.com) Markus Rieder (+49 89189 33 170, mrieder@gibsondunn.com) Kai Gesing (+49 89 189 33 180, kgesing@gibsondunn.com) Ralf van Ermingen-Marbach (+49 89 18933 130, rvanermingenmarbach@gibsondunn.com) International Trade, Sanctions and Export Control Michael Walther (+49 89 189 33 180, mwalther@gibsondunn.com) Richard Roeder (+49 89 189 33 122, rroeder@gibsondunn.com) © 2020 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.