Paris partner Ahmed Baladi is the author of “Can GDPR Hinder AI Made in Europe?” [PDF] published by Cybersecurity Law Report on July 10, 2019.
Paris partner Ahmed Baladi is the author of “Can GDPR Hinder AI Made in Europe?” [PDF] published by Cybersecurity Law Report on July 10, 2019.
Click for PDF Decided June 24, 2019 Iancu v. Brunetti, No. 18-302 Today, the Supreme Court held 6-3 that the Lanham Act’s prohibition on the registration of “Immoral or Scandalous” trademarks infringes the First Amendment. Background: Two terms ago, in Matal v. Tam, 582 U.S. __ (2017), the Supreme Court declared unconstitutional the Lanham Act’s ban on registering trademarks that “disparage” any “person, living or dead.” 15 U.S.C. § 1052(a). The Court held that a viewpoint based ban on trademark registration is unconstitutional, and that the Lanham Act’s disparagement bar was viewpoint based (permitting registration of marks when their messages celebrate persons, but not when their messages are alleged to disparage). Against that backdrop, Erik Brunetti, the owner of a streetwear brand whose name sounds like a form of the F-word, sought federal registration of the trademark FUCT. The U.S. Patent and Trademark Office denied Brunetti’s application under a provision of the Lanham Act that prohibits registration of trademarks that “[c]onsist of or compromise immoral or scandalous matter.” 15 U.S.C. § 1052(a). On Brunetti’s First Amendment challenge, the Federal Circuit invalidated this “Immoral or Scandalous” provision of the Lanham Act, on the basis that it impermissibly discriminated on the basis of viewpoint. Issue: Does the Lanham Act’s prohibition on the federal registration of “Immoral or Scandalous” trademarks infringe the First Amendment right to freedom of speech? Court’s Holding: Yes. In an opinion authored by Justice Kagan on June 24, 2019, the Supreme Court held that he Lanham Act, which bans registration of “immoral … or scandalous matter,” violates the free speech rights guaranteed by the First Amendment because it discriminates on the basis of viewpoint. “If the ‘immoral or scandalous’ bar similarly discriminates on the basis of viewpoint, it must also collide with our First Amendment doctrine.” Justice Kagan, writing for the majority What It Means: The argument that the government advanced in this case—that speech is not restricted when you can call your brand or product anything you want even if you cannot get the benefit of federal trademark protection—will not save statutory bans on trademark registration that are viewpoint based. The Court made clear that its decision was based on the broad reach of the Lanham Act’s ban: “[T]he ‘immoral or scandalous’ bar is substantially overbroad. There are a great many immoral and scandalous ideas in the world (even more than there are swearwords), and the Lanham Act covers them all.” In his concurring opinion, Justice Alito emphasized that the Court’s decision “does not prevent Congress from adopting a more carefully focused statute that precludes the registration of marks containing vulgar terms that play no real part in the expression of ideas,” thus leaving room for legislators to develop a more narrowly tailored alternative. Unless and until a new law is proposed and passed, however, the U.S. Patent and Trademark Office will have no statutory basis to refuse federal registration of potentially vulgar, profane, offensive, disreputable, or obscene words and images. As always, Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding developments at the Supreme Court. Please feel free to contact the following practice leaders: Appellate and Constitutional Law Practice Allyson N. Ho +1 214.698.3233 email@example.com Mark A. Perry +1 202.887.3667 firstname.lastname@example.org Related Practice: Intellectual Property Wayne Barsky +1 310.552.8500 email@example.com Josh Krevitt +1 212.351.4000 firstname.lastname@example.org Mark Reiter +1 214.698.3100 email@example.com Related Practice: Fashion, Retail and Consumer Products Howard S. Hogan +1 202.887.3640 firstname.lastname@example.org © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Click for PDF In a previous client alert, we highlighted a recent U.S. sanctions regime aimed at deterring threats of election interference, which further expanded the U.S. menu of cyber-related sanctions. Across the Atlantic, as a step forward that demonstrates its voiced determination to enhance the EU’s cyber defense capabilities, on May 17, 2019, the EU established a sanctions framework for targeted restrictive measures to deter and respond to cyber-attacks that constitute an external threat to the EU or its Member States. The new framework is expounded in two documents, Council Decision (CFSP) 2019/797 and Council Regulation 2019/796. The newly-introduced framework is significant for two reasons. First, the framework enables the EU to implement unilateral cyber sanctions, a move that expands the EU’s sanctions toolkit beyond traditional areas of sanctions, such as sanctions imposed due to terrorism and international relations-based grounds. Second, it represents a major, concrete measure that arose out of the EU’s continued interest in developing an open and secured cyberspace and amid concerns for malicious use of information and communications technologies by both State and non-State actors. From the alleged plot by Russia to hack the Organization for the Prevention of Chemical Weapons in the Hague in April last year to the cyber-attack on the German Parliament early this year, European leaders have been very concerned about future cyber-attacks on EU Member States. In particular, in light of the European Parliament election that took place on May 23-26, 2019, the framework equips the EU with a potent economic instrument to punish cyber-attacks more ably and directly on a unified front. Modality for Establishing the List of Sanctioned Parties Under the new framework, persons, entities and bodies subject to sanctions will be listed in the Annex to the Council Decision (CFSP) 2019/797 (“Annex I”). With a view to ensure greater consistency in the listing of sanctioned parties, the European Council has the sole authority to establish and amend Annex I as needed, and is to review Annex I “at regular intervals and at least every 12 months.” The Council will review its decision in light of observations or substantial new evidence presented to it. External Threats with a “Significant Effect” The framework applies to “cyber-attacks with significant effect, including attempted cyber-attacks with a potentially significant effect, which constitutes an external threat to the Union or its Member States.” To be external, it suffices, among other ways, that the attack originates from outside the Union, uses infrastructure outside the Union, or is with the support, at the direction of or under the control of a person outside the Union. The kinds of conduct considered as cyber-attacks include unauthorized access to and interference with IT systems, as well as data interference and interception. The Council’s approach to assessing the “significant effect” is by and large result-oriented, focusing, inter alia, pursuant to Article 3 of the Council Regulation, on “(a) the scope, scale, impact or severity of disruption caused . . . (d) the amount of economic loss caused . . . (e) the economic benefit gained by the perpetrator [or]. . . (f) the amount or nature of data stolen or the scale of data breaches. . . .” Expansive Reach of the Framework Under the framework, sanctioned persons and entities are those who are responsible for the cyber-attack, and those who attempted, or provided “financial, technical or material support” to, or otherwise involved in the cyber-attack (e.g. directing, encouraging, planning, and facilitating the attack). It is also noteworthy that although the framework primarily targets attacks against Member States and the Union itself, sanctions measures under the framework can also be applied to cyber-attacks with a significant effect against “third States or international organisations,” if sanctions measures are deemed “necessary to achieve common foreign and security policy (CFSP) objectives.” As an initiative to deter cyber-attacks in general, the subjects of cyber-attacks covered under this framework are also expansive, ranging from critical infrastructure to the storage of classified information, as well as essential services necessary for the maintenance and operation of essential social and economic activities, and government functions, including elections. Sanctions Measures under the Framework The primary restrictive measures under the framework are asset freeze and travel ban. Generally, all funds and economic resources “belonging to, owned, held or controlled by” the sanctioned person or entity will be frozen. Furthermore, “no funds or economic resources shall be made available directly or indirectly to or for the benefit of” the sanctioned party. In broad terms, these EU financial sanctions are similar to a sanctions attendant to designation on the U.S. Specially Designated Nationals And Blocked Persons List. Comparison with the U.S. Sanctions Regime for Cyber Attacks In the U.S., besides the country-specific programs, the major source of authority for cyber-related sanctions is Executive Order 13694, titled “Blocking the Property of Certain Persons Engaging in Significant Malicious Cyber-Enabled Activities” (“E.O. 13694”) signed into effect by President Barack Obama on April 1, 2015. The recently promulgated Executive Order 13848 on “Imposing Certain Sanctions in the Event of Foreign Interference in a United States Election” (“E.O. 13848”) by President Donald Trump adds a further emphasis on threats of election interference via cyber means. In comparison, the latecomer EU sanctions framework is by and large similar both in terms of the conduct it seeks to deter and parties potentially subject to sanctions. Like the U.S. sanctions program, the EU framework covers a wide range of significant interferences and expressly highlights interference with “public elections or the voting process” as one of the enumerated predicate cyber-attacks. Much like the U.S. program’s focus on “significant” “malicious cyber-enabled activities,” the focus of the EU framework on “willfully carried out” cyber-attacks “with significant effect” gives the European Council substantial flexibility and discretion in its determination of what arises to the level of a sanctionable conduct. In terms of parties covered, both E.O. 13694 (and subsequent E.O. 13848) and the EU framework sanction persons and entities who are responsible for the attack as well as those who are agents, or complicit by providing material assistance, in the commission of the cyber-attack. It is important to note that the EU framework expressly permits imposition of sanctions on parties whose conduct is against a “third [non-Member] States or international organizations”, insofar the EU satisfies itself that the sanctions are necessary to achieve CFSP objectives, namely the EU’s Union-level foreign policy objectives. In comparison, in the U.S. E.O. 13694, the possibility of imposing sanctions for cyber-attacks against a third party seems to be alluded to by the language “threat to the . . . foreign policy . . . of the United States.” Given the recentness of the framework, it is unclear as to the extent to which the EU would exercise its right under this provision, and no other countries have yet commented on this. Nonetheless, it is encouraging that both regimes leave open the possibility of sanctions based on cyber-attacks targeting third states. Conclusion & Implications The new framework established by the European Council represents a significant effort by the EU to stiffen its response to cyber-attacks. The framework has broadened EU sanctions both in substance and in scope. To the extent that the EU framework is comparable to the current U.S. cyber-related sanctions program, the EU framework reflects greater synchronization between the EU and the U.S. on the sanctions front. For the time being, no name has been added to Annex I yet. However, as the list grows in the future, businesses should closely assess their existing business relationships with other companies and pay greater attention in their onboarding compliance due diligence efforts. On the other hand, as the decision to list and delist a sanctioned party is reserved for the European Council, there is likely to be greater transparency and legal predictability for compliance purposes. ______________________  See our client alert dated Sep. 25, 2018 entitled U.S. Authorizes Sanctions for Election Interference, https://www.gibsondunn.com/us-authorizes-sanctions-for-election-interference/, for an analysis of E.O. 13848.  See Judith Lee, Cybersecurity Sanctions: A Powerful New Tool, LAW 360 (Apr. 02, 2015), https://www.gibsondunn.com/wp-content/uploads/documents/publications/Lee-Cybersecurity-Sanctions-A-Powerful-New-Tool-Law360.pdf, for an analysis by our Washington D.C. partner Judith Lee on the Obama-era executive order that forms the bulk of the current U.S. cyber-related sanctions program.  See Council Press Release 301/19, Declaration by the High Representative on behalf of the EU on respect for the rules-based order in cyberspace (Apr. 12, 2019), https://www.consilium.europa.eu/en/press/press-releases/2019/04/12/declaration-by-the-high-representative-on-behalf-of-the-eu-on-respect-for-the-rules-based-order-in-cyberspace/.  Council Press Release 367/19, Cyber-attacks: Council is now able to impose sanctions (May 17, 2019), https://www.consilium.europa.eu/en/press/press-releases/2019/05/17/cyber-attacks-council-is-now-able-to-impose-sanctions/.  See Erica Moret and Patryk Pawlak, European Union Institute for Security Studies, Brief, The EU Cyber Diplomacy Toolbox: towards a cyber sanctions regime?, p. 2 (Jul. 12, 2017), https://www.iss.europa.eu/content/eu-cyber-diplomacy-toolbox-towards-cyber-sanctions-regime.  Joe Barnes, UK Plays Pivotal Role In EU’s New Cyber-Attack Sections Regime – ‘This Is Decisive Action’, Express (May 17, 2019), https://www.express.co.uk/news/uk/1128512/UK-news-EU-cyber-attack-section-regime-European-Council-latest-update.  Thorsten Severin, Andrea Shalal, German Government under Cyber Attack, Shores Up Defenses, Reuters (Mar. 1, 2018), https://www.reuters.com/article/us-germany-cyber/german-government-under-cyber-attack-shores-up-defenses-idUSKCN1GD4C8.  See Natalia Drozdiak, EU Agrees Powers to Sanction, Freeze Assets Over Cyber-Attacks, Bloomberg (May 17, 2019), https://www.bloomberg.com/news/articles/2019-05-17/eu-agrees-powers-to-sanction-freeze-assets-over-cyber-attacks.  Council Regulation 2019/796 of May 17, 2019, concerning restrictive measures against cyber-attacks threatening the Union or its Member States, preamble, art. 13, O.J. L 129I , 17.5.2019, p. 1–12, http://data.europa.eu/eli/reg/2019/796/oj (hereinafter “Council Regulation 2019/796”).  Council Decision (CFSP) 2019/797 of 17 May 2019, concerning restrictive measures against cyber-attacks threatening the Union or its Member States, art. 1(1), O.J. L 129I , 17.5.2019, p. 13–19, http://data.europa.eu/eli/dec/2019/797/oj (hereinafter “Council Decision 2019/797”).  Id. art. 1(2).  Id. art. 3. The same language is also reflected in Council Regulation 2019/796, art. 2.  Council Decision 2019/797, supra note 10, art. 4.  Council Regulation 2019/796, supra note 9, art. 1(6).  Council Decision 2019/797, supra note 10, art. 1(4).  Id. art. 5(1).  Id. art. 5(2).  See supra note 2 for an analysis of the Executive Order. See also Exec. Order No. 13694, 80 Fed. Reg. 18,077 (Apr. 2, 2015), https://www.treasury.gov/resource-center/sanctions/Programs/Documents/cyber_eo.pdf, subsequently amended by Executive Order 13757 of December 28, 2016.  See supra note 1. See also Exec. Order No. 13848, 83 Fed. Reg. 46,843 (Sep. 12, 2018), https://www.federalregister.gov/documents/2018/09/14/2018-20203/imposing-certain-sanctions-in-the-event-of-foreign-interference-in-a-united-states-election.  Council Decision 2019/797, supra note 10, art. 1(4)(c).  See Judith Lee, supra note 2.  CFSP objectives, as the Council Decision notes, can be found in relevant provisions of Article 21 of the Treaty on European Union. A relevant excerpt of article 21 of the Treaty on European Union: The Union shall define and pursue common policies and actions, and shall work for a high degree of cooperation in all fields of international relations, in order to: (a) safeguard its values, fundamental interests, security, independence and integrity; (b) consolidate and support democracy, the rule of law, human rights and the principles of international law; (c) preserve peace, prevent conflicts and strengthen international security, in accordance with the purposes and principles of the United Nations Charter, with the principles of the Helsinki Final Act and with the aims of the Charter of Paris, including those relating to external borders; (d) foster the sustainable economic, social and environmental development of developing countries, with the primary aim of eradicating poverty; (e) encourage the integration of all countries into the world economy, including through the progressive abolition of restrictions on international trade; (f) help develop international measures to preserve and improve the quality of the environment and the sustainable management of global natural resources, in order to ensure sustainable development; (g) assist populations, countries and regions confronting natural or man-made disasters; and (h) promote an international system based on stronger multilateral cooperation and good global governance. Consolidated Version Of The Treaty On European Union, art. 21, O.J. C 326, 26.10.2012, p. 13–390, available online at http://data.europa.eu/eli/treaty/teu_2012/oj.  Compare Exec. Order No. 13694, supra note 18, sec. 1(a)(ii)(A), with Council Decision 2019/797, supra note 10, art. 1(6). The following Gibson Dunn lawyers assisted in preparing this client update: Judith Alison Lee, Adam Smith, Patrick Doris, Michael Walther, Nicolas Autet and Richard Roeder. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding the above developments. Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any of the following leaders and members of the firm’s International Trade or Privacy, Cybersecurity and Consumer Protection practice groups: United States: Judith Alison Lee – Co-Chair, International Trade Practice, Washington, D.C. (+1 202-887-3591, email@example.com) Ronald Kirk – Co-Chair, International Trade Practice, Dallas (+1 214-698-3295, firstname.lastname@example.org) Alexander H. Southwell – Co-Chair, Privacy, Cybersecurity & Consumer Protection Practice, New York (+1 212-351-3981, email@example.com) Jose W. Fernandez – New York (+1 212-351-2376, firstname.lastname@example.org) Marcellus A. McRae – Los Angeles (+1 213-229-7675, email@example.com) Adam M. Smith – Washington, D.C. (+1 202-887-3547, firstname.lastname@example.org) Christopher T. Timura – Washington, D.C. (+1 202-887-3690, email@example.com) Ben K. Belair – Washington, D.C. (+1 202-887-3743, firstname.lastname@example.org) Courtney M. Brown – Washington, D.C. (+1 202-955-8685, email@example.com) Laura R. Cole – Washington, D.C. (+1 202-887-3787, firstname.lastname@example.org) Stephanie L. Connor – Washington, D.C. (+1 202-955-8586, email@example.com) Henry C. Phillips – Washington, D.C. (+1 202-955-8535, firstname.lastname@example.org) R.L. Pratt – Washington, D.C. (+1 202-887-3785, email@example.com) Audi K. Syarief – Washington, D.C. (+1 202-955-8266, firstname.lastname@example.org) Scott R. Toussaint – Washington, D.C. (+1 202-887-3588, email@example.com) Europe: Ahmed Baladi – Co-Chair, Privacy, Cybersecurity & Consumer Protection Practice, Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) Peter Alexiadis – Brussels (+32 2 554 72 00, email@example.com) Nicolas Autet – Paris (+33 1 56 43 13 00, firstname.lastname@example.org) Attila Borsos – Brussels (+32 2 554 72 10, email@example.com) Patrick Doris – London (+44 (0)207 071 4276, firstname.lastname@example.org) Sacha Harber-Kelly – London (+44 20 7071 4205, email@example.com) Penny Madden – London (+44 (0)20 7071 4226, firstname.lastname@example.org) Steve Melrose – London (+44 (0)20 7071 4219, email@example.com) Benno Schwarz – Munich (+49 89 189 33 110, firstname.lastname@example.org) Michael Walther – Munich (+49 89 189 33-180, email@example.com) Richard W. Roeder – Munich (+49 89 189 33-160, firstname.lastname@example.org) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Palo Alto partner Mark Lyon and associates Cassandra Gaedt-Sheckter and Arjun Rangarajan are the authors of “Should Consumer Data Privacy Laws Apply To The Gov’t?” [PDF] published by Law360 on June 7, 2019.
Click for PDF In the last two weeks, California legislative committees voted on several amendments to the California Consumer Privacy Act (CCPA), which is due to go into effect January 1, 2020. While each proposal requires additional approvals, including full Assembly and Senate votes, the committees’ determinations provide an important development in the ongoing roll-out of the CCPA, what it will ultimately require, and how to address compliance. The California Assembly’s Privacy and Consumer Protection Committee approved amendments that included narrowing the scope of personal information, and effectively exempting employee-related information from coverage under the Act. In addition, the Senate Appropriations Committee unanimously approved S.B. 561 yesterday, which would expand the private right of action against entities that violate the CCPA, and is supported by Attorney General Xavier Becerra. These amendments, and any other legislative amendments or clarifications, will be further supplemented by the Attorney General Office’s promulgation of regulations, still anticipated to be issued for public comment by Fall 2019. The following is a summary of each of the amendments voted on in the past week, and a chart exhibiting the key changes to the existing language of the CCPA. As always, we will continue to monitor these important updates. Senate The Senate Judiciary Committee and the Senate Appropriations Committee both voted this month to augment the private right of action for violations of the CCPA with S.B. 561. Under the current version of the CCPA, consumers only have a private right of action for certain unauthorized disclosures of their data. S.B. 561 would permit a private right of action for any violation of the CCPA, broadly expanding the potential exposure businesses may face. The bill further removes the 30-day cure period for violations before claims can be brought by the Attorney General. Finally, the amendment removes the provision permitting businesses and third parties to seek guidance directly from the Attorney General, replacing it with a statement that the Attorney General may publish materials to provide general guidance on compliance. Assembly Several bills in the Assembly also continued to gain traction with a positive vote from the California Assembly’s Privacy and Consumer Protection Committee: A.B. 25 redefines “consumer” to exclude employees, contractors, agents, and job applicants, so long as their personal information is only collected and used by the business in that context; A.B. 873 modifies the definition of “personal information” to narrow its scope—including by removing information relating to a household, and information “capable of being associated with” a consumer—and also redefines “deidentified” data; A.B. 1564 would require businesses to make available to consumers a toll-free telephone number or an email address for submitting requests, and require businesses with websites to make those website addresses available to consumers to submit requests for information; A.B. 846 would modify the way businesses can offer financial incentive plans to consumers in exchange for their data; A.B. 1146 would exempt vehicle and ownership data collected by automotive dealers and shared with the manufacturers of the vehicle sold if the vehicle information is shared pursuant to, or in anticipation of, a vehicle repair relating to a warranty or recall; and A.B. 981 would exempt certain insurance institutions subject to the Insurance Information and Privacy Protection Act (IIPPA) from the CCPA, and would incorporate certain disclosure and other privacy requirements into the IIPPA to be in line with the CCPA. Notably, a proposal to revoke and revamp the CCPA, A.B. 1760—which would have required obtaining opt-in consent from consumers before sharing (not just selling) personal information, and would have generally broadened consumers’ rights under the Act—was taken off hearing, and will not move forward, at least at this time. Potential Impact of the Amendments on Businesses Arguably the most important changes to the CCPA for businesses interacting with California consumers are the proposed amendments set out in S.B. 561; expanding the private right of action to any violations of the Act has the potential to significantly increase the number of suits brought by individuals, including data privacy class actions, and magnify the resulting financial impact of the Act businesses interacting with state residents. As before, in anticipation of this potential amendment, it is important for businesses to work now to analyze steps necessary to ensure compliance with the various provisions likely to go into effect, including as discussed in our previous client alerts (California Consumer Privacy Act of 2018 (July 2018) and New California Security of Connected Devices Law and CCPA Amendments (October 2018)). In general, businesses should ensure that they understand the type, nature, and scope of consumer data they have collected, including where it is stored; create the processes to comply with the disclosure and other, technically difficult rights (including a Do Not Sell opt-out link on their website, and a request verification and disclosure process); revise service provider agreements for compliance; and review their privacy policies, both internal and public, to ensure that they are properly disclosing how personal data is collected, used, and potentially shared with third parties. Certain of the proposed Assembly bill amendments, on the other hand, may serve to narrow the impact on businesses, particularly related to the scope of personal information at issue. The modifications in A.B. 25, clarifying that the CCPA is not intended to cover employees’ data, could minimize the impact on companies that generally do not collect California residents’ personal information other than as a result of being an employer of Californians, and also minimize logistical issues that would otherwise arise if businesses have to allow employees to exercise the rights afforded by the Act. Rather, it would shift the impact of the CCPA primarily to those businesses that rely on collecting data as a part of their business model. The scope of personal information would be further narrowed if A.B. 873 passes, as it may eliminate some of the broader reaching—and more confusing—applications of CCPA, to household data and data that is “capable of being associated with” a consumer. The remaining language focuses on information that is linked directly, or indirectly to a particular consumer. This will also clarify some concern expressed at multiple public forums on the CCPA, regarding how verifications for data requests should work when the individual is requesting household data. A.B. 873 also redefines “deidentified,” and while several of the same guardrails would exist, the new definition would specifically require (1) contractual prohibitions on recipients of data to not reidentify such deidentified personal information, and (2) a public commitment to not reidentify the data, which may require certain internal and third party contract provision revisions, and suggested modifications to the language in consumer-facing privacy policies. As a result, it may be important for businesses to re-evaluate their contracts with suppliers, distributors, and contractors to ensure compliance for any use of deidentified data. Logistically, A.B. 1564 would offer businesses some relief from providing a toll-free telephone number for requests related to the Act, offering instead an option of an email address or a telephone number, and a website address for consumers to access. While many businesses may have already included an email address for compliance with related laws, instituting a telephone number for such requests may impose additional logistical issues for businesses under the current text of the law. Finally, for entities offering customer loyalty programs, the new provisions of A.B. 846—replacing the financial incentive provisions—will require particular attention, if passed. Primarily, businesses will need to ensure the offerings and their value must be “reasonably” related to the value of the data collected, though there may be latitude on what incentives are possible. Comparison of Proposed Language to Original The following chart provides a comparison of what would be key changes to the language of the CCPA as a result of the more broadly applicable amendments currently moving through the California legislature. The language crossed out in the Original Language column indicates what has been deleted from the current language of the Act, while the bolded language in the Proposed Amendment column shows what language has been added. That column contains what would be the final text if these amendments are adopted. We will continue to monitor the progress of these amendments, and will provide updates, accordingly. Concept Original Language Proposed Amendment Introducing Private Right of Action for Any Violation of the Act (S.B. 561) (a) (1) Any consumer whose nonencrypted or nonredacted personal information, . . . is subject to an unauthorized access . . . may institute a civil action for any of the following . . . (a) (1) Any consumer whose rights under this title are violated, or whose nonencrypted or nonredacted personal information . . . is subject to an unauthorized access . . . may institute a civil action for any of the following Excluding Employees from the Definition of Consumer (A.B. 25) (g) “Consumer” means a natural person who is a California resident . . . (g) (1) “Consumer” means a natural person who is a California resident . . . (g) (2) “Consumer” does not include a natural person whose personal information has been collected by a business in the course of a person acting as a job applicant to, an employee of, a contractor of, an agent on behalf of the business, to the extent the person’s personal information is collected and used solely within the context of the person’s role as a job applicant to, an employee of, a contractor of, or an agent on behalf of the business. Redefining Deidentified (A.B. 873) “Deidentified” means information that cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer, provided that a business that uses deidentified information: (1) Has implemented technical safeguards that prohibit reidentification of the consumer to whom the information may pertain. (2) Has implemented business processes that specifically prohibit reidentification of the information. (3) Has implemented business processes to prevent inadvertent release of deidentified information. (4) Makes no attempt to reidentify the information. “Deidentified” means information that does not reasonably identify or link, directly or indirectly, to a particular consumer, provided that the business makes no attempt to reidentify the information, and takes reasonable technical and administrative measures designed to: (1) Ensure that the data is deidentified. (2) Publicly commit to maintain and use the data in a deidentified form. (3) Contractually prohibit recipients of the data from trying to reidentify the data. Excluding Household and Information “capable of being associated with” from the Definition of “Personal Information” (A.B. 873) “Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household. “Personal information” means information that identifies, relates to, describes, or could reasonably be linked, directly or indirectly, with a particular consumer. Personal information may include, but is not limited to, the following if it identifies, relates to, describes, or could be reasonably linked, directly or indirectly, with a particular consumer. Prescribing Methods of Contacting Businesses (A.B. 1564) (1) Make available to consumers two or more designated methods for submitting requests for information required to be disclosed pursuant to Sections 1798.110 and 1798.115, including, at a minimum, a toll-free telephone number, and if the business maintains an Internet Web site, a Web site address. (1) (A) Make available to consumers a toll-free telephone number or an email address for submitting requests for information required to be disclosed pursuant to Sections 1798.110 and 1798.115. (B) If the business maintains an internet website, make the internet website available to consumers to submit requests for information required to be disclosed pursuant to Sections 1798.110 and 1798.115. Clarifying Non-discrimination Provision re Financial Incentives: Removing in Favor of Customer Loyalty Programs (A.B. 846) (a) (1) A business shall not discriminate against a consumer because the consumer exercised any of the consumer’s rights under this title, including, but not limited to, by: … (B) Charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties. (C) Providing a different level or quality of goods or services to the consumer. (2) Nothing in this subdivision prohibits a business from charging a consumer a different price or rate, or from providing a different level or quality of goods or services to the consumer, if that difference is reasonably related to the value provided to the consumer by the consumer’s data. (b) (1) A business may offer financial incentives, including payments to consumers as compensation, for the collection of personal information, the sale of personal information, or the deletion of personal information. A business may also offer a different price, rate, level, or quality of goods or services to the consumer if that price or difference is directly related to the value provided to the consumer by the consumer’s data. (2) A business that offers any financial incentives pursuant to subdivision (a), shall notify consumers of the financial incentives pursuant to Section 1798.135. (3) A business may enter a consumer into a financial incentive program only if the consumer gives the business prior opt-in consent pursuant to Section 1798.135 which clearly describes the material terms of the financial incentive program, and which may be revoked by the consumer at any time. (4) A business shall not use financial incentive practices that are unjust, unreasonable, coercive, or usurious in nature. (a) (1) A business shall not discriminate against a consumer because the consumer exercised any of the consumer’s rights under this title, including, but not limited to, by: … (B) Charging higher prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties. (C) Providing a lower level or quality of goods or services to the consumer. (2) Nothing in this subdivision prohibits a business from offering a different price, rate, level, or quality of goods or services to a consumer, including offering its goods or services for no fee, if any of the following are true: (A) The offering is in connection with a consumer’s voluntary participation in a loyalty, rewards, premium features, discount, or club card program. (B) That difference is reasonably related to the value provided by the consumer’s data. (C) The offering is for a specific good or service whose functionality is reasonably related to the collection, use, or sale of the consumer’s data. (b) As used in this section, “loyalty, rewards, premium features, discount, or club card program” includes an offering to one or more consumers of lower prices or rates for goods or services or a higher level or quality of goods or services, including through the use of discounts or other benefits, or a program through which consumers earn points, rewards, credits, incentives, gift cards, or certificates, coupons, or access to sales or discounts on a priority or exclusive basis.  Although approved unanimously, S.B. 561 was placed on Suspense File, where the committee sends bills with an annual cost of more than $150,000, to be considered following budget discussions. The bill will not move forward until the Appropriations Committee releases it for a vote.  The Senate Judiciary Committee had previously approved the bill 6-2 on April 9, 2019.  Please note that the following chart does not include language modifications to the IIPPA (A.B. 981) or proposed amendments exempting information shared between automotive dealers and vehicle manufacturers (A.B. 1146), as they are of more limited application than the more general provisions that were included. If you have questions about those particular provisions, please reach out to discuss with us and we would be happy to provide further guidance. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Privacy, Cybersecurity and Consumer Protection practice group, or the authors: H. Mark Lyon – Palo Alto (+1 650-849-5307, email@example.com) Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650-849-5203, firstname.lastname@example.org) Maya Ziv – Palo Alto (+1 650-849-5336, email@example.com) Privacy, Cybersecurity and Consumer Protection Group: United States Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, firstname.lastname@example.org) M. Sean Royall – Dallas (+1 214-698-3256, email@example.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, firstname.lastname@example.org) Christopher Chorba – Los Angeles (+1 213-229-7396, email@example.com) Richard H. Cunningham – Denver (+1 303-298-5752, firstname.lastname@example.org) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, email@example.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, firstname.lastname@example.org) Kristin A. Linsley – San Francisco (+1 415-393-8395, email@example.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, firstname.lastname@example.org) Shaalu Mehra – Palo Alto (+1 650-849-5282, email@example.com) Karl G. Nelson – Dallas (+1 214-698-3203, firstname.lastname@example.org) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, email@example.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, firstname.lastname@example.org) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, email@example.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, firstname.lastname@example.org) Europe Ahmed Baladi – Co-Chair, Paris (+33 (0)1 56 43 13 00, email@example.com) James A. Cox – London (+44 (0)207071 4250, firstname.lastname@example.org) Patrick Doris – London (+44 (0)20 7071 4276, email@example.com) Bernard Grinspan – Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) Penny Madden – London (+44 (0)20 7071 4226, email@example.com) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) Michael Walther – Munich (+49 89 189 33-180, email@example.com) Nicolas Autet – Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) Kai Gesing – Munich (+49 89 189 33-180, email@example.com) Sarah Wazen – London (+44 (0)20 7071 4203, firstname.lastname@example.org) Alejandro Guerrero – Brussels (+32 2 554 7218, email@example.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, firstname.lastname@example.org) Jai S. Pathak – Singapore (+65 6507 3683, email@example.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
In its 2019 edition, Chambers USA: America’s Leading Lawyers for Business awarded Gibson Dunn 79 first-tier rankings, of which 27 were firm practice group rankings and 52 were individual lawyer rankings. Overall, the firm earned 276 rankings – 80 firm practice group rankings and 196 individual lawyer rankings. Gibson Dunn earned top-tier rankings in the following practice group categories: National – Antitrust National – Antitrust: Cartel National – Appellate Law National – Corporate Crime & Investigations National – FCPA National – Outsourcing National – Real Estate National – Retail National – Securities: Regulation CA – Antitrust CA – Environment CA – IT & Outsourcing CA – Litigation: Appellate CA – Litigation: General Commercial CA – Litigation: Securities CA – Litigation: White-Collar Crime & Government Investigations CA – Real Estate: Southern California CO – Litigation: White-Collar Crime & Government Investigations CO – Natural Resources & Energy DC – Corporate/M&A & Private Equity DC – Labor & Employment DC – Litigation: General Commercial DC – Litigation: White-Collar Crime & Government Investigations NY – Litigation: General Commercial: The Elite NY – Media & Entertainment: Litigation NY – Technology & Outsourcing TX – Antitrust This year, 155 Gibson Dunn attorneys were identified as leading lawyers in their respective practice areas, with some ranked in more than one category. The following lawyers achieved top-tier rankings: D. Jarrett Arp, Theodore Boutrous, Jessica Brown, Jeffrey Chapman, Linda Curtis, Michael Darden, William Dawson, Patrick Dennis, Mark Director, Scott Edelman, Miguel Estrada, Stephen Fackler, Sean Feller, Eric Feuerstein, Amy Forbes, Stephen Glover, Richard Grime, Daniel Kolkey, Brian Lane, Jonathan Layne, Karen Manos, Randy Mastro, Cromwell Montgomery, Daniel Mummery, Stephen Nordahl, Theodore Olson, Richard Parker, William Peters, Tomer Pinkusiewicz, Sean Royall, Eugene Scalia, Jesse Sharf, Orin Snyder, George Stamas, Beau Stark, Charles Stevens, Daniel Swanson, Steven Talley, Helgi Walker, Robert Walters, F. Joseph Warin and Debra Wong Yang.
Click for PDF On January 25, 2019, in Rosenbach v. Six Flags Entertainment Corporation, the Illinois Supreme Court unanimously held that a plaintiff may be “aggrieved” under Illinois’ Biometric Information Privacy Act (“BIPA”)—with statutory standing to sue for significant statutory damages—even without alleging an “actual injury” caused by the BIPA violation. In so holding, the Court reversed the appellate court’s contrary conclusion and—at least for now—appears to have put to rest one outstanding question in several federal and state court proceedings regarding the scope and availability of BIPA’s private right of action. The Court’s decision is likely to lead to an increase in BIPA litigation in Illinois. Other states, including Texas and Washington, have biometric privacy statutes, but the Illinois law is the only one that allows for a private right of action. BIPA Background Illinois enacted BIPA in 2008 in response to the increasing use of “biometric-facilitated financial transactions” in Illinois. BIPA regulates the “collection, use, safeguarding, handling, storage, retention, and destruction of biometric identifiers and information,” including retina or iris scans, fingerprints, voiceprints, and scans of hand or face geometry. Among other requirements, BIPA requires private entities to develop and follow a written, publicly-available policy for the retention and destruction of biometric identifiers, and to provide certain disclosures in writing and obtain a release before acquiring an individual’s biometric identifier or information. Persons “aggrieved by a violation” of BIPA have a private right of action under the statute and may sue for statutory remedies, including the greater of actual or liquidated damages of $1,000 (for negligent violations) or $5,000 (for intentional or reckless violations). BIPA’s private right of action has energized the Illinois plaintiffs’ bar, which in the last few years has filed dozens of proposed class action lawsuits against companies for their allegedly improper collection of alleged biometric information. Plaintiffs in these cases have generally fallen into two categories: (1) employees of companies that allegedly utilize biometric information, such as fingerprints, for time keeping purposes; and (2) customers of companies that use alleged biometric information to enhance the consumer experience. The Rosenbach plaintiff fell into this second group. Plaintiff Stacy Rosenbach—on behalf of her minor son, a customer of Six Flags Entertainment Corporation (“Six Flags”)—sued Six Flags after her son registered for a season pass at the amusement park. Six Flags allegedly captured the thumbprints of season pass holders to facilitate entry into the park and limit loss from the unauthorized use of passes by non-pass-holders. In her suit against Six Flags, Rosenbach alleged that Six Flags violated BIPA by capturing her son’s thumbprint without first providing written notice, obtaining written consent, and publishing a policy explaining how her son’s thumbprint would be used, retained, and destroyed. She alleged no actual harm beyond the violation of BIPA’s requirements. The Issue in Rosenbach v. Six Flags The question presented to the Illinois Supreme Court was whether a plaintiff is “aggrieved” under BIPA, and thus potentially eligible for statutory remedies including liquidated damages, when the only injury she alleges is that the defendant collected her biometric identifiers or biometric information without providing the required disclosures and obtaining written consent as required by the Act. The Second District Appellate Court held that a “technical violation” of the statute, without more, did not render a plaintiff “aggrieved” under BIPA. Specifically, the appellate court stated that “there must be an actual injury, adverse effect, or harm in order for the person to be ‘aggrieved,’” and a “technical violation” alone does not suffice. If a “violation” were “actionable” by itself, the appellate court concluded, that “would render the word ‘aggrieved’ superfluous.” The Court’s Holding Reversed. The Illinois Supreme Court held that a plaintiff is “aggrieved” under BIPA—and has statutory standing to sue—when the plaintiff alleges a violation of her BIPA rights, even if the violation caused no “actual injury or adverse effect.” In other words, the “[t]he violation, in itself, is sufficient to support the individual’s or customer’s statutory cause of action.” The Court found that BIPA creates a substantive right to control one’s own biometric information. No-injury BIPA violations are not merely “technicalities,” the Court held, but “real and significant” harms to important rights created by the Illinois legislature. The Court also reasoned that the private right of action and remedies exist to prevent and deter violations of individuals’ BIPA rights. Requiring would-be plaintiffs to wait to sue until they have suffered “actual injury” would defeat these purposes of the statute. Because the Rosenbach plaintiff alleged violations of his BIPA rights—Six Flags allegedly collected his fingerprints for use in a season pass without providing the statutorily mandated notices or publishing a data retention policy—the Supreme Court reversed the appellate court’s contrary decision and remanded the case to the trial court. What to Expect Expect more class action litigation on BIPA claims from the Illinois plaintiffs’ bar. Companies that do business in Illinois and collect or use biometric identifiers or biometric information should examine their policies for BIPA compliance. Biometric identifier is defined to mean “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry. Writing samples, signatures, photographs, demographic data, physical descriptions, and biological samples used for scientific testing are not biometric identifiers. Biometric information is any information “based on an individual’s biometric identifier used to identify an individual.” BIPA Basics: Private entities may not collect biometric information or identifiers (“biometrics”) without first: (1) providing written notice of the collection that describes the purpose and terms of the collection and storage, and (2) obtaining written consent. Private entities may not sell, rent, or disclose biometrics without prior written consent. Private entities also must develop and make publicly available a data retention policy that sets forth a “retention schedule and guidelines for permanently destroying [biometrics] when the initial purpose for collecting or obtaining [them] has been satisfied or within 3 years of the individual’s last interaction with the private entity, whichever occurs first.” Private entities must store and protect biometrics according to the reasonable standard of care of the entities’ industry and in a manner that is as protective or more protective than the manner in which the entity stores and protects other sensitive information. Expect additional developments in the federal courts regarding whether BIPA plaintiffs have Article III standing. Post-Rosenbach, BIPA plaintiffs need not allege an “actual injury” beyond the statutory violation to state a claim under the statute. But to satisfy the Article III standing requirements necessary to pursue a claim in federal court, plaintiffs may need to allege more than a statutory violation. To date, federal courts have been split on what type of injury, short of economic harm, may be sufficient to create Article III standing for BIPA plaintiffs. Expect additional litigation over the scope of Illinois’ standing doctrine. Amici for Six Flags urged the Illinois Supreme Court to consider an alternate ground for affirmance: that Rosenbach lacked standing to sue under the Illinois constitution. The Court did not address the issue. In lieu of a statutory standing argument, more BIPA defendants may press a state constitutional standing argument in an effort to void plaintiffs’ claims. Look for additional changes in BIPA’s terms. This year, the Illinois State Senate will consider a bill narrowing the impact of BIPA.  2019 IL 123186 (Ill. Jan. 25, 2019).  See Tex. Bus. & Com. Code § 503.001 et seq.; Wash. Rev. Code § 19.375.010 et seq.  740 Ill. Comp. Stat. 14/5(a), (b), (g).  740 Ill. Comp. Stat. 14/15(a), (b).  740 Ill. Comp. Stat. 14/20.  Rosenbach, 2019 IL 123186 at ¶¶ 4-9.  Id. ¶ 14.  Rosenbach v. Six Flags Entm’t Corp., 2017 IL App (2d) 170317, at ¶ 20 (Ill. App. Ct. 2017).  Id. at ¶ 23.  Rosenbach, 2019 IL 123186 at ¶ 33.  Id. ¶ 33.  Id. ¶ 34.  Id. ¶ 37.  740 Ill. Comp. Stat. 14/10.  Id.  Id.  740 Ill. Comp. Stat. 14/15(b).  740 Ill. Comp. Stat. 14/15(c).  740 Ill. Comp. Stat. 14/15(a).  740 Ill. Comp. Stat. 14/15(e).  Compare e.g., Monroy v. Shutterfly, 2017 WL 4099846, *8 n.5 (N.D. Ill. Sept. 15, 2017) (collection and violation of privacy interest create Article III standing for BIPA claimant) with Santana v. Take-Two Interactive Software, Inc., 717 F. App’x 12, 17 (2d Cir. 2017) (collection of biometrics without adequate notices creates no “risk of real harm” and therefore does not create Article III standing for BIPA claimant) and Rivera v. Google, Inc., No. 16-cv-02714, 2018 WL 6830332, at *6 (N.D. Ill. Dec. 29, 2018) (alleged privacy violation does not create Article III standing for BIPA claimant).  S.B. 3053, 2018 Reg. Sess. (Ill. 2018). Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Privacy, Cybersecurity and Consumer Protection or Labor and Employment practice groups, or the authors: Jason C. Schwartz – Washington, D.C. (+1 202-955-8242, firstname.lastname@example.org) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, email@example.com) Erin Morgan – Washington, D.C. (+1 202-887-3577, firstname.lastname@example.org) Please also feel free to contact any of the following practice group leaders and members: Privacy, Cybersecurity and Consumer Protection Group: Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, email@example.com) M. Sean Royall – Dallas (+1 214-698-3256, firstname.lastname@example.org) Debra Wong Yang – Los Angeles (+1 213-229-7472, email@example.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, firstname.lastname@example.org) Christopher Chorba – Los Angeles (+1 213-229-7396, email@example.com) Richard H. Cunningham – Denver (+1 303-298-5752, firstname.lastname@example.org) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, email@example.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, firstname.lastname@example.org) Kristin A. Linsley – San Francisco (+1 415-393-8395, email@example.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, firstname.lastname@example.org) Shaalu Mehra – Palo Alto (+1 650-849-5282, email@example.com) Karl G. Nelson – Dallas (+1 214-698-3203, firstname.lastname@example.org) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, email@example.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, firstname.lastname@example.org) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, email@example.com) Labor and Employment Group: Catherine A. Conway – Co-Chair, Los Angeles (+1 213-229-7822, firstname.lastname@example.org) Jason C. Schwartz – Co-Chair, Washington, D.C. (+1 202-955-8242, email@example.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Click for PDF As every year, in honor of Data Privacy Day—an international effort to raise awareness and promote privacy and data protection best practices—we offered Gibson Dunn’s seventh annual Cybersecurity and Data Privacy Outlook and Review. In addition to that U.S.-focused report, we again this year offer this International Outlook and Review. Like many recent years, 2018 saw significant developments in the evolution of the data protection and cybersecurity landscape in the European Union (“EU”): Following the adoption and application of the General Data Protection Regulation governing the collection, processing and transfer of personal data in 2016 (“GDPR”), the EU’s main privacy body took office, in the form of the European Data Protection Board (“EDPB”). The EDPB and its predecessor, the Article 29 Working Party Group (“WP29”), issued a number of guidance documents throughout 2018 for the interpretation and application of the GDPR. Furthermore, several EU Member States continued to adapt their national legal frameworks, and started to apply these laws and the GDPR, since the GDPR’s date of application on 25 May 2018. The Council of the EU, which represents the governments and administrations of the EU Member States, pursued its internal discussions regarding the adoption of an EU regulation with respect to private life and the protection of personal data in electronic communications, intended to repeal the currently applicable legal framework (“ePrivacy Regulation”). EU Member States continued to work on the transposition and application of the EU Directive on the security of network and information systems (“NIS Directive”). Several objections were raised by EU institutions and before EU supervisory authorities and courts regarding different frameworks for international data transfers (e.g., the EU-U.S. Privacy Shield, the European Commission’s Standard Contract Clauses). In addition to the EU, a number of different bills were introduced and passed into law in other jurisdictions around the globe, including in other local European jurisdictions, Asia-Pacific region, Canada and Latin America. We cover these topics and many more in this year’s International Cybersecurity and Data Privacy Outlook and Review. While we do not attempt to address every development that occurred in 2018, this Review focuses on a number of the most significant developments affecting companies as they navigate the evolving cybersecurity and privacy landscape. __________________________ Table of Contents I. European Union A. EU GDPR: Its Main Elements, Implementation and Application 1. GDPR 2. Principal Elements of the GDPR 3. Guidance Adopted by the Former WP29 and the Current EDPB 4. National Data Protection Initiatives Implementing and Applying the GDPR 5. GDPR cases, investigations and enforcement a) Data breaches and investigations b) GDPR investigations B. International Transfers: Adequacy Declarations and Challenges 1. Adequacy Declarations a) Japan b) South Korea 2. Challenges to Data Transfer Systems a) Challenges to Standard Contract Clauses b) Challenges to the EU-U.S. Privacy Shield C. EU Cybersecurity Directive 1. Adoption and Implementation of the EU CybersSecurity Directive 2. Documents and Guidance Issued by ENISA D. Other EU Developments 1. Reform of the ePrivacy Directive – the Draft EU ePrivacy Regulation a) The European Commission’s ePrivacy Regulation Proposal b) The WP29 Opinion on the European Commission Proposal c) The European Parliament’s Amended Proposal d) The Proposal of the Council of the EU 2. CJEU Case Law a) The Determination of the Applicable Law and the Relevant Data Controller in the Context of Social Networks b) Claims Assignment II. Developments in Other European Jurisdictions: Switzerland, Turkey and Russia A. Russia B. Switzerland C. Turkey D. Ukraine III. Developments in Asia-Pacific A. China B. Singapore C. India IV. Developments in Canada and in Latin America A. Brazil B. Canada C. Other Jurisdictions: Argentina, Chile, Colombia, Mexico, Panamá and Uruguay __________________________ I. European Union A. EU GDPR: Its Main Elements, Implementation and Application 1. GDPR On 25 May 2018, after a two-year “grace period” the GDPR became the main legislative act for the protection of personal data and privacy in the EU. The GDPR replaces the EU Data Protection Directive  and constitutes a set of data protection rules that are directly applicable to the processing of personal data across EU Member States. 2. Principal Elements of the GDPR As explained in the 2018 International Outlook and Review, the GDPR brought about a significant change in all aspects of the EU’s data protection regime, revamping the substantive provisions regarding data protection law compliance and further developing and integrating the application and enforcement aspects of it. The core substantive elements of the GDPR include the following: Extraterritorial Scope: The GDPR applies not only to data controllers established in the EU, but also to organizations that either offer goods or services to individuals located in the EU or monitor their behavior, even if these organizations are not established in the EU and do not process data using servers in the EU.  On 23 November 2018, the EDPB published draft Guidelines on the territorial scope of the GDPR, which were subject to public consultation.  Transparency Principle: Under the GDPR, transparency is a general requirement applicable to three central areas: (i) the provision of information to data subjects; (ii) the way data controllers communicate with data subjects in relation to their rights under the GDPR; and (iii) how data controllers allow and facilitate the exercise of their rights by data subjects. In April 2018, the WP29 published its Guidelines on transparency, which emphasized the importance of providing data subjects with clear and full information, comprehensible to the average data subject, and made available in layers.  Consent of the Data Subjects: The GDPR put emphasis on the notion of consent of data subjects by providing further clarification and specification of the requirements for obtaining and demonstrating valid consent. In April 2018, the WP29 adopted Guidelines specifically dedicated to the concept of consent and focusing on the changes in this respect resulting from the GDPR.  In these Guidelines, the WP29 emphasized the importance of consent being obtained freely, and questioned the relevance of “consent” as a legal basis for data processing where consumers are, in practice, obliged to provide their personal data to, for example, engage and receive a service. Right to Be Forgotten: The GDPR further develops the “right to be forgotten” (formally known as the “right to erasure”), whereby personal data must be deleted when an individual no longer wants his or her data to be processed by a company and there are no legitimate reasons for retaining the data.  This right was already introduced in the EU Data Protection Directive, and was the object of the litigation before the Court of Justice of the EU (“CJEU”) in Google Spain SL and Google Inc. v. AEPD and Mario Costeja González.  Among other points, the GDPR clarifies that this right is not absolute and will always be subject to the legitimate interests of the public, including the freedom of expression and historical and scientific research. The GDPR also obliges controllers who have received a request for erasure to inform other controllers of such request in order to achieve the erasure of any links to or copy of the personal data involved. This part of the GDPR may impose significant burdens on affected companies, as the creation of selective data destruction procedures often leads to significant costs. Data Breach Notification Obligation: The GDPR requires data controllers to provide notice of serious security breaches to the competent supervisory authorities, also known as Data Protection Authority/ies (“DPA(s)”), without undue delay and, in any event, within 72 hours after becoming aware of any such breach. The WP29 has issued Guidelines in order to explain the mandatory breach notification and communication requirements of the GDPR as well as some of the steps data controllers and data processors can take to meet these new obligations.  Profiling Activities: The GDPR specifically addresses the use of profiling and other automated individual decision-making. In February 2018, the WP29 issued Guidelines clarifying the provisions of the GDPR regarding profiling, in particular by defining in more detail what profiling is.  . Data Protection Impact Assessment (“DPIA”): Where processing activities are deemed likely to result in high risk to the rights and freedoms of data subjects, the GDPR requires that data controllers carry out, prior to the contemplated processing, an assessment of the impact thereof on the protection of personal data.  However, the GDPR does not detail the specific criteria that needs to be taken into account to determine whether any given processing activities represent a “high risk”. Instead, the GDPR only provides a non-exhaustive list of examples falling within this scope. Similarly, no process for performing DPIAs is detailed in the GDPR. Considering the need for additional information in this respect, the WP29 issued Guidelines in October 2017 intended to clarify which processing operations must be subject to DPIAs and how they should be carried out.  Privacy-Friendly Techniques and Practices: “Privacy by design” is the idea that a product or service should be conceived from the outset to ensure a certain level of privacy for an individual’s data. “Privacy by default” is the idea that a product or service’s default settings should help ensure privacy of individual’s data. The GDPR establishes privacy by design and privacy by default as essential principles. Accordingly, businesses should only process personal data to the extent necessary for their intended purposes and should not store it for longer than is necessary for those purposes. These principles will require data controllers to design data protection safeguards into their products and services from the inception of the product development process. Data Portability: The GDPR establishes a right to data portability, which is intended to make it easier for individuals to transfer personal data from one service provider to another. According to the WP29, as a matter of good practice, companies should develop the means that will contribute to answering data portability requests, such as download tools and Application Programming Interfaces. Companies should guarantee that personal data is transmitted in a structured, commonly used and machine-readable format, and they should be encouraged to ensure the interoperability of the data format provided in the exercise of a data portability request. In April 2017, the WP29 issued Guidelines on the right to data portability providing guidance on the way to interpret and implement the right to data portability introduced by the GDPR.  Competent Supervisory Authority: To date, the monitoring of the application of EU data protection rules has fallen almost exclusively on the national DPAs. With the adoption of the GDPR, a complex set of rules has been established to govern the applicability of the rules to data controllers that have cross-border processing practices. First, where a case relates only to an establishment of a data controller or processor in a Member State or substantially affects residents only in a Member State, the DPA of the Member State will have jurisdiction to deal with the case.  Second, in other cases concerning cross-border data processing, the DPA of the main establishment of the controller or processor within the EU will have jurisdiction to act as lead DPA for the cross-border processing of this controller or processor.  Articles 61 and 62 provide for mutual assistance and joint operations mechanisms, respectively, to ensure compliance with the GDPR. Furthermore, the lead DPA will need to follow the cooperation mechanism provided in Article 60 with other DPAs “concerned”. Ultimately, the EDPB (where all EU DPAs and the European Commission are represented) will have decision-making powers in case of disagreement among DPAs as to the outcome of specific investigations.  Third, the GDPR establishes an urgency procedure that any DPA can use to adopt time-barred measures regarding data processing in case of urgency. These measures will only be applicable in the DPA’s own territory, pending a final decision by the EDPB.  In 2017, the WP29 issued Guidelines that aim to assist controllers and processors in the identification of their lead DPA.  Governance: Data controllers and processors may be required to designate a Data Protection Officer (“DPO”) in certain circumstances. Small and medium-sized enterprises are exempted from the obligation to appoint a DPO insofar as data processing is not their core business activity. In April 2017, the WP29 issued Guidelines that clarify the conditions for the designation, position and tasks of the DPO to ensure compliance with the GDPR.  These requirements will be supplemented by a much more rigid regime of fines for violations. DPAs will be able to fine companies that do not comply with EU rules up to EUR 20 million or up to 4% of their global annual turnover, whichever is higher. 3. Guidance Adopted by the Former WP29 and the Current EDPB As indicated above, the main EU data protection body under the now repealed EU Data Protection Directive—the WP29—has been replaced by the current EDPB, which took office on 25 May 2018. Both the WP29, until 25 May, and the EDPB, from 25 May onwards, have subjected to public consultation and adopted certain Guidelines on the interpretation and application of certain key provisions and aspects of the GDPR. These Guidelines, some of which have been discussed in sub-section I.A.2 above, include the following:  GDPR applicability: EDPB Guidelines 3/2018 on the territorial scope of the GDPR (Article 3) – version for public consultation. Requirements to obtain valid consent: Guidelines on consent under Regulation 2016/679, WP259 rev.01. Information and transparency obligations: Guidelines on transparency under Regulation 2016/679, WP260 rev.01. Automated decision-making and profiling: Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, WP251 rev.01. Right to data portability: Guidelines on the right to data portability under Regulation 2016/679, WP242 rev.01. Data breach notification obligations: Guidelines on Personal data breach notification under Regulation 2016/679, WP250 rev.01. Data protection impact assessment : Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679, WP248 rev.01. Data Protection Officers : Guidelines on Data Protection Officers (“DPO”), WP243 rev.01. Derogations to maintain records of processing activities: Position Paper on the derogations from the obligation to maintain records of processing activities pursuant to Article 30(5) GDPR. Certification bodies and criteria: EDPB Guidelines 4/2018 on the accreditation of certification bodies under Article 43 of the General Data Protection Regulation (2016/679). EDPB Guidelines 1/2018 on certification and identifying certification criteria in accordance with Articles 42 and 43 of the Regulation 2016/679 – version for public consultation. Transfers of personal data outside the EU: EDPB Guidelines 2/2018 on derogations of Article 49 under Regulation 2016/679. Working Document Setting Forth a Co-Operation Procedure for the approval of “Binding Corporate Rules” for controllers and processors under the GDPR, WP 263 rev.01. Recommendation on the Standard Application for Approval of Controller Binding Corporate Rules for the Transfer of Personal Data, WP 264. Recommendation on the Standard Application form for Approval of Processor Binding Corporate Rules for the Transfer of Personal Data, WP 265. Working Document setting up a table with the elements and principles to be found in Binding Corporate Rules, WP 256 rev.01. Working Document setting up a table with the elements and principles to be found in Processor Binding Corporate Rules, WP 257 rev.01. Adequacy Referential, WP 254 rev.01. Identification of the lead DPA: Guidelines for identifying a controller or processor’s lead supervisory authority, WP244 rev.01. Fines and penalties imposed by DPAs: Guidelines on the application and setting of administrative fines for the purposes of the Regulation 2016/679, WP 253. 4. National Data Protection Initiatives Implementing and Applying the GDPR Because the GDPR is a regulation, there is no need for EU Member States to transpose its provisions in order to render them applicable within their national legal systems. However, some Member States nonetheless have adapted their legal frameworks regarding data protection in light of the GDPR. The GDPR contains provisions granting flexibility to the Member States to implement such adaptations. For example, Article 8 of the GDPR provides specific rules regarding the processing of personal data of children below the age of 16. Nevertheless, Member States may provide by law for a lower age provided it is not below 13 years. Article 88 of the GDPR also enables Member States to set out more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context. Below is an overview of the national data protection reforms implemented throughout the EU during 2018: Member State National Data Protection Law Adopted Austria Federal Act on the Protection of Individuals with regard to the Processing of Personal Data (the “Data Protection Act” (DSG)), BGBI. I No. 165/1999), of 17 August 1999. Belgium – Law on the creation of the Data Protection Authority, of 3 December 2017 (the “Institutional Law”). – Law on the protection of natural persons with regard to the processing of personal data, of 30 July 2018 (the “Substantive Law”). – Law on economic matters which introduces collective redress action, of 30 July 2018 (the “Collective Redress Law”). – Law on the installation and use of cameras, of 21 March 2018 (the “Camera Law”) [modifying the Law of 21 March 2017]. – Law on the creation of an Information Security Committee, of 5 September 2018 (the “Information Security Law”). Bulgaria On 30 April 2018, a draft law was introduced for public consultation, amending and supplementing the Personal Data Protection Act of 4 January 2002. Public consultations ended on 30 May 2018, and the draft law submitted to the Parliament, where it is subject to further amendments. Croatia Act on the Implementation of the General Data Protection Regulation, of 27 April 2018. Cyprus Law on the Protection of Physical Persons Against the Processing of Personal Data and Free Movement of such Data, Law 125(I)/2018. Czech Republic Draft of the new Data Protection Act (the “DPA”), intended to adapt the current national legal framework to the GDPR. The DPA is in the legislative process, currently in the second reading in the Chamber of Deputies (lower chamber of the Czech Parliament). The DPA is expected to replace the current act on data protection. Denmark Danish Act on Data Protection, of 17 May 2018. Estonia – Personal Data Protection Act (the “PDPA”), of 12 December 2018. – Personal Data Protection Implementation Act. Finland Data Protection Act of Finland, which entered into force on 1 January 2018. Some minor amendments will be made to the Working Life Act (which aims to promote the protection of privacy and other rights safeguarding the privacy in working life) and a Government Proposal regarding these amendments has been given in July 2018. The amendments have not yet been passed, but the objective is that the amended act shall enter into force as soon as possible. France – Ordinance No. 2018-1125, of 12 December 2018 – Law No. 2018-493 on the protection of personal data, of 20 June 2018. – Decree No. 2018-687, of 1 August 2018. Germany German Federal Data Protection Act, of 5 July 2017. Greece Greece has not yet issued a national law implementing the GDPR. On 5 March 2018, a public consultation on the new law was completed; however, the draft has not yet been submitted to the Greek Parliament. Hungary Amendment to the Act CXII of 2011 on the Right of Informational Self-Determination and on Freedom of Information. Ireland Data Protection Act 2018, of 24 May 2018. Italy – Law No. 163, of 6 November 2017, adopting specific provisions with respect to the GDPR. – Legislative Decree 101/2018, of 10 August 2018. Latvia Personal Data Processing Law, of 21 June 2018. Lithuania Law on Legal Protection of Personal Data, of 16 July 2018. Luxembourg Law on the organization of the National Data Protection Commission (“CNPD”), of 1 August 2018. Malta Data Protection Act 2018 (Chapter 586 of the Laws of Malta), of 28 May 2018, and the Regulations issued under it. Netherlands Dutch GDPR Implementation Act, of 16 May 2018. Poland Personal Data Protection Act, of 24 May 2018. Portugal On 26 March 2018, the Portuguese government published a Draft Law (the “Draft”) for the implementation of the GDPR and associated national derogations. On 3 May 2018, the Draft was submitted to the Portuguese Parliament for discussion and is currently being studied by a special group of the Portuguese Parliament. The applicable law is still Law no. 67/98, of 26 October (as amended by Law 103/2015, of 24 August) on personal data protection. Romania Law no. 190/2018 on the measures for the application of the GDPR. Slovakia – Act No. 18/2018 Coll. on the Protection of Personal Data which implements the GDPR was adopted by the Slovak Parliament on 29 November 2017. It was published in the Collection of Laws on 30 January 2018, and entered into force on 25 May 2018. – The Decree of the Office for Personal Data Protection no. 158/2018 Coll. on Data Protection Impact Assessment Procedure. Slovenia The new Slovenian Data Protection Act (the “ZVOP-2”) is currently in the legislative pipeline, and it will repeal the current Data Protection Act (the “ZVOP-1”). Spain Organic Law 3/2018 on the protection of personal data and guarantee of digital rights, of 5 December 2018. Sweden Data Protection Act (2018:218) with its complementary provisions (2018:19), of 19 April 2018. United Kingdom Data Protection Act 2018, of 23 May 2018. 5. GDPR cases, investigations and enforcement In the course of 2018, EU data protection authorities continued their enforcement action against companies and organizations for violations of their pre-GDPR legal regimes (i.e., under the EU Data Protection Directive). Furthermore, soon after the GDPR became applicable and Member States adapted their legal frameworks regarding data protection in light of the GDPR, investigations regarding data breaches and potential infringements of the GDPR rules started to be conducted. The most significant cases are set out below. a) Data breaches and investigations In the UK, the Information Commissioner’s Office (“ICO”) has been particularly active in the investigation of unauthorized or illegal accesses or loss of personal data. In early 2017, a number of media reports in The Observer newspaper claimed that a data analytics service had worked for the Leave.EU campaign during the EU referendum, providing data services that supported micro-targeting of voters. In March 2017, the ICO announced that it would begin a review of evidence as to the potential risks arising from the use of data analytics in the political process. Following that review of the available evidence, the ICO announced in May 2017 that a broader formal investigation into the use of data analytics in political campaigns would be launched, in order to ascertain if there had been any misuse of personal data and breaches of data protection law by the campaigns, on both sides, during the referendum. In addition to the potential links between this data analytics organization and Leave.EU, which gave rise to the investigation, the ICO later found further lines of enquiry covering 30 organizations. According to an official investigation update, the investigation is considering both regulatory and criminal issues, namely failure to properly comply with the Data Protection Principles, failure to properly comply with the Privacy and Electronic Communications Regulations and potential offences under the Data Protection Act 1998.  So far, although the investigation is still ongoing, the ICO has issued one of the organizations involved with a monetary penalty in the sum GBP 500,000 for lack of transparency and security issues relating to the collection, processing and storage of data, constituting breaches of the first and seventh data protection principles under the Data Protection Act 1998.  In November 2018, the ICO also announced it was investigating an international hotel management company after a data breach had been brought to its attention. According to public sources, personal data including credit card details, passport numbers and the dates of birth of up to 300 million people had been stolen in a cyber-attack to the parent company of the international hotel management company.  In France, the company “Optical center” was fined EUR 250,000 by the French National Data Protection Commission (“CNIL”) for failing to secure its website. Through its website it was possible to access hundreds of customer invoices, containing health data and, in some cases, the social security number of the data subjects concerned. This case is one of the highest sanctions ever pronounced by the CNIL before the GDPR came into force and illustrates the seriousness with which the CNIL is approaching data protection and data breach violations. In another matter from before the application of the GDPR, in Hungary, the Hungarian regulator imposed a fine of up to HUF 20 million (approx. EUR 62,000, being the maximum fine under the Hungarian Act implementing the EU Data Protection Directive) on the Hungarian Church of Scientology for serious breaches of the local Data Protection Act. b) GDPR investigations In addition to the cases mentioned above, GDPR investigations have also proliferated in most of the Member States based on facts occurring and being brought to the attention of supervisory authorities after 25 May 2018. On 25 and 28 May 2018, in France the CNIL received group complaints from the associations None Of Your Business and La Quadrature du Net. In these complaints, the associations complained against Google LLC for not having a valid legal basis to process the personal data of the users of its services, particularly for the purposes of customizing and delivering targeted ads. After an investigation period and on the basis of online inspections conducted, CNIL stated that in this context two types of GDPR breaches had occurred, namely a breach of transparency and information obligations; and a violation of the obligation to have a legal basis for customizing and delivering targeted ads. On these grounds, the CNIL imposed a financial penalty of EUR 50 million to Google LLC on 21 January 2019.  In particular, the CNIL considered that Google users were not able to fully understand the scope of the processing operations carried out by Google LLC and that the purposes of these processing operations were described in a too generic and vague manner. Similarly, the information communicated was considered to be not clear enough so that the user can understand that the legal basis of processing operations for the ad targeting is consent, and not the legitimate interest of the company. Finally, the CNIL noticed that information on data retention periods was not provided for some categories of data.  In Ireland, an online news and social networking service is currently being investigated by Irish privacy authorities over its refusal to give a user information about how it tracks users when they click on links posted on the service. The company refused to disclose the data it recorded when a user clicked on links in other people’s links, claiming that providing this information would take a disproportionate effort. In December 2018, the Irish Data Protection Commission opened a statutory inquiry into the company’s compliance with the relevant provisions of the GDPR following receipt of a number of breach notifications from the company since the introduction of the GDPR.  B. International Transfers: Adequacy Declarations and Challenges 1. Adequacy Declarations Both under the former EU Data Protection Directive and the current GDPR, transfers of personal data outside of the EU are generally prohibited unless, inter alia, the European Commission formally concludes that the legislation of the country of destination of the data protects it adequately. Thus far, the European Commission has only recognized the following countries to provide adequate protection to personal data: Andorra, Argentina, Canada (commercial organizations), Faroe Islands, Guernsey, Israel, Isle of Man, Jersey, New Zealand, Switzerland, Uruguay and the U.S. (limited to the EU-U.S. Privacy Shield framework).  In the course of 2018, adequacy talks have proceeded with regard to two major Asian economies: Japan and South Korea. a) Japan With regard to Japan, negotiations with the EU on the finding of reciprocal adequacy took place in the course of the last years, and ended on 17 July 2018. Upon the conclusion of these negotiations, the EU and Japan both agreed to recognize each other’s regimes for the protection of personal data as being adequate, thereby enabling safe transfers of personal data between the EU and Japan. This arrangement is meant to complement the EU-Japan Economic Partnership Agreement,  enabling European and Japanese companies to benefit from free data flows, as well as from privileged access to approximately 650 million European and Japanese consumers. On 5 September 2018, the European Commission formally launched the procedure for the finding of adequacy of the data protection regime in Japan.  In issuing its draft adequacy decision to cover transfers of personal data to Japan, the European Commission highlighted the following commitments that Japan made to improve the protection of EU personal data: Japan committed to adopt a set of rules, providing individuals in the EU whose personal data are transferred to Japan with additional safeguards that will bridge several differences between the data protection systems of both jurisdictions. These additional safeguards will strengthen, for example, the protection of sensitive data, the conditions under which EU data can be further transferred from Japan to another third country, and the exercise of individual rights to access and rectification. These rules will be binding on Japanese companies importing data from the EU, and they will be enforceable by the Japanese independent data protection authority and by Japanese courts. The Japanese government also gave assurances to the EU regarding safeguards concerning the access of Japanese public authorities for criminal law enforcement and national security purposes, ensuring that any use of personal data would be limited to what is necessary and proportionate, and subject to independent oversight and effective redress mechanisms. Japan committed to implement a complaint-handling mechanism to investigate and resolve complaints from Europeans regarding access to their data by Japanese public authorities. This new mechanism will be administered and supervised by the Japanese independent data protection authority. On 5 December 2018, the EDPB issued its opinion on the draft adequacy decision prepared by the European Commission with regard to Japan.  Although the EDPB praised the efforts of the European Commission to reach an understanding with the Japanese government, a number of outstanding points were identified as being crucial for the finding of adequacy of the Japanese data protection regime. In particular: The EDPB remarked that the system for the monitoring of the new architecture of adequacy, which combines the existing Japanese legal framework with specific Supplementary Rules applicable to EU personal data, will pose certain challenges to ensure compliance by Japanese entities and enforcement by the Personal Information Protection Commission (“PPC”). The EDPB raised some concerns regarding the possibility of onward transfers of EU data from Japan to third countries that are only subject to a Japanese adequacy decision, but not to an adequacy decision from the European Commission. The EDPB also expressed some concerns in relation to the consent and transparency obligations of data controllers. As opposed to EU data protection law, the use of consent as a basis for the processing and transfer of personal data has a central role in the Japanese legal system. Some inconsistencies in the definition of consent under EU and Japanese law, such as the existence of “free” consent or the introduction of the right to withdraw consent, could be interpreted to cast doubt on data subjects’ ability to genuine control over their personal data. The EDPB raised some questions regarding the availability and accessibility of the “helpline” of the Japanese data protection authority for EU data subjects. Certain important documentation is only available in the Japanese-language version of official websites, if at all, which will raise challenges in the reliance of EU data subjects on Japanese data protection regulations. In addition to the opinion issued by the EDPB, the draft adequacy decision will be subject to the following procedure: Consultation of a committee composed of representatives of the Member States (comitology procedure); Update of the European Parliament Committee on Civil Liberties, Justice and Home Affairs; Adoption of the adequacy decision by the College of Commissioners. b) South Korea Negotiations between the EU and South Korea authorities occurred in the course of 2018 with a view to adopting an adequacy decision. Although the negotiations remained confidential so far, it has been reported that the main concerns of the EU authorities are relating to the independence and powers of the South Korean data protection authority.  While the Personal Information Protection Act of 2011 created a Personal Information Protection Commission, the independence of this body, which lacks enforcement powers, has been questioned. The South Korean Homeland and Security Ministry is tasked with the enforcement of the Personal Information Protection Act. On 15 November 2018, some amendments to the Personal Information Protection Act were submitted to the South Korean National Assembly, in order to grant enforcement power and functions to the Personal Information Protection Commission. 2. Challenges to Data Transfer Systems a) Challenges to Standard Contract Clauses As noted in the 2018 International Outlook and Review, on 3 October 2017, the Irish High Court referred the issue of the validity of the standard contractual clauses decisions to the CJEU for a preliminary ruling.  The proceedings before the EU are still ongoing, and a ruling is expected in 2019 or 2020. If the CJEU decides to invalidate the standard contractual clauses, this ruling would, in all likelihood, have a tremendous impact on businesses around the world, many of which relying on these legal guarantees to ensure an adequate level of data protection to data transfers outside the EU. b) Challenges to the EU-U.S. Privacy Shield On 12 July 2016, the European Commission formally approved the EU-U.S. Privacy Shield, a framework for navigating the transatlantic transfer of data from the EU to the United States. The Privacy Shield replaced the EU-U.S. Safe Harbor framework, which was invalidated by the CJEU on 6 October 2015 in the case Maximilian Schrems v. Data Protection Commissioner.  We provided an in-depth explanation of the Privacy Shield and a discussion of the Schrems decision in the 2018 International Outlook and Review. Since the adoption of the Privacy Shield program in 2016, approximately 4,000 companies have adhered to the Privacy Shield framework, making legally enforceable commitments to comply with the Privacy Shield rules and principles. However, the success of the Privacy Shield has not sheltered it from certain challenges that have been directed from politicians, DPAs and individuals across Europe. On 16 September 2016, Digital Rights Ireland Ltd., an organization that had been successful in obtaining the repeal of other EU legislation concerning personal data,  brought an action against the European Commission decision approving the EU-U.S. Privacy Shield. On 22 November 2017, the CJEU declared the action inadmissible, thereby giving some relief to the companies relying on this framework to transfer personal data to the U.S. Notwithstanding this, on 5 July 2018 the European Parliament voted a non-binding resolution recommending the suspension of the EU-U.S. Privacy Shield unless certain corrective actions were adopted by the U.S. administration, including: aligning fully the Privacy Shield to the GDPR, and making the Privacy Shield fully compliant with the recommendations issued by the WP29 on 28 November 2017.  In October 2018, EU Commissioner Věra Jourová, Secretary of Commerce Wilbur Ross, and members of the respective EU and U.S. administrations and authorities met with the occasion of the second annual review of the Privacy Shield.  During these meetings, the governments of both jurisdictions discussed the nomination and functioning of the Privacy and Civil Liberties Oversight Board and of the Privacy Shield Ombudsman Mechanism, which are important elements to guarantee the application and enforcement of the Privacy Shield. Finally, it is notable that although the case before the CJEU from the referral from the Irish High Court concerns primarily standard contract clauses, a number of the questions posed by the Court refer to the adoption of the Privacy Shield and its influence in the overall assessment of standard contract clauses. C. EU Cybersecurity Directive 1. Adoption and Implementation of the EU CybersSecurity Directive In the EU, cybersecurity legislation addressing incidents affecting essential service and digital service providers is primarily covered by the NIS Directive , adopted on 6 July 2016. As it was explained in the 2018 International Outlook and Review, the NIS Directive is the first set of cybersecurity rules to be adopted at the EU level, adding to an already complex array of laws with which companies must comply when implementing security and breach response processes. It aims to set a minimum level of cybersecurity standards and to streamline cooperation between EU Member States at a time of growing cybersecurity breaches. The NIS Directive is not directly applicable by authorities and courts, and contained a deadline for Member States to transpose it into national law by May 2018. Thus, in the course of the last year, Member States have endeavored to adopt the necessary regulations and empower the appropriate authorities to transpose, apply and enforce the NIS Directive. The final text of the NIS Directive sets out separate cybersecurity obligations for (i) essential service and (ii) digital service providers: Essential service providers include actors in the energy, transport, banking and financial markets, as well as health, water and digital infrastructure  sectors. Digital service providers will include online marketplaces, search engines and cloud services (with an exemption for companies with less than 50 employees) but not social networks, app stores or payment service providers. The clear aim of the NIS Directive is to harmonize the EU Member State rules applicable to the security levels of network and information systems across the EU. However, given the strategic character of certain services covered by the NIS Directive, it confers some powers and margin of discretion to Member States. For example, the NIS Directive mandates each EU Member State to adopt a national strategy on the security of network and information systems, defining objectives, policies and measures envisaged with a view to achieve the aims of the NIS Directive.  Thus, despite the ability of Member States to seek the assistance of the European Union Agency for Network and Information Security (“ENISA”), the development of a strategy will remain a national competence. Furthermore, as far as operators of essential services are concerned, EU Member States will identify the relevant operators subject to the NIS Directive and may impose stricter requirements than those laid down in the NIS Directive (in particular with regard to matters affecting national security).  In contrast, Member States should not identify digital service providers (as the NIS Directive applies to all digital service providers within its scope) and, in principle, may not impose any further obligations to such entities.  The European Commission retains powers to adopt implementing rules regarding the application of the security and notification requirements rules applicable to digital service providers.  It is expected that these rules will be developed in cooperation with the ENISA and stakeholders, and will enable an uniform treatment of digital service providers across the EU. In addition, the competent authorities will only be able to carry out supervisory activities when there is evidence that a digital service provider is not complying with its obligations under the NIS Directive. Another tool for coordination among authorities will be the envisaged “Cooperation Group”, similar to the WP29 operating currently under the 1995 Data Privacy Directive. The Cooperation Group will bring together the regulators of all EU Member States, who have different legal cultures and hold different approaches to IT and security matters (e.g., affecting national security). It is therefore expected that the European Commission will play an active role in building trust and consensus among the Cooperation Group’s members with a view of providing meaningful and clear guidance to businesses. 2. Documents and Guidance Issued by ENISA In the course of 2018, ENISA has been particularly active in issuing guidance and evaluating the responsiveness of the EU authorities, stakeholders and systems in responding to cyberattacks. In particular: ENISA has published a number of guidance documents aimed to assist private parties in their evaluation of security measures adopted in application of EU instruments, such as the NIS Directive  and the Open Internet Regulation.  Following the trends for increased use of consumer products and services relying on cloud services and Internet of Things, ENISA has issued a number of guidance documents providing companies with an overview of the potential risks and redress measures in this context. This includes the “Good practices for Security of Internet of Things in the context of Smart Manufacturing”, of November 2018,  or the working document “Towards secure convergence of Cloud and IoT”, of September 2018.  On 6-7 June 2018 ENISA held Cyber Europe 2018, a yearly exercise that simulates an intense realistic crisis caused by a large number of cybersecurity incidents. During the exercise, the EU Member States’ cooperation was found to have improved at technical level and be efficient. However, ENISA also noted that the private sector had to prioritize investing on IT security, particularly in regards to essential service operators.  D. Other EU Developments 1. Reform of the ePrivacy Directive – the Draft EU ePrivacy Regulation As it was explained in the 2018 International Outlook and Review, 2016 saw the initiation of the procedures for the reform of the EU’s main set of rules on ePrivacy, the ePrivacy Directive. In this context, further to a public consultation held by the European Commission, the first proposal of the future EU ePrivacy Regulation (the “draft ePrivacy Regulation”) was released on 10 January 2017.  In 2017, the draft ePrivacy Regulation was subject to an opinion of the WP29 (4 April 2017)  and an amended version issued by the European Parliament (20 October 2017).  Since then, in the course of 2018, internal discussions have been ongoing at the level of the Council of the EU, which have concluded in the issuance of two final versions of the draft ePrivacy Regulation, dated 10 July and 19 October 2018. Due to the progress made, the ePrivacy Regulation is expected to be adopted in 2019. a) The European Commission’s ePrivacy Regulation Proposal The Commission’s ePrivacy Regulation proposal released in January 2017 sought to accommodate the reform of the ePrivacy regime to the feedback received from stakeholders and the WP29. In summary, the draft ePrivacy Regulation prepared by the European Commission constituted a more comprehensive piece of legislation that aims to fix and close certain open issues identified in the application of the ePrivacy Directive: Regulation versus Directive: The European Commission’s proposal to replace the ePrivacy Directive with a Regulation means that its terms will in principle apply directly across all EU Member States, and will not require transposition at national level (e.g., via the adoption of laws by the parliaments of the different Member States). This decision is consistent with the approach adopted with regard to the GDPR. Although Member States will still be given some freedom to deviate from the ePrivacy Regulation (particularly in the area of national security), the choice to adopt a Regulation will increase the homogeneous application of the ePrivacy Regulation across all EU Member States. Alignment with the GDPR: A number of provisions in the draft ePrivacy Regulation of the European Commission demonstrated alignment with the GDPR. For example, as the GDPR, the draft ePrivacy Regulation had a broad territorial scope and applied to the provision of electronic communication services (e.g., voice telephony, SMS services) from outside the EU to residents in the EU. As indicated below, the draft ePrivacy Regulation also aimed to close the gap with the GDPR from an enforcement perspective, by empowering DPAs to monitor the application of the privacy-related provisions of the draft ePrivacy Regulation under the conditions established in the GDPR. From a substantive perspective, the definition of a number of legal concepts used in both the GDPR and the draft ePrivacy Regulation were also aligned (e.g., the conditions for “consent”, the “appropriate technical and organization measures to ensure a level of security appropriate to the risks”). Inclusion of OTT Service Providers: In response to the feedback of stakeholders, the draft ePrivacy Regulation indicates that the new Regulation will apply to providers of services that run over the Internet (referred to as “over-the-top” or “OTT” service providers), such as instant messaging services, video call service providers and other interpersonal communications services.  Cookies and Other Connection Data: Like the ePrivacy Directive, the draft ePrivacy Regulation contained a provision that addressed the circumstances under which the storage and collection of data on users’ devices is lawful. These practices may still be based on the prior consent obtained from users. In the absence of users’ consent, according to the draft ePrivacy Regulation, it would still be possible to carry out these practices provided that:  they serve the purpose of carrying out (not facilitating) the transmission of a communication over an electronic communications network; or they are necessary (albeit not strictly necessary) for providing: (i) a service requested by the end user; or (ii) first-party web audience measuring. The recitals of the draft ePrivacy Regulation suggested that the circumstances under which consent would not be required could be interpreted more broadly than under the current ePrivacy Directive.  By contrast, the ePrivacy Regulation contains a new set of seemingly more stringent rules applicable to the “collection of information emitted by terminal equipment to enable it to connect to another device and/or to network equipment”. Supervisory Authorities and EDPB: One of the novelties introduced by the draft ePrivacy Regulation was a section devoted to the appointment and powers of national supervisory authorities.  The relevant provisions clarify that the DPAs responsible for monitoring the application of the GDPR shall also be responsible for monitoring the application of the provisions of the draft ePrivacy Regulation related to privacy in electronic communications, and that the rules on competence, cooperation and powers of action of DPAs foreseen in the GDPR also apply to the draft ePrivacy Regulation. b) The WP29 Opinion on the European Commission Proposal Following the release of the European Commission’s proposal, the WP29 issued its opinion on the proposed draft ePrivacy Regulation in April 2017.  While the WP29 welcomed the proposal and the choice for a regulation as the regulatory instrument, it highlighted four points of “grave concern” that would “lower the level of protection enjoyed under the GDPR” if adopted, and made recommendations in this respect concerning: The rules concerning the tracking of the location of terminal equipment, for instance WiFi tracking, which are inconsistent with the rules of the GDPR. The WP29 advised the European Commission to “promote a technical standard for mobile devices to automatically signal an objection against such tracking”. The conditions under which the content and metadata can be analyzed should be limited: consent of all end-users (senders and recipients) should be the principle with limited exceptions for “purely personal purposes”. Barriers used by some websites to completely block access to the service unless visitors agree to third-party tracking, known as “tracking walls,” should be explicitly prohibited to give individuals the choice to refuse such tracking while still being able to access the website. Terminal equipment and software should offer “privacy protective settings” by default, in addition to allowing the user to adjust these settings. The WP29 indicated that it expected its concerns to be addressed during the ongoing legislative process. c) The European Parliament’s Amended Proposal In October 2017, the European Parliament proposed an amended version of the European Commission’s proposed draft ePrivacy Regulation,  which introduced more stringent rules on the use of personal data and on the respect of users’ privacy. Some of the notable changes include: The prohibition to block access to a service solely because the user has refused the processing of personal data which is not necessary for the functioning of the service. The requirement for providers of electronic communications services to ensure the confidentiality of the data, for instance with end-to-end encryption and the prohibition of backdoors. The requirement for browsers to block third-party cookies by default until the user has adjusted his/her cookie settings. The prohibition of “cookie walls” and cookie banners that prevent the use of the service unless users agree to all cookies. d) The Proposal of the Council of the EU In addition to the Parliament’s version of the draft ePrivacy Regulation, the Council of the EU has also published a number of working proposals and amendments. The two latest documents related to the draft ePrivacy Regulation were published on 10 July and 19 October 2018, and they introduced some important changes to the proposals of the European Commission and of the European Parliament. On 10 July 2018, the EU Council published some revisions to the draft ePrivacy Regulation, which focused primarily on the following key points:  The draft introduced the possibility for “further compatible processing of electronic communications metadata”. This amendment suggests the broadening of the scope of permissible processing for research purposes, which would enable private parties to pursue research and innovation. The Council of the EU also called for the draft ePrivacy Regulation to be “more future-proof”, providing flexibility to enable developments in a rapidly changing digital environment. Other amendments made by the EU Council sought to clarify the lawfulness of processing operations carried out in the course of operators’ daily business. For example, new language introduced in Article 6(2)(b) clarified that the processing of metadata for the purposes of calculating and billing interconnection payments is permitted. The EU Council also sought to clarify the rules applicable to the storage and processing of data on end-users’ equipment. Pursuant to the Council’s revisions, the responsibility for obtaining consent for the storage of a cookie or similar identifier lies on the entity that collects information from end-users’ terminal equipment, such as an information society service provider or an ad network provider. However, these entities may request another party to obtain consent on their behalf. The Council’s amendments also clarify that the end-user’s consent to storage of a cookie or similar identifier may also entail consent for the subsequent readings of the cookie in the context of a revisit to the same website domain initially visited by the end-user. The EU Council suggests the deletion of the entire Article 10 of the draft ePrivacy Regulation, and the respective recitals, which obliged software providers to inform the end user whenever privacy settings are updated. On 19 October 2018, the EU Council issued a new revised version of the draft ePrivacy Regulation, which included further edits and amendments in addition to those published in July.  One of the most significant changes introduced to the draft ePrivacy Regulation is the recognition of the ability of information society services to use tracking technologies on the computers of individuals, without consent, for websites that partly or wholly finance themselves through advertisement, provided information obligations have been complied with and that the user “has accepted this use” of the data (as opposed to requiring full-blown consent). The EU Council also included to the draft ePrivacy Regulation a new Article 6(1)(c), which allows the processing of electronic communications data when necessary to ensure the security and protection of terminal equipment. This and other similar changes introduced by the Council aim at achieving certain coherence between these provisions and the security obligations to which information society services are subject, enabling the latter to use security tools that need the processing of data contained in the terminal equipment without obtaining prior consent. 2. CJEU Case Law 2017 has also witnessed important cases before the CJEU on the application of the EU Data Protection Directive, the GDPR and the ePrivacy Directive. a) The Determination of the Applicable Law and the Relevant Data Controller in the Context of Social Networks On 5 June 2018, the CJEU delivered a ruling in Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH which clarified the definition of data controller and the determination of the applicability of national data protection legislation and the powers of DPAs in cases concerning controllers established in multiple Member States.  First, the CJEU indicated that administrators of webpages hosted by third parties (e.g., fan pages hosted by social networks) that knowingly make use of the services (e.g., audience statistics), may be considered to be (co)controllers of the data processed in the context of visitors’ traffic to the webpage. In doing so, the CJEU recognized the joint responsibility of the operator of the third-party website (e.g., the social network) and the administrator of the webpage (e.g., a fan page) in relation to the processing of the personal data of visitors to that page, which is deemed to contribute to ensuring more complete protection of the rights of persons visiting a fan page. Second, the CJEU found that, while an establishment of a controller focused on the sale of advertising space and other marketing activities may be subject to the laws and the powers of the DPA of the Member State where it is established, such laws and powers may not extend to an establishment of the same controller located in another Member State. The judgment in Wirtschaftsakademie was followed by an Opinion of the EU’s Advocate General Michal Bobek in Fashion ID GmbH & Co. KG v. Verbraucherzentrale NRW e.V., which also addressed the question of determining who is the data controller in the context of the use of tools to collect and transmit cookie data (e.g., social plug-ins). The Advocate General found that an entity or organization which has embedded a third-party plug-in in its website, which causes the collection and transmission of the user’s personal data, must be considered as a controller, even if it is unable to influence the data processing operation resulting from the functioning of the plug-in. However, the Advocate General observed that a controller’s joint responsibility should be limited to those operations for which it effectively co- determines the means and purposes of the processing of the personal data. The Advocate General proceeded to note that, where the processing of cookie data resulting from the use of plug-ins is based on the legitimate interests of controllers or third parties, legitimate interests of both the website operator and the plug-in provider should be taken into account as joint controllers, and an assessment should be made balancing those interests with the rights of the data subjects. Finally, the Advocate General concluded that the consent of the data subject has to be given to a website operator which has embedded the content of a third party, and that the EU Data Protection Directive must be interpreted as meaning that the obligation to inform also applies to that website operator, and both must be given before the data are collected and transferred. However, he noted that the extent of those obligations shall correspond with that operator’s joint responsibility for the collection and transmission of the personal data. b) Claims Assignment As indicated in the 2018 International Outlook and Review, Mr. Schrems started legal proceedings against Facebook Ireland Limited before a court in Austria, which raised the question of whether jurisdiction was established in the domicile of a consumer claimant who was assigned claims by other consumers, thus opening up the possibility of collecting consumer claims from around the world. On 14 November 2017, Advocate General Bobek delivered his opinion on the Maximilian Schrems v. Facebook Ireland Limited case pending in the CJEU.  Advocate General Bobek held that a consumer cannot invoke, at the same time as his own claims, claims on the same subject assigned by other consumers domiciled in other places in the same Member State, in other Member States, or in non-Member States. On 25 January 2018, the CJEU concurred with the Advocate General’s opinion, finding that a consumer cannot assert, in the courts of the place where he is domiciled, not only his own claims, but also claims assigned by other consumers domiciled in the same Member State, in other Member States or in non-Member State countries. II. Developments in Other European Jurisdictions: Switzerland, Turkey and Russia The increasing impact of digital services in Europe, as well as the overhaul brought about by the GDPR in the EU, have led certain jurisdictions in the vicinity of the EU to improve their data protection regulations. A. Russia Local data privacy laws have been heavily enforced, reflecting the activity of the Russian Data Protection Authority in monitoring and enforcing data protection compliance. As of 1 July 2017, the administrative sanctions in Russia for certain privacy violations have been significantly increased. For example, data processing operations in excess of the consent provided by a data subject may result in a fine of RUR 75,000 (approx. USD 1,200; approx. EUR 1,000). Criminal prosecution and prison sanctions are also possible for certain types of privacy violations. Another type of enforcement action under Russian law is blockage of the online resources. Thus, if processing of personal data on the website or in the app violates data protection laws, access to such website/app may be restricted for Russian users upon the respective court decision. The most well-known and widely debated blockage related to LinkedIn, which has been blocked since 2016 and remains unavailable for Russian users. This is not the only example – some other websites, with smaller user bases, have been blocked in recent years. The Russian Data Protection Authority has been targeting large digital multinationals in the last few years. For example, in 2017, Telegram was fined RUR 800,000 (approx. USD 14,000; approx. EUR 10,500) by Russian courts for failing to provide the Russian Federal Security Service with the decoding keys for access to personal data, as obliged by the Russian Data Protection Act. In doing so, the Russian courts disregarded Telegram’s arguments based on its lack of control over the encoding and decoding processes of its instant messaging service. On 22 October 2018, Russian courts rejected Telegram’s appeal against the fine. The Telegram case shows that, if the relevant technology used by a service provider (as long as the services relate to communications in the Internet) does not allow state authorities to access unencrypted information, this may be deemed a breach of Russian data protection and cybersecurity laws. B. Switzerland To prepare for the entry into force of the GDPR, the Swiss government has issued a draft of a new Data Protection Act (the “Draft FDPA”)  that aims to: – Modernize Swiss data protection law and to a certain extent, align it to the requirements of the GDPR; and, – Maintain its adequacy status granted by the European Commission, to ensure the free flow of personal data between the EU and Switzerland. The Draft FDPA was published by the Swiss Federal Council on 15 September 2017. The Draft FDPA, which will replace the Federal Act on Data Protection of 19 June 1992 (the “FADP”), has the following characteristics: The concept of “sensitive” or “special categories” of personal data under the Draft FDPA covers a wide range of categories of data, including personal data in the “intimate sphere” (e.g., fears, dreams, therapies), biometric data which clearly identifies an individual (e.g., pictures), data on administrative or criminal proceedings and sanctions, and data on social security measures.  The Draft FDPA contains a list of basic principles for the processing of personal data which are broadly equivalent to those contained in the GDPR.  By contrast, as opposed to the GDPR, the processing of personal data will not require any legal basis under the Draft FDPA (such as consent), unless such processing leads to an unlawful violation of privacy (i.e., the processing of personal data does not comply with the basic data processing principles). Similarly to the GDPR, under the Draft FDPA data subjects have the right to request access, rectification or erasure of their personal data and not to be subject to automated decision-making.  However, in contrast to the GDPR, the Draft FDPA does not provide for a right to data portability. The Draft FDPA contains a duty for companies to carry out a DPIA in specific situations, which closely mimic the scenarios envisaged by the GDPR.  The Draft FDPA also contains an obligation on privacy by design and by default broadly equivalent to that of the GDPR,  which compels companies and organizations to set up technical and organizational measures in order for the data processing to meet the data protection rules. However, the Draft FDPA does not foresee any sanctions or penalties for a violation of these obligations (as opposed to the GDPR). The Draft FDPA includes a general obligation for companies to report to the Federal Data Protection and Information Commissioner (“FDPIC”) as soon as possible ( the data breaches which are likely to result in a high risk to the privacy or the fundamental rights of data subjects.  A notification of the data breach to data subjects may also be required if it is necessary for the protection of data subject or if such notification is ordered by the FDPIC. The Draft FDPA does not foresee a criminal sanctions for a violation of the obligation to notify data breaches, unless notification of data subjects is to be made based on an order from the FDPIC. The refusal to comply with the FDPIC’s order may be criminally sanctioned with a fine up to CHF 250’000.  Under the Draft FDPA, it will no longer be the FDPIC who provides guidance on the adequacy level of third countries. The Draft FDPA delegates the qualification of adequacy to the Federal Council who will determine the countries providing for an adequate level of data protection. One may expect, however, that the Federal Council will follow closely the adoption of adequacy decisions by the European Commission. With regard to the authorities’ investigations and fines, the Federal Data Protection FDPIC has the right to investigate on his own initiative or upon request, it may take investigation measures and is entitled to issue certain administrative measures. These investigation proceedings are governed by administrative procedural law, and are subject to review by the Federal Administrative Court. However, the FDPIC does not have the power to impose any fines or penalties. Instead, data protection violations lead to personal criminal liability of individuals, subject to fines of up to CHF 250,000 that will be imposed by the ordinary courts in Switzerland. Until the Draft FDPA is finally enacted, the current FDPA of 19 June 1992 remains applicable. Initially, the Swiss Federal Council tentatively aimed to enact the Draft FDPA in August 2018. However, in January 2018, the relevant parliamentary commission required that the Draft FDPA be split in two parts to allow more time for deliberation. For companies anticipating to be affected by both the Draft FDPA and the GDPR, it may be advisable to adjust all their processing of personal data to the standards provided under the GDPR. If the implementation and application of the Draft FDPA leads to certain obligations being leaner than those contained in the GDPR, these adjustments may be done in the course of the data processing activities (e.g., not applying the exercise of certain rights where these rights are not covered by the Draft FDPA and provided that the GDPR does not apply). To the extent that the Draft FDPA goes beyond the GDPR, the additional requirements should be implemented for any processing subject to the current FDPA respectively the Draft FDPA. C. Turkey Throughout 2018, the Turkish data protection authority (the “KVKK”) has issued a number of regulations and guidance documents regarding a number of issues related to the application and enforcement of the Turkish Data Protection Act No. 6698 of 2016. These regulations and guidance documents include the following: Processing of sensitive personal data: On 7 March 2018, the KVKK published a decision regarding the processing of special categories of personal data. Pursuant to this decision, data controllers must foresee a separate policy and procedure for the protection of special categories of personal data. The decision further determined special conditions and requirements applicable to mediums where such data are stored, persons who have access to such data and transfer of such data. Transparency and information obligations: On 10 March 2018, the KVKK published the Communique on Procedures and Principles regarding the Obligation of Data Controllers to Inform, which lays out the content and methodology that shall be followed by entities and organizations to provide information to data subjects, for example within the scope of their privacy notices. Security measures: On 19 January 2018, the KVKK published a guidance document on security of personal data in order to assist entities and organizations in their compliance with data protection and security obligations, specifically focusing on technical and administrative measures. KVKK provided further detailed guidance on the matter with its decision on 31 January 2018 (2018/10). Registration of the data controllers: Pursuant to the KVKK Regulation on the Data Controller Registry, published on 30 December 2017, data controllers not exempted from registration by the KVKK must include their details in the KVKK Registry before proceeding to process personal data. Controllers may register online by uploading the required information to the KVKK Registry system. KVKK also declared the grace periods for different entities in its decision on 19 July 2018 (2018/88). Registration of e-marketing approvals and rejections: In 2018, Turkey adopted Law No. 7061 Amending Certain Tax Laws and Other Laws, which empowers the Ministry of Customs and Commerce to put in place a system to record the approvals and rejections received by companies for the purposes of e-marketing. This measure was later followed by a decision adopted by the KVKK, mandating all entities and organizations to cease their marketing operations unless they were covered by one of the exceptions provided for by the Turkish Data Protection Act or by consent. Data subject requests: On 10 March 2018, the KVKK also published the Communique on Procedures and Principles of Applications to Data Controllers, which lays out the procedure for data subjects to employ their rights against data controllers and data controllers’ obligation with regards to such requests. D. Ukraine In Ukraine, on 23 October 2018, the Parliamentary Commissioner for Human Rights issued a draft law aiming to align the Law on Personal Data with the GDPR. The draft law was further updated on 30 October 2018, and is subject to additional revisions until it is finally filed by the Cabinet of Ministers to the Ukrainian Parliament. As it currently stands, the draft law contains the following main amendments: The draft sets out the legal basis upon which an entity may process personal data, including the consent, the performance of a contract to which the data subject is a party and the fulfilment of a legal obligation. The draft law borrows from the GDPR a number of principles and definitions, including the concepts of personal data, data processing, profiling and pseudonymisation. Like the GDPR, the draft law also regulates aspects such as the rights of data subjects, the appointment of DPOs, the notification of data breaches and the transfer of personal data to third countries and organizations. In addition to the draft data protection law, on 9 May 2018, the Law on Basic Principles of Ukraine’s Cyber Security came into force. The Cyber Security Law mainly applies to “critical infrastructure”, and lays down the regulatory framework for a number of measures to be adopted in implementation of the Law. III. Developments in Asia-Pacific In an increasingly connected world, 2018 also saw many other countries try to get ahead of the challenges within the cybersecurity and data protection landscape. Several international developments bear brief mention here: A. China As noted in the 2018 International Outlook and Review, China’s Cybersecurity Law was adopted on 1 June 2017, becoming the first comprehensive Chinese law to regulate the management and protection of digital information by companies. The law also imposes significant restrictions on the transfer of certain data outside of the mainland (data localization) enabling government access to such data before it is exported.  Despite protests and petitions by governments and multinational companies, the implementation of the Cybersecurity Law continues to progress with the aim of regulating the behavior of many companies in protecting digital information.  While the stated objective is to protect personal information and individual privacy, and according to a government statement in China Daily, a state media outlet, to “effectively safeguard national cyberspace sovereignty and security,” the law in effect gives the Chinese government unprecedented access to network data for essentially all companies in the business of information technology.  Notably, key components of the law disproportionately affect multinationals because the data localization requirement obligates international companies to store data domestically and undergo a security assessment by supervisory authorities for important data that needs to be exported out of China. Though the law imposes more stringent rules on critical information infrastructure operators (whose information could compromise national security or public welfare) in contrast to network operators (whose information capabilities could include virtually all businesses using modern technology), the law effectively subjects a majority of companies to government oversight. As a consequence, the reality for many foreign companies is that these requirements would likely be onerous, will increase the costs of doing business in China, and will heighten the risk of exposure to industrial espionage.  Despite the release of additional draft guidelines meant to clarify certain provisions of the law, there is a general outlook that the law is still a work in progress, with the scope and definition still vague and uncertain.  Nonetheless, companies should endeavor to assess their data and information management operations to evaluate the risks of the expanding scope of the data protection law as well as their risk appetite for compliance with the Chinese government’s access to their network data. More recently, on 10 September 2018, the National People’s Congress of China announced, as part of its legislative agenda, that its Standing Committee would consider draft laws with relatively mature conditions, including a draft personal information protection law and a draft data security law.  B. Singapore As indicated in the 2018 International Outlook and Review, the Personal Data Protection Commission of Singapore issued on 7 November 2017 the proposed advisory guidelines for the collection and use of national registration identification numbers. The guidance, which covers a great deal of personal and biometric data, emphasized the obligations of companies to ensure policies and practices are in place to meet the obligations for data protection under the Personal Data Protection Act of 2012. The Commission gives businesses and organizations 12 months from the date of publication to review their processes and implement necessary changes to ensure compliance.  C. India As noted in the 2018 International Outlook and Review, India recently issued a white paper in 2017 with the aim of drafting a data protection bill to “ensure growth of the digital economy while keeping personal data of citizens secure and protected”.  Further to the publication of this white paper, the Ministry of Electronics and Information Technology published, on 27 July 2018, the Personal Data Protection Bill (the “Bill”) and the Data Protection Committee Report (the “Report”).  The Bill comprises 15 chapters and addresses, data protection obligations, including, grounds for processing personal data and sensitive personal data, personal and sensitive data of children, data principal rights, transparency, accountability measures and transfer of personal data outside India. In particular, according to its Article 1, the Bill shall apply to the processing of personal data where such data has been collected, disclosed, shared or otherwise processes within the territory of India and to the processing of personal data by the State, any Indian company, any Indian citizen or any person or body of persons incorporated or created under Indian law. Notwithstanding the above, the Bill also applies to the processing of personal data by fiduciaries or data processors not present in the territory of India, if they carry out processing of personal data in connection with (i) any business carried on in India, (ii) systematic activity of offering goods or services to data principals within the territory of India, (iii) any activity which involves profiling of data principals within the territory of India. Moreover, the Bill outlines that a data protection authority would be established and penalties would be imposed for violations of the obligations. In particular, Article 69(1) of the Bill establishes penalties that may extend up to five crore rupees (i.e., approx. USD 700,000; approx. EUR 620,000) or 2% of the data fiduciary total worldwide turnover in the preceding financial year, whichever is higher, if the data fiduciary contravenes its obligations to take prompt and appropriate action in response to a data security breach, undertake a DPIA, conduct a data audit, appoint a DPO or if it fails to register within the relevant authority. In case the data fiduciary contravenes any of its obligations regarding the processing of personal and/or sensitive data, the need to adhere to security safeguards or the applicable provisions on transfer of personal data outside India, the Bill establishes a penalty that may extend up to 15 crore rupees or 4% of the data fiduciary total worldwide turnover in the preceding financial year, whichever is higher. In addition, the Report addresses, among other things, existing approaches to data protection, key definitions of the Bill and recommendations received from the white paper consultation. IV. Developments in Canada and in Latin America The overhaul of data protection rules in important jurisdictions around the globe has also impacted Canada and Latin America, where some local administrations have bolstered their respective legislation and undertaken initiatives to bring their framework closer to that of the EU. A. Brazil In Brazil, a new General Data Protection Law was adopted on 14 August 2018 after several years of discussions among decision-makers.  Although the Brazilian Law is more lenient and contains fewer explanations regarding the interpretation and application of its provisions, a number of commonalities can be found between the Law and the GDPR, including the following: As the GDPR, the Brazilian General Data Protection Law generally excludes from its scope of application anonymous/ized data, except when the anonymization process used has been reverted, using solely its own resources, or where it can be reverted applying reasonable efforts. For this purpose, it is understood that anonymous/ized data is data that cannot be assigned to an identifiable person using reasonable means.  In setting out the obligations of entities processing personal data, the Brazilian General Data Protection Law also considers the conditions under which such processing is taking place. For example, while (as indicated above) anonymous/ised data may generally be considered to be excluded from the scope of application of the Law, it contains a specific provision whereby anonymous/ised data may fall within the scope of the Law if it is used to evaluate certain aspects of a physical person (e.g., the behavioral profile of a person if he/ she is identifiable).  The Brazilian Law is also based on the basic principle that data processing operations are forbidden unless they are based on any of its previously established legal basis. The Law contains 10 legal basis, which are based on the five legal basis contained in the GDPR, plus five additional basis:  data processing for the exercise of rights in legal proceedings; data processing for the research by study entities (granted that, whenever possible, the data is anonymous/ised); data processing for the protection of an individual’s health; data processing for the protection to credit; data processing and sharing by the public administration as required for public policy enforcement under law or contract. In Brazil, consent is also defined as freely given, informed and unambiguous indication of data subjects’ agreement to process personal data. Furthermore, the Law focuses on empowering data subjects with meaningful control and choice regarding their personal data.  As regards to the rights of data subjects, the Brazil Law has also included a general right to data portability, which was first envisaged by the GDPR.  This right obliges controllers to transfer personal data of data subjects to another controller, at the data subjects’ request. The Law also contains a general obligation to report incidents regarding the processing of personal data to the national authority and to the data’s subject, in a reasonable timeframe. The notification shall include information such as a description of the personal data affected and the data subjects and entities involved, a description of the technical and security measures used for the protection of personal data, the reasons for the delay suffered in the case of late notifications, and a description of the measures adopted to mitigate or redress the effects of the incident.  The Brazilian General Data Protection Law contains a general obligation to appoint a DPO, applicable to data controllers only.  However, the Brazilian data protection authority may set further guidance qualifying the situations where such obligation may no longer apply. Finally, as the GDPR, the Law prescribes the obligation to carry out a “Report on the Impact on Personal Data Protection” in certain situations, where a data processing operation may pose risks to civil liberties and fundamental rights. Like GDPR, the Brazilian General Data Protection Law provides that personal data can be transferred to third countries that ensure an adequate level of protection or based on appropriate safeguards. The safeguards under both laws are basically the same, except for legally binding instruments between public authorities/bodies, which is a safeguard under GDPR but under the Brazilian Law it is limited for purposes of international legal cooperation among intelligence, investigation and prosecution bodies (at least until the Brazilian data protection authority regulates the international transfer mechanisms). Fines under the Brazilian General Data Protection Law are capped at 2% of the turnover in Brazil in the preceding year or BRL 50 million (approximately USD 13 million), whichever is lower. These caps are applied to fines imposed per unlawful conduct. The Brazilian data protection authority was created on 28 December 2018, through Executive Order (MP) 869/2018, and will be composed of five commissioners, to be appointed by the President of the Republic, and advised by a National Council for the Protection of Personal Data and Privacy, composed of 23 unpaid members — 11 members from different spheres of government and 12 members divided between four from the private sector, four from academia and four from the civil society. The Executive Order also postpones the entry into force of the Brazilian General Data Protection Law to August 2020. B. Canada As noted in the 2018 International Outlook and Review, Canada opened up for comments a proposed regulation in 2017 that would mandate reporting of privacy breaches under its Personal Information Protection and Electronic Documents Act of 2015 (“PIPEDA”). On 1 November 2018, some amendments to the PIPEDA came into force.  The law now establishes that, where an organization subject to PIPEDA experiences a data breach that gives rise to a “risk of significant harm”, they will be required to: (i) report the incident to the Office of the Privacy Commissioner of Canada; (ii) notify any affected individuals; and (iii) alert any other third parties that are in a position to reduce the risk of harm to affected individuals. C. Other Jurisdictions: Argentina, Chile, Colombia, Mexico, Panamá and Uruguay Finally, as explained in the 2018 International Outlook and Review, Argentina forged ahead with an overhaul of the country’s data protection regime by publishing in 2017 a draft data protection bill that would align the country’s privacy laws with the GDPR requirements.  More recently, the Argentinian data protection authority announced, on 20 September 2018, that the President of the Argentine Republic, Mauricio Macri, had sent a draft data protection bill to the National Congress of Argentina for consideration, seeking to reform the current law on the protection of personal data. The message attached to the bill indicates that its objective is to modernize the law, in light of new technologies. The message attached to the bill also makes reference to the GDPR, and the bill includes provisions on data breach notification, privacy by design and default, processing of data by third parties, DPIA and the appointment of a DPO.  In Chile, on 31 August 2018, the Superintendence of Banks and Financial Institutions announced that it had issued a series of modifications to Chapter 20-8 and 1-13 of the Updated Compilation of Standards relating to cybersecurity, including updates to the rules on the reporting of operational incidents. In particular, the modifications to Chapter 20-8 seek to improve the system for the reporting of security incidents by creating a digital platform, requiring incidents to be reported within 30 minutes of the incident occurring beginning 1 October 2018, and requiring entities to include specific information when reporting an incident. In addition, a number of obligations were also introduced, namely a requirement to appoint a person, at the executive level, to communicate with the Superintendence of Banks and Financial Institutions (known as “SBIF”, its acronym in Spanish); to inform users and clients of incidents that affect the quality and continuity of services, the security of their personal data or that are of public knowledge; and, to maintain a cybersecurity incident alert system to facilitate data sharing on the incidents in order to allow other entities to adopt any necessary measures. In relation to Chapter 1-13, the modifications establish cybersecurity as a special criteria in the evaluation of the management of a bank by the SBIF, and provides for a requirement to report on cybersecurity management at least once a year. In addition, the SBIF will also evaluate whether an entity maintains a cybersecurity incident database.  Moreover, on 25 October 2018, the Chilean Transparency Council announced that the President of Chile, had changed the status of the draft data protection bill currently being considered by the National Congress of Chile to an urgent status.  In Colombia, the Financial Superintendence of Colombia issued, on 5 June 2018, two circulars introducing requirements on cybersecurity risk management for covered entities, as well as security standards applicable to online payment platforms, in order to enhance the protection of consumers’ personal financial information. In particular, the requirements include notifying consumers of cybersecurity incidents that affect the confidentiality or integrity of their information, as well as the measures adopted in response to incidents. With the publication of these circulars, entities will also be required to establish a unit in charge of cybersecurity risk management and a strategy concerning the sending of reports to supervisory authorities. In relation to online payment platforms, the security standards introduced are expected to enable the platforms, which are not regulated by the Financial Superintendence of Colombia, to offer their services to financial entities, such as banks and payment networks, under the supervision of this authority.  Additionally, a legislative proposal seeking to modify and supplement the Statutory Law No. 1266 of 2008, concerning habeas data and financial information, has recently been presented to the Senate of the Republic of Colombia on 26 July 2018.  In Mexico, the National Institute of Access to Information and Data Protection has been particularly active in 2018, issuing several guidance papers on several data protection topics. In March, the National Institute issued recommendations on the processing of the Mexican voting card (a widely used ID) by companies and public entities subject to the provisions of the Federal Law on the Protection of Personal Data Held by Private Parties 2010 and the General Law on the Protection of Personal Data Held by Public Entities 2017.  In May, the National Institute has issued guidance on biometric data, providing recommendations on how to process biometric data in compliance with the principles and obligations under the Federal Law on the Protection of Personal Data Held by Private Parties 2010 and the General Law on the Protection of Personal Data Held by Public Entities 2017 and clarifying when biometric data should be considered personal data.  In June, the National Institute issued guidance on how to manage data security incidents in order to assist companies, organizations and public entities to comply with their correspondent Data Protection Law .  In August, the National Institute issued guidance for the implementation of a “Data Protection Program” by those entities subject to the General Law on the Protection of Personal Data Held by Public Entities 2017.  Finally, in November, the National Institute issued guidance outlining the minimum criteria suggested for the contracting of cloud computing services that involve the processing of personal data. The guide covers provider reputation and identity, minimum criteria to be considered by the customer to ensure that the provider has implemented security measures and has conducted risk assessment for personal data, the providers’ return and destruction of personal data at the end of the service, and the conditions and practices of the provider regarding interoperability and portability. The guidance also includes checklists for companies and individuals subject to the Federal Law on the Protection of Personal Data Held by Private Parties 2010 to help them ensure compliance and analyze the risks they assume when hiring cloud computing products and services.  Moreover, on 26 June 2018 Mexico acceded to the Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data (known as “Convention 108”) and its additional protocol. In Uruguay, a bill on accountability and budget, containing provisions relating to data protection, is currently being analyzed by the Parliament of Uruguay.  Additionally, the data protection authority has recently issued, on 29 October 2018, data protection guides on cookies, profiling, bring your own device and drones, providing recommendations on their use in order to raise attention for data protection issues that may arise from the use of these technologies.   See Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC, OJ L 119 4.5.2016, p. 1.  See Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, OJ L 281, 23.11.1995, pp. 31-50.  See GDPR, at Article 3.  See EDPB, Guidelines 3/2018 on the territorial scope of the GDPR (Article 3) – Version for public consultation (16 November 2018), available at https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_3_2018_territorial_scope_en.pdf.  See WP29, Guidelines on Transparency under Regulation 2016/679 (WP260 rev.01, 11 April 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=51025.  See WP29, Guidelines on Consent under Regulation 2016/679 (WP259 rev.01; 10 April 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=51030.  See GDPR, at Article 17.  See EU Data Protection Directive, at Articles 12 and 14; and Case C-131/12 Google Spain SL and Google Inc. v. AEPD and Mario Costeja González ECLI:EU:C:2014:317.  See WP29, Guidelines on Personal Data Breach Notification under Regulation 2016/679 (WP250 rev.01; 6 February 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=49827.  See WP29, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (WP251 rev.01; 6 February 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=49826.  See GDPR, at Article 35.  See WP29, Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 (WP248 rev.01; 4 October 2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=47711.  See WP29, Guidelines on the right to data portability (WP242 rev.01; 5 April 2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=44099.  See GDPR, at Article 56(2).  See GDPR, at Article 56(1).  See GDPR, at Article 63.  See GDPR, at Article 66.  See WP29, Guidelines for Identifying a Controller or Processor’s Lead Supervisory Authority (WP244 rev.01; 5 April 2017), available at http://ec.europa.eu/newsroom/just/item-detail.cfm?item_id=50083.  See WP29, Guidelines on Data Protection Officers (“DPOs”) (WP243 rev.01; 5 April 2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=44100.  See: https://edpb.europa.eu/our-work-tools/general-guidance/gdpr-guidelines-recommendations-best-practices_en.  The Investigation Update “Investigation into the use of data analytics in political campaigns”, 11.07.2018 is available at https://ico.org.uk/media/action-weve-taken/2259371/investigation-into-data-analytics-for-political-purposes-update.pdf.  The notice is available at https://ico.org.uk/media/action-weve-taken/mpns/2260051/r-facebook-mpn-20181024.pdf.  The press release is available at http://news.marriott.com/2019/01/marriott-provides-update-on-starwood-database-security-incident/.  For more information, the press release is available at https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-penalty-50-million-euros-against-google-llc  For more information, the decision is available at https://www.legifrance.gouv.fr/affichCnil.do?oldAction=rechExpCnil&id=CNILTEXT000038032552&fastReqId=2103387945&fastPos=1.  For more information, the press release is available at https://www.dataprotection.ie/en/news-media/press-releases/data-protection-commission-opens-statutory-inquiry-twitter.  See: https://ec.europa.eu/info/law/law-topic/data-protection/data-transfers-outside-eu/adequacy-protection-personal-data-non-eu-countries_en.  See European Commission, “EU and Japan sign Economic Partnership Agreement” (17 July 2018), available at http://europa.eu/rapid/press-release_IP-18-4526_en.htm.  See: http://europa.eu/rapid/press-release_IP-18-5433_en.htm.  See EDPB, Opinion 28/2018 regarding the European Commission Draft Implementing Decision on the adequate protection of personal data in Japan (5 December 2018), available at https://edpb.europa.eu/sites/edpb/files/files/file1/2018-12-05-opinion_2018-28_art.70_japan_adequacy_en.pdf.  See IAPP, “South Korea’s EU adequacy decision rests on new legislative proposals” (27 November 2018), available at https://iapp.org/news/a/south-koreas-eu-adequacy-decision-rests-on-new-legislative-proposals/.  See Irish High Court Commercial, The Data Protection Commissioner v. Facebook Ireland Limited and Maximilian Schrems, 2016 No. 4809 P.  See CJEU, Case C-362/14, Maximillian Schrems v. Data Protection Commissioner (6 October 2016).  See CJEU, Case C-293/12, Digital Rights Ireland Ltd. v. Minister for Communications, Marine and Natural Resources et al (8 April 2014).  See European Parliament, Adequacy of the protection afforded by the EU-US Privacy Shield (5 July 2018), available at http://www.europarl.europa.eu/sides/getDoc.do?type=TA&reference=P8-TA-2018-0315&format=XML&language=EN.  See European Commission, “Joint Press Statement from Commissioner Věra Jourová and Secretary of Commerce Wilbur Ross on the Second Annual EU-U.S. Privacy Shield Review” (19 October 2018), available at http://europa.eu/rapid/press-release_STATEMENT-18-6157_en.htm.  See Directive (EU) 2016/1148 of the European Parliament and of the Council of 6 July 2016 concerning measures for a high common level of security of network and information systems across the Union, OJ L 194, 19.7.2016, pp. 1-30, available at http://eur-lex.europa.eu/legal-content/EN/TXT/ ?uri=uriserv:OJ.L_.2016.194.01.0001.01.ENG&toc=OJ:L:2016:194:TOC.  E.g., domain name systems (DNS) providers and top level domain (TLD) registries; see Article 4, NIS Directive.  See NIS Directive, at Article 7.  See NIS Directive, at Recital (57) and Article 3.  See NIS Directive, at Article 16(10).  See NIS Directive, at Articles 16(8) and (9).  See ENISA, “Guidelines on assessing DSP security and OES compliance with the NISD security requirements” (28 November 2018), available at https://www.enisa.europa.eu/publications/guidelines-on-assessing-dsp-security-and-oes-compliance-with-the-nisd-security-requirements.  See ENISA, “Guideline on assessing security measures in the context of Article 3(3) of the Open Internet regulation” (12 December 2018), available at https://www.enisa.europa.eu/publications/guideline-on-assessing-security-measures-in-the-context-of-article-3-3-of-the-open-internet-regulation.  See https://www.enisa.europa.eu/publications/good-practices-for-security-of-iot.  See https://www.enisa.europa.eu/publications/towards-secure-convergence-of-cloud-and-iot  See ENISA, “Cyber Eurrope 2018: After Action Report” (December 2018), available at https://www.enisa.europa.eu/publications/cyber-europe-2018-after-action-report/at_download/fullReport.  See https://ec.europa.eu/digital-single-market/en/proposal-eprivacy-regulation.  See http://ec.europa.eu/newsroom/document.cfm?doc_id=44103.  See http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A8-2017-0324& language=EN.  See draft ePrivacy Regulation, at Recital (13). See Explanatory Memorandum, at Section 3.2.  See draft ePrivacy Regulation, at Article 8(1).  However, in practice, the WP29 had already expressed the possibility that operators do not obtain consent for the setting and receipt of cookies in some of the circumstances now covered in the draft ePrivacy Regulation, provided that certain conditions are met. See WP29, Opinion 04/2012 on Cookie Consent Exemption (WP 194; 7 June 2012), available at http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2012/wp194_en.pdf.  See draft ePrivacy Regulation, at Articles 18 ff.  See WP29, Opinion 01/2017 on the Proposed Regulation for the ePrivacy Regulation (2002/58/EC) (WP247; 4 April 2017), available at http://ec.europa.eu/newsroom/just/item-detail.cfm?item_id=50083.  See European Parliament’s proposal, available at http://www.europarl.europa.eu/sides/getDoc. do?type=REPORT&reference=A8-2017-0324& language=EN.  See: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_10975_2018_INIT&from=EN.  See https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_13256_2018_INIT&from=EN.  See CJEU, Case C-210/16 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH (5 June 2018).  See Opinion of Advocate General Bobek on Case C-498/16 Maximilian Schrems v. Facebook Ireland Limited.  The Draft FDPA is available in the official languages of Switzerland: · French: https://www.ejpd.admin.ch/ejpd/fr/home/aktuell/news/2017/2017-09-150.html · German: https://www.ejpd.admin.ch/ejpd/de/home/aktuell/news/2017/2017-09-150.html · Italian: https://www.ejpd.admin.ch/ejpd/it/home/aktuell/news/2017/2017-09-150.html An unofficial English version of the Draft FDPA is also available at https://www.dataprotection.ch/fileadmin/dataprotection.ch/user_upload/redaktion/Docs/Swiss_Data_Protection_Act__draft_of_September_2017__Walder_Wyss_convenience_translation_V010.pdf?v=1507206202  See Draft FDPA, Article 4(b). Please note that the current FDPA protects information relating to legal entities as personal data.  See Draft FDPA, Articles 5(1) to (5).  See Draft FDPA, Articles 19 and 23 to 28.  See Draft FDPA, Article 20.  See Draft FDPA, Article 6, and GDPR, Article 25.  See Draft FDPA, Article 22.  See Draft FDPA, Article 57.  See FT Cyber Security, “China’s cyber security law rattles multinationals,” Financial Times (30 May 2017), available at https://www.ft.com/content/b302269c-44ff-11e7-8519-9f94ee97d996.  See Alex Lawson, “US Asks China Not To Implement Cybersecurity Law,” Law360 (27 September 2017) available at https://www.law360.com/articles/968132/us-asks-china-not-to-implement-cybersecurity-law.  See Sophie Yan, “China’s new cybersecurity law takes effect today, and many are confused,” CNBC.com (1 June 2017), available at https://www.cnbc.com/2017/05/31/chinas-new-cybersecurity-law-takes-effect-today.html.  See Christina Larson, Keith Zhai, and Lulu Yilun Chen, “Foreign Firms Fret as China Implements New Cybersecurity Law”, Bloomberg News (24 May 2017), available at https://www.bloomberg.com/news/articles/2017-05-24/foreign-firms-fret-as-china-implements-new-cybersecurity-law.  See Clarice Yue, Michelle Chan, Sven-Michael Werner and John Shi, “China Cybersecurity Law update: Draft Guidelines on Security Assessment for Data Export Revised!,” Lexology (26 September, 2017), available at https://www.lexology.com/library/detail.aspx?g=94d24110-4487-4b28-bfa5-4fa98d78a105.  See http://www.npc.gov.cn/npc/xinwen/2018-09/10/content_2061041.htm (Press Release in Chinese).  See Singapore Personal Data Protection Commission, Proposed Advisory Guidelines on the Personal Data Protection Act For NRIC Numbers, published 7 November 2017, available at https://www.pdpc.gov.sg/docs/default-source/public-consultation-6—nric/proposed-nric-advisory-guidelines—071117.pdf?sfvrsn=4.  See Naïm Alexandre Antaki and Wendy J. Wagner, “No escaping notification: Government releases proposed regulations for federal data breach reporting & notification”, Lexology (6 September 2017), available at https://www.lexology.com/library/detail.aspx?g=0a98fd33-1f2c-4a52-98c0-cf1feeaf0b90; Ministry of Electronics & Information Technology, “White Paper of the Committee of Experts on a Data Protection Framework for India,” Government of India (27 November 2017), available at http://meity.gov.in/white-paper-data-protection-framework-india-public-comments-invited.  See http://meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill%2C2018_0.pdf  See IAPP, “GDPR matchup: Brazil’s General Data Protection Law” (4 October 2018), available at https://iapp.org/news/a/gdpr-matchup-brazils-general-data-protection-law/.  See Brazilian General Data Protection Law, Article 12.  See Brazilian General Data Protection Law, Article 12.  See Brazilian General Data Protection Law, Article 7.  See Brazilian General Data Protection Law, Article 7.  In Brazil, under local telecommunications regulations, users could request the portability of personal data related to a telephone number (Resolution 460/07 of the Brazilian National Telecommunications Agency, Anatel), available at http://www.anatel.gov.br/legislacao/resolucoes/22-2007/8-resolucao-460.  See Brazilian General Data Protection Law, Article 48.  See Brazilian General Data Protection Law, Article 41.  These amendments were implemented through the Digital Privacy Law of 2015, available at https://www.canlii.org/en/ca/laws/astat/sc-2015-c-32/121166/sc-2015-c-32.html.  See Office of the Australian Information Commissioner, “De-identification Decision-Making Framework”, Australian Government (18 September 2017), available at https://www.oaic.gov.au/agencies-and-organisations/guides/de-identification-decision-making-framework; Lyn Nicholson, “Regulator issues new guidance on de-identification and implications for big data usage”, Lexology (26 September 2017) available at https://www.lexology.com/library/detail.aspx?g=f6c055f4-cc82-462a-9b25-ec7edc947354; “New Regulation on the Deletion, Destruction or Anonymization of Personal Data,” British Chamber of Commerce of Turkey (28 September 28, 2017), available at https://www.bcct.org.tr/news/new-regulation-deletion-destruction-anonymization-personal-data-2/64027; Jena M. Valdetero and David Chen, “Big Changes May Be Coming to Argentina’s Data Protection Laws,” Lexology (5 June 2017), available at https://www.lexology.com/library/detail.aspx?g=6a4799ec-2f55-4d51-96bd-3d6d8c04abd2.  See https://www.argentina.gob.ar/noticias/proteccion-de-datos-personales-al-congreso (press release only available in Spanish).  See https://www.sbif.cl/sbifweb/servlet/Noticia?indice=2.1&idContenido=12214 (press release only available in Spanish).  See https://www.consejotransparencia.cl/presidente-del-cplt-asegura-estar-cada-vez-mas-cerca-el-fin-del-abuso-tras-anuncio-de-urgencia-al-proyecto-de-proteccion-de-datos-personales/ (press release only available in Spanish).  See the press release of 5 June 2018, available at https://www.superfinanciera.gov.co/inicio/sala-de-prensa/comunicados-de-prensa-/comunicados-de-prensa–10082460 (press release only available in Spanish).  See http://leyes.senado.gov.co/proyectos/images/documentos/Textos%20Radicados/proyectos%20de%20ley/2018%20-%202019/PL%20053-18%20Habeas%20Data.pdf  The guide is available at http://inicio.inai.org.mx/DocumentosdeInteres/RecomendacionesCredencialV.pdf  The guide is available at http://inicio.ifai.org.mx/DocumentosdeInteres/GuiaDatosBiometricos_Web_Links.pdf  The guide is available at http://inicio.inai.org.mx/DocumentosdeInteres/Recomendaciones_Manejo_IS_DP.pdf  The guide is available at http://inicio.inai.org.mx/DocumentosdeInteres/DocumentoOrientadorPPDP.docx  The guide is available at http://inicio.ifai.org.mx/nuevo/ComputoEnLaNube.pdf  The draft bill is available at https://www.mef.gub.uy/innovaportal/file/24846/1/fundamentacion-del-articulado.pdf.  See https://www.datospersonales.gub.uy/inicio/institucional/noticias/urcdp_lanzo_nuevas_guias_proteccion_datos_personales (press release only available in Spanish). The following Gibson Dunn lawyers assisted in the preparation of this client alert: Ahmed Baladi, Alexander Southwell, Alejandro Guerrero, Clémence Pugnet and Francisca Couto. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work or any of the following leaders and members of the firm’s Privacy, Cybersecurity and Consumer Protection practice group: Europe Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) James A. Cox – London (+44 (0)207071 4250, email@example.com) Patrick Doris – London (+44 (0)20 7071 4276, firstname.lastname@example.org) Penny Madden – London (+44 (0)20 7071 4226, email@example.com) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) Michael Walther – Munich (+49 89 189 33-180, email@example.com) Kai Gesing – Munich (+49 89 189 33-180, firstname.lastname@example.org) Sarah Wazen – London (+44 (0)20 7071 4203, email@example.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) Alejandro Guerrero – Brussels (+32 2 554 7218, email@example.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, firstname.lastname@example.org) Jai S. Pathak – Singapore (+65 6507 3683, email@example.com) United States Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, firstname.lastname@example.org) M. Sean Royall – Dallas (+1 214-698-3256, email@example.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, firstname.lastname@example.org) Ryan T. Bergsieker – Denver (+1 303-298-5774, email@example.com) Richard H. Cunningham – Denver (+1 303-298-5752, firstname.lastname@example.org) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, email@example.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, firstname.lastname@example.org) Kristin A. Linsley – San Francisco (+1 415-393-8395, email@example.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, firstname.lastname@example.org) Karl G. Nelson – Dallas (+1 214-698-3203, email@example.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, firstname.lastname@example.org) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, email@example.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, firstname.lastname@example.org) Questions about SEC disclosure issues concerning data privacy and cybersecurity can also be addressed to the following leaders and members of the Securities Regulation and Corporate Governance Group: James J. Moloney – Orange County, CA (+1 949-451-4343, email@example.com) Elizabeth Ising – Washington, D.C. (+1 202-955-8287, firstname.lastname@example.org) Lori Zyskowski – New York (+1 212-351-2309, email@example.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Law360 named Gibson Dunn one of its six Cybersecurity & Privacy Practice Groups of the Year [PDF] for 2018. The practice group was noted as “the trusted choice for leading technology companies” who “continues to do some of the most cutting-edge work in the space.” The firm’s Cybersecurity & Privacy practice was profiled on January 23, 2019. Gibson Dunn’s Privacy, Cybersecurity and Consumer Protection Practice Group represents clients across a wide range of industries in matters involving complex and rapidly evolving laws, regulations, and industry best practices relating to privacy, cybersecurity, and consumer protection. Our team includes the largest number of former federal cyber-crimes prosecutors of any law firm.
The Daily Journal named San Francisco partner Kristin Linsley to its 2019 list of the Top Cyber Lawyers in California [PDF]. Linsley has extensive experience in complex business and appellate litigation across a spectrum of subject areas, including technology and privacy, international and transnational law, and complex financial litigation. Her profile was published on January 23, 2019.
Gibson, Dunn & Crutcher LLP is pleased to announce its selection by Law360 as a Law Firm of the Year for 2018, featuring the four firms that received the most Practice Group of the Year awards in its profile, “The Firms That Dominated in 2018.” [PDF] Of the four, Gibson Dunn “led the pack with 11 winning practice areas” for “successfully securing wins in bet-the-company matters and closing high-profile, big-ticket deals for clients throughout 2018.” The awards were published on January 13, 2019. Law360 previously noted that Gibson Dunn “dominated the competition this year” for its Practice Groups of the Year, which were selected “with an eye toward landmark matters and general excellence.” Gibson Dunn is proud to have been honored in the following categories: Appellate [PDF]: Gibson Dunn’s Appellate and Constitutional Law Practice Group is one of the leading U.S. appellate practices, with broad experience in complex litigation at all levels of the state and federal court systems and an exceptionally strong and high-profile presence and record of success before the U.S. Supreme Court. Class Action [PDF]: Our Class Actions Practice Group has an unrivaled record of success in the defense of high-stakes class action lawsuits across the United States. We have successfully litigated many of the most significant class actions in recent years, amassing an impressive win record in trial and appellate courts, including before the U. S. Supreme Court, that have changed the class action landscape nationwide. Competition [PDF]: Gibson Dunn’s Antitrust and Competition Practice Group serves clients in a broad array of industries globally in every significant area of antitrust and competition law, including private antitrust litigation between large companies and class action treble damages litigation; government review of mergers and acquisitions; and cartel investigations, internationally across borders and jurisdictions. Cybersecurity & Privacy [PDF]: Our Privacy, Cybersecurity and Consumer Protection Practice Group represents clients across a wide range of industries in matters involving complex and rapidly evolving laws, regulations, and industry best practices relating to privacy, cybersecurity, and consumer protection. Our team includes the largest number of former federal cyber-crimes prosecutors of any law firm. Employment [PDF]: No firm has a more prominent position at the leading edge of labor and employment law than Gibson Dunn. With a Labor and Employment Practice Group that covers a complete range of matters, we are known for our unsurpassed ability to help the world’s preeminent companies tackle their most challenging labor and employment matters. Energy [PDF]: Across the firm’s Energy and Infrastructure, Oil and Gas, and Energy, Regulation and Litigation Practice Groups, our global energy practitioners counsel on a complex range of issues and proceedings in the transactional, regulatory, enforcement, investigatory and litigation arenas, serving clients in all energy industry segments. Environmental [PDF]: Gibson Dunn has represented clients in the environmental and mass tort area for more than 30 years, providing sophisticated counsel on the complete range of litigation matters as well as in connection with transactional concerns such as ongoing regulatory compliance, legislative activities and environmental due diligence. Real Estate [PDF]: The breadth of sophisticated matters handled by our real estate lawyers worldwide includes acquisitions and sales; joint ventures; financing; land use and development; and construction. Gibson Dunn additionally has one of the leading hotel and hospitality practices globally. Securities [PDF]: Our securities practice offers comprehensive client services including in the defense and handling of securities class action litigation, derivative litigation, M&A litigation, internal investigations, and investigations and enforcement actions by the SEC, DOJ and state attorneys general. Sports [PDF]: Gibson Dunn’s global Sports Law Practice represents a wide range of clients in matters relating to professional and amateur sports, including individual teams, sports facilities, athletic associations, athletes, financial institutions, television networks, sponsors and municipalities. Transportation [PDF]: Gibson Dunn’s experience with transportation-related entities is extensive and includes the automotive sector as well as all aspects of the airline and rail industries, freight, shipping, and maritime. We advise in a broad range of areas that include regulatory and compliance, customs and trade regulation, antitrust, litigation, corporate transactions, tax, real estate, environmental and insurance.
Click for PDF With increasing regularity, the Federal Trade Commission (“FTC”) is seeking and obtaining large monetary remedies as “equitable monetary relief” pursuant to Section 13(b) of the FTC Act. Indeed, FTC settlements and judgments exceeding $100 million, and even $1 billion, are becoming commonplace. The Supreme Court, however, has never held that Section 13(b) of the FTC Act empowers the FTC to obtain monetary relief. Although multiple federal circuit courts have held that Section 13(b) provides the agency with this power, several weeks ago two Ninth Circuit judges issued a concurrence in FTC v. AMG Capital Management, LLC et al. calling for the full Ninth Circuit to reconsider this issue en banc in light of the Supreme Court’s 2017 decision in Kokesh v. SEC. Gibson Dunn partners Sean Royall, Blaine Evanson, and Rich Cunningham, and associate Brandon J. Stoker recently published an article discussing the AMG Capital Management concurrence in the Washington Legal Foundation’s The Legal Pulse blog. The article describes the concurrence and how it fits into the broader legal landscape around this issue, which is clearly poised for further attention from the federal appellate courts, including the Supreme Court. Ninth Circuit Judges Call for En Banc Review of the Federal Trade Commission’s Authority to Obtain Monetary Relief (click on link) © 2019, Washington Legal Foundation, The Legal Pulse, January 15, 2019. Reprinted with permission. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the authors of this Client Alert, the Gibson Dunn lawyer with whom you usually work, or one of the leaders and members of the firm’s Antitrust and Competition or Privacy, Cybersecurity and Consumer Protection practice groups: Washington, D.C. Scott D. Hammond (+1 202-887-3684, firstname.lastname@example.org) D. Jarrett Arp (+1 202-955-8678, email@example.com) Adam Di Vincenzo (+1 202-887-3704, firstname.lastname@example.org) Howard S. Hogan (+1 202-887-3640, email@example.com) Joseph Kattan P.C. (+1 202-955-8239, firstname.lastname@example.org) Joshua Lipton (+1 202-955-8226, email@example.com) Cynthia Richman (+1 202-955-8234, firstname.lastname@example.org) Jeremy Robison (+1 202-955-8518, email@example.com) New York Alexander H. Southwell (+1 212-351-3981, firstname.lastname@example.org) Eric J. Stock (+1 212-351-2301, email@example.com) Los Angeles Daniel G. Swanson (+1 213-229-7430, firstname.lastname@example.org) Debra Wong Yang (+1 213-229-7472, email@example.com) Samuel G. Liversidge (+1 213-229-7420, firstname.lastname@example.org) Jay P. Srinivasan (+1 213-229-7296, email@example.com) Rod J. Stone (+1 213-229-7256, firstname.lastname@example.org) Eric D. Vandevelde (+1 213-229-7186, email@example.com) Orange County Blaine H. Evanson (+1 949-451-3805, firstname.lastname@example.org) San Francisco Rachel S. Brass (+1 415-393-8293, email@example.com) Dallas M. Sean Royall (+1 214-698-3256, firstname.lastname@example.org) Olivia Adendorff (+1 214-698-3159, email@example.com) Veronica S. Lewis (+1 214-698-3320, firstname.lastname@example.org) Mike Raiff (+1 214-698-3350, email@example.com) Brian Robison (+1 214-698-3370, firstname.lastname@example.org) Robert C. Walters (+1 214-698-3114, email@example.com) Denver Richard H. Cunningham (+1 303-298-5752, firstname.lastname@example.org) Ryan T. Bergsieker (+1 303-298-5774, email@example.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Click for PDF On December 28, 2018, a Task Group that includes U.S. Department of Health and Human Services (“HHS”) personnel and private-sector health care industry leaders published new guidance for health care organizations on cybersecurity best practices. The guidance—Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients—is voluntary and creates no legal obligations. It is targeted to health care providers, payors, pharmaceutical companies, and medical device manufacturers. This publication is among the most comprehensive and detailed guidance now available to the health care industry on cybersecurity. While voluntary, the prescriptive advice and scalable tools in the new guidance may be a valuable resource for legal, compliance, IT, and information security professionals at health care organizations. Organizations that follow this guidance may decrease the likelihood that they will suffer a costly data breach, and in the event of a breach may be able to point to compliance with the guidance to show that they have implemented reasonable cybersecurity practices, thereby helping to defend against private lawsuits or government enforcement actions. This alert briefly describes the background and key takeaways from the guidance. Gibson Dunn is available to answer any questions you may have about how this guidance applies to your organization, as well as any other topics related to cybersecurity or privacy in the health care industry. Background The health care industry is a primary target for attacks by cyber-criminals. The threat is especially critical because by at least one measure the average cost of a data breach in the health sector is $408 per record, almost double that of the next highest industry. In recent years, moreover, HHS’s Office for Civil Rights (“OCR”)—the office charged with enforcing the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”)—has demonstrated an increasing willingness to bring enforcement actions against even reputable and respected organizations that have suffered a data breach. The new guidance comes against this backdrop and as the result of the Cybersecurity Act of 2015, which required HHS to issue guidance through a “trusted platform and tighter partnership between the United States government and the private sector.” Under Section 405(d) of the Act, industry and government leaders formed a Task Group in May 2017 to create a set of “voluntary, consensus-based principles and practices to ensure cybersecurity in the Health Care and Public Health (HPH) sector.” This guidance is the result of 18 months of work by the Task Group. The Guidance Recognizing that it would be impossible to address every cybersecurity challenge in a single publication, the Task Group focused on five prevalent cybersecurity threats: 1) e-mail phishing attacks, 2) ransomware attacks, 3) loss or theft of equipment or data, 4) insider, accidental or intentional data loss, and 5) attacks against connected medical devices that may affect patient safety. For each of the five high risk cybersecurity threats, the guidance describes the risk, lists specific vulnerabilities and the potential effects of these vulnerabilities, and offers a list of “practices to consider” to help minimize the threat. The Task Group then identified a set of voluntary best practices and organized them into ten categories: E-mail Protection Systems Endpoint Protection Systems Access Management Data Protection and Loss Prevention Asset Management Network Management Vulnerability Management Incident Response Medical Device Security Cybersecurity Policies Information regarding each of these practice categories is detailed in two supplementary technical volumes—one addressing the needs of small organizations and the other addressing the requirements of medium and large organizations—as well as a supplemental volume of additional resources and templates. The guidance also provides a toolkit for determining and prioritizing the cybersecurity practices that would be most effective, which can be used to assist organizations in conducting a cybersecurity risk assessment. The specific practices identified in the guidance are not intended to replace existing regulatory requirements or frameworks (such as the HIPAA Security Rule or the NIST Cybersecurity Framework). Instead, they are intended to be a supplemental resource for health care organizations, with the goal of “rais[ing] the cybersecurity floor across the health care industry.” Specific application and resource allocation will be up to each organization, and the guidance recognizes that each organization will need to tailor cybersecurity practices to its specific size, complexity, and type. The guidance provides a chart to assist in determining these categorizations. Importantly, the guidance does not authorize any causes of action or grounds for regulatory enforcement. Conclusion Because of the long shadow of HIPAA, the health care industry has long been among the most heavily-regulated industries when it comes to cybersecurity practices. This new guidance offers an additional tool that health care organizations can use to gauge the adequacy of their systems and their preparedness for a cyber attack. Given that HHS OCR is simultaneously seeking comments on how it might update HIPAA’s requirements, and the explosion of enforcement activity and lawsuits related to cybersecurity and privacy more generally, health care organizations would be well-served to evaluate this guidance and refine or enhance their plans to address cybersecurity issues that regulators and plaintiffs are likely to examine increasingly in the years to come.  Healthcare & Public Health Sector Coordinating Councils, Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients (Dec. 28, 2018), https://www.phe.gov/Preparedness/planning/405d/Documents/HICP-Main-508.pdf.  Id. at 9.  See, e.g., Press Release, Department of Health and Human Services, Anthem Pays OCR $16 Million in Record HIPAA Settlement Following Largest U.S. Health Data Breach in History (Oct. 15, 2018), https://www.hhs.gov/about/news/2018/10/15/anthem-pays-ocr-16-million-record-hipaa-settlement-following-largest-health-data-breach-history.html, Press Release, Department of Health and Human Services, Five breaches add up to millions in settlement costs for entity that failed to heed HIPAA’s risk analysis and risk management rules (Feb. 1, 2018), available at https://www.hhs.gov/about/news/2018/02/01/five-breaches-add-millions-settlement-costs-entity-failed-heed-hipaa-s-risk-analysis-and-risk.html.  Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients, at 4.  Id.  Id. at 6.  Id. at 26.  Id.  Id. at 11.  See Request for Information on Modifying HIPAA Rules to Improve Coordinate Care, 83 Fed. Reg. 64,302 (Dec. 14, 2018). Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding the above developments. Please contact the Gibson Dunn lawyer with whom you usually work, or the following authors: Ryan T. Bergsieker – Denver (+1 303-298-5774, firstname.lastname@example.org) Reid Rector – Denver (+1 303-298-5923, email@example.com) Josiah J. Clarke – Denver (+1 303-298-5708, firstname.lastname@example.org) Please also feel free to contact the following practice group leaders: Alexander H. Southwell – Chair, Privacy, Cybersecurity and Consumer Protection Practice, New York (+1 212-351-3981, email@example.com) Daniel J. Thomasch – Co-Chair, Life Sciences Practice, New York (+1 212-351-3800, firstname.lastname@example.org) Tracey B. Davies – Co-Chair, Life Sciences Practice, Dallas (+1 214-698-3335, email@example.com) Ryan A. Murr – Co-Chair, Life Sciences Practice, San Francisco (+1 415-393-8373, firstname.lastname@example.org) Stephen C. Payne – Chair, FDA and Health Care Practice, Washington, D.C. (+1 202-887-3693, email@example.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Orange County partner Joshua Jessen is the author of “How Calif. Privacy Act Could Prompt Private Plaintiff Suits,” [PDF] published by Law360 on January 11, 2019.
Law360 named eight Gibson Dunn partners among its 2018 MVPs and noted that the firm had the most MVPs of any law firms this year. Law360 MVPs feature lawyers who have “distinguished themselves from their peers by securing hard-earned successes in high-stakes litigation, complex global matters and record-breaking deals.” Gibson Dunn’s MVPs are: Christopher Chorba, a Class Action MVP [PDF] – Co-Chair of the firm’s Class Actions Group and a partner in our Los Angeles office, he defends class actions and handles a broad range of complex commercial litigation with an emphasis on claims involving California’s Unfair Competition and False Advertising Laws, the Consumers Legal Remedies Act, the Lanham Act, and the Class Action Fairness Act of 2005. His litigation and counseling experience includes work for companies in the automotive, consumer products, entertainment, financial services, food and beverage, social media, technology, telecommunications, insurance, health care, retail, and utility industries. Michael P. Darden, an Energy MVP [PDF] – Partner in charge of the Houston office, Mike focuses his practice on international and U.S. oil & gas ventures and infrastructure projects (including LNG, deep-water and unconventional resource development projects), asset acquisitions and divestitures, and energy-based financings (including project financings, reserve-based loans and production payments). Thomas H. Dupree Jr., an MVP in Transportation [PDF] – Co-partner in charge of the Washington, DC office, Tom has represented clients in a wide variety of trial and appellate matters, including cases involving punitive damages, class actions, product liability, arbitration, intellectual property, employment, and constitutional challenges to federal and state statutes. He has argued more than 80 appeals in the federal courts, including in all 13 circuits as well as the United States Supreme Court. Joanne Franzel, a Real Estate MVP [PDF] – Joanne is a partner in the New York office, and her practice has included all forms of real estate transactions, including acquisitions and dispositions and financing, as well as office and retail leasing with anchor, as well as shopping center tenants. She also has represented a number of clients in New York City real estate development, representing developers as well as users in various mixed-use projects, often with a significant public/private component. Matthew McGill, an MVP in the Sports category [PDF] – A partner in the Washington, D.C. office, Matt practices appellate and constitutional law. He has participated in 21 cases before the Supreme Court of the United States, prevailing in 16. Spanning a wide range of substantive areas, those representations have included several high-profile triumphs over foreign and domestic sovereigns. Outside the Supreme Court, his practice focuses on cases involving novel and complex questions of federal law, often in high-profile litigation against governmental entities. Mark A. Perry, an MVP in the Securities category [PDF] – Mark is a partner in the Washington, D.C. office and is Co-chair of the firm’s Appellate and Constitutional Law Group. His practice focuses on complex commercial litigation at both the trial and appellate levels. He is an accomplished appellate lawyer who has briefed and argued many cases in the Supreme Court of the United States. He has served as chief appellate counsel to Fortune 100 companies in significant securities, intellectual property, and employment cases. He also appears frequently in federal district courts, serving both as lead counsel and as legal strategist in complex commercial cases. Eugene Scalia, an Appellate MVP [PDF] – A partner in the Washington, D.C. office and Co-Chair of the Administrative Law and Regulatory Practice Group, Gene has a national practice handling a broad range of labor, employment, appellate, and regulatory matters. His success bringing legal challenges to federal agency actions has been widely reported in the legal and business press. Michael Li-Ming Wong, an MVP in Cybersecurity and Privacy – Michael is a partner in the San Francisco and Palo Alto offices. He focuses on white-collar criminal matters, complex civil litigation, data-privacy investigations and litigation, and internal investigations. Michael has tried more than 20 civil and criminal jury trials in federal and state courts, including five multi-week jury trials over the past five years.
The Federal Trade Commission (“FTC”) is increasingly focusing on the advertising, data privacy/security, and e-commerce processes of prominent companies marketing legitimate, valuable products and services, as compared to the types of fraudsters and shams that have been a central focus of FTC attention in the past. The FTC’s recently concluded action against DirecTV is emblematic of this trend. In FTC v. DirecTV, the FTC alleged that DirecTV’s marketing failed to adequately disclose that (a) the introductory discounted price lasted only twelve months while subscribers were bound to a 24-month commitment; (b) subscribers who cancelled early would be charged a cancellation fee; and (c) subscribers would automatically incur monthly charges if they did not cancel a premium channel package after a free three-month promotional period. On August 16, 2017, after hearing the FTC’s case-in-chief, Judge Gilliam of the U.S. District Court for the Northern District of California granted judgment for DirecTV on the majority of these claims. And earlier this week, the FTC agreed to voluntarily dismiss the remainder of its case with prejudice. Gibson Dunn partners Sean Royall and Rich Cunningham and associates Brett Rosenthal and Emily Riff recently published an article titled Lessons from FTC’s Loss in, and Subsequent Abandonment of, DirecTV Advertising Case in the Washington Legal Foundation’s The Legal Pulse blog. The article describes the case, the FTC’s evidence, and key takeaways for companies crafting advertising and marketing disclosures. Lessons from FTC’s Loss in, and Subsequent Abandonment of, DirecTV Advertising Case © 2018, Washington Legal Foundation, The Legal Pulse, October 23, 2018. Reprinted with permission. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the authors of this Client Alert, the Gibson Dunn lawyer with whom you usually work, or one of the leaders and members of the firm’s Antitrust and Competition or Privacy, Cybersecurity and Consumer Protection practice groups: Washington, D.C. Scott D. Hammond (+1 202-887-3684, firstname.lastname@example.org) D. Jarrett Arp (+1 202-955-8678, email@example.com) Adam Di Vincenzo (+1 202-887-3704, firstname.lastname@example.org) Howard S. Hogan (+1 202-887-3640, email@example.com) Joseph Kattan P.C. (+1 202-955-8239, firstname.lastname@example.org) Joshua Lipton (+1 202-955-8226, email@example.com) Cynthia Richman (+1 202-955-8234, firstname.lastname@example.org) New York Alexander H. Southwell (+1 212-351-3981, email@example.com) Eric J. Stock (+1 212-351-2301, firstname.lastname@example.org) Los Angeles Daniel G. Swanson (+1 213-229-7430, email@example.com) Debra Wong Yang (+1 213-229-7472, firstname.lastname@example.org) Samuel G. Liversidge (+1 213-229-7420, email@example.com) Jay P. Srinivasan (+1 213-229-7296, firstname.lastname@example.org) Rod J. Stone (+1 213-229-7256, email@example.com) Eric D. Vandevelde (+1 213-229-7186, firstname.lastname@example.org) San Francisco Rachel S. Brass (+1 415-393-8293, email@example.com) Dallas M. Sean Royall (+1 214-698-3256, firstname.lastname@example.org) Veronica S. Lewis (+1 214-698-3320, email@example.com) Brian Robison (+1 214-698-3370, firstname.lastname@example.org) Robert C. Walters (+1 214-698-3114, email@example.com) Denver Richard H. Cunningham (+1 303-298-5752, firstname.lastname@example.org) Ryan T. Bergsieker (+1 303-298-5774, email@example.com) © 2018 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Click for PDF On October 16, 2018, the Securities and Exchange Commission issued a report warning public companies about the importance of internal controls to prevent cyber fraud. The report described the SEC Division of Enforcement’s investigation of multiple public companies which had collectively lost nearly $100 million in a range of cyber-scams typically involving phony emails requesting payments to vendors or corporate executives. Although these types of cyber-crimes are common, the Enforcement Division notably investigated whether the failure of the companies’ internal accounting controls to prevent unauthorized payments violated the federal securities laws. The SEC ultimately declined to pursue enforcement actions, but nonetheless issued a report cautioning public companies about the importance of devising and maintaining a system of internal accounting controls sufficient to protect company assets. While the SEC has previously addressed the need for public companies to promptly disclose cybersecurity incidents, the new report sees the agency wading into corporate controls designed to mitigate such risks. The report encourages companies to calibrate existing internal controls, and related personnel training, to ensure they are responsive to emerging cyber threats. The report (issued to coincide with National Cybersecurity Awareness Month) clearly intends to warn public companies that future investigations may result in enforcement action. The Report of Investigation Section 21(a) of the Securities Exchange Act of 1934 empowers the SEC to issue a public Report of Investigation where deemed appropriate. While SEC investigations are confidential unless and until the SEC files an enforcement action alleging that an individual or entity has violated the federal securities laws, Section 21(a) reports provide a vehicle to publicize investigative findings even where no enforcement action is pursued. Such reports are used sparingly, perhaps every few years, typically to address emerging issues where the interpretation of the federal securities laws may be uncertain. (For instance, recent Section 21(a) reports have addressed the treatment of digital tokens as securities and the use of social media to disseminate material corporate information.) The October 16 report details the Enforcement Division’s investigations into the internal accounting controls of nine issuers, across multiple industries, that were victims of cyber-scams. The Division identified two specific types of cyber-fraud – typically referred to as business email compromises or “BECs” – that had been perpetrated. The first involved emails from persons claiming to be unaffiliated corporate executives, typically sent to finance personnel directing them to wire large sums of money to a foreign bank account for time-sensitive deals. These were often unsophisticated operations, textbook fakes that included urgent, secret requests, unusual foreign transactions, and spelling and grammatical errors. The second type of business email compromises were harder to detect. Perpetrators hacked real vendors’ accounts and sent invoices and requests for payments that appeared to be for otherwise legitimate transactions. As a result, issuers made payments on outstanding invoices to foreign accounts controlled by impersonators rather than their real vendors, often learning of the scam only when the legitimate vendor inquired into delinquent bills. According to the SEC, both types of frauds often succeeded, at least in part, because responsible personnel failed to understand their company’s existing cybersecurity controls or to appropriately question the veracity of the emails. The SEC explained that the frauds themselves were not sophisticated in design or in their use of technology; rather, they relied on “weaknesses in policies and procedures and human vulnerabilities that rendered the control environment ineffective.” SEC Cyber-Fraud Guidance Cybersecurity has been a high priority for the SEC dating back several years. The SEC has pursued a number of enforcement actions against registered securities firms arising out of data breaches or deficient controls. For example, just last month the SEC brought a settled action against a broker-dealer/investment-adviser which suffered a cyber-intrusion that had allegedly compromised the personal information of thousands of customers. The SEC alleged that the firm had failed to comply with securities regulations governing the safeguarding of customer information, including the Identity Theft Red Flags Rule. The SEC has been less aggressive in pursuing cybersecurity-related actions against public companies. However, earlier this year, the SEC brought its first enforcement action against a public company for alleged delays in its disclosure of a large-scale data breach. But such enforcement actions put the SEC in the difficult position of weighing charges against companies which are themselves victims of a crime. The SEC has thus tried to be measured in its approach to such actions, turning to speeches and public guidance rather than a large number of enforcement actions. (Indeed, the SEC has had to make the embarrassing disclosure that its own EDGAR online filing system had been hacked and sensitive information compromised.) Hence, in February 2018, the SEC issued interpretive guidance for public companies regarding the disclosure of cybersecurity risks and incidents. Among other things, the guidance counseled the timely public disclosure of material data breaches, recognizing that such disclosures need not compromise the company’s cybersecurity efforts. The guidance further discussed the need to maintain effective disclosure controls and procedures. However, the February guidance did not address specific controls to prevent cyber incidents in the first place. The new Report of Investigation takes the additional step of addressing not just corporate disclosures of cyber incidents, but the procedures companies are expected to maintain in order to prevent these breaches from occurring. The SEC noted that the internal controls provisions of the federal securities laws are not new, and based its report largely on the controls set forth in Section 13(b)(2)(B) of the Exchange Act. But the SEC emphasized that such controls must be “attuned to this kind of cyber-related fraud, as well as the critical role training plays in implementing controls that serve their purpose and protect assets in compliance with the federal securities laws.” The report noted that the issuers under investigation had procedures in place to authorize and process payment requests, yet were still victimized, at least in part “because the responsible personnel did not sufficiently understand the company’s existing controls or did not recognize indications in the emailed instructions that those communications lacked reliability.” The SEC concluded that public companies’ “internal accounting controls may need to be reassessed in light of emerging risks, including risks arising from cyber-related frauds” and “must calibrate their internal accounting controls to the current risk environment.” Unfortunately, the vagueness of such guidance leaves the burden on companies to determine how best to address emerging risks. Whether a company’s controls are adequate may be judged in hindsight by the Enforcement Division; not surprisingly, companies and individuals under investigation often find the staff asserting that, if the controls did not prevent the misconduct, they were by definition inadequate. Here, the SEC took a cautious approach in issuing a Section 21(a) report highlighting the risk rather than publicly identifying and penalizing the companies which had already been victimized by these scams. However, companies and their advisors should assume that, with this warning shot across the bow, the next investigation of a similar incident may result in more serious action. Persons responsible for designing and maintaining the company’s internal controls should consider whether improvements (such as enhanced trainings) are warranted; having now spoken on the issue, the Enforcement Division is likely to view corporate inaction as a factor in how it assesses the company’s liability for future data breaches and cyber-frauds.  SEC Press Release (Oct. 16, 2018), available at www.sec.gov/news/press-release/2018-236; the underlying report may be found at www.sec.gov/litigation/investreport/34-84429.pdf.  SEC Press Release (Sept. 16, 2018), available at www.sec.gov/news/press-release/2018-213. This enforcement action was particularly notable as the first occasion the SEC relied upon the rules requiring financial advisory firms to maintain a robust program for preventing identify theft, thus emphasizing the significance of those rules.  SEC Press Release (Apr. 24, 2018), available at www.sec.gov/news/press-release/2018-71.  SEC Press Release (Oct. 2, 2017), available at www.sec.gov/news/press-release/2017-186.  SEC Press Release (Feb. 21, 2018), available at www.sec.gov/news/press-release/2018-22; the guidance itself can be found at www.sec.gov/rules/interp/2018/33-10459.pdf. The SEC provided in-depth guidance in this release on disclosure processes and considerations related to cybersecurity risks and incidents, and complements some of the points highlighted in the Section 21A report. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work in the firm’s Securities Enforcement or Privacy, Cybersecurity and Consumer Protection practice groups, or the following authors: Marc J. Fagel – San Francisco (+1 415-393-8332, firstname.lastname@example.org) Alexander H. Southwell – New York (+1 212-351-3981, email@example.com) Please also feel free to contact the following practice leaders and members: Securities Enforcement Group: New York Barry R. Goldsmith – Co-Chair (+1 212-351-2440, firstname.lastname@example.org) Mark K. Schonfeld – Co-Chair (+1 212-351-2433, email@example.com) Reed Brodsky (+1 212-351-5334, firstname.lastname@example.org) Joel M. Cohen (+1 212-351-2664, email@example.com) Lee G. Dunst (+1 212-351-3824, firstname.lastname@example.org) Laura Kathryn O’Boyle (+1 212-351-2304, email@example.com) Alexander H. Southwell (+1 212-351-3981, firstname.lastname@example.org) Avi Weitzman (+1 212-351-2465, email@example.com) Lawrence J. Zweifach (+1 212-351-2625, firstname.lastname@example.org) Washington, D.C. Richard W. Grime – Co-Chair (+1 202-955-8219, email@example.com) Stephanie L. Brooker (+1 202-887-3502, firstname.lastname@example.org) Daniel P. Chung (+1 202-887-3729, email@example.com) Stuart F. Delery (+1 202-887-3650, firstname.lastname@example.org) Patrick F. Stokes (+1 202-955-8504, email@example.com) F. Joseph Warin (+1 202-887-3609, firstname.lastname@example.org) San Francisco Marc J. Fagel – Co-Chair (+1 415-393-8332, email@example.com) Winston Y. Chan (+1 415-393-8362, firstname.lastname@example.org) Thad A. Davis (+1 415-393-8251, email@example.com) Charles J. Stevens (+1 415-393-8391, firstname.lastname@example.org) Michael Li-Ming Wong (+1 415-393-8234, email@example.com) Palo Alto Paul J. Collins (+1 650-849-5309, firstname.lastname@example.org) Benjamin B. Wagner (+1 650-849-5395, email@example.com) Denver Robert C. Blume (+1 303-298-5758, firstname.lastname@example.org) Monica K. Loseman (+1 303-298-5784, email@example.com) Los Angeles Michael M. Farhang (+1 213-229-7005, firstname.lastname@example.org) Douglas M. Fuchs (+1 213-229-7605, email@example.com) Privacy, Cybersecurity and Consumer Protection Group: Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, firstname.lastname@example.org) M. Sean Royall – Dallas (+1 214-698-3256, email@example.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, firstname.lastname@example.org) Christopher Chorba – Los Angeles (+1 213-229-7396, email@example.com) Richard H. Cunningham – Denver (+1 303-298-5752, firstname.lastname@example.org) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, email@example.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, firstname.lastname@example.org) Kristin A. Linsley – San Francisco (+1 415-393-8395, email@example.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, firstname.lastname@example.org) Shaalu Mehra – Palo Alto (+1 650-849-5282, email@example.com) Karl G. Nelson – Dallas (+1 214-698-3203, firstname.lastname@example.org) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, email@example.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, firstname.lastname@example.org) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, email@example.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, firstname.lastname@example.org) © 2018 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.
Click for PDF California continues to lead the United States in focusing attention on privacy and security of user data and devices. Last week, Governor Jerry Brown signed into law two identical bills requiring manufacturers to include “reasonable security feature[s]” on all devices which are “capable of connecting to the Internet” (commonly known as the Internet of Things). The law is described as the first of its kind in the United States, and comes just three months after passage of the California Consumer Privacy Act of 2018 (“CCPA”); both laws are set to take effect January 1, 2020. Collectively, these laws represent a dramatic expansion of data privacy law that will impact the products and processes of many companies. Also last week, Governor Brown signed into law Senate Bill 1121, which implemented amendments to the CCPA relating primarily to enforcement of the provisions, and clarification of exemptions relating to medical information. Security of Connected Devices The new law is aimed at protecting “connected devices” from unauthorized access, and requires “reasonable security feature[s]” proportional to the device’s “nature and function” and the “information it may collect, contain, or transmit.” There are various notable exclusions, particularly where the devices are covered by certain other laws, or when a company merely purchases devices for resale (or for branding and resale) in California. Nonetheless, the law is unique in that it may require security for Internet-connected products regardless of the type of information or data at issue—a contrast to the CCPA and other data privacy and security laws. Who Must Comply with the Law? Anyone “who manufactures, or contracts with another person to manufacture on the person’s behalf, connected devices that are sold or offered for sale in California” is subject to the statute. However, the law includes an explicit carve-out that “contract[ing] with another person to manufacture on the person’s behalf” does not include a “contract only to purchase a connected device, or only to purchase and brand a connected device.” Thus, if a company is merely purchasing whole units, and reselling, or even branding and reselling—effectively without the ability to indicate specifications for the device—it will likely not be subject to the new law. What’s Required? The law applies to manufacturers of “connected devices.” A “connected device” is defined as “capable of connecting to the Internet . . . and . . . assigned an Internet Protocol address or Bluetooth address.” The number of products falling into this category is increasing at a remarkable rate, and the products span a multitude of applications, from consumer products (such as smart home features, including automatic lights or thermostats controlled remotely), to commercial use cases (such as electronic toll systems and “smart agriculture”). The law requires that such manufacturers “equip the device with a reasonable security feature or features” that is: Appropriate to the nature and function of the device; Appropriate to the information it may collect, contain, or transmit; and Designed to protect the device and its information from unauthorized access, destruction, use, modification, or disclosure. The law does not specify what is “reasonable,” and relies upon the manufacturer to determine what is appropriate to the device. As a result, “reasonable” will likely be further refined through enforcement actions (described below) . However, the law does provide that a device will satisfy the provisions if it is “equipped with a means for authentication outside a local area network,” and (1) each device is preprogrammed with a unique password, or (2) the user must create a “new means of authentication” (such as a password) before the device may be used.  What’s Not Covered? Notably, the law excludes certain devices or manufacturers, particularly where they are covered by other existing laws, and makes clear statements of what this law does not do. For example, the law does not apply to: Any unaffiliated third-party software or applications the user adds to the device; Any provider of an electronic store, gateway, marketplace, or other means of purchasing or downloading software or applications; Devices subject to security requirements under federal law (e.g., FDA); and “Manufacturers” subject to HIPAA or the Confidentiality of Medical Information Act—at least “with respect to any activity regulated by those acts.” How Will It Be Enforced? The law expressly does not provide for a private right of action, and it may only be enforced by the “Attorney General, a city attorney, a county counsel, or a district attorney.” It further does not set forth any criminal penalty, include a maximum civil fine, or specify any other authorized relief. Nonetheless, the authorization of the enumerated entities to enforce it presumably includes the authority for those entities to seek civil fines, as they can under other consumer protection statutes (for example, Section 17206 of the California Business & Professions Code). What Can You Do? If your company sells, or intends to sell, a product in California that connects to the Internet, consider: Whether the company is a “manufacturer”; The security features of the device, if any; What security features might be reasonable given the nature and function of the device and the nature of the data collected or used; Possibilities for alternative, or additional security measures for the specific device; and Engineering resources and timeline required to implement additional features. Many connected devices on the market today already have authentication and security features, but even those that do may benefit from an evaluation of their sufficiency in preparation for this new law. Because the law may require actual product changes, rather than merely policy changes, addressing these issues early is important. Consultation with legal and information security professionals may be helpful. Amendments to CCPA Signed by Governor Brown on September 23, 2018 As anticipated, the California Legislature has begun to pass amendments to the CCPA, though the current changes are relatively modest. Governor Brown signed the latest amendments to the CCPA on September 23, 2018, which included: Extending the deadline for the California Attorney General (“AG”) to develop and publish rules implementing the CCPA until July 1, 2020; Prohibiting the AG from enforcing the Act until either July 1, 2020, or six months after the publication of the regulations, whichever comes first; Limiting the civil penalties that the AG can impose to $2,500 for each violation of the CCPA or up to $7,500 per each intentional violation; Removing the requirement for a consumer to notify the AG within 30 days of filing a civil action in the event of a data breach and to then wait six months to see if the AG elects to pursue the case; Clarifying that consumers only have a right of action related to a business’ alleged failure to “implement and maintain reasonable security procedures and practices” that results in a breach and not for any other violations of the Act; Updating the definition of “personal information” to stress that certain identifiers (e.g., IP address, geolocation information and web browsing history) only constitute personal information if the data can be “reasonably linked, directly or indirectly, with a particular consumer or household”; and Explicitly exempting entities covered by HIPAA, GLBA and DPPA, as well as California’s Confidentiality of Medical Information Act and its Financial Information Privacy Act. The foregoing amendments may not have been of major significance—they were passed on the last day of the most recent legislative session. The California Legislature is expected to consider more substantive changes to the law when it reconvenes for the 2019 – 2020 session in January 2019, including addressing additional concerns regarding enforcement mechanisms, the law’s broad scope, and the sweeping disclosure obligations. Companies that may be impacted by the CCPA should continue to monitor legislative and regulatory developments relating to the CCPA, and should begin planning for the implementation of this broad statute.  Assembly Bill 1906 and Senate Bill 327 contain identical language.  The California Consumer Privacy Act was the subject of a detailed analysis in a client alert issued by Gibson Dunn on July 12, 2018. That publication is available here.  The law will be enacted as California Civil Code Sections 1798.91.04 to 1798.91.06.  Cal. Civil Code § 1798.91.04(a)(1) and (a)(2).  Cal. Civil Code § 1798.91.05(c) and § 1798.91.06.  Cal. Civil Code § 1798.91.05(c).  Cal. Civil Code § 1798.91.05(c).  Cal. Civil Code § 1798.91.05(b).  Cal. Civil Code § 1798.91.04(a)(1), (a)(2), and (a)(3).  Cal. Civil Code § 1798.91.04(b) (emphasis added).  Authentication is simply defined as a “method of verifying the authority” of a user accessing the information or device. Cal. Civil Code § 1798.91.05(a).  Cal. Civil Code § 1798.91.06.  That said, those laws generally require stricter provisions for security measures.  Cal. Civil Code § 1798.91.06(e).  See Cal. Bus. & Prof. Code § 17204.  S.B. 1121. S. Reg. Sess. 2017-2018. (CA 2018) The following Gibson Dunn lawyers assisted in the preparation of this client alert: Joshua A. Jessen, Benjamin B. Wagner, and Cassandra L. Gaedt-Sheckter. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work or the following leaders and members of the firm’s Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, email@example.com) M. Sean Royall – Dallas (+1 214-698-3256, firstname.lastname@example.org) Debra Wong Yang – Los Angeles (+1 213-229-7472, email@example.com) Christopher Chorba – Los Angeles (+1 213-229-7396, firstname.lastname@example.org) Richard H. Cunningham – Denver (+1 303-298-5752, email@example.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, firstname.lastname@example.org) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, email@example.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, firstname.lastname@example.org) H. Mark Lyon – Palo Alto (+1 650-849-5307, email@example.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, firstname.lastname@example.org) Karl G. Nelson – Dallas (+1 214-698-3203, email@example.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, firstname.lastname@example.org) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, email@example.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, firstname.lastname@example.org) Ryan T. Bergsieker – Denver (+1 303-298-5774, email@example.com) Europe Ahmed Baladi – Co-Chair, Paris (+33 (0)1 56 43 13 00, firstname.lastname@example.org) James A. Cox – London (+44 (0)207071 4250, email@example.com) Patrick Doris – London (+44 (0)20 7071 4276, firstname.lastname@example.org) Bernard Grinspan – Paris (+33 (0)1 56 43 13 00, email@example.com) Penny Madden – London (+44 (0)20 7071 4226, firstname.lastname@example.org) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, email@example.com) Michael Walther – Munich (+49 89 189 33-180, firstname.lastname@example.org) Nicolas Autet – Paris (+33 (0)1 56 43 13 00, email@example.com) Kai Gesing – Munich (+49 89 189 33-180, firstname.lastname@example.org) Sarah Wazen – London (+44 (0)20 7071 4203, email@example.com) Alejandro Guerrero – Brussels (+32 2 554 7218, firstname.lastname@example.org) Asia Kelly Austin – Hong Kong (+852 2214 3788, email@example.com) Jai S. Pathak – Singapore (+65 6507 3683, firstname.lastname@example.org) © 2018 Gibson, Dunn & Crutcher LLP Attorney Advertising: The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.