204 Search Results

October 11, 2019 |
California Consumer Privacy Act Update: Regulatory Update

Click for PDF With the California Consumer Privacy Act of 2018 (“CCPA”) set to take effect on January 1, 2020, California’s Attorney General Xavier Becerra released yesterday the much-anticipated draft regulations operationalizing the CCPA.[1] The CCPA (encoded in California Civil Code Sections 1798.100 to 1798.198), aims to give California consumers increased transparency and control over how companies use and share their personal information, and requires most businesses collecting information about California consumers to: disclose data collection, use, and sharing practices to consumers; delete consumer data upon request by the consumer; permit consumers to opt out of the sale or sharing of their personal information; and not sell personal information of consumers under the age of 16 without explicit consent. As a reminder, the CCPA is a landmark privacy law with broad reach, that some have compared to Europe’s General Data Protection Regulation (GDPR). Although the CCPA is California law, it applies to all entities doing business in California and collecting California consumers’ personal information if they meet certain thresholds, thereby impacting a wide range of companies. More information and our prior client alerts can be accessed here, including a summary of the CCPA and initial amendments to the CCPA. Yesterday’s draft regulations flow from the law’s requirement that on or before January 1, 2020, the Office of the Attorney General (OAG) promulgate and adopt implementing regulations for the CCPA.[2] The attorney general is now accepting public comments on the draft regulations through December 6, 2019 and the OAG plans to hold four public hearings across California (Sacramento, Los Angeles, San Francisco and Fresno) in early December to collect additional feedback. If the OAG adheres to its previous guidance, we expect the final regulations to be promulgated 15 days following any changes to regulations, unless there are substantial changes, in which case another 45-day notice period will be triggered.[3] The attorney general’s power to enforce the law is delayed until either July 1, 2020 or six months after the final regulations are issued, whichever comes first. At this time, given the timeline for public comments, it appears that commencement of enforcement will likely be July 1, 2020, and definitely no earlier than mid-June. To provide some context for the scope of the draft regulations, we briefly summarize below a number of the key requirements. However, this description is not exhaustive, and you should consult with your regular Gibson Dunn counsel to determine how these draft regulations may affect you and your company. As the public comment period is an important opportunity for companies handling consumer information to provide feedback on the draft regulations, please feel free to contact any of the Gibson Dunn attorneys listed below, all of whom would be happy to assist in the formulation of responses in advance of the December 6, 2019 deadline. Notice to be Provided to Consumers Keeping in tune with the theme of transparency to consumers, the draft regulations generally require any notices to be in “plain, straightforward language [that] avoid[s] technical or legal jargon,” visible and readable (including on small screens), available in the languages in which the business provides information, and accessible to consumers with difficulties. Businesses should keep in mind that this emphasis on accessibility to consumers has been the backbone of both the CCPA and the regulations. Hence, it will be important for notices and policies to be drafted in plain language for a general audience. More specifically, the draft regulations describe in some detail how companies should go about notifying consumers of: (1) their data rights at the point of collection (including for brick-and-mortar institutions, which had not expressly been considered by the text of the CCPA), (2) their ability to opt out of sale of personal information (the sample opt-out button or logo to be added in a modified version of the regulations[4]), (3) the financial incentive or price or service difference offered by allowing personal information to be used or sold, and (4) the business’s privacy policy (which needs to be available in an additional format that allows a consumer to print it out as a separate document, and available in whatever form makes sense for the collection of information).[5] Process Requirements for Businesses and Consumers The draft regulations further describe how businesses should procedurally handle consumer data requests, including requests to opt out of the sale of information and requests to delete information.[6] The draft regulations also specify how businesses should go about verifying consumers’ identities when they receive these data requests.[7] Notably, the regulations add further requirements, including that businesses must keep records of consumer requests for 24 months, and if a business buys, receives for commercial purposes, sells, or shares for commercial purposes personal information of 4 million or more consumers, the business is required to compile certain metrics and disclose them in the business’s privacy policy.[8] In addition, the regulations require businesses to at least provide a placeholder response within 10 days to consumer requests, even though substantive responses are not due for 45 days (or 90 days from the date of the request, should an extension be taken within the initial 45 days).[9] Businesses are required to support at least two methods for submitting requests. This includes, at a minimum, a toll-free telephone number and, if the business operates a website, an interactive webform accessible through the website or mobile application.[10] In other words, simply providing a contact email address in a privacy policy may not be sufficient (a webform will likely be required). Note, however, that the toll-free telephone number method may not be required if the pending legislation AB-1564 is signed into law.[11] The draft regulations require a two-step process for opt-ins following a previous decision to opt out, and online requests to delete: first, a request submission, and next a separate confirmation (which could be a new email, form, click etc.).[12] Collection of Information Only From Sources Other than the Consumer The draft regulations contain important information for businesses that obtain information from publicly available government sources and not directly from the consumers: A business that does not collect information directly from consumers does not need to provide a notice at collection to the consumer, but before it can sell a consumer’s personal information, it shall do either of the following: (1) Contact the consumer directly to provide notice that the business sells personal information about the consumer and provide the consumer with a notice of right to opt-out in accordance with section 999.306; or (2) Contact the source of the personal information to: (a) Confirm that the source provided a notice at collection to the consumer in accordance with subsections a and b; and (b) Obtain signed attestations from the source describing how the source gave the notice at collection and including an example of the notice. Attestations shall be retained by the business for at least two years and made available to the consumer upon request.[13] Interestingly, this seems to eliminate the need for data scrapers, or other businesses that are not obtaining information directly from the consumer, from providing notice at the time of collection from the consumer. However, the requirements for such businesses of providing notice or obtaining attestations when the data is sold may be burdensome. The draft regulations do not seem to anticipate obtaining information from non-government public sources, such as publicly available personal data from private websites on the Internet. Under those circumstances, it would seem that a business collecting from non-governmental sources (including posted publicly by the consumers themselves on social media or other sites), and then selling such information (under the broad definition of sale), may have to obtain attestations from companies and websites that host the data, and with whom the business likely has no relationship at all. If these regulations go into effect unchanged (e.g., without some form of identified safe harbor), the requirements for attestations may have a significant impact on data brokers that collect data from Internet sources. Clarification of the Non-discrimination Requirement There had been some speculation regarding how CCPA’s non-discrimination provision would be enforced (pursuant to Civil Code section 1798.125, a business is not allowed to treat a consumer differently because the consumer exercised a right conferred by the CCPA). The draft regulations, using examples, clarified that “a business may offer a price or service difference if it is reasonably related to the value of the consumer’s data as that term is defined in section 999.337.”[14] In other words, a business may charge a higher price to consumers who choose not to share their personal data, so long as the price differential is reasonably related to the “value of the customer’s data”. Section 999.337 provides eight methods, one or more of which can be used to calculate this value. Conclusion While the draft regulations have provided much-needed clarity to a number of process-related questions, several areas of uncertainty remain. Previously, the OAG had indicated that, in addition to what the regulations have addressed, they would also clarify or define (1) categories of personal information, (2) unique identifiers (things used to identify an entity connected to the internet), and (3) exceptions to the law (due to conflicts with state or federal laws, trade secrets, or other forms of intellectual property). However, the draft regulations do not appear to provide any significant guidance on such topics. It is important to use the opportunity the OAG has provided for comments to weigh in on the issues that remain unclear. Companies currently undergoing compliance efforts for CCPA should continue to consider the additional insight gathered from these regulations, and we are available to assist with your inquiries as needed. ___________________________ [1] Press Release, Attorney General Becerra Publicly Releases Proposed Regulations under the California Consumer Privacy Act (October 10, 2019), available at https://oag.ca.gov/news/press-releases/attorney-general-becerra-publicly-releases-proposed-regulations-under-california. The entire text of the draft regulations is available at https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-proposed-regs.pdf. [2] Cal. Civ. Code § 1798.185(a) [3] See https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-public-forum-ppt.pdf [4] Draft Regulations §999.306 (e). [5] See Draft Regulations §999.305 (a)(2), §999.306 (a)(2), §999.307 (a)(2), §999.308 (a)(2). [6] Draft Regulations §§ 999.312-315. [7] Draft Regulations § 999.323. [8] Draft Regulations § 999.317. [9] Draft Regulations § 999.313. [10] Draft Regulations § 999.312. [11] AB-1564 is presently on Governor Newsom’s desk, awaiting final approval. [12] Draft Regulations § 999.312(d), § 999.312(a). [13] Draft Regulations § 999.305(d) (emphasis added). [14] Draft Regulations § 999.336. The following Gibson Dunn lawyers assisted in the preparation of this client update: Alex Southwell, Mark Lyon, Cassandra Gaedt-Sheckter, Arjun Rangrajan and Tony Bedel. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments.  Please contact the Gibson Dunn lawyer with whom you usually work, or any member of the firm’s California Consumer Privacy Act Task Force or its Privacy, Cybersecurity and Consumer Protection practice group: California Consumer Privacy Act Task Force: Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Arjun Rangrajan – Palo Alto (+1 650-849-5398, arangarajan@gibsondunn.com) Alexander H. Southwell – New York (+1 212-351-3981, asouthwell@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Please also feel free to contact any member of the Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Olivia Adendorff – Dallas (+1 214-698-3159, oadendorff@gibsondunn.com) Matthew Benjamin – New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Deborah L. Stein (+1 213-229-7164, dstein@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Europe Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan – Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

September 16, 2019 |
Ninth Circuit Issues Decision in Closely Watched Data Scraping Case

Click for PDF On September 9, 2019, the Ninth Circuit issued its long-anticipated decision in hiQ v. LinkedIn, one of the most closely watched data scraping cases in years. Affirming the district court’s decision, the Ninth Circuit held that data analytics company hiQ was entitled to a preliminary injunction forbidding LinkedIn from denying hiQ access to publicly available LinkedIn member profiles. Background Many companies harvest or “scrape” electronic data from third parties—sometimes with that third party’s permission, sometimes without. In some cases, data analytics companies like hiQ scrape data, aggregate it, apply their own algorithm, and sell the resulting data analytics products and services. In other cases, companies may scrape data for internal research purposes. hiQ’s case against LinkedIn attracted significant interest from data hosting platforms, data analytics companies, and other companies that engage in data scraping, as well as from public interest organizations that were divided over the issue, with the Electronic Privacy Information Center warning of the privacy risks associated with data scraping, while the Electronic Frontier Foundation and other entities emphasized the need for open access to information online. The Issue in hiQ v. LinkedIn LinkedIn is a professional networking website with over 500 million members, on which users post resumes and job listings and build connections with other members. LinkedIn members retain ownership of the information they submit, which they license non-exclusively to LinkedIn. Members can choose whether to make their LinkedIn profiles visible only to direct connections, to certain LinkedIn members, to all LinkedIn members, or—as relevant here—to the general public. hiQ Labs is a data analytics company that uses automated bots to scrape information from public LinkedIn profiles, and then aggregates that data to create “people analytics” tools that it sells to business clients. In May 2017, LinkedIn sent hiQ a cease-and-desist letter, asserting that hiQ violated LinkedIn’s User Agreement, demanding that hiQ stop accessing and copying user data from LinkedIn, and warning hiQ that continued activity would violate state and federal law, including the Computer Fraud and Abuse Act (“CFAA”), the Digital Millennium Copyright Act (“DMCA”), and the California common law of trespass. Shortly after, hiQ filed suit, seeking injunctive and declaratory relief in order to continue scraping data from LinkedIn’s public pages. In August 2017, the district court granted hiQ’s motion for a preliminary injunction and ordered LinkedIn to refrain from enacting any legal or technical barriers to hiQ’s access to public profiles. The Ninth Circuit heard oral argument in March 2018. The Ninth Circuit’s Opinion In an opinion authored by Judge Marsha S. Berzon, the Ninth Circuit affirmed the district court’s grant of a preliminary injunction. First, the Ninth Circuit found that hiQ had demonstrated a likelihood of irreparable harm absent a preliminary injunction. Crediting the district court’s determination that hiQ’s entire business depends on access to public LinkedIn profiles, the Ninth Circuit found that the record supported hiQ’s assertions that it would be forced to breach existing contracts, forgo prospective deals, lay off most of its employees, and shutter its business absent a preliminary injunction. Second, the Ninth Circuit upheld the district court’s determination that the balance of hardships tips in favor of hiQ, pointing out that LinkedIn has no protected property interest in the data contributed by its users, who retain ownership over their profiles, and that LinkedIn users who choose to make their profiles public have little expectation of privacy with respect to the information they post publicly. Third, the Ninth Circuit held that, under the sliding-scale approach to the preliminary injunction factors, because the balance of hardships tipped decidedly in hiQ’s favor, hiQ satisfied the likelihood-of-success prong by raising serious questions going to the merits. The Ninth Circuit agreed with the district court that hiQ had shown a likelihood of success on its tortious interference with contract claim, pointing out that LinkedIn knew hiQ scraped data from its servers for hiQ’s own products and services, and that LinkedIn’s competitive business interests were insufficient to justify its interference with hiQ’s existing contracts. The Ninth Circuit rejected LinkedIn’s affirmative defense that hiQ had accessed LinkedIn data “without authorization” under the CFAA, 18 U.S.C. § 1030, and that the CFAA preempted hiQ’s state law claims. Authorization, the panel wrote, “is an affirmative notion, indicating that access is restricted to those specially recognized or admitted”: The wording of the statutory phrase “‘[a]ccess[] . . . without authorization,’ 18 U.S.C. § 1030(a)(2), suggests a baseline in which access is not generally available and so permission is ordinarily required.” That interpretation, Judge Berzon noted, is confirmed by the legislative history of the CFAA, which was enacted to prevent intentional computer hacking—an act “analogous to that of ‘breaking and entering.’” “Public LinkedIn profiles, available to anyone with an Internet connection,” the Court explained, therefore do not constitute information for which authorization or access permission is generally required. Further, the Ninth Circuit cautioned that the rule of lenity favors a narrow interpretation of the “without authorization” provision, as Section 1030 is primarily a criminal statute and statutes must be interpreted consistently in the criminal and civil contexts. Finally, the Ninth Circuit found that, on balance, the public interest favors granting the preliminary injunction. The Ninth Circuit observed that, although LinkedIn had an obvious interest in blocking abusive users and thwarting attacks on its servers, the injunction does not prevent it from employing anti-bot measures to combat such abuses. And permitting companies like LinkedIn that collect large amounts of data “to decide, on any basis, who can collect and use” user data posted publicly on their platforms “risks the possible creation of information monopolies that would disserve the public interest.” What To Expect The Ninth Circuit’s opinion—although framed narrowly as deferring to the district court’s determinations on the preliminary injunction record in this case—is likely to be relied upon by companies seeking to scrape publicly available data from public websites. The contours of Section 1030 liability have been the subject of competing interpretations in different Circuits. The First, Fifth, Seventh, and Eleventh Circuits have adopted a broad view of the CFAA, extending Section 1030 liability even to misuse or misappropriation of information lawfully accessed, as when a corporate employee with valid login credentials provides files to a competitor, see, e.g., United States v. Rodriguez, 628 F.3d 1258, 1263 (11th Cir. 2010); United States v. John, 597 F.3d 263, 271 (5th Cir. 2010); Int’l Airport Ctrs., L.L.C. v. Citrin, 440 F.3d 418, 420–21 (7th Cir. 2006); EF Cultural Travel BV v. Explorica, Inc., 274 F.3d 577, 581–84 (1st Cir. 2001), and permitting civil CFAA claims to proceed based on violations of a website’s terms of service, Sw. Airlines Co. v. Farechase, Inc., 318 F. Supp. 2d 435, 439–40 (N.D. Tex. 2004). By contrast, the Second, Fourth, and Ninth Circuits have adopted a narrow view, interpreting the CFAA as a restriction on unauthorized access rather than on “mere use of a computer” (including use in violation of a website’s terms of service), and thus limiting Section 1030 liability to those who, through disingenuous means, gain access to data in a manner analogous to “breaking and entering.” H.R. Rep. No. 98–894, at 3706 (1984); see, e.g., United States v. Valle, 807 F.3d 508, 523–28 (2d Cir. 2015); WEC Carolina Energy Sols. LLC v. Miller, 687 F.3d 199, 205–06 (4th Cir. 2012); United States v. Nosal (Nosal I), 676 F.3d 854, 857–63 (9th Cir. 2012) (en banc). The decision in hiQ v. LinkedIn marks a long-anticipated addition to that landscape, reaffirming the Ninth Circuit’s narrow approach while providing additional clarity as to the scope of Section 1030’s “without authorization” provision. Companies should not be too quick to view the Ninth Circuit’s opinion as an invitation or green light to scrape, however. The Ninth Circuit was careful to point out that “victims of data scraping are not without resort” and, in particular, that accessing and scraping data without the website owner’s consent may give rise to a common law tort claim for trespass to chattels. Going forward, we can expect that companies seeking to prevent data scraping will rely less on the CFAA and more on state law claims such as trespass to chattels. In addition, the Ninth Circuit’s opinion “is premised on a distinction between information presumptively accessible to the general public and information for which authorization is generally required.” Companies seeking to prevent scraping may attempt to demarcate data as non-public by requiring authorization or authentication measures or otherwise restricting access. The following Gibson Dunn lawyers prepared this client update: Alexander Southwell, Matthew Benjamin, Alexandra Perloff-Giles and Erica Sollazzo Payne. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work or any of the following leaders and members of the firm’s Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Olivia Adendorff – Dallas (+1 214-698-3159, oadendorff@gibsondunn.com) Matthew Benjamin – New York (+1 212-351-4079, mbenjamin@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Europe Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)20 7071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, jrobe@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

September 9, 2019 |
Google and YouTube Reach Historic Settlement with FTC and New York AG Over Alleged COPPA Violations

Click for PDF On September 4, 2019, Google and its subsidiary, YouTube, agreed to pay a record $170 million fine to settle allegations by the Federal Trade Commission (“FTC”) and New York Attorney General (“AG”) that YouTube harvested children’s personal data in violation of the Children’s Online Privacy Protection Act (“COPPA”) Rule, 16 C.F.R. § 312. The proposed settlement—which will require Google and YouTube to pay $136 million to the FTC and $34 million to the State of New York—represents the largest civil penalty ever imposed under COPPA since the legislation’s enactment in 1998, eclipsing the previous record of $5.7 million paid by video social networking app Musical.ly (now known as TikTok) earlier this year. The settlement is the latest in a string of aggressive, high-stakes enforcement actions against companies alleged to have committed privacy-related violations—a trend we expect to continue in coming years. Moreover, the deal signals a notable expansion in the circumstances in which third-party platforms are considered to be directed to children or to possess actual knowledge that they are collecting personal information from users of a child-directed site or service, thereby potentially expanding COPPA’s reach to businesses that previously may not have considered the need for COPPA compliance. Background The FTC’s COPPA Rule imposes certain obligations on operators of websites or online services directed to children under the age of 13 that collect, use, or disclose personal information from children, as well as websites or online services that are deemed to be directed to children because they have actual knowledge that they collect personal information from users of other websites or online services directed at children.[1] Such obligations include providing notice of the operators’ data collection practices and obtaining parental consent prior to the collection of personal information from children. A violation of the Rule constitutes an unfair or deceptive act or practice in or affecting commerce in violation of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a), for which the FTC may seek civil penalties.[2] The FTC and New York AG’s complaint alleges that Google and YouTube violated the COPPA Rule by collecting persistent identifiers—which track users’ actions on the Internet—from viewers of “child-directed channels” without first providing notice to parents and obtaining their consent, and that YouTube then used these identifiers to provide targeted advertising to underage viewers in exchange for nearly $50 million in advertising revenue. The complaint claims that content creators of child-directed channels are “operators” for purposes of the COPPA Rule because they collect children’s personal information, and that YouTube was aware that these channels were directed at children, marketed itself as a kid-friendly platform, used a content rating system that includes categories for children under 13, and specifically curated content for a separate mobile application called “YouTube Kids.” As such, the complaint contends, Google and YouTube had actual knowledge that they collected personal information, including persistent identifiers, from viewers of channels and content directed to children under 13, and thus are deemed to operate an online service directed to children under COPPA. In addition to the $170 million civil penalty, the settlement will require Google and YouTube to notify channel owners that child-directed content may be subject to the COPPA Rule’s requirements and implement a system for YouTube channel owners to designate whether their content is directed at children. This is significant “fencing in” relief because COPPA does not itself require platforms that host and serve ads on child-directed content, but do not create content themselves, to inquire as to whether content is directed at children. In addition, YouTube and Google must provide annual training to relevant employees regarding COPPA Rule compliance and are enjoined from violating the notice and consent provisions of the Rule in the future. In practice, these measures will place responsibility on YouTube as well as individual content creators to proactively identify any child-directed content on the platform and obtain the requisite notice and consent required under the Rule. YouTube has already publicly stated that it will begin limiting data collection on child-directed content and will no longer offer personalized ads on child-directed videos.[3] The FTC approved the settlement in a 3-2 vote. In a joint statement, Chairman Joe Simons and Commissioner Christine Wilson characterized the settlement as “a significant victory” that “sends a strong message to children’s content providers and to platforms about their obligations to comply with the COPPA Rule.”[4] In a separate statement, Commissioner Noah Phillips expressed support for the settlement while urging Congress to enact privacy legislation that includes more detailed guidance as to how the FTC should calculate civil penalties in privacy cases, where harm is often difficult to quantify.[5] In their dissenting statements, Commissioners Rohit Chopra and Rebecca Kelly Slaughter criticized the proposed settlement as insufficient because it fails to (i) hold senior executives at Google and YouTube individually accountable, (ii) impose injunctive relief requiring YouTube to “fundamentally change its business practices,” and (iii) impose a monetary penalty of an amount sufficient to deter future misconduct.[6] The settlement is currently pending judicial approval in the United States District Court for the District of Columbia. Key Takeaways: The FTC May Contend that COPPA Applies to Platforms, Ad Networks, and Others that Are Aware (or Reasonably Should Be Aware) that They Collect Personal Information from Users of Child-Directed Sites or Content – Despite the fact that YouTube offers products and services to the general public, requires users to be over the age of 13 for use of most features, and does not itself create content, the FTC and New York AG concluded that YouTube was covered by the COPPA Rule because it allegedly had “actual knowledge” that it was collecting personal information from viewers of channels and content directed to children under 13. Similarly, in December 2018, the New York AG took enforcement action against a non-consumer-facing internet service provider that does not itself operate a child-directed website because it was aware that several of its clients’ websites were directed to children under 13. These cases have potentially far-reaching implications for companies that offer apps, websites, and other services that do not target children or position themselves as serving children, but which, in fact, collect personal information from users of child-directed sites in some manner and have actual knowledge of such collection. Regulators Continue to Focus on Privacy Issues (Particularly Children’s Privacy Issues) – The YouTube settlement is another example of a long-running effort by regulators—in particular, the FTC and New York AG—to investigate and enforce against companies and individuals for privacy-related violations. In July, the FTC announced its largest monetary settlement to date in connection with alleged data privacy-related violations by a social media company, and the New York AG has pursued a number of actions based on alleged COPPA violations, including an $835,000 settlement in 2016 with several children’s brands that were allegedly tracking users to serve ads. In light of the increased focus on privacy-related violations in recent years, we anticipate that this trend will continue at both the federal and state level. Regulators Are Increasingly Willing to Collaborate on Privacy Enforcement Efforts – The joint effort by the FTC and New York AG against YouTube and Google is but one recent example of regulators pooling resources to enforce against large companies. In December 2018, twelve state Attorneys General, led by the Indiana Attorney General, filed the first ever multi-state data breach lawsuit against a healthcare information technology company and its subsidiary related to a breach that compromised the personal data of 3.9 million people. This is in line with a broader trend of states continuing to coordinate enforcement efforts through multi-state litigations arising from large-scale data breaches and other alleged violations. The FTC Is Re-Setting Its Standards for Calculating Civil Penalties to Force Companies to Reevaluate the Consequences of Noncompliance – In a statement accompanying the settlement, Chairman Simons and Commissioner Wilson noted that the YouTube settlement is nearly 30 times higher than the largest fine previously imposed under COPPA. And, as noted above, the FTC announced its largest-ever civil penalty in July 2019 as part of a settlement over alleged privacy-related violations—a self-proclaimed “paradigm shift” in consumer privacy enforcement. There, the FTC noted that the penalty was over 20 times greater than the largest fine under the EU’s General Data Protection Regulation (“GDPR”) and one of the largest civil penalties in U.S. history, surpassed only by cases involving widespread environmental damage and financial fraud. In sum, the FTC has expressed a willingness in recent years to impose civil penalties much higher than any previous fine as a way of setting a high-water mark to deter others from committing future violations. The FTC Continues to Resolve Enforcement Actions Using Highly Prescriptive Consent Orders – In the wake of the LabMD decision, the FTC has resolved numerous cases using highly prescriptive consent orders that mandate sweeping injunctive relief. In LabMD, the Eleventh Circuit found that an FTC cease and desist order mandating a “complete overhaul of LabMD’s data-security program to meet an indeterminable standard of reasonableness” was unenforceable because it lacked adequate specificity.[7] Since then, the FTC has increasingly used consent orders that require detailed injunctive measures to resolve enforcement actions. For example, this latest settlement explicitly requires YouTube to implement a system for channel owners to designate whether their content is directed at children, rather than simply prohibiting YouTube from violating the COPPA Rule again in the future. We can expect the Commission to continue this trend. Regulators Are Taking Enforcement Actions Based on Highly Technical Aspects of Privacy Compliance – Another noteworthy aspect of the YouTube settlement is that it demonstrates regulators’ ability and willingness to assess highly technical aspects of privacy compliance—such as persistent identifiers—as part of their investigations and enforcement efforts. This is a stark change from several years ago, when enforcement actions tended not to implicate such technical aspects of a company’s products and services. In light of this trend, companies should ensure close coordination among their in-house counsel, IT team, and outside counsel experienced with technical issues, in order to meaningfully evaluate and adopt controls to address and mitigate potential compliance risks. The complaint against Google and YouTube can be accessed at: https://www.ftc.gov/system/files/documents/cases/youtube_complaint.pdf. The proposed settlement with Google and YouTube can be accessed at: https://www.ftc.gov/system/files/documents/cases/172_3083_youtube_coppa_consent_order.pdf Gibson Dunn’s 2019 U.S. Cybersecurity and Data Privacy Outlook and Review can be accessed at: https://www.gibsondunn.com/us-cybersecurity-and-data-privacy-outlook-and-review-2019/ ______________________ [1] 16 C.F.R. § 312.2. [2] 15 U.S.C. § 6502(c); 15 U.S.C. § 57(a)(d)(3). [3] Susan Wojcicki, An update on kids and data protection on YouTube, YouTube Official Blog (Sept. 4, 2019) https://youtube.googleblog.com/2019/09/an-update-on-kids.html. [4] Statement of Joseph J. Simons & Christine S. Wilson Regarding FTC and People of the State of New York v. Google LLC and YouTube, LLC (Sept. 4, 2019), here. [5] Separate Statement of Commissioner Noah Joshua Phillips, United States of America and People of the State of New York v. Google LLC and YouTube, LLC (Sept. 4, 2019), here. [6] Dissenting Statement of Commissioner Rohit Chopra, In the Matter of Google LLC and YouTube, LLC (Sept. 4, 2019), here; Dissenting Statement of Commissioner Rebecca Kelly Slaughter, In the Matter of Google LLC and YouTube, LLC (Sept. 4, 2019), here. [7] LabMD, Inc. v. FTC, No. 16-16279, slip op. at 17-18 (11th Cir. June 6, 2018). The following Gibson Dunn lawyers prepared this client update: Alexander Southwell, Rich Cunningham, Olivia Adendorff, Ryan Bergsieker, Ashley Rogers and Lucie Duvall. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work or any of the following leaders and members of the firm’s Privacy, Cybersecurity and Consumer Protection practice group: United States Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Olivia Adendorff (+1 214-698-3159, oadendorff@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Europe Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com</abr /> James A. Cox – London (+44 (0)207071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, jrobe@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

July 10, 2019 |
Can GDPR Hinder AI Made in Europe?

Paris partner Ahmed Baladi is the author of “Can GDPR Hinder AI Made in Europe?” [PDF] published by Cybersecurity Law Report on July 10, 2019.

June 24, 2019 |
Supreme Court Holds That A Federal Ban on “Immoral or Scandalous” Trademarks Violates the First Amendment

Click for PDF Decided June 24, 2019 Iancu v. Brunetti, No. 18-302 Today, the Supreme Court held 6-3 that the Lanham Act’s prohibition on the registration of “Immoral or Scandalous” trademarks infringes the First Amendment. Background: Two terms ago, in Matal v. Tam, 582 U.S. __ (2017), the Supreme Court declared unconstitutional the Lanham Act’s ban on registering trademarks that “disparage” any “person[], living or dead.”  15 U.S.C. § 1052(a).  The Court held that a viewpoint based ban on trademark registration is unconstitutional, and that the Lanham Act’s disparagement bar was viewpoint based (permitting registration of marks when their messages celebrate persons, but not when their messages are alleged to disparage).  Against that backdrop, Erik Brunetti, the owner of a streetwear brand whose name sounds like a form of the F-word, sought federal registration of the trademark FUCT.  The U.S. Patent and Trademark Office denied Brunetti’s application under a provision of the Lanham Act that prohibits registration of trademarks that “[c]onsist[] of or compromise[] immoral[] or scandalous matter.”  15 U.S.C. § 1052(a).  On Brunetti’s First Amendment challenge, the Federal Circuit invalidated this “Immoral or Scandalous” provision of the Lanham Act, on the basis that it impermissibly discriminated on the basis of viewpoint. Issue:  Does the Lanham Act’s prohibition on the federal registration of “Immoral or Scandalous” trademarks infringe the First Amendment right to freedom of speech? Court’s Holding:  Yes.  In an opinion authored by Justice Kagan on June 24, 2019, the Supreme Court held that the Lanham Act, which bans registration of “immoral … or scandalous matter,” violates the free speech rights guaranteed by the First Amendment because it discriminates on the basis of viewpoint. “If the ‘immoral or scandalous’ bar similarly discriminates on the basis of viewpoint, it must also collide with our First Amendment doctrine.” Justice Kagan, writing for the majority What It Means: The argument that the government advanced in this case—that speech is not restricted when you can call your brand or product anything you want even if you cannot get the benefit of federal trademark protection—will not save statutory bans on trademark registration that are viewpoint based. The Court made clear that its decision was based on the broad reach of the Lanham Act’s ban:  “[T]he ‘immoral or scandalous’ bar is substantially overbroad.  There are a great many immoral and scandalous ideas in the world (even more than there are swearwords), and the Lanham Act covers them all.”  In his concurring opinion, Justice Alito emphasized that the Court’s decision “does not prevent Congress from adopting a more carefully focused statute that precludes the registration of marks containing vulgar terms that play no real part in the expression of ideas,” thus leaving room for legislators to develop a more narrowly tailored alternative. Unless and until a new law is proposed and passed, however, the U.S. Patent and Trademark Office will have no statutory basis to refuse federal registration of potentially vulgar, profane, offensive, disreputable, or obscene words and images. As always, Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding developments at the Supreme Court.  Please feel free to contact the following practice leaders: Appellate and Constitutional Law Practice Allyson N. Ho +1 214.698.3233 aho@gibsondunn.com Mark A. Perry +1 202.887.3667 mperry@gibsondunn.com Related Practice: Intellectual Property Wayne Barsky +1 310.552.8500 wbarsky@gibsondunn.com Josh Krevitt +1 212.351.4000 jkrevitt@gibsondunn.com Mark Reiter +1 214.698.3100 mreiter@gibsondunn.com Related Practice: Fashion, Retail and Consumer Products Howard S. Hogan +1 202.887.3640 hhogan@gibsondunn.com © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

June 19, 2019 |
The EU Introduces a New Sanctions Framework in Response to Cyber-Attack Threats

Click for PDF In a previous client alert, we highlighted a recent U.S. sanctions regime aimed at deterring threats of election interference[1], which further expanded the U.S. menu of cyber-related sanctions.[2]  Across the Atlantic, as a step forward that demonstrates its voiced determination to enhance the EU’s cyber defense capabilities[3], on May 17, 2019, the EU established a sanctions framework for targeted restrictive measures to deter and respond to cyber-attacks that constitute an external threat to the EU or its Member States.[4]  The new framework is expounded in two documents, Council Decision (CFSP) 2019/797 and Council Regulation 2019/796. The newly-introduced framework is significant for two reasons.  First, the framework enables the EU to implement unilateral cyber sanctions, a move that expands the EU’s sanctions toolkit beyond traditional areas of sanctions, such as sanctions imposed due to terrorism and international relations-based grounds.[5]  Second, it represents a major, concrete measure that arose out of the EU’s continued interest in developing an open and secured cyberspace and amid concerns for malicious use of information and communications technologies by both State and non-State actors.  From the alleged plot by Russia to hack the Organization for the Prevention of Chemical Weapons in the Hague in April last year[6] to the cyber-attack on the German Parliament early this year[7], European leaders have been very concerned about future cyber-attacks on EU Member States.  In particular, in light of the European Parliament election that took place on May 23-26, 2019, the framework equips the EU with a potent economic instrument to punish cyber-attacks more ably and directly on a unified front.[8] Modality for Establishing the List of Sanctioned Parties Under the new framework, persons, entities and bodies subject to sanctions will be listed in the Annex to the Council Decision (CFSP) 2019/797 (“Annex I”).  With a view to ensure greater consistency in the listing of sanctioned parties, the European Council has the sole authority to establish and amend Annex I as needed, and is to review Annex I “at regular intervals and at least every 12 months.”[9]  The Council will review its decision in light of observations or substantial new evidence presented to it. External Threats with a “Significant Effect” The framework applies to “cyber-attacks with significant effect, including attempted cyber-attacks with a potentially significant effect, which constitutes an external threat to the Union or its Member States.”[10]  To be external, it suffices, among other ways, that the attack originates from outside the Union, uses infrastructure outside the Union, or is with the support, at the direction of or under the control of a person outside the Union.[11]  The kinds of conduct considered as cyber-attacks include unauthorized access to and interference with IT systems, as well as data interference and interception.  The Council’s approach to assessing the “significant effect” is by and large result-oriented, focusing, inter alia, pursuant to Article 3 of the Council Regulation, on “(a) the scope, scale, impact or severity of disruption caused . . . (d) the amount of economic loss caused . . . (e) the economic benefit gained by the perpetrator [or]. . . (f) the amount or nature of data stolen or the scale of data breaches. . . .”[12] Expansive Reach of the Framework Under the framework, sanctioned persons and entities are those who are responsible for the cyber-attack, and those who attempted, or provided “financial, technical or material support” to, or otherwise involved in the cyber-attack (e.g. directing, encouraging, planning, and facilitating the attack).[13] It is also noteworthy that although the framework primarily targets attacks against Member States and the Union itself, sanctions measures under the framework can also be applied to cyber-attacks with a significant effect against “third States or international organisations,” if sanctions measures are deemed “necessary to achieve common foreign and security policy (CFSP) objectives.”[14]  As an initiative to deter cyber-attacks in general, the subjects of cyber-attacks covered under this framework are also expansive, ranging from critical infrastructure to the storage of classified information, as well as essential services necessary for the maintenance and operation of essential social and economic activities, and government functions, including elections.[15] Sanctions Measures under the Framework The primary restrictive measures under the framework are asset freeze and travel ban.  Generally, all funds and economic resources “belonging to, owned, held or controlled by” the sanctioned person or entity will be frozen.[16]  Furthermore, “no funds or economic resources shall be made available directly or indirectly to or for the benefit of” the sanctioned party.[17] In broad terms, these EU financial sanctions are similar to a sanctions attendant to designation  on the U.S. Specially Designated Nationals And Blocked Persons List. Comparison with the U.S. Sanctions Regime for Cyber Attacks In the U.S., besides the country-specific programs, the major source of authority for cyber-related sanctions is Executive Order 13694, titled “Blocking the Property of Certain Persons Engaging in Significant Malicious Cyber-Enabled Activities” (“E.O. 13694”) signed into effect by President Barack Obama on April 1, 2015.[18]  The recently promulgated Executive Order 13848 on “Imposing Certain Sanctions in the Event of Foreign Interference in a United States Election” (“E.O. 13848”) by President Donald Trump adds a further emphasis on threats of election interference via cyber means.[19] In comparison, the latecomer EU sanctions framework is by and large similar both in terms of the conduct it seeks to deter and parties potentially subject to sanctions.  Like the U.S. sanctions program, the EU framework covers a wide range of significant interferences and expressly highlights interference with “public elections or the voting process” as one of the enumerated predicate cyber-attacks.[20]  Much like the U.S. program’s focus on “significant” “malicious cyber-enabled activities,” the focus of the EU framework on “willfully carried out” cyber-attacks “with significant effect” gives the European Council substantial flexibility and discretion in its determination of what arises to the level of a sanctionable conduct.[21]  In terms of parties covered, both E.O. 13694 (and subsequent E.O. 13848) and the EU framework sanction persons and entities who are responsible for the attack as well as those who are agents, or complicit by providing material assistance, in the commission of the cyber-attack. It is important to note that the EU framework expressly permits imposition of sanctions on parties whose conduct is against a “third [non-Member] States or international organizations”, insofar the EU satisfies itself that the sanctions are necessary to achieve CFSP objectives, namely the EU’s Union-level foreign policy objectives.[22]  In comparison, in the U.S. E.O. 13694, the possibility of imposing sanctions for cyber-attacks against a third party seems to be alluded to by the language “threat to the . . . foreign policy . . . of the United States.”[23]  Given the recentness of the framework, it is unclear as to the extent to which the EU would exercise its right under this provision, and no other countries have yet commented on this.  Nonetheless, it is encouraging that both regimes leave open the possibility of sanctions based on cyber-attacks targeting third states. Conclusion & Implications The new framework established by the European Council represents a significant  effort by  the EU to stiffen its response to cyber-attacks.  The framework has broadened EU sanctions both in substance and in scope.  To the extent that the EU framework is comparable to the current U.S. cyber-related sanctions program, the EU framework reflects greater synchronization between the EU and the U.S. on the sanctions front.  For the time being, no name has been added to Annex I yet.  However, as the list grows in the future, businesses should closely assess their existing business relationships with other companies and pay greater attention in their onboarding compliance due diligence efforts.  On the other hand, as the decision to list and delist a sanctioned party is reserved for the European Council, there is likely to be greater transparency and legal predictability for compliance purposes. ______________________ [1] See our client alert dated Sep. 25, 2018 entitled U.S. Authorizes Sanctions for Election Interference,  https://www.gibsondunn.com/us-authorizes-sanctions-for-election-interference/, for an analysis of E.O. 13848. [2] See Judith Lee, Cybersecurity Sanctions: A Powerful New Tool, LAW 360 (Apr. 02, 2015), https://www.gibsondunn.com/wp-content/uploads/documents/publications/Lee-Cybersecurity-Sanctions-A-Powerful-New-Tool-Law360.pdf, for an analysis by our Washington D.C. partner Judith Lee on the Obama-era executive order that forms the bulk of the current U.S. cyber-related sanctions program. [3] See Council Press Release 301/19, Declaration by the High Representative on behalf of the EU on respect for the rules-based order in cyberspace (Apr. 12, 2019), https://www.consilium.europa.eu/en/press/press-releases/2019/04/12/declaration-by-the-high-representative-on-behalf-of-the-eu-on-respect-for-the-rules-based-order-in-cyberspace/. [4] Council Press Release 367/19, Cyber-attacks: Council is now able to impose sanctions (May 17, 2019), https://www.consilium.europa.eu/en/press/press-releases/2019/05/17/cyber-attacks-council-is-now-able-to-impose-sanctions/. [5] See Erica Moret and Patryk Pawlak, European Union Institute for Security Studies, Brief, The EU Cyber Diplomacy Toolbox: towards a cyber sanctions regime?, p. 2 (Jul. 12, 2017), https://www.iss.europa.eu/content/eu-cyber-diplomacy-toolbox-towards-cyber-sanctions-regime. [6] Joe Barnes, UK Plays Pivotal Role In EU’s New Cyber-Attack Sections Regime – ‘This Is Decisive Action’, Express (May 17, 2019), https://www.express.co.uk/news/uk/1128512/UK-news-EU-cyber-attack-section-regime-European-Council-latest-update. [7] Thorsten Severin, Andrea Shalal, German Government under Cyber Attack, Shores Up Defenses, Reuters (Mar. 1, 2018), https://www.reuters.com/article/us-germany-cyber/german-government-under-cyber-attack-shores-up-defenses-idUSKCN1GD4C8. [8] See Natalia Drozdiak, EU Agrees Powers to Sanction, Freeze Assets Over Cyber-Attacks, Bloomberg (May 17, 2019), https://www.bloomberg.com/news/articles/2019-05-17/eu-agrees-powers-to-sanction-freeze-assets-over-cyber-attacks. [9] Council Regulation 2019/796 of May 17, 2019, concerning restrictive measures against cyber-attacks threatening the Union or its Member States, preamble, art. 13, O.J. L 129I , 17.5.2019, p. 1–12, http://data.europa.eu/eli/reg/2019/796/oj (hereinafter “Council Regulation 2019/796”). [10] Council Decision (CFSP) 2019/797 of 17 May 2019, concerning restrictive measures against cyber-attacks threatening the Union or its Member States, art. 1(1), O.J. L 129I , 17.5.2019, p. 13–19, http://data.europa.eu/eli/dec/2019/797/oj (hereinafter “Council Decision 2019/797”). [11] Id. art. 1(2). [12] Id. art. 3.  The same language is also reflected in Council Regulation 2019/796, art. 2. [13] Council Decision 2019/797, supra note 10, art. 4. [14] Council Regulation 2019/796, supra note 9, art. 1(6). [15] Council Decision 2019/797, supra note 10, art. 1(4). [16] Id. art. 5(1). [17] Id. art. 5(2). [18] See supra note 2 for an analysis of the Executive Order.  See also Exec. Order No. 13694, 80 Fed. Reg. 18,077 (Apr. 2, 2015), https://www.treasury.gov/resource-center/sanctions/Programs/Documents/cyber_eo.pdf, subsequently amended by Executive Order 13757 of December 28, 2016. [19] See supra note 1.  See also Exec. Order No. 13848, 83 Fed. Reg. 46,843 (Sep. 12, 2018), https://www.federalregister.gov/documents/2018/09/14/2018-20203/imposing-certain-sanctions-in-the-event-of-foreign-interference-in-a-united-states-election. [20] Council Decision 2019/797, supra note 10, art. 1(4)(c). [21] See Judith Lee, supra note 2. [22] CFSP objectives, as the Council Decision notes, can be found in relevant provisions of Article 21 of the Treaty on European Union.  A relevant excerpt of article 21 of the Treaty on European Union: The Union shall define and pursue common policies and actions, and shall work for a high degree of cooperation in all fields of international relations, in order to: (a) safeguard its values, fundamental interests, security, independence and integrity; (b) consolidate and support democracy, the rule of law, human rights and the principles of international law; (c) preserve peace, prevent conflicts and strengthen international security, in accordance with the purposes and principles of the United Nations Charter, with the principles of the Helsinki Final Act and with the aims of the Charter of Paris, including those relating to external borders; (d) foster the sustainable economic, social and environmental development of developing countries, with the primary aim of eradicating poverty; (e) encourage the integration of all countries into the world economy, including through the progressive abolition of restrictions on international trade; (f) help develop international measures to preserve and improve the quality of the environment and the sustainable management of global natural resources, in order to ensure sustainable development; (g) assist populations, countries and regions confronting natural or man-made disasters; and (h) promote an international system based on stronger multilateral cooperation and good global governance. Consolidated Version Of The Treaty On European Union, art. 21, O.J. C 326, 26.10.2012, p. 13–390, available online at http://data.europa.eu/eli/treaty/teu_2012/oj. [23] Compare Exec. Order No. 13694, supra note 18, sec. 1(a)(ii)(A), with Council Decision 2019/797, supra note 10, art. 1(6). The following Gibson Dunn lawyers assisted in preparing this client update: Judith Alison Lee, Adam Smith, Patrick Doris, Michael Walther, Nicolas Autet and Richard Roeder. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding the above developments.  Please contact the Gibson Dunn lawyer with whom you usually work, the authors, or any of the following leaders and members of the firm’s International Trade or Privacy, Cybersecurity and Consumer Protection practice groups: United States: Judith Alison Lee – Co-Chair, International Trade Practice, Washington, D.C. (+1 202-887-3591, jalee@gibsondunn.com) Ronald Kirk – Co-Chair, International Trade Practice, Dallas (+1 214-698-3295, rkirk@gibsondunn.com) Alexander H. Southwell – Co-Chair, Privacy, Cybersecurity & Consumer Protection Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Jose W. Fernandez – New York (+1 212-351-2376, jfernandez@gibsondunn.com) Marcellus A. McRae – Los Angeles (+1 213-229-7675, mmcrae@gibsondunn.com) Adam M. Smith – Washington, D.C. (+1 202-887-3547, asmith@gibsondunn.com) Christopher T. Timura – Washington, D.C. (+1 202-887-3690, ctimura@gibsondunn.com) Ben K. Belair – Washington, D.C. (+1 202-887-3743, bbelair@gibsondunn.com) Courtney M. Brown – Washington, D.C. (+1 202-955-8685, cmbrown@gibsondunn.com) Laura R. Cole – Washington, D.C. (+1 202-887-3787, lcole@gibsondunn.com) Stephanie L. Connor – Washington, D.C. (+1 202-955-8586, sconnor@gibsondunn.com) Henry C. Phillips – Washington, D.C. (+1 202-955-8535, hphillips@gibsondunn.com) R.L. Pratt – Washington, D.C. (+1 202-887-3785, rpratt@gibsondunn.com) Audi K. Syarief – Washington, D.C. (+1 202-955-8266, asyarief@gibsondunn.com) Scott R. Toussaint – Washington, D.C. (+1 202-887-3588, stoussaint@gibsondunn.com) Europe: Ahmed Baladi – Co-Chair, Privacy, Cybersecurity & Consumer Protection Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) Peter Alexiadis – Brussels (+32 2 554 72 00, palexiadis@gibsondunn.com) Nicolas Autet – Paris (+33 1 56 43 13 00, nautet@gibsondunn.com) Attila Borsos – Brussels (+32 2 554 72 10, aborsos@gibsondunn.com) Patrick Doris – London (+44 (0)207 071 4276, pdoris@gibsondunn.com) Sacha Harber-Kelly – London (+44 20 7071 4205, sharber-kelly@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Steve Melrose – London (+44 (0)20 7071 4219, smelrose@gibsondunn.com) Benno Schwarz – Munich (+49 89 189 33 110, bschwarz@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Richard W. Roeder – Munich (+49 89 189 33-160, rroeder@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

June 7, 2019 |
Should Consumer Data Privacy Laws Apply To The Gov’t?

Palo Alto partner Mark Lyon and associates Cassandra Gaedt-Sheckter and Arjun Rangarajan are the authors of “Should Consumer Data Privacy Laws Apply To The Gov’t?” [PDF] published by Law360 on June 7, 2019.

April 30, 2019 |
California Consumer Privacy Act Update — California State Committees Vote on Amendments

Click for PDF In the last two weeks, California legislative committees voted on several amendments to the California Consumer Privacy Act (CCPA), which is due to go into effect January 1, 2020.  While each proposal requires additional approvals, including full Assembly and Senate votes, the committees’ determinations provide an important development in the ongoing roll-out of the CCPA, what it will ultimately require, and how to address compliance. The California Assembly’s Privacy and Consumer Protection Committee approved amendments that included narrowing the scope of personal information, and effectively exempting employee-related information from coverage under the Act.  In addition, the Senate Appropriations Committee unanimously approved S.B. 561 yesterday,[1] which would expand the private right of action against entities that violate the CCPA, and is supported by Attorney General Xavier Becerra.[2]  These amendments, and any other legislative amendments or clarifications, will be further supplemented by the Attorney General Office’s promulgation of regulations, still anticipated to be issued for public comment by Fall 2019. The following is a summary of each of the amendments voted on in the past week, and a chart exhibiting the key changes to the existing language of the CCPA.  As always, we will continue to monitor these important updates. Senate The Senate Judiciary Committee and the Senate Appropriations Committee both voted this month to augment the private right of action for violations of the CCPA with S.B. 561.  Under the current version of the CCPA, consumers only have a private right of action for certain unauthorized disclosures of their data. S.B. 561 would permit a private right of action for any violation of the CCPA, broadly expanding the potential exposure businesses may face.  The bill further removes the 30-day cure period for violations before claims can be brought by the Attorney General.  Finally, the amendment removes the provision permitting businesses and third parties to seek guidance directly from the Attorney General, replacing it with a statement that the Attorney General may publish materials to provide general guidance on compliance. Assembly Several bills in the Assembly also continued to gain traction with a positive vote from the California Assembly’s Privacy and Consumer Protection Committee: A.B. 25 redefines “consumer” to exclude employees, contractors, agents, and job applicants, so long as their personal information is only collected and used by the business in that context; A.B. 873 modifies the definition of “personal information” to narrow its scope—including by removing information relating to a household, and information “capable of being associated with” a consumer—and also redefines “deidentified” data; A.B. 1564 would require businesses to make available to consumers a toll-free telephone number or an email address for submitting requests, and require businesses with websites to make those website addresses available to consumers to submit requests for information; A.B. 846 would modify the way businesses can offer financial incentive plans to consumers in exchange for their data; A.B. 1146 would exempt vehicle and ownership data collected by automotive dealers and shared with the manufacturers of the vehicle sold if the vehicle information is shared pursuant to, or in anticipation of, a vehicle repair relating to a warranty or recall; and A.B. 981 would exempt certain insurance institutions subject to the Insurance Information and Privacy Protection Act (IIPPA) from the CCPA, and would incorporate certain disclosure and other privacy requirements into the IIPPA to be in line with the CCPA. Notably, a proposal to revoke and revamp the CCPA, A.B. 1760—which would have required obtaining opt-in consent from consumers before sharing (not just selling) personal information, and would have generally broadened consumers’ rights under the Act—was taken off hearing, and will not move forward, at least at this time. Potential Impact of the Amendments on Businesses Arguably the most important changes to the CCPA for businesses interacting with California consumers are the proposed amendments set out in S.B. 561; expanding the private right of action to any violations of the Act has the potential to significantly increase the number of suits brought by individuals, including data privacy class actions, and magnify the resulting financial impact of the Act businesses interacting with state residents.  As before, in anticipation of this potential amendment, it is important for businesses to work now to analyze steps necessary to ensure compliance with the various provisions likely to go into effect, including as discussed in our previous client alerts (California Consumer Privacy Act of 2018 (July 2018) and New California Security of Connected Devices Law and CCPA Amendments (October 2018)).  In general, businesses should ensure that they understand the type, nature, and scope of consumer data they have collected, including where it is stored; create the processes to comply with the disclosure and other, technically difficult rights (including a Do Not Sell opt-out link on their website, and a request verification and disclosure process); revise service provider agreements for compliance; and review their privacy policies, both internal and public, to ensure that they are properly disclosing how personal data is collected, used, and potentially shared with third parties. Certain of the proposed Assembly bill amendments, on the other hand, may serve to narrow the impact on businesses, particularly related to the scope of personal information at issue.  The modifications in A.B. 25, clarifying that the CCPA is not intended to cover employees’ data, could minimize the impact on companies that generally do not collect California residents’ personal information other than as a result of being an employer of Californians, and also minimize logistical issues that would otherwise arise if businesses have to allow employees to exercise the rights afforded by the Act.  Rather, it would shift the impact of the CCPA primarily to those businesses that rely on collecting data as a part of their business model. The scope of personal information would be further narrowed if A.B. 873 passes, as it may eliminate some of the broader reaching—and more confusing—applications of CCPA, to household data and data that is “capable of being associated with” a consumer.  The remaining language focuses on information that is linked directly, or indirectly to a particular consumer.  This will also clarify some concern expressed at multiple public forums on the CCPA, regarding how verifications for data requests should work when the individual is requesting household data. A.B. 873 also redefines “deidentified,” and while several of the same guardrails would exist, the new definition would specifically require (1) contractual prohibitions on recipients of data to not reidentify such deidentified personal information, and (2) a public commitment to not reidentify the data, which may require certain internal and third party contract provision revisions, and suggested modifications to the language in consumer-facing privacy policies.  As a result, it may be important for businesses to re-evaluate their contracts with suppliers, distributors, and contractors to ensure compliance for any use of deidentified data. Logistically, A.B. 1564 would offer businesses some relief from providing a toll-free telephone number for requests related to the Act, offering instead an option of an email address or a telephone number, and a website address for consumers to access.  While many businesses may have already included an email address for compliance with related laws, instituting a telephone number for such requests may impose additional logistical issues for businesses under the current text of the law. Finally, for entities offering customer loyalty programs, the new provisions of A.B. 846—replacing the financial incentive provisions—will require particular attention, if passed.  Primarily, businesses will need to ensure the offerings and their value must be “reasonably” related to the value of the data collected, though there may be latitude on what incentives are possible. Comparison of Proposed Language to Original The following chart provides a comparison of what would be key changes to the language of the CCPA as a result of the more broadly applicable amendments currently moving through the California legislature.  The language crossed out in the Original Language column indicates what has been deleted from the current language of the Act, while the bolded language in the Proposed Amendment column shows what language has been added.  That column contains what would be the final text if these amendments are adopted.  We will continue to monitor the progress of these amendments, and will provide updates, accordingly.[3] Concept Original Language Proposed Amendment Introducing Private Right of Action for Any Violation of the Act (S.B. 561) (a) (1) Any consumer whose nonencrypted or nonredacted personal information, . . .  is subject to an unauthorized access . . . may institute a civil action for any of the following . . . (a) (1) Any consumer whose rights under this title are violated, or whose  nonencrypted or nonredacted personal information . . . is subject to an unauthorized access . . . may institute a civil action for any of the following Excluding Employees from the Definition of Consumer (A.B. 25) (g) “Consumer” means a natural person who is a California resident . . . (g) (1) “Consumer” means a natural person who is a California resident . . . (g) (2) “Consumer” does not include a natural person whose personal information has been collected by a business in the course of a person acting as a job applicant to, an employee of, a contractor of, an agent on behalf of the business, to the extent the person’s personal information is collected and used solely  within the context of the person’s role as a job applicant to, an employee of, a contractor of, or an agent on behalf of the business. Redefining Deidentified (A.B. 873) “Deidentified” means information that cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer, provided that a business that uses deidentified information: (1)  Has implemented technical safeguards that prohibit reidentification of the consumer to whom the information may pertain. (2)  Has implemented business processes that specifically prohibit reidentification of the information. (3)  Has implemented business processes to prevent inadvertent release of deidentified information. (4)  Makes no attempt to reidentify the information. “Deidentified” means information that does not reasonably identify or link, directly or indirectly, to a particular consumer, provided that the business makes no attempt to reidentify the information, and takes reasonable technical and administrative measures designed to: (1)  Ensure that the data is deidentified. (2)  Publicly commit to maintain and use the data in a deidentified form. (3)  Contractually prohibit recipients of the data from trying to reidentify the data. Excluding Household and Information “capable of being associated with” from the Definition of “Personal Information” (A.B. 873) “Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household. “Personal information” means information that identifies, relates to, describes, or could reasonably be linked, directly or indirectly, with a particular consumer. Personal information may include, but is not limited to, the following if it identifies, relates to, describes, or could be reasonably linked, directly or indirectly, with a particular consumer. Prescribing Methods of Contacting Businesses (A.B. 1564) (1) Make available to consumers two or more designated methods for submitting requests for information required to be disclosed pursuant to Sections 1798.110 and 1798.115, including, at a minimum, a toll-free telephone number, and if the business maintains an Internet Web site, a Web site address. (1) (A) Make available to consumers a toll-free telephone number or an email address for submitting requests for information required to be disclosed pursuant to Sections 1798.110 and 1798.115. (B)  If the business maintains an internet website, make the internet website available to consumers to submit requests for information required to be disclosed pursuant to Sections 1798.110 and 1798.115. Clarifying Non-discrimination Provision re Financial Incentives: Removing in Favor of Customer Loyalty Programs (A.B. 846)   (a)  (1)  A business shall not discriminate against a consumer because the consumer exercised any of the consumer’s rights under this title, including, but not limited to, by: … (B)  Charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties. (C)  Providing a different level or quality of goods or services to the consumer. (2) Nothing in this subdivision prohibits a business from charging a consumer a different price or rate, or from providing a different level or quality of goods or services to the consumer, if that difference is reasonably related to the value provided to the consumer by the consumer’s data. (b) (1) A business may offer financial incentives, including payments to consumers as compensation, for the collection of personal information, the sale of personal information, or the deletion of personal information. A business may also offer a different price, rate, level, or quality of goods or services to the consumer if that price or difference is directly related to the value provided to the consumer by the consumer’s data. (2) A business that offers any financial incentives pursuant to subdivision (a), shall notify consumers of the financial incentives pursuant to Section 1798.135. (3) A business may enter a consumer into a financial incentive program only if the consumer gives the business prior opt-in consent pursuant to Section 1798.135 which clearly describes the material terms of the financial incentive program, and which may be revoked by the consumer at any time. (4) A business shall not use financial incentive practices that are unjust, unreasonable, coercive, or usurious in nature. (a)  (1)  A business shall not discriminate against a consumer because the consumer exercised any of the consumer’s rights under this title, including, but not limited to, by: … (B)  Charging higher prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties. (C)  Providing a lower level or quality of goods or services to the consumer. (2) Nothing in this subdivision prohibits a business from offering a different price, rate, level, or quality of goods or services to a consumer, including offering its goods or services for no fee, if any of the following are true: (A)  The offering is in connection with a consumer’s voluntary participation in a loyalty, rewards, premium features, discount, or club card program. (B)  That difference is reasonably related to the value provided by the consumer’s data. (C)  The offering is for a specific good or service whose functionality is reasonably related to the collection, use, or sale of the consumer’s data. (b)  As used in this section, “loyalty, rewards, premium features, discount, or club card program” includes an offering to one or more consumers of lower prices or rates for goods or services or a higher level or quality of goods or services, including through the use of discounts or other benefits, or a program through which consumers earn points, rewards, credits, incentives, gift cards, or certificates, coupons, or access to sales or discounts on a priority or exclusive basis.    [1]   Although approved unanimously, S.B. 561 was placed on Suspense File, where the committee sends bills with an annual cost of more than $150,000, to be considered following budget discussions.  The bill will not move forward until the Appropriations Committee releases it for a vote.    [2]   The Senate Judiciary Committee had previously approved the bill 6-2 on April 9, 2019.    [3]   Please note that the following chart does not include language modifications to the IIPPA (A.B. 981) or proposed amendments exempting information shared between automotive dealers and vehicle manufacturers (A.B. 1146), as they are of more limited application than the more general provisions that were included. If you have questions about those particular provisions, please reach out to discuss with us and we would be happy to provide further guidance. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments.  Please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Privacy, Cybersecurity and Consumer Protection practice group, or the authors: H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Cassandra L. Gaedt-Sheckter – Palo Alto (+1 650-849-5203, cgaedt-sheckter@gibsondunn.com) Maya Ziv – Palo Alto (+1 650-849-5336, mziv@gibsondunn.com) Privacy, Cybersecurity and Consumer Protection Group: United States Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Christopher Chorba – Los Angeles (+1 213-229-7396, cchorba@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, smehra@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Europe Ahmed Baladi – Co-Chair, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)207071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Bernard Grinspan – Paris (+33 (0)1 56 43 13 00, bgrinspan@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, jrobe@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Nicolas Autet – Paris (+33 (0)1 56 43 13 00, nautet@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

April 25, 2019 |
Gibson Dunn Earns 79 Top-Tier Rankings in Chambers USA 2019

In its 2019 edition, Chambers USA: America’s Leading Lawyers for Business awarded Gibson Dunn 79 first-tier rankings, of which 27 were firm practice group rankings and 52 were individual lawyer rankings. Overall, the firm earned 276 rankings – 80 firm practice group rankings and 196 individual lawyer rankings. Gibson Dunn earned top-tier rankings in the following practice group categories: National – Antitrust National – Antitrust: Cartel National – Appellate Law National – Corporate Crime & Investigations National – FCPA National – Outsourcing National – Real Estate National – Retail National – Securities: Regulation CA – Antitrust CA – Environment CA – IT & Outsourcing CA – Litigation: Appellate CA – Litigation: General Commercial CA – Litigation: Securities CA – Litigation: White-Collar Crime & Government Investigations CA – Real Estate: Southern California CO – Litigation: White-Collar Crime & Government Investigations CO – Natural Resources & Energy DC – Corporate/M&A & Private Equity DC – Labor & Employment DC – Litigation: General Commercial DC – Litigation: White-Collar Crime & Government Investigations NY – Litigation: General Commercial: The Elite NY – Media & Entertainment: Litigation NY – Technology & Outsourcing TX – Antitrust This year, 155 Gibson Dunn attorneys were identified as leading lawyers in their respective practice areas, with some ranked in more than one category. The following lawyers achieved top-tier rankings:  D. Jarrett Arp, Theodore Boutrous, Jessica Brown, Jeffrey Chapman, Linda Curtis, Michael Darden, William Dawson, Patrick Dennis, Mark Director, Scott Edelman, Miguel Estrada, Stephen Fackler, Sean Feller, Eric Feuerstein, Amy Forbes, Stephen Glover, Richard Grime, Daniel Kolkey, Brian Lane, Jonathan Layne, Karen Manos, Randy Mastro, Cromwell Montgomery, Daniel Mummery, Stephen Nordahl, Theodore Olson, Richard Parker, William Peters, Tomer Pinkusiewicz, Sean Royall, Eugene Scalia, Jesse Sharf, Orin Snyder, George Stamas, Beau Stark, Charles Stevens, Daniel Swanson, Steven Talley, Helgi Walker, Robert Walters, F. Joseph Warin and Debra Wong Yang.

January 29, 2019 |
Illinois Supreme Court Finds BIPA Violations Actionable, Even With No “Actual Injury”

Click for PDF On January 25, 2019, in Rosenbach v. Six Flags Entertainment Corporation, the Illinois Supreme Court unanimously held that a plaintiff may be “aggrieved” under Illinois’ Biometric Information Privacy Act (“BIPA”)—with statutory standing to sue for significant statutory damages—even without alleging an “actual injury” caused by the BIPA violation.[1]  In so holding, the Court reversed the appellate court’s contrary conclusion and—at least for now—appears to have put to rest one outstanding question in several federal and state court proceedings regarding the scope and availability of BIPA’s private right of action.  The Court’s decision is likely to lead to an increase in BIPA litigation in Illinois.  Other states, including Texas and Washington, have biometric privacy statutes,[2] but the Illinois law is the only one that allows for a private right of action. BIPA Background  Illinois enacted BIPA in 2008 in response to the increasing use of “biometric-facilitated financial transactions” in Illinois.  BIPA regulates the “collection, use, safeguarding, handling, storage, retention, and destruction of biometric identifiers and information,” including retina or iris scans, fingerprints, voiceprints, and scans of hand or face geometry.[3]  Among other requirements, BIPA requires private entities to develop and follow a written, publicly-available policy for the retention and destruction of biometric identifiers, and to provide certain disclosures in writing and obtain a release before acquiring an individual’s biometric identifier or information.[4] Persons “aggrieved by a violation” of BIPA have a private right of action under the statute and may sue for statutory remedies, including the greater of actual or liquidated damages of $1,000 (for negligent violations) or $5,000 (for intentional or reckless violations).[5] BIPA’s private right of action has energized the Illinois plaintiffs’ bar, which in the last few years has  filed dozens of proposed class action lawsuits against companies for their allegedly improper collection of alleged biometric information.  Plaintiffs in these cases have generally fallen into two categories: (1) employees of companies that allegedly utilize biometric information, such as fingerprints, for time keeping purposes; and (2) customers of companies that use alleged biometric information to enhance the consumer experience. The Rosenbach plaintiff fell into this second group.  Plaintiff Stacy Rosenbach—on behalf of her minor son, a customer of Six Flags Entertainment Corporation (“Six Flags”)—sued Six Flags after her son registered for a season pass at the amusement park.  Six Flags allegedly captured the thumbprints of season pass holders to facilitate entry into the park and limit loss from the unauthorized use of passes by non-pass-holders.  In her suit against Six Flags, Rosenbach alleged that Six Flags violated BIPA by capturing her son’s thumbprint without first providing written notice, obtaining written consent, and publishing a policy explaining how her son’s thumbprint would be used, retained, and destroyed.[6]  She alleged no actual harm beyond the violation of BIPA’s requirements. The Issue in Rosenbach v. Six Flags The question presented to the Illinois Supreme Court was whether a plaintiff is “aggrieved” under BIPA, and thus potentially eligible for statutory remedies including liquidated damages, when the only injury she alleges is that the defendant collected her biometric identifiers or biometric information without providing the required disclosures and obtaining written consent as required by the Act.[7]  The Second District Appellate Court held that a “technical violation” of the statute, without more, did not render a plaintiff “aggrieved” under BIPA.  Specifically, the appellate court stated that “there must be an actual injury, adverse effect, or harm in order for the person to be ‘aggrieved,’” and a “technical violation” alone does not suffice.[8]  If a “violation” were “actionable” by itself, the appellate court concluded, that “would render the word ‘aggrieved’ superfluous.”[9] The Court’s Holding Reversed.  The Illinois Supreme Court held that a plaintiff is “aggrieved” under BIPA—and has statutory standing to sue—when the plaintiff alleges a violation of her BIPA rights, even if the violation caused no “actual injury or adverse effect.”[10]  In other words, the “[t]he violation, in itself, is sufficient to support the individual’s or customer’s statutory cause of action.”[11] The Court found that BIPA creates a substantive right to control one’s own biometric information.  No-injury BIPA violations are not merely “technicalities,” the Court held,  but “real and significant” harms to important rights created by the Illinois legislature.[12]  The Court also reasoned that the private right of action and remedies exist to prevent and deter violations of individuals’ BIPA rights.  Requiring would-be plaintiffs to wait to sue until they have suffered “actual injury” would defeat these purposes of the statute.[13] Because the Rosenbach plaintiff alleged violations of his BIPA rights—Six Flags allegedly collected his fingerprints for use in a season pass without providing the statutorily mandated notices or publishing a data retention policy—the Supreme Court reversed the appellate court’s contrary decision and remanded the case to the trial court. What to Expect Expect more class action litigation on BIPA claims from the Illinois plaintiffs’ bar.   Companies that do business in Illinois and collect or use biometric identifiers or biometric information should examine their policies for BIPA compliance. Biometric identifier is defined to mean “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.[14]  Writing samples, signatures, photographs, demographic data, physical descriptions, and biological samples used for scientific testing are not biometric identifiers.[15] Biometric information is any information “based on an individual’s biometric identifier used to identify an individual.”[16] BIPA Basics:  Private entities may not collect biometric information or identifiers (“biometrics”) without first:  (1) providing written notice of the collection that describes the purpose and terms of the collection and storage, and (2) obtaining written consent.[17] Private entities may not sell, rent, or disclose biometrics without prior written consent.[18] Private entities also must develop and make publicly available a data retention policy that sets forth a “retention schedule and guidelines for permanently destroying [biometrics] when the initial purpose for collecting or obtaining [them] has been satisfied or within 3 years of the individual’s last interaction with the private entity, whichever occurs first.”[19] Private entities must store and protect biometrics according to the reasonable standard of care of the entities’ industry and in a manner that is as protective or more protective than the manner in which the entity stores and protects other sensitive information.[20] Expect additional developments in the federal courts regarding whether BIPA plaintiffs have Article III standing.  Post-Rosenbach, BIPA plaintiffs need not allege an “actual injury” beyond the statutory violation to state a claim under the statute.  But to satisfy the Article III standing requirements necessary to pursue a claim in federal court, plaintiffs may need to allege more than a statutory violation.  To date, federal courts have been split on what type of injury, short of economic harm, may be sufficient to create Article III standing for BIPA plaintiffs.[21] Expect additional litigation over the scope of Illinois’ standing doctrine.  Amici for Six Flags urged the Illinois Supreme Court to consider an alternate ground for affirmance:  that Rosenbach lacked standing to sue under the Illinois constitution.  The Court did not address the issue.  In lieu of a statutory standing argument, more BIPA defendants may press a state constitutional standing argument in an effort to void plaintiffs’ claims. Look for additional changes in BIPA’s terms.  This year, the Illinois State Senate will consider a bill narrowing the impact of BIPA.[22] [1] 2019 IL 123186 (Ill. Jan. 25, 2019). [2] See Tex. Bus. & Com. Code § 503.001 et seq.; Wash. Rev. Code § 19.375.010 et seq. [3] 740 Ill. Comp. Stat. 14/5(a), (b), (g). [4] 740 Ill. Comp. Stat. 14/15(a), (b). [5] 740 Ill. Comp. Stat. 14/20. [6] Rosenbach, 2019 IL 123186 at ¶¶ 4-9. [7] Id. ¶ 14. [8] Rosenbach v. Six Flags Entm’t Corp., 2017 IL App (2d) 170317, at ¶ 20 (Ill. App. Ct. 2017). [9] Id. at ¶ 23. [10] Rosenbach, 2019 IL 123186 at ¶ 33. [11] Id. ¶ 33. [12] Id. ¶ 34. [13] Id. ¶ 37. [14] 740 Ill. Comp. Stat. 14/10. [15] Id. [16] Id. [17] 740 Ill. Comp. Stat. 14/15(b). [18] 740 Ill. Comp. Stat. 14/15(c). [19] 740 Ill. Comp. Stat. 14/15(a). [20] 740 Ill. Comp. Stat. 14/15(e). [21] Compare e.g., Monroy v. Shutterfly, 2017 WL 4099846, *8 n.5 (N.D. Ill. Sept. 15, 2017) (collection and violation of privacy interest create Article III standing for BIPA claimant) with Santana v. Take-Two Interactive Software, Inc., 717 F. App’x 12, 17 (2d Cir. 2017) (collection of biometrics without adequate notices creates no “risk of real harm” and therefore does not create Article III standing for BIPA claimant) and Rivera v. Google, Inc., No. 16-cv-02714, 2018 WL 6830332, at *6 (N.D. Ill. Dec. 29, 2018) (alleged privacy violation does not create Article III standing for BIPA claimant). [22] S.B. 3053, 2018 Reg. Sess. (Ill. 2018). Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues.  For further information, please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Privacy, Cybersecurity and Consumer Protection or Labor and Employment practice groups, or the authors: Jason C. Schwartz – Washington, D.C. (+1 202-955-8242, jschwartz@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Erin Morgan – Washington, D.C. (+1 202-887-3577, emorgan@gibsondunn.com) Please also feel free to contact any of the following practice group leaders and members: Privacy, Cybersecurity and Consumer Protection Group: Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Christopher Chorba – Los Angeles (+1 213-229-7396, cchorba@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, smehra@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Labor and Employment Group: Catherine A. Conway – Co-Chair, Los Angeles (+1 213-229-7822, cconway@gibsondunn.com) Jason C. Schwartz – Co-Chair, Washington, D.C. (+1 202-955-8242, jschwartz@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 29, 2019 |
International Cybersecurity and Data Privacy Outlook and Review – 2019

Click for PDF As every year, in honor of Data Privacy Day—an international effort to raise awareness and promote privacy and data protection best practices—we offered Gibson Dunn’s seventh annual Cybersecurity and Data Privacy Outlook and Review.  In addition to that U.S.-focused report, we again this year offer this  International Outlook and Review. Like many recent years, 2018 saw significant developments in the evolution of the data protection and cybersecurity landscape in the European Union (“EU”): Following the adoption and application of the General Data Protection Regulation governing the collection, processing and transfer of personal data in 2016 (“GDPR”),[1] the EU’s main privacy body took office, in the form of the European Data Protection Board (“EDPB”).  The EDPB and its predecessor, the Article 29 Working Party Group (“WP29”), issued a number of guidance documents throughout 2018 for the interpretation and application of the GDPR. Furthermore, several EU Member States continued to adapt their national legal frameworks, and started to apply these laws and the GDPR, since the GDPR’s date of application on 25 May 2018. The Council of the EU, which represents the governments and administrations of the EU Member States, pursued its internal discussions regarding the adoption of an EU regulation with respect to private life and the protection of personal data in electronic communications, intended to repeal the currently applicable legal framework (“ePrivacy Regulation”). EU Member States continued to work on the transposition and application of the EU Directive on the security of network and information systems (“NIS Directive”). Several objections were raised by EU institutions and before EU supervisory authorities and courts regarding different frameworks for international data transfers (e.g., the EU-U.S. Privacy Shield, the European Commission’s Standard Contract Clauses). In addition to the EU, a number of different bills were introduced and passed into law in other jurisdictions around the globe, including in other local European jurisdictions, Asia-Pacific region, Canada and Latin America. We cover these topics and many more in this year’s International Cybersecurity and Data Privacy Outlook and Review. While we do not attempt to address every development that occurred in 2018, this Review focuses on a number of the most significant developments affecting companies as they navigate the evolving cybersecurity and privacy landscape. __________________________ Table of Contents I. European Union A.   EU GDPR: Its Main Elements, Implementation and Application 1.   GDPR 2.   Principal Elements of the GDPR 3.   Guidance Adopted by the Former WP29 and the Current EDPB 4.   National Data Protection Initiatives Implementing and Applying the GDPR 5.   GDPR cases, investigations and enforcement a)   Data breaches and investigations b)   GDPR investigations B.   International Transfers: Adequacy Declarations and Challenges 1.   Adequacy Declarations a)   Japan b)   South Korea 2.   Challenges to Data Transfer Systems a)   Challenges to Standard Contract Clauses b)   Challenges to the EU-U.S. Privacy Shield C.   EU Cybersecurity Directive 1.   Adoption and Implementation of the EU CybersSecurity Directive 2.   Documents and Guidance Issued by ENISA D.   Other EU Developments 1.   Reform of the ePrivacy Directive – the Draft EU ePrivacy Regulation a)   The European Commission’s ePrivacy Regulation Proposal b)   The WP29 Opinion on the European Commission Proposal c)   The European Parliament’s Amended Proposal d)   The Proposal of the Council of the EU 2.   CJEU Case Law a)   The Determination of the Applicable Law and the Relevant Data Controller in the Context of Social Networks b)   Claims Assignment II. Developments in Other European Jurisdictions: Switzerland, Turkey and Russia A.   Russia B.   Switzerland C.   Turkey D.   Ukraine III. Developments in Asia-Pacific A.   China B.   Singapore C.   India IV. Developments in Canada and in Latin America A.   Brazil B.   Canada C.   Other Jurisdictions: Argentina, Chile, Colombia, Mexico, Panamá and Uruguay __________________________ I.    European Union A.    EU GDPR: Its Main Elements, Implementation and Application 1.    GDPR On 25 May 2018, after a two-year “grace period” the GDPR became the main legislative act for the protection of personal data and privacy in the EU.  The GDPR replaces the EU Data Protection Directive [2] and constitutes a set of data protection rules that are directly applicable to the processing of personal data across EU Member States. 2.    Principal Elements of the GDPR As explained in the 2018 International Outlook and Review, the GDPR brought about a significant change in all aspects of the EU’s data protection regime, revamping the substantive provisions regarding data protection law compliance and further developing and integrating the application and enforcement aspects of it.  The core substantive elements of the GDPR include the following: Extraterritorial Scope:  The GDPR applies not only to data controllers established in the EU, but also to organizations that either offer goods or services to individuals located in the EU or monitor their behavior, even if these organizations are not established in the EU and do not process data using servers in the EU. [3]  On 23 November 2018, the EDPB published draft Guidelines on the territorial scope of the GDPR, which were subject to public consultation. [4] Transparency Principle:  Under the GDPR, transparency is a general requirement applicable to three central areas: (i) the provision of information to data subjects; (ii) the way data controllers communicate with data subjects in relation to their rights under the GDPR; and (iii) how data controllers allow and facilitate the exercise of their rights by data subjects.  In April 2018, the WP29 published its Guidelines on transparency, which emphasized the importance of providing data subjects with clear and full information, comprehensible to the average data subject, and made available in layers. [5] Consent of the Data Subjects:  The GDPR put emphasis on the notion of consent of data subjects by providing further clarification and specification of the requirements for obtaining and demonstrating valid consent.  In April 2018, the WP29 adopted Guidelines specifically dedicated to the concept of consent and focusing on the changes in this respect resulting from the GDPR. [6]  In these Guidelines, the WP29 emphasized the importance of consent being obtained freely, and questioned the relevance of “consent” as a legal basis for data processing where consumers are, in practice, obliged to provide their personal data to, for example, engage and receive a service. Right to Be Forgotten:  The GDPR further develops the “right to be forgotten” (formally known as the “right to erasure”), whereby personal data must be deleted when an individual no longer wants his or her data to be processed by a company and there are no legitimate reasons for retaining the data. [7]  This right was already introduced in the EU Data Protection Directive, and was the object of the litigation before the Court of Justice of the EU (“CJEU”) in Google Spain SL and Google Inc. v. AEPD and Mario Costeja González. [8] Among other points, the GDPR clarifies that this right is not absolute and will always be subject to the legitimate interests of the public, including the freedom of expression and historical and scientific research.  The GDPR also obliges controllers who have received a request for erasure to inform other controllers of such request in order to achieve the erasure of any links to or copy of the personal data involved.  This part of the GDPR may impose significant burdens on affected companies, as the creation of selective data destruction procedures often leads to significant costs. Data Breach Notification Obligation:  The GDPR requires data controllers to provide notice of serious security breaches to the competent supervisory authorities, also known as Data Protection Authority/ies (“DPA(s)”), without undue delay and, in any event, within 72 hours after becoming aware of any such breach.  The WP29 has issued Guidelines in order to explain the mandatory breach notification and communication requirements of the GDPR as well as some of the steps data controllers and data processors can take to meet these new obligations. [9] Profiling Activities:  The GDPR specifically addresses the use of profiling and other automated individual decision-making.  In February 2018, the WP29 issued Guidelines clarifying the provisions of the GDPR regarding profiling, in particular by defining in more detail what profiling is. [10]  . Data Protection Impact Assessment (“DPIA”):  Where processing activities are deemed likely to result in high risk to the rights and freedoms of data subjects, the GDPR requires that data controllers carry out, prior to the contemplated processing, an assessment of the impact thereof on the protection of personal data. [11]  However, the GDPR does not detail the specific criteria that needs to be taken into account to determine whether any given processing activities represent a “high risk”.  Instead, the GDPR only provides a non-exhaustive list of examples falling within this scope.  Similarly, no process for performing DPIAs is detailed in the GDPR.  Considering the need for additional information in this respect, the WP29 issued Guidelines in October 2017 intended to clarify which processing operations must be subject to DPIAs and how they should be carried out. [12] Privacy-Friendly Techniques and Practices:  “Privacy by design” is the idea that a product or service should be conceived from the outset to ensure a certain level of privacy for an individual’s data.  “Privacy by default” is the idea that a product or service’s default settings should help ensure privacy of individual’s data.  The GDPR establishes privacy by design and privacy by default as essential principles.  Accordingly, businesses should only process personal data to the extent necessary for their intended purposes and should not store it for longer than is necessary for those purposes.  These principles will require data controllers to design data protection safeguards into their products and services from the inception of the product development process. Data Portability:  The GDPR establishes a right to data portability, which is intended to make it easier for individuals to transfer personal data from one service provider to another. According to the WP29, as a matter of good practice, companies should develop the means that will contribute to answering data portability requests, such as download tools and Application Programming Interfaces.  Companies should guarantee that personal data is transmitted in a structured, commonly used and machine-readable format, and they should be encouraged to ensure the interoperability of the data format provided in the exercise of a data portability request.  In April 2017, the WP29 issued Guidelines on the right to data portability providing guidance on the way to interpret and implement the right to data portability introduced by the GDPR. [13] Competent Supervisory Authority:  To date, the monitoring of the application of EU data protection rules has fallen almost exclusively on the national DPAs.  With the adoption of the GDPR, a complex set of rules has been established to govern the applicability of the rules to data controllers that have cross-border processing practices. First, where a case relates only to an establishment of a data controller or processor in a Member State or substantially affects residents only in a Member State, the DPA of the Member State will have jurisdiction to deal with the case. [14] Second, in other cases concerning cross-border data processing, the DPA of the main establishment of the controller or processor within the EU will have jurisdiction to act as lead DPA for the cross-border processing of this controller or processor. [15] Articles 61 and 62 provide for mutual assistance and joint operations mechanisms, respectively, to ensure compliance with the GDPR.  Furthermore, the lead DPA will need to follow the cooperation mechanism provided in Article 60 with other DPAs “concerned”.  Ultimately, the EDPB (where all EU DPAs and the European Commission are represented) will have decision-making powers in case of disagreement among DPAs as to the outcome of specific investigations. [16] Third, the GDPR establishes an urgency procedure that any DPA can use to adopt time-barred measures regarding data processing in case of urgency.  These measures will only be applicable in the DPA’s own territory, pending a final decision by the EDPB. [17] In 2017, the WP29 issued Guidelines that aim to assist controllers and processors in the identification of their lead DPA. [18] Governance: Data controllers and processors may be required to designate a Data Protection Officer (“DPO”) in certain circumstances.  Small and medium-sized enterprises are exempted from the obligation to appoint a DPO insofar as data processing is not their core business activity.  In April 2017, the WP29 issued Guidelines that clarify the conditions for the designation, position and tasks of the DPO to ensure compliance with the GDPR. [19] These requirements will be supplemented by a much more rigid regime of fines for violations.  DPAs will be able to fine companies that do not comply with EU rules up to EUR 20 million or up to 4% of their global annual turnover, whichever is higher. 3.    Guidance Adopted by the Former WP29 and the Current EDPB As indicated above, the main EU data protection body under the now repealed EU Data Protection Directive—the WP29—has been replaced by the current EDPB, which took office on 25 May 2018. Both the WP29, until 25 May, and the EDPB, from 25 May onwards, have subjected to public consultation and adopted certain Guidelines on the interpretation and application of certain key provisions and aspects of the GDPR.  These Guidelines, some of which have been discussed in sub-section I.A.2 above, include the following: [20] GDPR applicability: EDPB Guidelines 3/2018 on the territorial scope of the GDPR (Article 3) – version for public consultation. Requirements to obtain valid consent: Guidelines on consent under Regulation 2016/679, WP259 rev.01. Information and transparency obligations: Guidelines on transparency under Regulation 2016/679, WP260 rev.01. Automated decision-making and profiling: Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, WP251 rev.01. Right to data portability: Guidelines on the right to data portability under Regulation 2016/679, WP242 rev.01. Data breach notification obligations: Guidelines on Personal data breach notification under Regulation 2016/679, WP250 rev.01. Data protection impact assessment : Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679, WP248 rev.01. Data Protection Officers : Guidelines on Data Protection Officers (“DPO”), WP243 rev.01. Derogations to maintain records of processing activities: Position Paper on the derogations from the obligation to maintain records of processing activities pursuant to Article 30(5) GDPR. Certification bodies and criteria: EDPB Guidelines 4/2018 on the accreditation of certification bodies under Article 43 of the General Data Protection Regulation (2016/679). EDPB Guidelines 1/2018 on certification and identifying certification criteria in accordance with Articles 42 and 43 of the Regulation 2016/679 – version for public consultation. Transfers of personal data outside the EU: EDPB Guidelines 2/2018 on derogations of Article 49 under Regulation 2016/679. Working Document Setting Forth a Co-Operation Procedure for the approval of “Binding Corporate Rules” for controllers and processors under the GDPR, WP 263 rev.01. Recommendation on the Standard Application for Approval of Controller Binding Corporate Rules for the Transfer of Personal Data, WP 264. Recommendation on the Standard Application form for Approval of Processor Binding Corporate Rules for the Transfer of Personal Data, WP 265. Working Document setting up a table with the elements and principles to be found in Binding Corporate Rules, WP 256 rev.01. Working Document setting up a table with the elements and principles to be found in Processor Binding Corporate Rules, WP 257 rev.01. Adequacy Referential, WP 254 rev.01. Identification of the lead DPA: Guidelines for identifying a controller or processor’s lead supervisory authority, WP244 rev.01. Fines and penalties imposed by DPAs: Guidelines on the application and setting of administrative fines for the purposes of the Regulation 2016/679, WP 253. 4.    National Data Protection Initiatives Implementing and Applying the GDPR Because the GDPR is a regulation, there is no need for EU Member States to transpose its provisions in order to render them applicable within their national legal systems.  However, some Member States nonetheless have adapted their legal frameworks regarding data protection in light of the GDPR. The GDPR contains provisions granting flexibility to the Member States to implement such adaptations.  For example, Article 8 of the GDPR provides specific rules regarding the processing of personal data of children below the age of 16.  Nevertheless, Member States may provide by law for a lower age provided it is not below 13 years.  Article 88 of the GDPR also enables Member States to set out more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context. Below is an overview of the national data protection reforms implemented throughout the EU during 2018:   Member State National Data Protection Law Adopted Austria Federal Act on the Protection of Individuals with regard to the Processing of Personal Data (the “Data Protection Act” (DSG)), BGBI. I No. 165/1999), of 17 August 1999. Belgium –        Law on the creation of the Data Protection Authority, of 3 December 2017 (the “Institutional Law”). –        Law on the protection of natural persons with regard to the processing of personal data, of 30 July 2018 (the “Substantive Law”). –        Law on economic matters which introduces collective redress action, of 30 July 2018 (the “Collective Redress Law”). –        Law on the installation and use of cameras, of 21 March 2018 (the “Camera Law”) [modifying the Law of 21 March 2017]. –        Law on the creation of an Information Security Committee, of 5 September 2018 (the “Information Security Law”). Bulgaria On 30 April 2018, a draft law was introduced for public consultation, amending and supplementing the Personal Data Protection Act of 4 January 2002.  Public consultations ended on 30 May 2018, and the draft law submitted to the Parliament, where it is subject to further amendments. Croatia Act on the Implementation of the General Data Protection Regulation, of 27 April 2018. Cyprus Law on the Protection of Physical Persons Against the Processing of Personal Data and Free Movement of such Data, Law 125(I)/2018. Czech Republic Draft of the new Data Protection Act (the “DPA”), intended to adapt the current national legal framework to the GDPR. The DPA is in the legislative process, currently in the second reading in the Chamber of Deputies (lower chamber of the Czech Parliament). The DPA is expected to replace the current act on data protection. Denmark Danish Act on Data Protection, of 17 May 2018. Estonia –        Personal Data Protection Act (the “PDPA”), of 12 December 2018. –        Personal Data Protection Implementation Act. Finland Data Protection Act of Finland, which entered into force on 1 January 2018. Some minor amendments will be made to the Working Life Act (which aims to promote the protection of privacy and other rights safeguarding the privacy in working life) and a Government Proposal regarding these amendments has been given in July 2018. The amendments have not yet been passed, but the objective is that the amended act shall enter into force as soon as possible. France –        Ordinance No. 2018-1125, of 12 December 2018 –        Law No. 2018-493 on the protection of personal data, of 20 June 2018. –        Decree No. 2018-687, of 1 August 2018. Germany German Federal Data Protection Act, of 5 July 2017. Greece Greece has not yet issued a national law implementing the GDPR. On 5 March 2018, a public consultation on the new law was completed; however, the draft has not yet been submitted to the Greek Parliament. Hungary Amendment to the Act CXII of 2011 on the Right of Informational Self-Determination and on Freedom of Information. Ireland Data Protection Act 2018, of 24 May 2018. Italy –        Law No. 163, of 6 November 2017, adopting specific provisions with respect to the GDPR. –        Legislative Decree 101/2018, of 10 August 2018. Latvia Personal Data Processing Law, of 21 June 2018. Lithuania Law on Legal Protection of Personal Data, of 16 July 2018. Luxembourg Law on the organization of the National Data Protection Commission (“CNPD”), of 1 August 2018. Malta Data Protection Act 2018 (Chapter 586 of the Laws of Malta), of 28 May 2018, and the Regulations issued under it. Netherlands Dutch GDPR Implementation Act, of 16 May 2018. Poland Personal Data Protection Act, of 24 May 2018. Portugal On 26 March 2018, the Portuguese government published a Draft Law (the “Draft”) for the implementation of the GDPR and associated national derogations. On 3 May 2018, the Draft was submitted to the Portuguese Parliament for discussion and is currently being studied by a special group of the Portuguese Parliament. The applicable law is still Law no. 67/98, of 26 October (as amended by Law 103/2015, of 24 August) on personal data protection. Romania Law no. 190/2018 on the measures for the application of the GDPR. Slovakia –        Act No. 18/2018 Coll. on the Protection of Personal Data which implements the GDPR was adopted by the Slovak Parliament on 29 November 2017. It was published in the Collection of Laws on 30 January 2018, and entered into force on 25 May 2018. –        The Decree of the Office for Personal Data Protection no. 158/2018 Coll. on Data Protection Impact Assessment Procedure. Slovenia The new Slovenian Data Protection Act (the “ZVOP-2”) is currently in the legislative pipeline, and it will repeal the current Data Protection Act (the “ZVOP-1”). Spain Organic Law 3/2018 on the protection of personal data and guarantee of digital rights, of 5 December 2018. Sweden Data Protection Act (2018:218) with its complementary provisions (2018:19), of 19 April 2018. United Kingdom Data Protection Act 2018, of 23 May 2018. 5.    GDPR cases, investigations and enforcement In the course of 2018, EU data protection authorities continued their enforcement action against companies and organizations for violations of their pre-GDPR legal regimes (i.e., under the EU Data Protection Directive).  Furthermore, soon after the GDPR became applicable and Member States adapted their legal frameworks regarding data protection in light of the GDPR, investigations regarding data breaches and potential infringements of the GDPR rules started to be conducted.  The most significant cases are set out below. a)    Data breaches and investigations In the UK, the Information Commissioner’s Office (“ICO”) has been particularly active in the investigation of unauthorized or illegal accesses or loss of personal data. In early 2017, a number of media reports in The Observer newspaper claimed that a data analytics service had worked for the Leave.EU campaign during the EU referendum, providing data services that supported micro-targeting of voters. In March 2017, the ICO announced that it would begin a review of evidence as to the potential risks arising from the use of data analytics in the political process. Following that review of the available evidence, the ICO announced in May 2017 that a broader formal investigation into the use of data analytics in political campaigns would be launched, in order to ascertain if there had been any misuse of personal data and breaches of data protection law by the campaigns, on both sides, during the referendum. In addition to the potential links between this data analytics organization and Leave.EU, which gave rise to the investigation, the ICO later found further lines of enquiry covering 30 organizations. According to an official investigation update, the investigation is considering both regulatory and criminal issues, namely failure to properly comply with the Data Protection Principles, failure to properly comply with the Privacy and Electronic Communications Regulations and potential offences under the Data Protection Act 1998. [21] So far, although the investigation is still ongoing, the ICO has issued one of the organizations involved with a monetary penalty in the sum GBP 500,000 for lack of transparency and security issues relating to the collection, processing and storage of data, constituting breaches of the first and seventh data protection principles under the Data Protection Act 1998. [22] In November 2018, the ICO also announced it was investigating an international hotel management company after a data breach had been brought to its attention. According to public sources, personal data including credit card details, passport numbers and the dates of birth of up to 300 million people had been stolen in a cyber-attack to the parent company of the international hotel management company. [25] In France, the company “Optical center” was fined EUR 250,000 by the French National Data Protection Commission (“CNIL”) for failing to secure its website. Through its website it was possible to access hundreds of customer invoices, containing health data and, in some cases, the social security number of the data subjects concerned. This case is one of the highest sanctions ever pronounced by the CNIL before the GDPR came into force and illustrates the seriousness with which the CNIL is approaching data protection and data breach violations. In another matter from before the application of the GDPR, in Hungary, the Hungarian regulator imposed a fine of up to HUF 20 million (approx. EUR 62,000, being the maximum fine under the Hungarian Act implementing the EU Data Protection Directive) on the Hungarian Church of Scientology for serious breaches of the local Data Protection Act. b)    GDPR investigations In addition to the cases mentioned above, GDPR investigations have also proliferated in most of the Member States based on facts occurring and being brought to the attention of supervisory authorities after 25 May 2018. On 25 and 28 May 2018, in France the CNIL received group complaints from the associations None Of Your Business and La Quadrature du Net. In these complaints, the associations complained against Google LLC for not having a valid legal basis to process the personal data of the users of its services, particularly for the purposes of customizing and delivering targeted ads. After an investigation period and on the basis of online inspections conducted, CNIL stated that in this context two types of GDPR breaches had occurred, namely a breach of transparency and information obligations; and a violation of the obligation to have a legal basis for customizing and delivering targeted ads. On these grounds, the CNIL imposed a financial penalty of EUR 50 million to Google LLC on 21 January 2019. [26] In particular, the CNIL considered that Google users were not able to fully understand the scope of the processing operations carried out by Google LLC and that the purposes of these processing operations were described in a too generic and vague manner. Similarly, the information communicated was considered to be not clear enough so that the user can understand that the legal basis of processing operations for the ad targeting is consent, and not the legitimate interest of the company. Finally, the CNIL noticed that information on data retention periods was not provided for some categories of data. [27] In Ireland, an online news and social networking service is currently being investigated by Irish privacy authorities over its refusal to give a user information about how it tracks users when they click on links posted on the service. The company refused to disclose the data it recorded when a user clicked on links in other people’s links, claiming that providing this information would take a disproportionate effort. In December 2018, the Irish Data Protection Commission opened a statutory inquiry into the company’s compliance with the relevant provisions of the GDPR following receipt of a number of breach notifications from the company since the introduction of the GDPR. [28] B.    International Transfers: Adequacy Declarations and Challenges 1.    Adequacy Declarations Both under the former EU Data Protection Directive and the current GDPR, transfers of personal data outside of the EU are generally prohibited unless, inter alia, the European Commission formally concludes that the legislation of the country of destination of the data protects it adequately.  Thus far, the European Commission has only recognized the following countries to provide adequate protection to personal data: Andorra, Argentina, Canada (commercial organizations), Faroe Islands, Guernsey, Israel, Isle of Man, Jersey, New Zealand, Switzerland, Uruguay and the U.S. (limited to the EU-U.S. Privacy Shield framework). [29] In the course of 2018, adequacy talks have proceeded with regard to two major Asian economies: Japan and South Korea. a)    Japan With regard to Japan, negotiations with the EU on the finding of reciprocal adequacy took place in the course of the last years, and ended on 17 July 2018.  Upon the conclusion of these negotiations, the EU and Japan both agreed to recognize each other’s regimes for the protection of personal data as being adequate, thereby enabling safe transfers of personal data between the EU and Japan.  This arrangement is meant to complement the EU-Japan Economic Partnership Agreement, [30] enabling European and Japanese companies to benefit from free data flows, as well as from privileged access to approximately 650 million European and Japanese consumers. On 5 September 2018, the European Commission formally launched the procedure for the finding of adequacy of the data protection regime in Japan. [31]  In issuing its draft adequacy decision to cover transfers of personal data to Japan, the European Commission highlighted the following commitments that Japan made to improve the protection of EU personal data: Japan committed to adopt a set of rules, providing individuals in the EU whose personal data are transferred to Japan with additional safeguards that will bridge several differences between the data protection systems of both jurisdictions.  These additional safeguards will strengthen, for example, the protection of sensitive data, the conditions under which EU data can be further transferred from Japan to another third country, and the exercise of individual rights to access and rectification.  These rules will be binding on Japanese companies importing data from the EU, and they will be enforceable by the Japanese independent data protection authority and by Japanese courts. The Japanese government also gave assurances to the EU regarding safeguards concerning the access of Japanese public authorities for criminal law enforcement and national security purposes, ensuring that any use of personal data would be limited to what is necessary and proportionate, and subject to independent oversight and effective redress mechanisms. Japan committed to implement a complaint-handling mechanism to investigate and resolve complaints from Europeans regarding access to their data by Japanese public authorities.  This new mechanism will be administered and supervised by the Japanese independent data protection authority.  On 5 December 2018, the EDPB issued its opinion on the draft adequacy decision prepared by the European Commission with regard to Japan. [32]  Although the EDPB praised the efforts of the European Commission to reach an understanding with the Japanese government, a number of outstanding points were identified as being crucial for the finding of adequacy of the Japanese data protection regime.  In particular: The EDPB remarked that the system for the monitoring of the new architecture of adequacy, which combines the existing Japanese legal framework with specific Supplementary Rules applicable to EU personal data, will pose certain challenges to ensure compliance by Japanese entities and enforcement by the Personal Information Protection Commission (“PPC”). The EDPB raised some concerns regarding the possibility of onward transfers of EU data from Japan to third countries that are only subject to a Japanese adequacy decision, but not to an adequacy decision from the European Commission. The EDPB also expressed some concerns in relation to the consent and transparency obligations of data controllers.  As opposed to EU data protection law, the use of consent as a basis for the processing and transfer of personal data has a central role in the Japanese legal system.  Some inconsistencies in the definition of consent under EU and Japanese law, such as the existence of “free” consent or the introduction of the right to withdraw consent, could be interpreted to cast doubt on data subjects’ ability to genuine control over their personal data. The EDPB raised some questions regarding the availability and accessibility of the “helpline” of the Japanese data protection authority for EU data subjects.  Certain important documentation is only available in the Japanese-language version of official websites, if at all, which will raise challenges in the reliance of EU data subjects on Japanese data protection regulations. In addition to the opinion issued by the EDPB, the draft adequacy decision will be subject to the following procedure: Consultation of a committee composed of representatives of the Member States (comitology procedure); Update of the European Parliament Committee on Civil Liberties, Justice and Home Affairs; Adoption of the adequacy decision by the College of Commissioners. b)    South Korea Negotiations between the EU and South Korea authorities occurred in the course of 2018 with a view to adopting an adequacy decision.  Although the negotiations remained confidential so far, it has been reported that the main concerns of the EU authorities are relating to the independence and powers of the South Korean data protection authority. [33]  While the Personal Information Protection Act of 2011 created a Personal Information Protection Commission, the independence of this body, which lacks enforcement powers, has been questioned.  The South Korean Homeland and Security Ministry is tasked with the enforcement of the Personal Information Protection Act. On 15 November 2018, some amendments to the Personal Information Protection Act were submitted to the South Korean National Assembly, in order to grant enforcement power and functions to the Personal Information Protection Commission. 2.    Challenges to Data Transfer Systems a)    Challenges to Standard Contract Clauses As noted in the 2018 International Outlook and Review, on 3 October 2017, the Irish High Court referred the issue of the validity of the standard contractual clauses decisions to the CJEU for a preliminary ruling. [34]  The proceedings before the EU are still ongoing, and a ruling is expected in 2019 or 2020. If the CJEU decides to invalidate the standard contractual clauses, this ruling would, in all likelihood, have a tremendous impact on businesses around the world, many of which relying on these legal guarantees to ensure an adequate level of data protection to data transfers outside the EU. b)    Challenges to the EU-U.S. Privacy Shield On 12 July 2016, the European Commission formally approved the EU-U.S. Privacy Shield, a framework for navigating the transatlantic transfer of data from the EU to the United States.  The Privacy Shield replaced the EU-U.S. Safe Harbor framework, which was invalidated by the CJEU on 6 October 2015 in the case Maximilian Schrems v. Data Protection Commissioner. [35]  We provided an in-depth explanation of the Privacy Shield and a discussion of the Schrems decision in the 2018 International Outlook and Review. Since the adoption of the Privacy Shield program in 2016, approximately 4,000 companies have adhered to the Privacy Shield framework, making legally enforceable commitments to comply with the Privacy Shield rules and principles.  However, the success of the Privacy Shield has not sheltered it from certain challenges that have been directed from politicians, DPAs and individuals across Europe. On 16 September 2016, Digital Rights Ireland Ltd., an organization that had been successful in obtaining the repeal of other EU legislation concerning personal data, [36] brought an action against the European Commission decision approving the EU-U.S. Privacy Shield.  On 22 November 2017, the CJEU declared the action inadmissible, thereby giving some relief to the companies relying on this framework to transfer personal data to the U.S. Notwithstanding this, on 5 July 2018 the European Parliament voted a non-binding resolution recommending the suspension of the EU-U.S. Privacy Shield unless certain corrective actions were adopted by the U.S. administration, including: aligning fully the Privacy Shield to the GDPR, and making the Privacy Shield fully compliant with the recommendations issued by the WP29 on 28 November 2017. [37] In October 2018, EU Commissioner Věra Jourová, Secretary of Commerce Wilbur Ross, and members of the respective EU and U.S. administrations and authorities met with the occasion of the second annual review of the Privacy Shield. [38]  During these meetings, the governments of both jurisdictions discussed the nomination and functioning of the Privacy and Civil Liberties Oversight Board and of the Privacy Shield Ombudsman Mechanism, which are important elements to guarantee the application and enforcement of the Privacy Shield. Finally, it is notable that although the case before the CJEU from  the referral from the Irish High Court concerns primarily standard contract clauses, a number of the questions posed by the Court refer to the adoption of the Privacy Shield and its influence in the overall assessment of standard contract clauses. C.    EU Cybersecurity Directive 1.    Adoption and Implementation of the EU CybersSecurity Directive In the EU, cybersecurity legislation addressing incidents affecting essential service and digital service providers is primarily covered by the NIS Directive [39], adopted on 6 July 2016. As it was explained in the 2018 International Outlook and Review, the NIS Directive is the first set of cybersecurity rules to be adopted at the EU level, adding to an already complex array of laws with which companies must comply when implementing security and breach response processes.  It aims to set a minimum level of cybersecurity standards and to streamline cooperation between EU Member States at a time of growing cybersecurity breaches. The NIS Directive is not directly applicable by authorities and courts, and contained a deadline for Member States to transpose it into national law by May 2018.  Thus, in the course of the last year, Member States have endeavored to adopt the necessary regulations and empower the appropriate authorities to transpose, apply and enforce the NIS Directive. The final text of the NIS Directive sets out separate cybersecurity obligations for (i) essential service and (ii) digital service providers: Essential service providers include actors in the energy, transport, banking and financial markets, as well as health, water and digital infrastructure [40] sectors. Digital service providers will include online marketplaces, search engines and cloud services (with an exemption for companies with less than 50 employees) but not social networks, app stores or payment service providers. The clear aim of the NIS Directive is to harmonize the EU Member State rules applicable to the security levels of network and information systems across the EU.  However, given the strategic character of certain services covered by the NIS Directive, it confers some powers and margin of discretion to Member States.  For example, the NIS Directive mandates each EU Member State to adopt a national strategy on the security of network and information systems, defining objectives, policies and measures envisaged with a view to achieve the aims of the NIS Directive. [41]  Thus, despite the ability of Member States to seek the assistance of the European Union Agency for Network and Information Security (“ENISA”), the development of a strategy will remain a national competence.  Furthermore, as far as operators of essential services are concerned, EU Member States will identify the relevant operators subject to the NIS Directive and may impose stricter requirements than those laid down in the NIS Directive (in particular with regard to matters affecting national security). [42] In contrast, Member States should not identify digital service providers (as the NIS Directive applies to all digital service providers within its scope) and, in principle, may not impose any further obligations to such entities. [43]  The European Commission retains powers to adopt implementing rules regarding the application of the security and notification requirements rules applicable to digital service providers. [44]  It is expected that these rules will be developed in cooperation with the ENISA and stakeholders, and will enable an uniform treatment of digital service providers across the EU.  In addition, the competent authorities will only be able to carry out supervisory activities when there is evidence that a digital service provider is not complying with its obligations under the NIS Directive. Another tool for coordination among authorities will be the envisaged “Cooperation Group”, similar to the WP29 operating currently under the 1995 Data Privacy Directive.  The Cooperation Group will bring together the regulators of all EU Member States, who have different legal cultures and hold different approaches to IT and security matters (e.g., affecting national security).  It is therefore expected that the European Commission will play an active role in building trust and consensus among the Cooperation Group’s members with a view of providing meaningful and clear guidance to businesses. 2.    Documents and Guidance Issued by ENISA In the course of 2018, ENISA has been particularly active in issuing guidance and evaluating the responsiveness of the EU authorities, stakeholders and systems in responding to cyberattacks.  In particular: ENISA has published a number of guidance documents aimed to assist private parties in their evaluation of security measures adopted in application of EU instruments, such as the NIS Directive [45] and the Open Internet Regulation. [46] Following the trends for increased use of consumer products and services relying on cloud services and Internet of Things, ENISA has issued a number of guidance documents providing companies with an overview of the potential risks and redress measures in this context.  This includes the “Good practices for Security of Internet of Things in the context of Smart Manufacturing”, of November 2018, [47] or the working document “Towards secure convergence of Cloud and IoT”, of September 2018. [48] On 6-7 June 2018 ENISA held Cyber Europe 2018, a yearly exercise that simulates an intense realistic crisis caused by a large number of cybersecurity incidents.  During the exercise, the EU Member States’ cooperation was found to have improved at technical level and be efficient.  However, ENISA also noted that the private sector had to prioritize investing on IT security, particularly in regards to essential service operators. [49] D.    Other EU Developments 1.    Reform of the ePrivacy Directive – the Draft EU ePrivacy Regulation As it was explained in the 2018 International Outlook and Review, 2016 saw the initiation of the procedures for the reform of the EU’s main set of rules on ePrivacy, the ePrivacy Directive.  In this context, further to a public consultation held by the European Commission, the first proposal of the future EU ePrivacy Regulation (the “draft ePrivacy Regulation”) was released on 10 January 2017. [50] In 2017, the draft ePrivacy Regulation was subject to an opinion of the WP29 (4 April 2017) [51] and an amended version issued by the European Parliament (20 October 2017). [52] Since then, in the course of 2018, internal discussions have been ongoing at the level of the Council of the EU, which have concluded in the issuance of two final versions of the draft ePrivacy Regulation, dated 10 July and 19 October 2018.  Due to the progress made, the ePrivacy Regulation is expected to be adopted in 2019. a)    The European Commission’s ePrivacy Regulation Proposal The Commission’s ePrivacy Regulation proposal released in January 2017 sought to accommodate the reform of the ePrivacy regime to the feedback received from stakeholders and the WP29.  In summary, the draft ePrivacy Regulation prepared by the European Commission constituted a more comprehensive piece of legislation that aims to fix and close certain open issues identified in the application of the ePrivacy Directive: Regulation versus Directive: The European Commission’s proposal to replace the ePrivacy Directive with a Regulation means that its terms will in principle apply directly across all EU Member States, and will not require transposition at national level (e.g., via the adoption of laws by the parliaments of the different Member States).  This decision is consistent with the approach adopted with regard to the GDPR.  Although Member States will still be given some freedom to deviate from the ePrivacy Regulation (particularly in the area of national security), the choice to adopt a Regulation will increase the homogeneous application of the ePrivacy Regulation across all EU Member States. Alignment with the GDPR:  A number of provisions in the draft ePrivacy Regulation of the European Commission demonstrated alignment with the GDPR.  For example, as the GDPR, the draft ePrivacy Regulation had a broad territorial scope and applied to the provision of electronic communication services (e.g., voice telephony, SMS services) from outside the EU to residents in the EU. As indicated below, the draft ePrivacy Regulation also aimed to close the gap with the GDPR from an enforcement perspective, by empowering DPAs to monitor the application of the privacy-related provisions of the draft ePrivacy Regulation under the conditions established in the GDPR. From a substantive perspective, the definition of a number of legal concepts used in both the GDPR and the draft ePrivacy Regulation were also aligned (e.g., the conditions for “consent”, the “appropriate technical and organization measures to ensure a level of security appropriate to the risks”). Inclusion of OTT Service Providers:  In response to the feedback of stakeholders, the draft ePrivacy Regulation indicates that the new Regulation will apply to providers of services that run over the Internet (referred to as “over-the-top” or “OTT” service providers), such as instant messaging services, video call service providers and other interpersonal communications services. [53] Cookies and Other Connection Data:  Like the ePrivacy Directive, the draft ePrivacy Regulation contained a provision that addressed the circumstances under which the storage and collection of data on users’ devices is lawful.  These practices may still be based on the prior consent obtained from users.  In the absence of users’ consent, according to the draft ePrivacy Regulation, it would still be possible to carry out these practices provided that: [54] they serve the purpose of carrying out (not facilitating) the transmission of a communication over an electronic communications network; or they are necessary (albeit not strictly necessary) for providing: (i) a service requested by the end user; or (ii) first-party web audience measuring. The recitals of the draft ePrivacy Regulation suggested that the circumstances under which consent would not be required could be interpreted more broadly than under the current ePrivacy Directive. [55] By contrast, the ePrivacy Regulation contains a new set of seemingly more stringent rules applicable to the “collection of information emitted by terminal equipment to enable it to connect to another device and/or to network equipment”. Supervisory Authorities and EDPB:  One of the novelties introduced by the draft ePrivacy Regulation was a section devoted to the appointment and powers of national supervisory authorities. [56]  The relevant provisions clarify that the DPAs responsible for monitoring the application of the GDPR shall also be responsible for monitoring the application of the provisions of the draft ePrivacy Regulation related to privacy in electronic communications, and that the rules on competence, cooperation and powers of action of DPAs foreseen in the GDPR also apply to the draft ePrivacy Regulation. b)    The WP29 Opinion on the European Commission Proposal Following the release of the European Commission’s proposal, the WP29 issued its opinion on the proposed draft ePrivacy Regulation in April 2017. [57] While the WP29 welcomed the proposal and the choice for a regulation as the regulatory instrument, it highlighted four points of “grave concern” that would “lower the level of protection enjoyed under the GDPR” if adopted, and made recommendations in this respect concerning: The rules concerning the tracking of the location of terminal equipment, for instance WiFi tracking, which are inconsistent with the rules of the GDPR.  The WP29 advised the European Commission to “promote a technical standard for mobile devices to automatically signal an objection against such tracking”. The conditions under which the content and metadata can be analyzed should be limited:  consent of all end-users (senders and recipients) should be the principle with limited exceptions for “purely personal purposes”. Barriers used by some websites to completely block access to the service unless visitors agree to third-party tracking, known as “tracking walls,” should be explicitly prohibited to give individuals the choice to refuse such tracking while still being able to access the website. Terminal equipment and software should offer “privacy protective settings” by default, in addition to allowing the user to adjust these settings. The WP29 indicated that it expected its concerns to be addressed during the ongoing legislative process. c)    The European Parliament’s Amended Proposal In October 2017, the European Parliament proposed an amended version of the European Commission’s proposed draft ePrivacy Regulation, [58] which introduced more stringent rules on the use of personal data and on the respect of users’ privacy.  Some of the notable changes include: The prohibition to block access to a service solely because the user has refused the processing of personal data which is not necessary for the functioning of the service. The requirement for providers of electronic communications services to ensure the confidentiality of the data, for instance with end-to-end encryption and the prohibition of backdoors. The requirement for browsers to block third-party cookies by default until the user has adjusted his/her cookie settings. The prohibition of “cookie walls” and cookie banners that prevent the use of the service unless users agree to all cookies. d)    The Proposal of the Council of the EU In addition to the Parliament’s version of the draft ePrivacy Regulation, the Council of the EU has also published a number of working proposals and amendments.  The two latest documents related to the draft ePrivacy Regulation were published on 10 July and 19 October 2018, and they introduced some important changes to the proposals of the European Commission and of the European Parliament. On 10 July 2018, the EU Council published some revisions to the draft ePrivacy Regulation, which focused primarily on the following key points: [59] The draft introduced the possibility for “further compatible processing of electronic communications metadata”.  This amendment suggests the broadening of the scope of permissible processing for research purposes, which would enable private parties to pursue research and innovation.  The Council of the EU also called for the draft ePrivacy Regulation to be “more future-proof”, providing flexibility to enable developments in a rapidly changing digital environment. Other amendments made by the EU Council sought to clarify the lawfulness of processing operations carried out in the course of operators’ daily business.  For example, new language introduced in Article 6(2)(b) clarified that the processing of metadata for the purposes of calculating and billing interconnection payments is permitted. The EU Council also sought to clarify the rules applicable to the storage and processing of data on end-users’ equipment.  Pursuant to the Council’s revisions, the responsibility for obtaining consent for the storage of a cookie or similar identifier lies on the entity that collects information from end-users’ terminal equipment, such as an information society service provider or an ad network provider.  However, these entities may request another party to obtain consent on their behalf.  The Council’s amendments also clarify that the end-user’s consent to storage of a cookie or similar identifier may also entail consent for the subsequent readings of the cookie in the context of a revisit to the same website domain initially visited by the end-user. The EU Council suggests the deletion of the entire Article 10 of the draft ePrivacy Regulation, and the respective recitals, which obliged software providers to inform the end user whenever privacy settings are updated. On 19 October 2018, the EU Council issued a new revised version of the draft ePrivacy Regulation, which included further edits and amendments in addition to those published in July. [60] One of the most significant changes introduced to the draft ePrivacy Regulation is the recognition of the ability of information society services to use tracking technologies on the computers of individuals, without consent, for websites that partly or wholly finance themselves through advertisement, provided information obligations have been complied with and that the user “has accepted this use” of the data (as opposed to requiring full-blown consent). The EU Council also included to the draft ePrivacy Regulation a new Article 6(1)(c), which allows the processing of electronic communications data when necessary to ensure the security and protection of terminal equipment.  This and other similar changes introduced by the Council aim at achieving certain coherence between these provisions and the security obligations to which information society services are subject, enabling the latter to use security tools that need the processing of data contained in the terminal equipment without obtaining prior consent. 2.    CJEU Case Law 2017 has also witnessed important cases before the CJEU on the application of the EU Data Protection Directive, the GDPR and the ePrivacy Directive. a)    The Determination of the Applicable Law and the Relevant Data Controller in the Context of Social Networks On 5 June 2018, the CJEU delivered a ruling in  Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH which clarified the definition of data controller and the determination of the applicability of national data protection legislation and the powers of DPAs in cases concerning controllers established in multiple Member States. [61] First, the CJEU indicated that administrators of webpages hosted by third parties (e.g., fan pages hosted by social networks) that knowingly make use of the services (e.g., audience statistics), may be considered to be (co)controllers of the data processed in the context of visitors’ traffic to the webpage.  In doing so, the CJEU recognized the joint responsibility of the operator of the third-party website (e.g., the social network) and the administrator of the webpage (e.g., a fan page) in relation to the processing of the personal data of visitors to that page, which is deemed to contribute to ensuring more complete protection of the rights of persons visiting a fan page. Second, the CJEU found that, while an establishment of a controller focused on the sale of advertising space and other marketing activities may be subject to the laws and the powers of the DPA of the Member State where it is established, such laws and powers may not extend to an establishment of the same controller located in another Member State. The judgment in  Wirtschaftsakademie was followed by an Opinion of the EU’s Advocate General Michal Bobek in Fashion ID GmbH & Co. KG v. Verbraucherzentrale NRW e.V., which also addressed the question of determining who is the data controller in the context of the use of tools to collect and transmit cookie data (e.g., social plug-ins).  The Advocate General found that an entity or organization which has embedded a third-party plug-in in its website, which causes the collection and transmission of the user’s personal data, must be considered as a controller, even if it is unable to influence the data processing operation resulting from the functioning of the plug-in. However, the Advocate General observed that a controller’s joint responsibility should be limited to those operations for which it effectively co- determines the means and purposes of the processing of the personal data. The Advocate General proceeded to note that, where the processing of cookie data resulting from the use of plug-ins is based on the legitimate interests of controllers or third parties, legitimate interests of both the website operator and the plug-in provider should be taken into account as joint controllers, and an assessment should be made balancing those interests with the rights of the data subjects. Finally, the Advocate General concluded that the consent of the data subject has to be given to a website operator which has embedded the content of a third party, and that the EU Data Protection Directive must be interpreted as meaning that the obligation to inform also applies to that website operator, and both must be given before the data are collected and transferred. However, he noted that the extent of those obligations shall correspond with that operator’s joint responsibility for the collection and transmission of the personal data. b)    Claims Assignment As indicated in the 2018 International Outlook and Review, Mr. Schrems started legal proceedings against Facebook Ireland Limited before a court in Austria, which raised the question of whether jurisdiction was established in the domicile of a consumer claimant who was assigned claims by other consumers, thus opening up the possibility of collecting consumer claims from around the world. On 14 November 2017, Advocate General Bobek delivered his opinion on the Maximilian Schrems v. Facebook Ireland Limited case pending in the CJEU. [62]   Advocate General Bobek held that a consumer cannot invoke, at the same time as his own claims, claims on the same subject assigned by other consumers domiciled in other places in the same Member State, in other Member States, or in non-Member States. On 25 January 2018, the CJEU concurred with the Advocate General’s opinion, finding that a consumer cannot assert, in the courts of the place where he is domiciled, not only his own claims, but also claims assigned by other consumers domiciled in the same Member State, in other Member States or in non-Member State countries. II.    Developments in Other European Jurisdictions: Switzerland, Turkey and Russia The increasing impact of digital services in Europe, as well as the overhaul brought about by the GDPR in the EU, have led certain jurisdictions in the vicinity of the EU to improve their data protection regulations. A.    Russia Local data privacy laws have been heavily enforced, reflecting the activity of the Russian Data Protection Authority in monitoring and enforcing data protection compliance. As of 1 July 2017, the administrative sanctions in Russia for certain privacy violations have been significantly increased. For example, data processing operations in excess of the consent provided by a data subject may result in a fine of RUR 75,000 (approx. USD 1,200; approx. EUR 1,000).  Criminal prosecution and prison sanctions are also possible for certain types of privacy violations.  Another type of enforcement action under Russian law is blockage of the online resources. Thus, if processing of personal data on the website or in the app violates data protection laws, access to such website/app may be restricted for Russian users upon the respective court decision. The most well-known and widely debated blockage related to LinkedIn, which has been blocked since 2016 and remains unavailable for Russian users. This is not the only example – some other websites, with smaller user bases,  have been blocked  in recent years. The Russian Data Protection Authority has been targeting large digital multinationals in the last few years.  For example, in 2017, Telegram was fined RUR 800,000 (approx. USD 14,000; approx. EUR 10,500) by Russian courts for failing to provide the Russian Federal Security Service with the decoding keys for access to personal data, as obliged by the Russian Data Protection Act.  In doing so, the Russian courts disregarded Telegram’s arguments based on its lack of control over the encoding and decoding processes of its instant messaging service.  On 22 October 2018, Russian courts rejected Telegram’s appeal against the fine. The Telegram case shows that, if the relevant technology used by a service provider (as long as the services relate to communications in the Internet) does not allow state authorities to access unencrypted information, this may be deemed a breach of Russian data protection and cybersecurity laws. B.    Switzerland To prepare for the entry into force of the GDPR, the Swiss government has issued a draft of a new Data Protection Act (the “Draft FDPA”) [63] that aims to: –        Modernize Swiss data protection law and to a certain extent, align it to the requirements of the GDPR; and, –        Maintain its adequacy status granted by the European Commission, to ensure the free flow of personal data between the EU and Switzerland. The Draft FDPA was published by the Swiss Federal Council on 15 September 2017.  The Draft FDPA, which will replace the Federal Act on Data Protection of 19 June 1992 (the “FADP”), has the following characteristics: The concept of “sensitive” or “special categories” of personal data under the Draft FDPA covers a wide range of categories of data, including personal data in the “intimate sphere” (e.g., fears, dreams, therapies), biometric data which clearly identifies an individual (e.g., pictures), data on administrative or criminal proceedings and sanctions, and data on social security measures. [64] The Draft FDPA contains a list of basic principles for the processing of personal data which are broadly equivalent to those contained in the GDPR. [65]  By contrast, as opposed to the GDPR, the processing of personal data will not require any legal basis under the Draft FDPA (such as consent), unless such processing leads to an unlawful violation of privacy (i.e., the processing of personal data does not comply with the basic data processing principles). Similarly to the GDPR, under the Draft FDPA data subjects have the right to request access, rectification or erasure of their personal data and not to be subject to automated decision-making. [66]  However, in contrast to the GDPR, the Draft FDPA does not provide for a right to data portability. The Draft FDPA contains a duty for companies to carry out a DPIA in specific situations, which closely mimic the scenarios envisaged by the GDPR. [67]  The Draft FDPA also contains an obligation on privacy by design and by default broadly equivalent to that of the GDPR, [68]  which compels companies and organizations to set up technical and organizational measures in order for the data processing to meet the data protection rules.  However, the Draft FDPA does not foresee any sanctions or penalties for a violation of these obligations (as opposed to the GDPR). The Draft FDPA includes a general obligation for companies to report to the Federal Data Protection and Information Commissioner (“FDPIC”) as soon as possible ( the data breaches which are likely to result in a high risk to the privacy or the fundamental rights of data subjects. [69]  A notification of the data breach to data subjects may also be required if it is necessary for the protection of data subject or if such notification is ordered by the FDPIC.  The Draft FDPA does not foresee a criminal sanctions for a violation of the obligation to notify data breaches, unless notification of data subjects is to be made based on an order from the FDPIC. The refusal to comply with the FDPIC’s order may be criminally sanctioned with a fine up to CHF 250’000. [70] Under the Draft FDPA, it will no longer be the FDPIC who provides guidance on the adequacy level of third countries. The Draft FDPA delegates the qualification of adequacy to the Federal Council who will determine the countries providing for an adequate level of data protection.  One may expect, however, that the Federal Council will follow closely the adoption of adequacy decisions by the European Commission. With regard to the authorities’ investigations and fines, the Federal Data Protection FDPIC has the right to investigate on his own initiative or upon request, it may take investigation measures and is entitled to issue certain administrative measures.  These investigation proceedings are governed by administrative procedural law, and are subject to review by the Federal Administrative Court.  However, the FDPIC does not have the power to impose any fines or penalties.  Instead, data protection violations lead to personal criminal liability of individuals, subject to fines of up to CHF 250,000 that will be imposed by the ordinary courts in Switzerland. Until the Draft FDPA is finally enacted, the current FDPA of 19 June 1992 remains applicable.  Initially, the Swiss Federal Council tentatively aimed to enact the Draft FDPA in August 2018.  However, in January 2018, the relevant parliamentary commission required that the Draft FDPA  be split in two parts to allow more time for deliberation. For companies anticipating to be affected by both the Draft FDPA and the GDPR, it may be advisable to adjust all their processing of personal data to the standards provided under the GDPR.  If the implementation and application of the Draft FDPA leads to certain obligations being leaner than those contained in the GDPR, these adjustments may be done in the course of the data processing activities (e.g., not applying the exercise of certain rights where these rights are not covered by the Draft FDPA and provided that the GDPR does not apply). To the extent that the Draft FDPA goes beyond the GDPR, the additional requirements should be implemented for any processing subject to the current FDPA respectively the Draft FDPA. C.    Turkey Throughout 2018, the Turkish data protection authority (the “KVKK”) has issued a number of regulations and guidance documents regarding a number of issues related to the application and enforcement of the Turkish Data Protection Act No. 6698 of 2016.  These regulations and guidance documents include the following: Processing of sensitive personal data: On 7 March 2018, the KVKK published a decision regarding the processing of special categories of personal data.  Pursuant to this decision, data controllers must foresee a separate policy and procedure for the protection of special categories of personal data. The decision further determined special conditions and requirements applicable to mediums where such data are stored, persons who have access to such data and transfer of such data. Transparency and information obligations: On 10 March 2018, the KVKK published the Communique on Procedures and Principles regarding the Obligation of Data Controllers to Inform, which lays out the content and methodology that shall be followed by entities and organizations to provide information to data subjects, for example within the scope of their privacy notices. Security measures: On 19 January 2018, the KVKK published a guidance document on security of personal data in order to assist entities and organizations in their compliance with data protection and security obligations, specifically focusing on technical and administrative measures. KVKK provided further detailed guidance on the matter with its decision on 31 January 2018 (2018/10). Registration of the data controllers: Pursuant to the KVKK Regulation on the Data Controller Registry, published on 30 December 2017, data controllers not exempted from registration by the KVKK must include their details in the KVKK Registry before proceeding to process personal data.  Controllers may register online by uploading the required information to the KVKK Registry system.  KVKK also declared the grace periods for different entities in its decision on 19 July 2018 (2018/88). Registration of e-marketing approvals and rejections: In 2018, Turkey adopted Law No. 7061 Amending Certain Tax Laws and Other Laws, which empowers the Ministry of Customs and Commerce to put in place a system to record the approvals and rejections received by companies for the purposes of e-marketing.  This measure was later followed by a decision adopted by the KVKK, mandating all entities and organizations to cease their marketing operations unless they were covered by one of the exceptions provided for by the Turkish Data Protection Act or by consent. Data subject requests: On 10 March 2018, the KVKK also published the Communique on Procedures and Principles of Applications to Data Controllers, which lays out the procedure for data subjects to employ their rights against data controllers and data controllers’ obligation with regards to such requests. D.    Ukraine In Ukraine, on 23 October 2018, the Parliamentary Commissioner for Human Rights issued a draft law aiming to align the Law on Personal Data with the GDPR.  The draft law was further updated on 30 October 2018, and is subject to additional revisions until it is finally filed by the Cabinet of Ministers to the Ukrainian Parliament.   As it currently stands, the draft law contains the following main amendments: The draft sets out the legal basis upon which an entity may process personal data, including the consent, the performance of a contract to which the data subject is a party and the fulfilment of a legal obligation. The draft law borrows from the GDPR a number of principles and definitions, including the concepts of personal data, data processing, profiling and pseudonymisation. Like the GDPR, the draft law also regulates aspects such as the rights of data subjects, the appointment of DPOs, the notification of data breaches and the transfer of personal data to third countries and organizations. In addition to the draft data protection law, on 9 May 2018, the Law on Basic Principles of Ukraine’s Cyber Security came into force.  The Cyber Security Law mainly applies to “critical infrastructure”, and lays down the regulatory framework for a number of measures to be adopted in implementation of the Law. III.    Developments in Asia-Pacific In an increasingly connected world, 2018 also saw many other countries try to get ahead of the challenges within the cybersecurity and data protection landscape.  Several international developments bear brief mention here: A.    China As noted in the 2018 International Outlook and Review, China’s Cybersecurity Law was adopted on 1 June 2017, becoming the first comprehensive Chinese law to regulate the management and protection of  digital information by companies.  The law also imposes significant restrictions on the transfer of certain data outside of the mainland (data localization) enabling government access to such data before it is exported. [71] Despite protests and petitions by governments and multinational companies, the implementation of the Cybersecurity Law continues to progress with the aim of regulating the behavior of many companies in protecting digital information. [72]  While the stated objective is to protect personal information and individual privacy, and according to a government statement in China Daily, a state media outlet, to “effectively safeguard national cyberspace sovereignty and security,” the law in effect gives the Chinese government unprecedented access to network data for essentially all companies in the business of information technology. [73]  Notably, key components of the law disproportionately affect multinationals because the data localization requirement obligates international companies to store data domestically and undergo a security assessment by supervisory authorities for important data that needs to be exported out of China.  Though the law imposes more stringent rules on critical information infrastructure operators (whose information could compromise national security or public welfare) in contrast to network operators (whose information capabilities could include virtually all businesses using modern technology), the law effectively subjects a majority of companies to government oversight.  As a consequence, the reality for many foreign companies is that these requirements would likely be onerous, will increase the costs of doing business in China, and will heighten the risk of exposure to industrial espionage. [74]  Despite the release of additional draft guidelines meant to clarify certain provisions of the law, there is a general outlook that the law is still a work in progress, with the scope and definition still vague and uncertain. [75]  Nonetheless, companies should endeavor to assess their data and information management operations to evaluate the risks of the expanding scope of the data protection law as well as their risk appetite for compliance with the Chinese government’s access to their network data. More recently, on 10 September 2018, the National People’s Congress of China announced, as part of its legislative agenda, that its Standing Committee would consider draft laws with relatively mature conditions, including a draft personal information protection law and a draft data security law. [76] B.   Singapore As indicated in the 2018 International Outlook and Review, the Personal Data Protection Commission of Singapore issued  on 7 November 2017 the proposed advisory guidelines for the collection and use of national registration identification numbers.  The guidance, which covers a great deal of personal and biometric data, emphasized the obligations of companies to ensure policies and practices are in place to meet the obligations for data protection under the Personal Data Protection Act of 2012.  The Commission gives businesses and organizations 12 months from the date of publication to review their processes and implement necessary changes to ensure compliance. [77] C.    India As noted in the 2018 International Outlook and Review, India recently issued a white paper in 2017 with the aim of drafting a data protection bill to “ensure growth of the digital economy while keeping personal data of citizens secure and protected”. [78] Further to the publication of this white paper, the Ministry of Electronics and Information Technology published, on 27 July 2018, the Personal Data Protection Bill (the “Bill”) and the Data Protection Committee Report (the “Report”). [79] The Bill comprises 15 chapters and addresses, data protection obligations, including, grounds for processing personal data and sensitive personal data, personal and sensitive data of children, data principal rights, transparency, accountability measures and transfer of personal data outside India. In particular, according to its Article 1, the Bill shall apply to the processing of personal data where such data has been collected, disclosed, shared or otherwise processes within the territory of India and to the processing of personal data by the State, any Indian company, any Indian citizen or any person or body of persons incorporated or created under Indian law. Notwithstanding the above, the Bill also applies to the processing of personal data by fiduciaries or data processors not present in the territory of India, if they carry out processing of personal data in connection with (i) any business carried on in India, (ii) systematic activity of offering goods or services to data principals within the territory of India, (iii) any activity which involves profiling of data principals within the territory of India. Moreover, the Bill outlines that a data protection authority would be established and penalties would be imposed for violations of the obligations. In particular, Article 69(1) of the Bill establishes penalties that may extend up to five crore rupees (i.e., approx. USD 700,000; approx. EUR 620,000) or  2% of the data fiduciary total worldwide turnover in the preceding financial year, whichever is higher, if the data fiduciary contravenes its obligations to take prompt and appropriate action in response to a data security breach, undertake a DPIA, conduct a data audit, appoint a DPO or if it fails to register within the relevant authority. In case the data fiduciary contravenes any of its obligations regarding the processing of personal and/or sensitive data, the need to adhere to security safeguards or the applicable provisions on transfer of personal data outside India, the Bill establishes a penalty that may extend up to 15 crore rupees or 4% of the data fiduciary total worldwide turnover in the preceding financial year, whichever is higher. In addition, the Report addresses, among other things, existing approaches to data protection, key definitions of the Bill and recommendations received from the white paper consultation. IV.    Developments in Canada and in Latin America The overhaul of data protection rules in important jurisdictions around the globe has also impacted Canada and Latin America, where some local administrations have bolstered their respective legislation and undertaken initiatives to bring their framework closer to that of the EU. A.    Brazil In Brazil, a new General Data Protection Law was adopted on 14 August 2018 after several years of discussions among decision-makers. [80]  Although the Brazilian Law is more lenient and contains fewer explanations regarding the interpretation and application of its provisions, a number of commonalities can be found between the Law and the GDPR, including the following: As the GDPR, the Brazilian General Data Protection Law generally excludes from its scope of application anonymous/ized data, except when the anonymization process used has been reverted, using solely its own resources, or where it can be reverted applying reasonable efforts.  For this purpose, it is understood that anonymous/ized data is data that cannot be assigned to an identifiable person using reasonable means. [81] In setting out the obligations of entities processing personal data, the Brazilian General Data Protection Law also considers the conditions under which such processing is taking place.  For example, while (as indicated above) anonymous/ised data may generally be considered to be excluded from the scope of application of the Law, it contains a specific provision whereby anonymous/ised data may fall within the scope of the Law if it is used to evaluate certain aspects of a physical person (e.g., the behavioral profile of a person if he/ she is identifiable). [82] The Brazilian Law is also based on the basic principle that data processing operations are forbidden unless they are based on any of its previously established legal basis.  The Law contains 10 legal basis, which are based on the five legal basis contained in the GDPR, plus five additional basis: [83] data processing for the exercise of rights in legal proceedings; data processing for the research by study entities (granted that, whenever possible, the data is anonymous/ised); data processing for the protection of an individual’s health; data processing for the protection to credit; data processing and sharing by the public administration as required for public policy enforcement under law or contract. In Brazil, consent is also defined as freely given, informed and unambiguous indication of data subjects’ agreement to process personal data.  Furthermore, the Law focuses on empowering data subjects with meaningful control and choice regarding their personal data. [84] As regards to the rights of data subjects, the Brazil Law has also included a general right to data portability, which was first envisaged by the GDPR. [85]  This right obliges controllers to transfer personal data of data subjects to another controller, at the data subjects’ request. The Law also contains a general obligation to report incidents regarding the processing of personal data to the national authority and to the data’s subject, in a reasonable timeframe. The notification shall include information such as a description of the personal data affected and the data subjects and entities involved, a description of the technical and security measures used for the protection of personal data, the reasons for the delay suffered in the case of late notifications, and a description of the measures adopted to mitigate or redress the effects of the incident. [86] The Brazilian General Data Protection Law contains a general obligation to appoint a DPO, applicable to data controllers only. [87]  However, the Brazilian data protection authority may set further guidance qualifying the situations where such obligation may no longer apply. Finally, as the GDPR, the Law prescribes the obligation to carry out a “Report on the Impact on Personal Data Protection” in certain situations, where a data processing operation may pose risks to civil liberties and fundamental rights. Like GDPR, the Brazilian General Data Protection Law provides that personal data can be transferred to third countries that ensure an adequate level of protection or based on appropriate safeguards. The safeguards under both laws are basically the same, except for legally binding instruments between public authorities/bodies, which is a safeguard under GDPR but under the Brazilian Law it is limited for purposes of international legal cooperation among intelligence, investigation and prosecution bodies (at least until the Brazilian data protection authority regulates the international transfer mechanisms). Fines under the Brazilian General Data Protection Law are capped at 2% of the turnover in Brazil in the preceding year or BRL 50 million (approximately USD 13 million), whichever is lower.  These caps are applied to fines imposed per unlawful conduct. The Brazilian data protection authority was created on 28 December 2018, through Executive Order (MP) 869/2018, and will be composed of five commissioners, to be appointed by the President of the Republic, and advised by a National Council for the Protection of Personal Data and Privacy, composed of 23 unpaid members — 11 members from different spheres of government and 12 members divided between four from the private sector, four from academia and four from the civil society. The Executive Order also postpones the entry into force of the Brazilian General Data Protection Law to August 2020. B.    Canada As noted in the 2018 International Outlook and Review, Canada opened up for comments a proposed regulation in 2017 that would mandate reporting of privacy breaches under its Personal Information Protection and Electronic Documents Act of 2015 (“PIPEDA”).  On 1 November 2018, some amendments to the PIPEDA came into force. [88] The law now establishes that, where an organization subject to PIPEDA experiences a data breach that gives rise to a “risk of significant harm”, they will be required to: (i) report the incident to the Office of the Privacy Commissioner of Canada; (ii) notify any affected individuals; and (iii) alert any other third parties that are in a position to reduce the risk of harm to affected individuals. C.    Other Jurisdictions: Argentina, Chile, Colombia, Mexico, Panamá and Uruguay Finally, as explained in the 2018 International Outlook and Review, Argentina forged ahead with an overhaul of the country’s data protection regime by publishing in 2017 a draft data protection bill that would align the country’s privacy laws with the GDPR requirements. [89]  More recently, the Argentinian data protection authority announced, on 20 September 2018, that the President of the Argentine Republic, Mauricio Macri, had sent a draft data protection bill to the National Congress of Argentina for consideration, seeking to reform the current law on the protection of personal data.  The message attached to the bill indicates that its objective is to modernize the law, in light of new technologies.  The message attached to the bill also makes reference to the GDPR, and the bill includes provisions on data breach notification, privacy by design and default, processing of data by third parties, DPIA and the appointment of a DPO. [90] In Chile, on 31 August 2018, the Superintendence of Banks and Financial Institutions announced that it had issued a series of modifications to Chapter 20-8 and 1-13 of the Updated Compilation of Standards relating to cybersecurity, including updates to the rules on the reporting of operational incidents. In particular, the modifications to Chapter 20-8 seek to improve the system for the reporting of security incidents by creating a digital platform, requiring incidents to be reported within 30 minutes of the incident occurring beginning 1 October 2018, and requiring entities to include specific information when reporting an incident. In addition, a number of obligations were also introduced, namely a requirement to appoint a person, at the executive level, to communicate with the Superintendence of Banks and Financial Institutions (known as “SBIF”, its acronym in Spanish); to inform users and clients of incidents that affect the quality and continuity of services, the security of their personal data or  that are of public knowledge; and, to maintain a cybersecurity incident alert system to facilitate data sharing on the incidents in order to allow other entities to adopt any necessary measures. In relation to Chapter 1-13, the modifications establish cybersecurity as a special criteria in the evaluation of the management of a bank by the SBIF, and provides for a requirement to report on cybersecurity management at least once a year. In addition, the SBIF will also evaluate whether an entity maintains a cybersecurity incident database. [91] Moreover, on 25 October 2018, the Chilean Transparency Council announced that the President of Chile, had changed the status of the draft data protection bill currently being considered by the National Congress of Chile to an urgent status. [92] In Colombia, the Financial Superintendence of Colombia issued, on 5 June 2018, two circulars introducing requirements on cybersecurity risk management for covered entities, as well as security standards applicable to online payment platforms, in order to enhance the protection of consumers’ personal financial information. In particular, the requirements include notifying consumers of cybersecurity incidents that affect the confidentiality or integrity of their information, as well as the measures adopted in response to incidents. With the publication of these circulars, entities will also be required to establish a unit in charge of cybersecurity risk management and a strategy concerning the sending of reports to supervisory authorities. In relation to online payment platforms, the security standards introduced are expected to enable the platforms, which are not regulated by the Financial Superintendence of Colombia, to offer their services to financial entities, such as banks and payment networks, under the supervision of this authority. [93] Additionally, a legislative proposal seeking to modify and supplement the Statutory Law No. 1266 of 2008, concerning habeas data and financial information, has recently been presented to the Senate of the Republic of Colombia on 26 July 2018. [94] In Mexico, the National Institute of Access to Information and Data Protection has been particularly active in 2018, issuing several guidance papers on several data protection topics. In March, the National Institute issued recommendations on the processing of the Mexican voting card (a widely used ID) by companies and public entities subject to the provisions of the Federal Law on the Protection of Personal Data Held by Private Parties 2010 and the General Law on the Protection of Personal Data Held by Public Entities 2017. [95] In May, the National Institute has issued guidance on biometric data, providing recommendations on how to process biometric data in compliance with the principles and obligations under the Federal Law on the Protection of Personal Data Held by Private Parties 2010 and the General Law on the Protection of Personal Data Held by Public Entities 2017 and clarifying when biometric data should be considered personal data. [96] In June, the National Institute issued guidance on how to manage data security incidents in order to assist companies, organizations and public entities to comply with their correspondent Data Protection Law . [97] In August, the National Institute issued guidance for the implementation of a “Data Protection Program” by those entities subject to the General Law on the Protection of Personal Data Held by Public Entities 2017. [98] Finally, in November, the National Institute issued guidance outlining the minimum criteria suggested for the contracting of cloud computing services that involve the processing of personal data. The guide covers provider reputation and identity, minimum criteria to be considered by the customer to ensure that the provider has implemented security measures and has conducted risk assessment for personal data, the providers’ return and destruction of personal data at the end of the service, and the conditions and practices of the provider regarding interoperability and portability. The guidance also includes checklists for companies and individuals subject to the Federal Law on the Protection of Personal Data Held by Private Parties 2010 to help them ensure compliance and analyze the risks they assume when hiring cloud computing products and services. [99] Moreover, on 26 June 2018 Mexico acceded to the Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data (known as “Convention 108”) and its additional protocol. In Uruguay, a bill on accountability and budget, containing provisions relating to data protection, is currently being analyzed by the Parliament of Uruguay. [100] Additionally, the data protection authority has recently issued, on 29 October 2018, data protection guides on cookies, profiling, bring your own device and drones, providing recommendations on their use in order to raise attention for data protection issues that may arise from the use of these technologies. [101]   [1]   See Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC, OJ L 119 4.5.2016, p. 1.   [2]   See Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, OJ L 281, 23.11.1995, pp. 31-50.   [3]   See GDPR, at Article 3.   [4]   See EDPB, Guidelines 3/2018 on the territorial scope of the GDPR (Article 3) – Version for public consultation (16 November 2018), available at https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_3_2018_territorial_scope_en.pdf.   [5]   See WP29, Guidelines on Transparency under Regulation 2016/679 (WP260 rev.01, 11 April 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=51025.   [6]   See WP29, Guidelines on Consent under Regulation 2016/679 (WP259 rev.01; 10 April 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=51030.   [7]   See GDPR, at Article 17.   [8]   See EU Data Protection Directive, at Articles 12 and 14; and Case C-131/12 Google Spain SL and Google Inc. v. AEPD and Mario Costeja González ECLI:EU:C:2014:317.   [9]   See WP29, Guidelines on Personal Data Breach Notification under Regulation 2016/679 (WP250 rev.01; 6 February 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=49827.   [10]   See WP29, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (WP251 rev.01; 6 February 2018), available at https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=49826.   [11]   See GDPR, at Article 35.   [12]   See WP29, Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 (WP248 rev.01; 4 October 2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=47711.   [13]   See WP29, Guidelines on the right to data portability (WP242 rev.01; 5 April 2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=44099.   [14]   See GDPR, at Article 56(2).   [15]   See GDPR, at Article 56(1).   [16]   See GDPR, at Article 63.   [17]   See GDPR, at Article 66.   [18]   See WP29, Guidelines for Identifying a Controller or Processor’s Lead Supervisory Authority (WP244 rev.01; 5 April 2017), available at http://ec.europa.eu/newsroom/just/item-detail.cfm?item_id=50083.   [19]   See WP29, Guidelines on Data Protection Officers (“DPOs”) (WP243 rev.01; 5 April 2017), available at http://ec.europa.eu/newsroom/document.cfm?doc_id=44100.   [20]   See: https://edpb.europa.eu/our-work-tools/general-guidance/gdpr-guidelines-recommendations-best-practices_en.   [21]   The Investigation Update “Investigation into the use of data analytics in political campaigns”, 11.07.2018 is available at https://ico.org.uk/media/action-weve-taken/2259371/investigation-into-data-analytics-for-political-purposes-update.pdf.   [22]   The notice is available at https://ico.org.uk/media/action-weve-taken/mpns/2260051/r-facebook-mpn-20181024.pdf.   [25]   The press release is available at http://news.marriott.com/2019/01/marriott-provides-update-on-starwood-database-security-incident/.   [26]   For more information, the press release is available at https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-penalty-50-million-euros-against-google-llc   [27]   For more information, the decision is available at https://www.legifrance.gouv.fr/affichCnil.do?oldAction=rechExpCnil&id=CNILTEXT000038032552&fastReqId=2103387945&fastPos=1.   [28]   For more information, the press release is available at https://www.dataprotection.ie/en/news-media/press-releases/data-protection-commission-opens-statutory-inquiry-twitter.   [29]   See: https://ec.europa.eu/info/law/law-topic/data-protection/data-transfers-outside-eu/adequacy-protection-personal-data-non-eu-countries_en.   [30]   See European Commission, “EU and Japan sign Economic Partnership Agreement” (17 July 2018), available at http://europa.eu/rapid/press-release_IP-18-4526_en.htm.   [31]   See: http://europa.eu/rapid/press-release_IP-18-5433_en.htm.   [32]   See EDPB, Opinion 28/2018 regarding the European Commission Draft Implementing Decision on the adequate protection of personal data in Japan (5 December 2018), available at https://edpb.europa.eu/sites/edpb/files/files/file1/2018-12-05-opinion_2018-28_art.70_japan_adequacy_en.pdf.   [33]   See IAPP, “South Korea’s EU adequacy decision rests on new legislative proposals” (27 November 2018), available at https://iapp.org/news/a/south-koreas-eu-adequacy-decision-rests-on-new-legislative-proposals/.   [34]   See Irish High Court Commercial, The Data Protection Commissioner v. Facebook Ireland Limited and Maximilian Schrems, 2016 No. 4809 P.   [35]   See CJEU, Case C-362/14, Maximillian Schrems v. Data Protection Commissioner (6 October 2016).   [36]   See CJEU, Case C-293/12, Digital Rights Ireland Ltd. v. Minister for Communications, Marine and Natural Resources et al (8 April 2014).   [37]   See European Parliament, Adequacy of the protection afforded by the EU-US Privacy Shield (5 July 2018), available at http://www.europarl.europa.eu/sides/getDoc.do?type=TA&reference=P8-TA-2018-0315&format=XML&language=EN.   [38]   See European Commission, “Joint Press Statement from Commissioner Věra Jourová and Secretary of Commerce Wilbur Ross on the Second Annual EU-U.S. Privacy Shield Review” (19 October 2018), available at http://europa.eu/rapid/press-release_STATEMENT-18-6157_en.htm.   [39]   See Directive (EU) 2016/1148 of the European Parliament and of the Council of 6 July 2016 concerning measures for a high common level of security of network and information systems across the Union, OJ L 194, 19.7.2016, pp. 1-30, available at http://eur-lex.europa.eu/legal-content/EN/TXT/ ?uri=uriserv:OJ.L_.2016.194.01.0001.01.ENG&toc=OJ:L:2016:194:TOC.   [40]   E.g., domain name systems (DNS) providers and top level domain (TLD) registries; see Article 4, NIS Directive.   [41]   See NIS Directive, at Article 7.   [42]   See NIS Directive, at Recital (57) and Article 3.   [43]   See NIS Directive, at Article 16(10).   [44]   See NIS Directive, at Articles 16(8) and (9).   [45]   See ENISA, “Guidelines on assessing DSP security and OES compliance with the NISD security requirements” (28 November 2018), available at https://www.enisa.europa.eu/publications/guidelines-on-assessing-dsp-security-and-oes-compliance-with-the-nisd-security-requirements.   [46]   See ENISA, “Guideline on assessing security measures in the context of Article 3(3) of the Open Internet regulation” (12 December 2018), available at https://www.enisa.europa.eu/publications/guideline-on-assessing-security-measures-in-the-context-of-article-3-3-of-the-open-internet-regulation.   [47]   See https://www.enisa.europa.eu/publications/good-practices-for-security-of-iot.   [48]   See https://www.enisa.europa.eu/publications/towards-secure-convergence-of-cloud-and-iot [49] See ENISA, “Cyber Eurrope 2018: After Action Report” (December 2018), available at https://www.enisa.europa.eu/publications/cyber-europe-2018-after-action-report/at_download/fullReport.   [50]   See https://ec.europa.eu/digital-single-market/en/proposal-eprivacy-regulation.   [51]   See http://ec.europa.eu/newsroom/document.cfm?doc_id=44103.   [52]   See http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A8-2017-0324& language=EN.   [53]   See draft ePrivacy Regulation, at Recital (13).  See Explanatory Memorandum, at Section 3.2.   [54]   See draft ePrivacy Regulation, at Article 8(1).   [55]   However, in practice, the WP29 had already expressed the possibility that operators do not obtain consent for the setting and receipt of cookies in some of the circumstances now covered in the draft ePrivacy Regulation, provided that certain conditions are met.  See WP29, Opinion 04/2012 on Cookie Consent Exemption (WP 194; 7 June 2012), available at http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2012/wp194_en.pdf.   [56]   See draft ePrivacy Regulation, at Articles 18 ff.   [57]   See WP29, Opinion 01/2017 on the Proposed Regulation for the ePrivacy Regulation (2002/58/EC) (WP247; 4 April 2017), available at http://ec.europa.eu/newsroom/just/item-detail.cfm?item_id=50083.   [58]   See European Parliament’s proposal, available at http://www.europarl.europa.eu/sides/getDoc. do?type=REPORT&reference=A8-2017-0324& language=EN.   [59]   See: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_10975_2018_INIT&from=EN.   [60]   See https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_13256_2018_INIT&from=EN.   [61]   See CJEU, Case C-210/16 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH (5 June 2018).   [62]   See Opinion of Advocate General Bobek on Case C-498/16 Maximilian Schrems v. Facebook Ireland Limited.   [63]  The Draft FDPA is available in the official languages of Switzerland: ·          French: https://www.ejpd.admin.ch/ejpd/fr/home/aktuell/news/2017/2017-09-150.html ·         German: https://www.ejpd.admin.ch/ejpd/de/home/aktuell/news/2017/2017-09-150.html ·         Italian: https://www.ejpd.admin.ch/ejpd/it/home/aktuell/news/2017/2017-09-150.html An unofficial English version of the Draft FDPA is also available at https://www.dataprotection.ch/fileadmin/dataprotection.ch/user_upload/redaktion/Docs/Swiss_Data_Protection_Act__draft_of_September_2017__Walder_Wyss_convenience_translation_V010.pdf?v=1507206202   [64]   See Draft FDPA, Article 4(b). Please note that the current FDPA protects information relating to legal entities as personal data.   [65]   See Draft FDPA, Articles 5(1) to (5).   [66]   See Draft FDPA, Articles 19 and 23 to 28.   [67]   See Draft FDPA, Article 20.   [68]   See Draft FDPA, Article 6, and GDPR, Article 25.   [69]   See Draft FDPA, Article 22.   [70]   See Draft FDPA, Article 57.   [71]   See FT Cyber Security, “China’s cyber security law rattles multinationals,” Financial Times (30 May 2017), available at https://www.ft.com/content/b302269c-44ff-11e7-8519-9f94ee97d996.   [72]   See Alex Lawson, “US Asks China Not To Implement Cybersecurity Law,” Law360 (27 September  2017) available at https://www.law360.com/articles/968132/us-asks-china-not-to-implement-cybersecurity-law.   [73]   See Sophie Yan, “China’s new cybersecurity law takes effect today, and many are confused,” CNBC.com (1 June 2017), available at https://www.cnbc.com/2017/05/31/chinas-new-cybersecurity-law-takes-effect-today.html.   [74]   See Christina Larson, Keith Zhai, and Lulu Yilun Chen, “Foreign Firms Fret as China Implements New Cybersecurity Law”, Bloomberg News (24 May 2017), available at https://www.bloomberg.com/news/articles/2017-05-24/foreign-firms-fret-as-china-implements-new-cybersecurity-law.   [75]   See Clarice Yue, Michelle Chan, Sven-Michael Werner and John Shi, “China Cybersecurity Law update: Draft Guidelines on Security Assessment for Data Export Revised!,” Lexology (26 September, 2017), available at https://www.lexology.com/library/detail.aspx?g=94d24110-4487-4b28-bfa5-4fa98d78a105.   [76]   See http://www.npc.gov.cn/npc/xinwen/2018-09/10/content_2061041.htm (Press Release in Chinese).   [77]   See Singapore Personal Data Protection Commission, Proposed Advisory Guidelines on the Personal Data Protection Act For NRIC Numbers, published 7 November 2017, available at https://www.pdpc.gov.sg/docs/default-source/public-consultation-6—nric/proposed-nric-advisory-guidelines—071117.pdf?sfvrsn=4.   [78]   See Naïm Alexandre Antaki and Wendy J. Wagner, “No escaping notification: Government releases proposed regulations for federal data breach reporting & notification”, Lexology (6 September 2017), available at https://www.lexology.com/library/detail.aspx?g=0a98fd33-1f2c-4a52-98c0-cf1feeaf0b90; Ministry of Electronics & Information Technology, “White Paper of the Committee of Experts on a Data Protection Framework for India,”  Government of India (27 November  2017), available at http://meity.gov.in/white-paper-data-protection-framework-india-public-comments-invited.   [79]   See http://meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill%2C2018_0.pdf   [80]   See IAPP, “GDPR matchup: Brazil’s General Data Protection Law” (4 October 2018), available at https://iapp.org/news/a/gdpr-matchup-brazils-general-data-protection-law/.   [81]   See Brazilian General Data Protection Law, Article 12.   [82]   See Brazilian General Data Protection Law, Article 12.   [83]   See Brazilian General Data Protection Law, Article 7.   [84]   See Brazilian General Data Protection Law, Article 7.   [85]   In Brazil, under local telecommunications regulations, users could request the portability of personal data related to a telephone number (Resolution 460/07 of the Brazilian National Telecommunications Agency, Anatel), available at http://www.anatel.gov.br/legislacao/resolucoes/22-2007/8-resolucao-460.   [86]   See Brazilian General Data Protection Law, Article 48.   [87]   See Brazilian General Data Protection Law, Article 41.   [88]   These amendments were implemented through the Digital Privacy Law of 2015, available at https://www.canlii.org/en/ca/laws/astat/sc-2015-c-32/121166/sc-2015-c-32.html.   [89]   See Office of the Australian Information Commissioner, “De-identification Decision-Making Framework”, Australian Government (18 September  2017), available at https://www.oaic.gov.au/agencies-and-organisations/guides/de-identification-decision-making-framework; Lyn Nicholson, “Regulator issues new guidance on de-identification and implications for big data usage”, Lexology (26 September 2017) available at https://www.lexology.com/library/detail.aspx?g=f6c055f4-cc82-462a-9b25-ec7edc947354; “New Regulation on the Deletion, Destruction or Anonymization of Personal Data,” British Chamber of Commerce of Turkey (28 September  28, 2017), available at https://www.bcct.org.tr/news/new-regulation-deletion-destruction-anonymization-personal-data-2/64027; Jena M. Valdetero and David Chen, “Big Changes May Be Coming to Argentina’s Data Protection Laws,” Lexology (5 June 2017), available at https://www.lexology.com/library/detail.aspx?g=6a4799ec-2f55-4d51-96bd-3d6d8c04abd2.   [90]   See https://www.argentina.gob.ar/noticias/proteccion-de-datos-personales-al-congreso (press release only available in Spanish).   [91]   See https://www.sbif.cl/sbifweb/servlet/Noticia?indice=2.1&idContenido=12214 (press release only available in Spanish).   [92]   See https://www.consejotransparencia.cl/presidente-del-cplt-asegura-estar-cada-vez-mas-cerca-el-fin-del-abuso-tras-anuncio-de-urgencia-al-proyecto-de-proteccion-de-datos-personales/ (press release only available in Spanish).   [93]   See the press release of 5 June 2018, available at https://www.superfinanciera.gov.co/inicio/sala-de-prensa/comunicados-de-prensa-/comunicados-de-prensa–10082460 (press release only available in Spanish).   [94]   See http://leyes.senado.gov.co/proyectos/images/documentos/Textos%20Radicados/proyectos%20de%20ley/2018%20-%202019/PL%20053-18%20Habeas%20Data.pdf   [95]   The guide is available at http://inicio.inai.org.mx/DocumentosdeInteres/RecomendacionesCredencialV.pdf   [96]   The guide is available at http://inicio.ifai.org.mx/DocumentosdeInteres/GuiaDatosBiometricos_Web_Links.pdf   [97]   The guide is available at http://inicio.inai.org.mx/DocumentosdeInteres/Recomendaciones_Manejo_IS_DP.pdf   [98]   The guide is available at http://inicio.inai.org.mx/DocumentosdeInteres/DocumentoOrientadorPPDP.docx   [99]   The guide is available at http://inicio.ifai.org.mx/nuevo/ComputoEnLaNube.pdf   [100] The draft bill is available at https://www.mef.gub.uy/innovaportal/file/24846/1/fundamentacion-del-articulado.pdf.   [101] See https://www.datospersonales.gub.uy/inicio/institucional/noticias/urcdp_lanzo_nuevas_guias_proteccion_datos_personales (press release only available in Spanish). The following Gibson Dunn lawyers assisted in the preparation of this client alert: Ahmed Baladi, Alexander Southwell, Alejandro Guerrero, Clémence Pugnet and Francisca Couto. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues. For further information, please contact the Gibson Dunn lawyer with whom you usually work or any of the following leaders and members of the firm’s Privacy, Cybersecurity and Consumer Protection practice group: Europe Ahmed Baladi – Co-Chair, PCCP Practice, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)207071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Jean-Philippe Robé – Paris (+33 (0)1 56 43 13 00, jrobe@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) United States Alexander H. Southwell – Co-Chair, PCCP Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, smehra@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Questions about SEC disclosure issues concerning data privacy and cybersecurity can also be addressed to the following leaders and members of the Securities Regulation and Corporate Governance Group: James J. Moloney – Orange County, CA (+1 949-451-4343, jmoloney@gibsondunn.com) Elizabeth Ising – Washington, D.C. (+1 202-955-8287, eising@gibsondunn.com) Lori Zyskowski – New York (+1 212-351-2309, lzyskowski@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 28, 2019 |
U.S. Cybersecurity and Data Privacy Outlook and Review – 2019

Click for PDF In honor of Data Privacy Day—an international effort to raise awareness and promote privacy and data protection best practices—we offer this seventh edition of Gibson Dunn’s United States Cybersecurity and Data Privacy Outlook and Review. In recent years, companies have been challenged to navigate a rapidly evolving set of cybersecurity and privacy challenges.  If anything, the pace of this evolution increased in 2018.  Federal agencies jockeying for prominence were joined by an increasingly active set of state Attorneys General and other state regulators in enforcing privacy and cybersecurity standards.  The California Consumer Privacy Act brought privacy regulation in the United States one step closer to prescriptive European-style controls.  With greater frequency, the plaintiffs in privacy class actions survived early attempts to dismiss their claims.  Biometric information privacy acts were an active battleground for litigation.  And questions regarding the government’s ability to access data, whether stored on servers outside the United States or on a cellphone in a target’s possession, came into sharp legislative and judicial focus. This Review places these, and other, 2018 developments in broader context, addressing: (1) the regulation of privacy and data security, including enforcement by federal and state authorities, new regulatory guidance, and key legislative developments; (2) trends in civil litigation, including privacy class actions, interceptions and eavesdropping, biometric information privacy acts, device hacking, and the development of the cybersecurity insurance market; and (3) the collection of electronically stored information by the government, including the extraterritoriality of subpoenas and warrants and the collection of data from electronic devices.  While we do not attempt to address every development that occurred in 2018, this Review focuses on a number of the most significant developments affecting companies as they navigate the evolving cybersecurity and privacy landscape. Our companion International Cybersecurity and Data Privacy Outlook and Review addresses a number of developments of interest to U.S. and international companies alike.  These include the entry into force of the European Union’s General Data Protection Regulation (“GDPR”), challenges to the EU-U.S. Privacy Shield framework, further developments around the EU Directive on the Security of Network and Information Systems (“NIS Directive”) and the EU ePrivacy Regulation, and changes to the privacy and cybersecurity legal landscape in Brazil, Canada, and China, among other countries. ____________________ TABLE OF CONTENTS I.    REGULATION OF PRIVACY AND DATA SECURITY A.  Enforcement and Guidance 1.   Federal Trade Commission 2.   Department of Health and Human Services and HIPAA 3.   Securities and Exchange Commission 4.   Other Federal Agencies 5.   State Attorneys General 6.   New York Department of Financial Services B.  Legislative Developments 1.   Federal Legislative Developments 2.   State Legislative Developments II.   CIVIL LITIGATION A.  Privacy Litigation 1.   Class Action Litigation 2.   Settlements B.  Interceptions and Eavesdropping 1.   Email Scanning 2.   Call Recording 3.   Other “Interceptions” C.  Telephone Consumer Protection Act D.  Video Privacy Protection Act E.   Biometric Information Privacy Acts F.   Internet of Things and Device Hacking 1.   Legislation 2.   Regulatory Guidance 3.   Litigation G.  Computer Fraud & Abuse Act H.  Cybersecurity Insurance III.   GOVERNMENT DATA COLLECTION A.  Electronic Communications Privacy Act Reform B.  Extraterritoriality of Subpoenas and Warrants and the CLOUD Act C.  Foreign Intelligence Surveillance Act Section 702 Reauthorization D.  Collection of Cellphone and Audio Data IV.    CONCLUSION ____________________ I.    REGULATION OF PRIVACY AND DATA SECURITY A.    Enforcement and Guidance 1.    Federal Trade Commission The Federal Trade Commission (“FTC” or “Commission”) remained one of the most active and aggressive regulators of privacy and data security in 2018.  Operating with new Commissioners and new leadership, the FTC announced eight enforcement actions related to privacy and data security issues as well as a broad-ranging policy review.  We address highlights from each of these developments below. a.    Leadership Changes and Policy Review This year saw an overhaul of the FTC’s leadership as President Trump appointed a new Chairman and filled each of the remaining four commissioner seats with new appointees.  After a lengthy period during which the Commission was operating at less than full strength, on May 1, 2018, Joseph Simons was sworn in as Chairman of the FTC.[1]  Simons previously served in roles at the FTC as Director of the Bureau of Competition, Associate Director for Mergers, and Assistant Director for Evaluation.[2]  He was joined by new four Commissioners—Joshua Phillips, Rebecca Kelly Slaughter, and Rohit Chopra, who were confirmed in May,[3] and Christine S. Wilson, who was confirmed in September (to fill departing Commissioner Maureen K. Ohlhausen’s seat).[4] The priorities of the newly constituted Commission remain unclear, but there are strong indications that changes may be forthcoming.  As the FTC attempts to balance the current Administration’s push for deregulation, growing public pressure for additional privacy enforcement, and the agency’s institutional desire for prominence, the form those changes will take is not yet clear.  Indeed, the Commission announced this fall that it would undertake a “comprehensive re-examination of the FTC’s approach to consumer privacy”—the first such review since 2012—as part of its ongoing “Hearings Initiative,” which is a series of policy reviews on a wide range of issues.[5]  The FTC’s review of data security and privacy issues has included public hearings and an invitation for comments, and will extend into spring of 2019. b.    Data Security and Privacy Enforcement Even in the midst of these leadership changes and the policy review, the FTC has continued to announce enforcement actions in this space, in some tension with the purportedly business-friendly administration, although many of these enforcement actions notably included no monetary remedies. Toy Manufacturer.  In January 2018, the FTC pursued its first children’s privacy case involving Internet-connected toys.[6]  The agency settled with a toy maker in a case alleging that the company violated the Children’s Online Privacy Protection Act (“COPPA”) by collecting personal information from children without notice or parental consent.[7]  The FTC also alleged that the company failed to take reasonable steps to secure the collected data as required under COPPA and that it falsely stated in its privacy policy that personal information obtained through its platforms would be encrypted.[8]  As part of the settlement, the company will pay $650,000.[9] Mobile Phone Manufacturer.  The FTC entered into a settlement with a mobile phone manufacturer in April 2018 over allegations that the company allowed collection of consumers’ personal information, such as the contents of text messages and location data.[10]  The mobile phone manufacturer allegedly collected information without consent and despite promises that the data would be kept secure and private.[11]  Specifically, the FTC alleged that the company falsely claimed that only data needed to perform requested services would be collected and that the company had implemented appropriate controls to safeguard consumer information, but in practice failed to do so.[12]  The final settlement, which includes no fines or penalties, prohibits the manufacturer from making misrepresentations about its data security and privacy measures, and mandates that the manufacturer establish, implement, and maintain a data security program.[13] Financial Services Firm.  In May 2018, the FTC entered into a settlement with a financial services firm over allegations that the company failed to provide users of its payment service with information about users’ ability to transfer funds.[14]  The service allegedly told users that money credited to their accounts could be transferred to bank accounts, without disclosing that the funds could be frozen or removed based on the service’s review of the underlying transaction.[15]  The FTC also alleged that the service’s default settings misled consumers about the privacy options for their transactions.[16]  The FTC also alleged that the service misrepresented its security systems and violated the Gramm-Leach-Bliley Act’s Safeguards and Privacy Rules by failing to maintain adequate security measures and failing to send privacy notices to consumers.[17]  The final settlement, which includes no fines or penalties, prohibits the service from misrepresenting any material restrictions on the use of its service, its privacy control settings, and its level of security.[18]  The order also requires the service to make disclosures to consumers relating to its transaction and privacy practices.[19]  With respect to alleged Gramm-Leach-Bliley Act violations, the service is prohibited from violating the Safeguards and Privacy Rules, and is required to obtain biennial third-party assessments of its compliance for the next 10 years.[20] EU-U.S. Privacy Shield Enforcement.  The FTC brought five actions against companies regarding false claims of certification under the EU-U.S. Privacy Shield framework, which establishes a process to allow companies to transfer consumer data from the European Union to the United States.[21]  In July 2018, the FTC settled charges with one company regarding allegations that the company falsely claimed on its website that it was in the process of being certified under the EU-US Privacy Shield framework.  In fact, the company had allegedly started an application but had not taken necessary steps to participate in the framework.  In September, the FTC announced it had reached a settlement with four additional companies over false claims of certification.[22]  Each of the companies claimed to be in compliance with the Privacy Shield, despite allowing their certifications to lapse or never obtaining certification in the first place.[23]  Each of the five settlements prohibits the companies from misrepresenting the extent to which they participate in any privacy or data security program, but did not include monetary payments.[24] c.    Eleventh Circuit Issues Important Decision in LabMD Case As we highlighted in last year’s Review, a now-defunct medical lab company, LabMD, last year appealed an FTC order finding that the company failed to reasonably protect its customers’ personal information from data breaches and requiring implementation of a comprehensive information security program to prevent future breaches.  This long-running case has been one of the highest-profile FTC data security enforcement actions, testing the boundaries of the FTC’s authority.  The company, in pursuing the litigation, has argued forcefully that the FTC overstepped its enforcement authority because, among other reasons, no consumer was injured as a result of the data breach.[25] In June 2018, the Court of Appeals for the Eleventh Circuit issued an important decision that held the FTC’s cease-and-desist order, which directed the company to implement a variety of security measures, was unenforceable.[26]  In the decision, the court assumed, arguendo, that the company’s “negligent failure to implement and maintain a reasonable data-security program constituted an unfair act or practice under Section 5(a),” and therefore did not reach the question of whether consumers had been injured.[27]  Nonetheless, the court held that even assuming the behavior was an unfair act or practice, the FTC’s order failed to enjoin any specific act or practice.[28]  Instead, the Court held that the FTC’s order inappropriately required the company “to overhaul and replace its data security program to meet an indeterminable standard of reasonableness,” requiring the Court to vacate the order.[29] By calling into question the FTC’s ability to fashion an enforceable order in data security cases, the Eleventh Circuit decision puts the burden on the FTC to define more clearly the conduct it is challenging and the remedies it seeks.  Going forward, we will be watching carefully to see how the FTC structures its orders to navigate these issues. 2.    Department of Health and Human Services and HIPAA Despite operating with a lower budget in 2018 than in previous years, the Department of Health and Human Services (“HHS”) has continued its strong efforts to enforce patient privacy violations, including imposing its largest ever fine to date, while also considering major regulatory overhauls to the Health Insurance Portability and Accountability Act (“HIPAA”) regulations.  But HHS is not the only entity seeking to enforce healthcare privacy violations, as 2018 also saw the first multi-state data breach lawsuit brought by Attorneys General of several states alleging violations of HIPAA.  These developments are addressed below. a.    HHS OCR Enforcement There were several notable HIPAA-related settlements and judgments during 2018: Health Insurer.  In October 2018, a large health insurer agreed to pay HHS’s Office for Civil Rights (“OCR”) $16 million and take “substantial corrective action” in response to alleged HIPAA violations related to a series of cyber-attacks in 2015, whereby hackers obtained electronic protected health information (“ePHI”) relating to more than 79 million individuals.[30]  The settlement almost tripled the previous high water mark for HIPAA enforcement settlements, which was set in 2016 and matched in 2017.[31]  OCR justified its high settlement by alleging that the insurer “failed to implement appropriate measures for detecting hackers who had gained access to their system to harvest passwords and steal people’s private information.”[32]  The insurer had allegedly “failed to conduct an enterprise-wide risk analysis, had insufficient procedures to regularly review information system activity, failed to identify and respond to suspected or known security incidents, and failed to implement adequate minimum access controls to prevent the cyber-attackers from accessing sensitive ePHI.”[33]  Taken together, these alleged violations and the seriousness of data breaches in the healthcare space led HHS to seek the high settlement. Dialysis Provider.  In February 2018, HHS reached a $3.5 million settlement with a national dialysis provider following a series of five separate breach reports alleging incidents that occurred in 2012 at five different locations.[34]  Emphasizing the need for performing risk assessments and risk analysis, HHS indicated that the “number of breaches, involving a variety of locations and vulnerabilities, highlights why there is no substitute for an enterprise-wide risk analysis for a covered entity.”[35] Cancer Center.  In June 2018, in a case that was not settled, an HHS Administrative Law Judge (“ALJ”) ruled against a hospital-based cancer center, finding on summary judgment that the cancer center had violated HIPAA following the theft or loss of a laptop and two USB thumb drives containing unencrypted ePHI in 2012 and 2013, and assessed a $4.3 million penalty.[36]  Key to the ALJ’s ruling was the center’s purported failure to address its risk assessment findings related to encryption.  Specifically, the ALJ found evidence that the center knew about the high risk to ePHI since at least 2006, stemming from the potential use of unencrypted devices, but failed to implement remediation until 2011, and even then did so inadequately.[37] Medical Records Company.  In February 2018, HHS OCR entered into a settlement that serves as a reminder that a covered entity’s obligations under HIPAA do not end when the company goes out of business, when it agreed to a $100,000 settlement with a now-bankrupt medical records storage company.[38]  Even after the company went out of business as part of an unrelated litigation, HHS alleged that the company had allowed an unauthorized individual to transport PHI, and declared that the “careless handling of PHI is never acceptable,” agreeing to take the settlement out of the liquidated assets designated for distribution to creditors and others.[39] b.    Request for Public Comments on Reforming HIPAA In addition to bringing enforcement actions, HHS also initiated a far-ranging review of HIPAA regulations, including asking for public comments on how it can amend HIPAA to “remove regulatory obstacles and decrease regulatory burdens in order to facilitate efficient care coordination and/or case management and to promote the transformation to value-based healthcare, while preserving the privacy and security of PHI.”[40]  The request for comments includes 54 questions, and interested parties must submit comments by February 12, 2019. c.    State AGs Bring Multi-State Action Premised on HIPAA Outside of HHS—in the first ever multi-state data breach lawsuit alleging violations of HIPAA—twelve state Attorneys General,[41] led by Indiana Attorney General Curtis T. Hill Jr., filed a complaint in Indiana federal court against a healthcare information technology company and its subsidiary related to a breach discovered in 2015 that compromised personal data of 3.9 million people.[42]  Notably, the lawsuit alleges that the company failed to protect ePHI in the hands of its business associate after a breach related to a third-party web application run by the company.[43]  The case was filed in December, 2018, and Gibson Dunn will continue to monitor developments. d.    HHS Issues Guidance on Cybersecurity Practices for the Healthcare Industry In late December 2018, HHS released detailed guidance on cybersecurity practices in the healthcare space.[44]  The publication was the result of a public-private taskforce that formed under a legislative mandate to develop practical and cost-effective cybersecurity guidelines.[45]  While adoption of the practices outlined in the guidance is voluntary, informed implementation could supplement an effort to demonstrate reasonable care in a negligence case related to cybersecurity, while failure to do so could lead to allegations of failure to take reasonable care.  The guidance includes two technical volumes—one for small organizations and one for medium and large organizations—that are organized under the 10 “most effective” cybersecurity practices as identified by the task force.[46]  These practices are not intended to be exhaustive or applicable to every entity, and the document encourages tailoring cybersecurity controls to the specific healthcare entity.[47] 3.    Securities and Exchange Commission As anticipated, the Securities and Exchange Commission (“SEC”) devoted increased attention in 2018 to cybersecurity enforcement and to regulatory activity, particularly around cryptocurrency and initial coin offerings. a.    Cybersecurity and Data Breaches SEC Guidance.  In February 2018, the SEC announced new guidance to assist public companies in understanding their disclosure obligations with respect to cybersecurity risks and incidents and to highlight the importance of cybersecurity policies and procedures.[48]  The guidance was the SEC’s first major pronouncement on these issues since 2011,[49] and the new guidance “reinforce[es] and expand[s] the previous guidance,” including by emphasizing “the importance of cybersecurity policies and procedures and the application of insider trading prohibitions in the cybersecurity context.” Insider Trading Charges.  In March and June 2018, the SEC charged two employees of a credit reporting agency with insider trading in advance of the company’s September 2017 disclosure of a data breach affecting nearly 150 million people.  Specifically, the SEC charged the credit agency’s former Chief Information Officer, and a former manager.[50]  The manager was charged after he deduced that a website for an unnamed client affected by the breach was in fact for consumers of the credit reporting agency.[51]  The charges underscore the importance of implementing internal policies and controls to prevent trading on non-public information related to cybersecurity incidents, and the personal risk to executives who make trades without disclosing such information first. Internal Accounting Controls.  In October 2018, the SEC issued a report cautioning public companies about the importance of internal controls to prevent cyber fraud.  The report described the SEC Enforcement Division’s investigation into whether nine unidentified companies that were victims of cyber-related fraud had sufficient internal accounting controls in place to satisfy their obligations under Sections 13(b)(2)(B)(i) and (iii) of the Securities Exchange Act of 1934.  The SEC ultimately decided not to pursue enforcement actions against the nine companies, but advised issuers and other market participants to consider cyber-related threats when devising and maintaining a system of internal accounting controls.[52]  The report points to the SEC’s growing interest in corporate controls designed to mitigate cyber risks, and should be understood as a warning to public companies that future investigations could lead to enforcement actions.[53] b.    Cryptocurrency In addition to regulatory efforts by other financial industry actors such as the Federal Reserve, the Commodity Futures Trading Commission, the Federal Deposit Insurance Commission, and state agencies, the SEC made regulation of cryptocurrencies and protection of investors from the risks associated with investment in digital assets a major focus in 2018. In January 2018, the SEC filed a complaint against a cryptocurrency platform[54] and obtained a court order halting an allegedly fraudulent initial coin offering (“ICO”).  For the first time in connection with an ICO, the court approved a receiver to secure various cryptocurrencies held by the platform.[55] SEC Chairman Jay Clayton later testified before Congress, noting that the SEC monitors cryptocurrency-related activities of brokers, dealers, investment advisers, and other market participants it regulates.  Clayton advised that ICO market participants should assess whether a coin or token is a security and, if a cryptocurrency is a security, must comply with the registration and other requirements of the federal securities laws.[56] 4.    Other Federal Agencies Although not as active or far reaching as actions by the FTC, HHS, or the SEC, other federal agencies also continue to make headlines in the data security and privacy space.  This year in particular, there were notable developments at the Federal Communications Commission (“FCC”), Consumer Financial Protection Bureau (“CFPB”), and Department of Defense (“DoD”). a.    Federal Communications Commission i.    FCC Robocall Initiative The FCC and FTC joined together to host two events aimed at preventing illegal robocalls and caller ID spoofing.[57]  The agencies hosted a Policy Forum in March 2018 to discuss the challenges posed by robocalls and the efforts being taken by both agencies to protect consumers.  The agencies also hosted a Technology Expo for consumers in April 2018, featuring technologies, devices, and applications to curb illegal robocalls.[58]  In announcing the events, leaders of both agencies highlighted the invasion of privacy consumers experience when receiving robocalls and the prevalence of consumer complaints on the issue.[59] ii.      ACA Int’l v. FCC As discussed in further detail in Section II.C. below, in March 2018, the D.C. Circuit issued a ruling that changes the rules for what constitutes an auto-dialer.[60]  The court held that the FCC’s use of the phrase “automatic telephone dialing system”—as interpreted in the FCC’s 2015 omnibus Declaratory Ruling and Order (the “omnibus order”)—was unreasonably broad under the Administrative Procedure Act (“APA”) because it effectively encompassed any uninvited call or message from any smartphone, due to smartphones’ potential to randomly dial numbers if a downloaded app could provide it such capabilities.[61]  The court also set aside as arbitrary and capricious the omnibus order’s imposition of liability for calling reassigned numbers without prior consent, even if the consent had been given by the number’s previous holder.[62]  The court upheld the omnibus order’s conclusion that “a called party may revoke consent at any time and through any reasonable means”—orally or in writing—“that clearly expresses a desire not to receive further messages.”[63]  In its decision, the court noted that the FCC was working to develop a new regime to avoid the reassignment issues involved in ACA International.[64]  As discussed below, the FCC approved new reassignment rules in June 2018. iii.    FCC Rulemaking Slamming and Cramming.  In June 2018, the FCC approved new rules relating to “slamming,” the unauthorized change of a consumer’s preferred telephone company, and “cramming,” imposing unauthorized charges on a consumer’s phone bill.[65]  The rule prevents phone companies from using deceptive tactics to obtain verification from consumers to switch service providers.[66]  Under the new rule, material misrepresentations will invalidate any alleged authorization given by a consumer to switch providers.[67]  Phone companies may face a five-year suspension from using third-party verification procedures for abusive practices.[68] Reassigned Numbers Database. In December 2018, the FCC adopted new rules to establish a reassigned numbers database.[69]  The rule will address unwanted calls to consumers caused by consumers who get a new phone and receive a reassigned number.[70]  Previously, businesses and other callers would call the consumer, looking for the holder of the number and not realizing the number had been reassigned.[71]  The rule establishes a single, comprehensive database with information provided by phone companies—and for callers who use the database, the rule will provide a safe harbor from liability for any calls to reassigned numbers caused by database error.[72] b.    Consumer Financial Protection Bureau In December 2018, the Senate confirmed Kathy Kraninger as director of the CFPB.[73]  Kraninger’s appointment follows a contentious battle between Trump appointee Mick Mulvaney, a critic of the agency, and deputy director Leandra English, who both claimed to be the lawful acting chief of the bureau.[74]  During her first day, Kraninger indicated a desire to distance herself from Mulvaney’s approach and promised that the agency “absolutely will take the enforcement actions to the full extent of the law and make sure we are protecting consumers.”[75]  Kraninger also stated that a priority is examining the bureau’s measures to secure the consumer data it collects.[76] The CFPB’s own security measures were called into question toward the end of 2017 after reports of a data breach, and the CFPB subsequently implemented a freeze on the collection of personally identifiable information (“PII”).[77]  In May 2018, however, acting director Mick Mulvaney lifted the freeze after an independent review concluded that the CFPB’s cybersecurity defenses were secure.[78] Given the changes in leadership, and Mulvaney’s efforts to scale back enforcement efforts, it is no surprise that the CFPB was not active in the enforcement area during 2018.  In the wake of the data breach at a major credit reporting agency in 2017, for example, the company initially disclosed in its SEC filings that the CFPB was investigating the company.  Since then, conflicting reports have claimed that Mulvaney declined to issue subpoenas or schedule interviews with the company’s leadership, even though the probe remains open.[79] c.    Department of Defense Defense Department Cyber Strategy.  In September 2018, the DoD issued a Cyber Strategy report outlining the DOD’s “vision for addressing this threat and implementing the priorities of the National Security Strategy and National Defense Strategy for cyberspace.”[80]  The report identified key cyberspace objectives of the department,[81] and explained that to achieve these goals, DoD will focus on developing cyber capabilities for warfighting and countering malicious cyber-attacks.[82]  This includes a focus on strengthening relationships with the private sector with advanced cyber capabilities to leverage the skills, resources, capabilities, and perspectives of those outside DOD.[83] “Do Not Buy” List For Foreign Software Vendors.  In July 2018, the DoD announced its creation, with the help of the intelligence community, of a “do not buy” list for defense suppliers.[84]  The list identifies certain vendors whose software originates in Russia or China, with the aim of helping the industry “steer clear of potentially problematic” products, those which “don’t operate in a way consistent” with defense standards.[85]  The “do not buy” list is part of a larger effort by the federal government to prevent Russian and Chinese penetration into defense and industrial systems.[86] 5.    State Attorneys General State Attorneys General continued to play a key role in data privacy and security matters this past year, acting at the forefront of concerted efforts to bring enforcement actions and regulate the technology industry. a.    Collaboration Among Attorneys General In 2018, states continued the trend of coordinating enforcement efforts with each other to settle multi-state litigations involving large-scale data breaches.  For example, as discussed above, in December 2018, eleven Attorneys General filed a federal complaint in the Northern District of Indiana against an electronic medical records company for violating provisions of HIPAA and state data and consumer protection laws when hackers stole protected health information relating to millions of individuals.  The suit marks the first time Attorneys General joined to file an action stemming from a HIPAA-related data breach in federal court.[87] b.    Developments Within States In May 2018, the New Jersey Attorney General announced plans to create a Data Privacy & Cybersecurity (“DPC”) Section within his office, under the authority of the Affirmative Civil Enforcement Practice Group, in response to increasing threats to online privacy.[88]  The DPC Section will take over the work of privacy and data security investigations and litigation from the Office’s Consumer Fraud Prosecution section.[89] Also in May 2018, the New Jersey Attorney General, in partnership with New Jersey’s Division of Consumer Affairs, entered into settlement with a Chinese app developer resolving the Division’s investigation into allegations that the company violated COPPA and the New Jersey Consumer Fraud Act (“CFA”) by collecting information from children under the age of 13 without parental consent.[90]  The developer agreed to pay $100,000 in fines and change its apps to prevent the collection of children’s data.[91] In June 2018, the New York Attorney General announced that several major business and consumer organizations endorsed the office’s Stop Hacks and Improve Electronic Data Security Act (“SHIELD Act”) of 2017.  The Act would require companies to adopt “reasonable” safeguards for sensitive data and expand the categories of data triggering reporting requirements.[92]  The Attorney General concurrently released a “Small Business Guide to Cybersecurity in New York State,”[93] which offers advice to small businesses on how to secure sensitive data and respond to data breaches. In September 2018, the New Mexico Attorney General filed suit against several mobile app developers for allegedly collecting personal data from children under the age of 13 without parental consent in violation of COPPA as well as of state law.[94]  The Attorney General expressed concerns that such data collection creates the “unacceptable risk of data breach and access from third parties” who may “exploit and harm” children.[95] In December 2018, the D.C. Attorney General filed a lawsuit against Facebook alleging that the company allowed Cambridge Analytica to gain access to information on D.C. residents for use in targeted ad campaigns during the 2016 presidential election.[96]  The Attorney General’s complaint alleges violations of the District’s Consumer Protection Procedures Act.[97] 6.    New York Department of Financial Services The New York Department of Financial Services (“NYDFS”) is the most active state cybersecurity regulator in the nation.  As noted in last year’s Review, New York’s Cybersecurity Regulation, 23 NYCRR 500, proposed by NYDFS in September 2016, became effective March 1, 2017, and marks a sweeping effort to impose cybersecurity obligations on a broad set of regulated institutions.[98] Specifically, 23 NYCRR 500 applies to all entities licensed or otherwise regulated by NYDFS, including state-chartered banks, licensed lenders, private bankers, foreign banks licensed to operate in New York, mortgage companies, insurance companies, and, by extension, service providers to such regulated entities (“Covered Entities”).[99] All Covered Entities were already required: By August 2017, to designate a Chief Information Security Officer; implement an overall cybersecurity program and appropriate cybersecurity policies; regulate access privileges; develop an incident response plan; and be able to notify the Superintendent of Financial Services within 72 hours of a cybersecurity incident; By March 2018, to conduct a risk assessment; implement cybersecurity monitoring, testing, and personnel training; maintain effective controls, such as multi-factor authentication; and have the Chief Information Security Officer prepare a written report to the board of directors; and By September 2018, to develop and periodically update policies and procedures to secure in-house and externally developed applications; detect unauthorized access to, use of, or tampering with nonpublic information; and to implement limits on data retention, audit trails to detect and respond to cybersecurity incidents, and controls, such as encryption, to protect nonpublic information.[100] For certain of the above measures, the board of directors or a senior officer of the company must certify that the company is in compliance by next month, February 15, 2019.[101] In addition, by March 1, 2019, the final transitional compliance deadline, Covered Entities must ensure that the third-party vendors with whom they do business also have adequate cybersecurity policies.  Specifically, Section 500.11 requires Covered Entities to “implement written policies and procedures designed to ensure the security of Information Systems and Nonpublic Information that are accessible to, or held by, Third Party Service Providers,” including “relevant guidelines for due diligence and/or contractual protections relating to Third Party Service Providers.”[102] The Superintendent of the Department of Financial Services, Maria T. Vullo, announced that the Department will incorporate cybersecurity in all of its regulatory examinations, including adding cybersecurity-related questions to “first day letters” (i.e., notices the Department issues to launch examinations of financial services companies).[103] The Department has also ventured into an enforcement role.  In June 2018, the Department, along with seven other state regulatory agencies, entered into a consent order with a national credit reporting agency stemming from its 2017 data breach.  Although the order does not impose a fine, it requires the agency to develop a risk assessment, establish a formal internal audit program, and improve board oversight of information security, vendor management, patch management, and information technology operations.[104] Additionally, on July 3, 2018, the Department adopted a new regulation, 23 NYCRR 201.07, requiring consumer credit reporting agencies (“CCRAs”) to register with the Department and to comply with the cybersecurity regulations under 23 NYCRR 500, with a final deadline of December 31, 2019, to comply with all requirements, including section 500.11.[105] Other states are expected to follow suit in enacting similar cybersecurity regulations.[106] B.    Legislative Developments 1.    Federal Legislative Developments 2018 saw a flurry of congressional activity in the area of cybersecurity, particularly compared to 2017.  The most significant piece of privacy legislation to be signed into law was the Clarifying Lawful Overseas Use of Data Act (“CLOUD Act”) (see Section III.B.).  In addition, Congress reauthorized Section 702 of the Foreign Intelligence Surveillance Act (see Section III.C.) and took steps towards addressing cybersecurity, data privacy, and robocalling, though few of those bills have become law. a.      Enacted Legislation i.    The CLOUD Act As discussed further in Section III.B. below, on March 23, 2018, Congress passed, and President Trump signed into law, the CLOUD Act,[107] which amends the Stored Communications Act of 1986 (“SCA”)[108] to allow the federal government to obtain warrants to compel service providers to turn over customer data stored outside of the United States and enter into bilateral data-sharing agreements with foreign governments for law enforcement purposes.  Service providers would be permitted to move to quash warrants obtained by the government if there is a material risk that compliance with the request would violate the laws of a foreign government.[109]  As discussed further in Section III.B., the legislation stemmed from litigation between the federal government and a prominent tech company that had reached the U.S. Supreme Court.[110] The CLOUD Act was supported by several tech giants who argued that it appropriately balances individual privacy rights and would help reduce international disputes.[111]  Other activist organizations, on the other hand, expressed concern that it does not provide sufficient procedural protections for cross-border access to consumer information.[112] ii.      Foreign Surveillance As discussed further in Section III.C. below, in January 2018, Congress reauthorized Section 702 of FISA for another six years without any significant changes.[113]  Section 702 allows the government to collect foreign communications without a warrant; however, the reauthorization does require the FBI to obtain a court order based on probable cause to access the communications of U.S. persons in criminal investigations unrelated to national security.[114]  Additionally, the reauthorization resumes the controversial “abouts” collection program, which allows the government to collect communications that contain a reference to a target (i.e., communications “about” a target), instead of just communications to or from a target, pending written notice to Congress.[115] b.      Proposed Legislation i.        TRACED Act On November 15, 2018, Senators John Thune (R-SD) and Edward Markey (D-MA) introduced the Telephone Robocall Abuse Criminal Enforcement and Deterrence (“TRACED”) Act,[116] which would amend the Communications Act of 1934 to authorize the FCC to crack down on illegal robocalls by creating authentication rules for voice service providers to prevent “caller ID spoofing.”[117]  The bill provides for significant penalties against telemarketers and scammers that use automatic dialing services, imposing a $10,000 fine for each call made in intentional violation of the law,[118] and increases the statute of limitations (to three years, up from one) for the FCC to bring an action against violators.[119]  It also brings together various federal agencies, state Attorneys General, and other non-federal entities to report to Congress on improving enforcement measures and directs the FCC to promulgate rules designed to stop texts and calls made using unauthenticated numbers.[120]  There has been no additional action on the bill in 2019. ii.      Cybersecurity and Data Breach Notification There was little agreement in Congress this year over how to respond to data breaches such as the breach of a prominent consumer credit reporting agency in 2017.  On December 11, 2018, Republican and Democratic leaders on the House Oversight Committee released dueling reports responding to that breach, with the Democratic report chastising House Republicans for not demanding stricter cybersecurity laws.[121]  The Democratic report recommended increasing financial penalties for data breaches, strengthening the FTC’s enforcement authority over credit agencies, and passing legislation that would create a framework for notifying victims of data breaches.[122]  In contrast, the Republican report suggested forming more public-private partnerships, requiring credit reporting agencies to be more transparent with consumers, and providing a “government-wide framework of cybersecurity and data security risk-based requirements” for federal contractors.[123] The proposed Consumer Privacy Protection Act of 2017,[124] which was introduced by Senator Mark Warner (D-VA) at the end of that year and would require disclosure of security breaches and implementation of comprehensive consumer privacy and data security programs by certain commercial entities, did not appear to make any headway in 2018.  Its future in the new Congress is uncertain. iii.    Email Collection by Law Enforcement As discussed further in Section III.A. below, efforts to reform the Electronic Communications Privacy Act (“ECPA”) fell short in 2018, despite advocacy from the technology industry and privacy organizations.  Controversially, ECPA still allows the government to obtain a court order (not a search warrant) directing service providers to grant access to emails after 180 days have passed.[125]  In 2018, the House passed the Email Privacy Act (“EPA”) as part of the Fiscal Year 2019 National Defense Authorization Act (“NDAA”) to impose a warrant requirement for access to emails over 180 days old.[126]  As in 2017, the bill lost steam in the Senate, where the final version of the NDAA passed without EPA’s reforms or the broader ECPA Modernization Act of 2017 introduced by Senators Mike Lee (R-UT) and Patrick Leahy (D-VT) last year.[127] 2.    State Legislative Developments In 2018, states continued to supplement federal law with their own data privacy regulations.  The trend in state legislation has broadly been to tighten controls and provide higher levels of consumer protection, with some exceptions.  California and New York have led the way with substantial data privacy and cybersecurity regulations implemented last year, and other states have enacted laws pertaining to data breaches, cybersecurity, and online privacy. a.    Data Breach Legislation In 2018, Alabama and South Dakota joined the national trend of enacting data breach notification legislation, meaning that all 50 states (as well as the District of Columbia, Guam, Puerto Rico, and the Virgin Islands) now have data breach notification laws in place.[128]  The Alabama Data Breach Notification Act of 2018 generally requires entities to notify subjects of a breach involving their electronically stored “sensitive personally identifying information,” which includes health information and other private details that could lead to access of sensitive data, no later than 45 days after learning of the beach.[129]  South Dakota’s data breach notification law provides a similar scope of protection, but requires notification within 60 days absent exceptional circumstances.[130]  Other states, such as Louisiana, amended existing data breach notification laws with more detail regarding definitions, timelines, and data disposal in case of a breach.[131] b.    California Consumer Privacy Act of 2018 On June 28, 2018, California passed the California Consumer Privacy Act of 2018 (“CCPA”), which will broadly raise the bar for companies, regardless of where located, that handle the personal information of California consumers.  The law is scheduled to go into effect on July 1, 2020 (or possibly later, see below)[132] and is projected to affect over 500,000 companies.[133]  Taking a cue from the EU’s General Data Protection Regulation (“GDPR”), the CCPA represents a much more comprehensive and stringent approach to data privacy than most existing privacy laws in the United States. Since its passage, however, various concerns have been raised about the law, which was hastily enacted to prevent an even more onerous privacy initiative from being presented to voters on the November 2018 ballot.[134]  Since its passage, the CCPA has been amended once, and further amendments are expected prior to its effective date.[135] The CCPA requires businesses that collect personal information relating to California consumers to, among other things: (1) disclose what personal information is collected and the purposes for which that information is used; (2) delete a consumer’s personal information if requested to do so, unless it is necessary for the business to maintain such information for certain purposes; (3) disclose what personal information is sold or shared and to whom; (4) stop selling a consumer’s personal information if requested to do so (i.e., the “right to opt out”), unless the consumer is under 16 years of age, in which case the business is required to obtain affirmative authorization to sell the consumer’s information (i.e., the “right to opt in”); and (5) not discriminate against a consumer for exercising any of the aforementioned rights, including by denying goods or services, charging different prices, or providing a different level or quality of goods or services, subject to certain exceptions.[136] With one exception, the CCPA does not include a private right of action, and thus the law will largely be enforced by the California Attorney General (as opposed to consumers filing private lawsuits).  The exception is that consumers whose non-encrypted or unredacted personal information has been accessed, exfiltrated, stolen, or disclosed “as a result of the business’ violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information” may initiate a civil lawsuit.[137] The Attorney General has until July 2020 to develop and publish rules implementing regulations for the CCPA and cannot enforce them until the later of July 1, 2020 or six months after their publication.[138]  Therefore, enforcement cannot begin until at least July 2020, and possibly later. For a detailed discussion of the original Act, see our July 12, 2018 client alert.  For a discussion of the September amendments, see our October 5, 2018 client alert. c.    Other California Legislation In September 2018, California also passed an expansive law regulating “connected devices,” broadly applying to any devices with the ability to connect to the internet and that are assigned an Internet Protocol (“IP”) or Bluetooth address.[139]  The law requires that every connected device must have a “reasonable security feature” that is: “(1) Appropriate to the nature and function of the device.  (2) Appropriate to the information it may collect, contain, or transmit.  (3) Designed to protect the device and any information contained therein from unauthorized access, destruction, use, modification, or disclosure.”[140] The law does not specify what a reasonable security feature entails, except that it does indicate that a connected device has a reasonable security feature if it is “equipped with a means for authentication outside a local area network” with a “preprogrammed password [] unique to each device manufactured” or “contains a security feature that requires a user to generate a new means of authentication before access is granted to the device for the first time.”[141]  Outside of this caveat, future enforcement actions will likely detail the specific guidelines for this regulation. d.    Other Cybersecurity Legislation Ten states also enacted legislation in 2018 relating to consumers requesting security freezes on their credit reports.[142]  These statutes largely require credit reporting agencies to honor consumers’ requests to freeze their credit reports without charging a fee, in order to facilitate further security in case of data breaches. But not every legislative action has afforded heightened consumer protection.  For example, effective November 2, 2018, Ohio enacted a “safe harbor” provision that allows businesses that have established written cybersecurity programs that meet industry standard requirements to claim an affirmative defense against tort claims alleging a failure to implement reasonable security controls.[143]  The law notably does not impose liability on businesses that fail to meet the standard; rather, it can only be used as an affirmative defense.[144] II.    CIVIL LITIGATION A.    Privacy Litigation 1.    Class Action Litigation a.    High-Profile Incidents and Related Litigation in 2018 In 2018, there were numerous attacks on technology, hospitality, retail, healthcare and other companies that exposed personal data and resulted in litigation. i.        Technology Companies Social Media Network.  On March 17, 2018, the New York Times reported that British political consulting firm Cambridge Analytica had obtained information on more than 50 million users of Facebook.[145]  Shareholders brought several derivative lawsuits that were consolidated in the Northern District of California, and the social media network’s motion to dismiss is currently pending before the court.[146]  A number of consumer class action were also consolidated in the Northern District of California.  At the end of December 2018, the social media network filed its reply in support of its motion to dismiss, which contends, among other arguments, that the plaintiffs have not suffered any actual or concrete harm.[147] Social Media Network.  In October 2018, Google announced that it was shutting down its social network following a report by the Wall Street Journal that a bug had exposed the profiles of hundreds of thousands of users for three years.[148]  On December 10, 2018, the company disclosed another bug that had exposed the profile data of 52.5 million users, including data such as name, age, email address, and occupation.[149]  The company now faces a class action complaint in the Northern District of California,[150] and several shareholder lawsuits in New York and California federal courts, including one brought by an investment fund owned by the state of Rhode Island.[151] ii.      Political Breaches Russia’s alleged interference with the 2016 presidential election continued to draw attention throughout the year.  On May 8, 2018, the Senate Intelligence Committee released a briefing that concluded Russia was engaged in efforts to undermine the integrity of the 2016 elections.[152]  It stated that Russian hackers had breached the security of election computers in several states and were “in a position to” alter voter registration data.  There was ultimately, however, no evidence that Russia actually changed vote tallies or the registration information of voters.[153]  On July 13, 2018, the U.S. Department of Justice indicted twelve Russian intelligence officers for hacking the systems of the Democratic Congressional Campaign Committee, the Democratic National Committee (“DNC”), and Hillary Clinton’s presidential campaign, as well as for conspiring to hack state boards of elections and U.S. companies that supplied election software.[154] On April 20, 2018, the DNC brought suit against the Russian government, Donald J. Trump for President, Inc., and Wikileaks in a New York federal court.  The complaint asserts that the defendants conspired to hack the Democratic Party in order to benefit President Trump’s campaign, including by illegally accessing the DNC’s emails, donor information, opposition research, and strategic plans.[155]  On December 10, 2018, Donald J. Trump for President, Inc. and Wikileaks moved to dismiss the suit on First Amendment grounds, and for failure to adequately state a claim.[156] iii.    Consumer Information International Hotel Management Company.  On November 30, 2018, an international hotel management company announced a data breach that potentially exposed information on approximately 300 million guests.[157]  Following the announcement, consumers filed putative class actions in Maryland, Illinois, Massachusetts, California, and New York.[158]  A shareholder also brought suit in a New York federal court.[159] Sports Apparel Company.  On March 30, 2018, a fitness apparel brand announced that an “unauthorized party” had acquired account information, including usernames, email addresses, and hashed passwords, for around 150 million users of its fitness-tracking app.[160]  The hacker did not gain access to payment card data because the company stored that information separately.[161]  In the wake of the disclosure, an app user initiated a class action lawsuit, and the company is currently seeking to arbitrate the matter.[162] Department Stores.  On April 2, 2018, the owner of two national department stores announced an attack affecting potentially five million customers.[163]  Hacker group JokerStash Syndicate claimed that they stole five million credit card and debit card numbers and had been releasing them for sale on the dark web.[164]  JokerStash has been linked to several past breaches, including those of a national grocery chain and fast casual restaurant chain.[165]  Consumers filed several class actions in the wake of the April announcement, including in New York, California, Delaware, and Tennessee federal courts.[166]  On August 1, 2018, the Judicial Panel on Multidistrict Litigation rejected a request by one of the plaintiffs to centralize the lawsuits in New York.[167] On April 4, 2018, a separate department store revealed that a cyberattack on its online customer service vendor exposed the payment information of 100,000 customers.[168]  An airline company also used the same vendor and estimated that the incident may have affected several hundred thousand of its customers as well.[169]  Consumers filed at least one class action lawsuit following the announcement.[170] And after a data breach left consumer data exposed from April to June 2018, a customer of another large department store brought a complaint in an Alabama federal court alleging that the company failed to adequately protect consumer data such as names, addresses, and credit card numbers.[171] iv.    Healthcare Data Breaches of healthcare data have been rising for years, according to a study of annual health data breaches, and 2018 was no different.[172]  On July 30, 2018, a hospital health system suffered a phishing attack that impacted the records of 1.4 million patients.[173]  Some of the healthcare provider’s employees had transmitted their login credentials in response to an email that falsely appeared to be from a company executive.[174]  The attack exposed information such as patient names, dates of birth, medical and treatment information, lab results, social security numbers, driver’s license numbers, insurance information, and payment information.[175]  This was the second successful phishing attack on the hospital in 2018.[176]  The first breach affected 16,000 patients.[177]  The hospital group faces a class action in Wisconsin federal court and has moved to dismiss the matter on the theory that the plaintiffs did not properly plead traceable harm.[178] Other healthcare providers also suffered data breaches that exposed the patient data of more than 500,000 patients each.[179] b.    Update on High-Profile Data Breach Cases from Prior Years i.    District Court Litigation Consumer Credit Reporting Agency.  A consumer credit reporting agency faced a series of lawsuits following a hack of its computer system in 2017 that exposed the names, social security numbers, addresses, and other PII of more than 140 million people.[180]  On December 6, 2017, the class actions were consolidated in the Northern District of Georgia.[181]  Since then, the agency has been fighting to dismiss various parts of the cases.  On July 17, 2018, the agency moved to dismiss dozens of banks and credit unions’ claims on standing grounds, arguing that the financial institutions had not adequately alleged that any fraudulent charges had been made on payment cards they had issued.[182]  And on July 30, 2018, the agency moved to dismiss small business plaintiffs on standing grounds as well, arguing that businesses cannot bring claims arising from injuries that their owners allegedly suffered.[183]  On December 14, 2018, the court heard oral argument on the motion to dismiss the consumer, financial institution, and small business plaintiffs.[184] Restaurant Chain.  A restaurant chain faced lawsuits by financial institutions as a result of a 2017 data breach that affected its payment card data.[185]  On October 24, 2018, the federal court in Colorado dismissed the majority of the claims, including those under negligence and trade secret law.[186]  The court allowed the counts under California’s unfair competition law, New Hampshire’s consumer protection law, and tort law to proceed.[187] ii.    Appellate Litigation Federal Agency.  In 2017, the District Court for the District of Columbia dismissed a class action suit filed after the Office of Personnel Management (“OPM”) suffered a breach that affected the data of past and present U.S. government employees.[188]  In May 2018, various groups, including federal employee unions and privacy organizations, urged the D.C. Circuit to revive the litigation.[189]  The groups disagreed with the district court’s finding that the plaintiffs had not pleaded an actual injury and lacked Article III standing.[190]  On November 2, 2018, the D.C. Circuit heard oral argument, and the parties are awaiting a decision.[191] Health Insurance.  In 2017, the D.C. Circuit ruled that members of a Maryland insurer could proceed with a class action lawsuit alleging that their personal information was stolen in a 2014 data breach.[192]  The parties disputed whether plaintiffs had sufficient Article III standing based on a substantial risk of non-imminent future harm.[193]  On February 20, 2018, the Supreme Court denied without comment the company’s petition for certiorari, which would have been the first data breach case to reach the Supreme Court.[194]  In its petition, the company had argued that Supreme Court guidance was necessary to resolve a circuit split over whether the exposure of personal data satisfied the standing requirement.[195] Online Retailer.  On March 8, 2018, the Ninth Circuit revived a class action filed in response to a 2012 data breach that affected the data of 24 million online shoppers.[196]  The panel ruled that the plaintiffs had adequately shown standing because of the sensitivity of the information exposed, credit card numbers, and risk of identity theft, phishing, and pharming.[197]  On August 20, 2018, the retailer filed a petition for certiorari and urged the Supreme Court to resolve a circuit split over whether exposure to a data breach constitutes an injury under Article III.[198] National Bookstore.  On April 11, 2018, the Seventh Circuit revived a proposed class action alleging that a large bookstore failed to secure customers’ financial data during a 2012 security breach.[199]  The panel held the customers adequately pleaded injuries, which included money spent on credit-monitoring services and time spent “to set things straight.”[200]  The panel sent the proceedings back to the lower court to adjudicate the merits and decide whether the proposed class should be certified.[201] c.    Circuit Split on Standing in Post-Spokeo 2018 Following the Supreme Court’s decision in Spokeo, Inc. v. Robins,[202] circuits continue to be split over how to satisfy the Article III standing requirement in data breach cases.  The D.C. Circuit in Attias v. CareFirst Inc. found that the mere exposure of personal information and risk of identity theft are sufficient to demonstrate standing.[203]  CareFirst filed a petition for certiorari, arguing that the circuit split was ripe for clarification; on February 16, 2018, the Supreme Court denied certiorari.[204] The Ninth Circuit also ruled in In re Zappos that victims of a data breach adequately pleaded standing because the information exposed in the data breach (credit card numbers) was sensitive, and because stolen data could be used to harm the plaintiffs.[205]  The Ninth Circuit reasoned that Clapper v. Amnesty International was not applicable because that case involved a challenge to surveillance procedures authorized by the Foreign Intelligence Surveillance Act, and thus involved a more speculative threat of identity theft and unique national security concerns.[206]  Zappos subsequently submitted a petition for certiorari, and it remains to be seen whether the high court will take up the case.[207]  The petition was distributed on November 20, 2018, which suggests that the Court may come to a decision soon. By contrast, the Second, Fourth, and Eighth Circuits have found that the risk of identity theft or credit card fraud does not constitute a concrete harm.[208] d.    Shareholder Derivative Suits Data privacy incidents typically spark both class actions brought by consumers (like those discussed above), as well as by shareholders.  This year was no different. First, a restaurant chain settled a derivative action pending in the Southern District of Ohio brought by certain shareholders before there was even a consolidated complaint and before the court had assessed who to appoint as lead counsel.[209]  The lawsuit stems from the company’s disclosure that malware had affected the company’s point-of-sale system, enabling credit card data to be stolen from 300 of its franchised restaurants.[210]  The settlement agreement, which is currently pending approval by the district court, does not provide a payment of any funds, but instead would require the company to implement certain remedial cybersecurity measures to prevent future breaches.[211]  Notably, the settlement provides for a newly created board-level Technology Committee with oversight over the company’s cybersecurity and information technology, requires the company to maintain its advisory council of franchisee representatives, and requires the company to either provide certain foundational security services to its franchisees or designate an approved vendor for such services.[212] In the wake of this settlement, two prominent class action securities fraud lawsuits were filed following data security incidents: Technology Company. After Google announced on October 8, 2018, that a problem with its software had exposed the personal profile data, including names, e-mail addresses, birth dates, profile photos, places lived, occupation, and relationship status, of nearly half a million users, several shareholders filed a proposed class action for damages under the Securities Exchange Act.[213]  The plaintiffs allege that by not disclosing the technical problem back in March 2018 when it was first discovered, and instead disclosing in October after reports in The Wall Street Journal, the company deceived investors and caused shares to be traded at inflated prices.[214]  Once the court appoints a lead plaintiff and lead counsel, the parties will meet and confer regarding the filing or designation of an operative complaint.[215] Education Materials and Service Provider. On September 25, 2018, a company providing educational materials and services to high school and college students announced that an unauthorized party had gained access to the user data of approximately 40 million users, causing the company’s share price to fall significantly.  Days later, plaintiff shareholders filed a class action lawsuit in the Northern District of California against the company.[216]  The complaint alleges that defendants’ failure to disclose in its quarterly press release that it did not maintain sufficient data security measures constituted materially false or misleading statements and pointed to the fall in the company’s share price following disclosure of a data breach to support its allegation that the share prices were artificially inflated by the conduct.[217]  On December 10, 2018, the case was consolidated with a substantively similar putative class action complaint before Judge Charles Breyer of the Northern District of California, who will appoint a lead plaintiff and lead counsel.[218] 2.    Settlements In 2018, companies reached settlements over some of the largest data breaches on record.  In addition, the Supreme Court considered the legality of the cy pres settlement at issue in a 2013 privacy class action against Google. a.    Health Insurer’s Settlement In 2015, a large health insurer announced that hackers had gained access to the names, birth dates, social security numbers, home addresses, and other personal information of approximately 79 million people.[219]  In August 2017, the Northern District of California preliminarily approved a settlement of the numerous class action lawsuits brought by consumers.[220]  On August 15, 2018, Judge Koh approved the settlement.[221]  In a lengthy opinion, she determined that the settlement avoided the risk, expense, and duration of further litigation, and was an adequate amount when compared to other data breach settlements, and the damages calculation of the plaintiffs’ expert.[222] b.    Consumer Credit Reporting Agency’s Settlement In September 2015, a mobile telecommunications company announced that 15 million customers’ data had been hacked on databases belonging to a consumer credit reporting agency, which it uses to conduct its credit checks.[223]  The breach compromised names, addresses, and dates of birth, and may have implicated social security and driver’s license numbers.[224]  The telecommunications customers brought suit against the credit reporting agency for claims of negligence, and 32 cases were consolidated in December 2015 in the Central District of California.[225]  On November 12, 2018, the plaintiffs moved for preliminary approval of a $22 million settlement fund.[226]  The settlement would also include credit monitoring services and an additional $11.7 million in remedial and enhanced security measures.[227] c.    Supreme Court’s Review of a 2013 Cy Pres Award During this term, the Supreme Court considered the legality of cy pres-only settlements, which provide no direct compensation to class members and instead distribute settlement proceeds to public interest organizations that further the interests served by the class action litigation.  On October 31, 2018, the Supreme Court heard oral arguments regarding the legality of a 2013 settlement for a privacy class action that claimed a large technology company shared users’ search queries with website owners.[228]  Per the settlement terms, of the $8.5 million settlement amount, $5,000 would go to three named plaintiffs, $2.15 million to class counsel, and $5.3 million to various internet privacy non-profit organizations.[229]  The Competitive Enterprise Institution, a conservative think tank, argued that the settlement violated Federal Rule of Civil Procedure 23(e), which requires that class action settlements be “fair, reasonable, and adequate.”[230] During arguments, Justices Sotomayor and Breyer seemed to suggest the current system was working, as courts rarely approve cy pres-only settlements.  Justices Roberts, Alito, and Kavanaugh expressed doubt at whether distributions to cy pres beneficiaries, who often had connections to class counsel rather than the class members, actually constituted relief.[231]  Several justices also queried whether the plaintiffs had standing to bring the class action in the first place.[232]  In a rare move, the Court responded by ordering the parties to brief the issue following oral arguments.[233]  A decision on the matter is forthcoming. d.    Comparison of Settlements of Data Breach Claims from 2015-2018 To place this year’s settlements in historical context, below are details of a number of significant settlements over the past few years. Defendant Category Approval Data Type Relief to the Class Service Awards, Fees, & Costs Health Insurer[234] August 15, 2018 Personal Information $115 million for, among other things, class members’ out-of-pocket expenses and credit monitoring services; security practice changes Up to $3 million in costs and $37.95 million in fees, to be covered by $115 million settlement payment Home Improvement Retailer (Financial Institution Class)[235] September 22, 2017 Card Data $25 million for class claims; up to $2.25 million to certain sponsored entities; security practice changes Up to $2,500 for each class representative; $710,000 in litigation costs; $15.3 million in fees Home Improvement Retailer (Consumer Class)[236] August 23, 2016 Card Data Up to $13 million for class claims; up to $6.5 million for 18 months of credit monitoring services; security practices changes $1,000 for each representative plaintiff; $166,925 in costs; $7.536 million in fees Department Store (Financial Institution Class)[237] May 12, 2016 Card Data Up to $20.25 million for class claims; $19.108 million to MasterCard; Reportedly up to $67 million for Visa’s claims against Target[238] $20,000 for 5 representative plaintiffs; $2.109 million in costs; $17.8 million in fees Entertainment Company[239] April 6, 2016 Login and Personal Information Up to $2 million for preventative losses; up to $2.5 million for claims for identity theft losses; up to two years of credit monitoring services $3,000 for each named plaintiff; $1,000 for each plaintiff who initially filed an action; $2.588 million in fees Healthcare Services Company[240] February 3, 2016 Health Information $7.5 million in cash payment; up to $3 million for class claims; one year of credit monitoring services (offered during remediation); security practice changes $50,000 in incentive payments for class representatives; $7.45 million in fees and costs Department Store (Consumer Class)[241] November 17, 2015 Card Data Up to $10 million for claims; security practice changes $1,000 for three deposed plaintiffs; $500 for other plaintiffs; $6.75 million in fees Social Networking Service[242] September 15, 2015 Login Information Up to $1.25 million for claims; security practice changes $5,000 for the named plaintiff; $26,609 in costs; $312,500 in fees Computer Software Company[243] August 13, 2015 Voluntary Dismissal Login and Card Data Security practice changes and audit $5,000 to each individual plaintiff; $1.18 million in fees Entertainment Company[244] May 4, 2015 Card Data and Personal Information Up to $1 million for identity theft losses; benefit options including free games and themes or month subscription, unused wallet credits, virtual currency; some small cash payments $2.75 million in fees B.    Interceptions and Eavesdropping 1.    Email Scanning This year, compared to 2017, saw fewer developments in class action lawsuits alleging technology companies violated state and federal laws by scanning user emails.  Nonetheless, companies operating electronic communications services should continue to monitor such lawsuits, as they concern potentially massive proposed classes including all or many users of such services. Email Web Service.  On June 6, 2018, a web service that unsubscribes users from mailing lists, newsletters, and other unwanted emails prevailed on consent grounds in its motion to dismiss claims under the ECPA, the SCA, and California’s Invasion of Privacy Act (“CIPA”).[245]  The plaintiffs asserted that the web service intercepted and accessed users’ emails without consent or authorization, or exceeded authorization by accessing emails for the purpose of extracting and selling consumer data.[246]  The court noted that “[a]ll of the Complaint’s statutory claims depend on a lack of consent.”[247]  Plaintiffs alleged that they consented for the web service “to access their emails only for the limited purpose of cleaning up their inboxes, and that they did not allow [the company] to sell their data for market research purposes.”[248]  But the court rejected this proposition because “the privacy policy reserves the right to do exactly what [the company] did:  ‘collect and use your commercial transactional messages and associated data to build anonymous market research products and services with trusted business partners.’”[249] 2.    Call Recording In recent years, there have been a number of civil and criminal cases brought against both businesses and individuals for recording phone calls without the requisite consent.  The recording of telephone conversations is governed by a patchwork of federal and state law.  At the federal level, the Wiretap Act permits the recording of phone calls, so long as one party to the call consents to the recording.[250]  The vast majority of states have similarly adopted a “one-party” consent requirement, while a minority of states have adopted either a “two-party” or “all-party” consent requirement.  Most of the call recording cases brought in recent years have been against companies for large-scale recordings of commercial calls, rather than individual illicit recordings. Although nearly a dozen states have all-party consent laws, lawsuits for call recording under the CIPA continue to expand.[251]  The growing trend of recording of employee and customer calls for a number of quality assurance purposes combined with ongoing notice and consent issues ensures that this will be an evergreen target for lawsuits.  Furthermore, in 2016, the U.S. District Courts for the Southern District of California and Central District of California determined that non-California plaintiffs may assert CIPA claims against California defendants where the alleged violations occurred in California.[252]  These developments escalate potential liability risk and encourage business to remain attentive. Banking Institution.  On July 10, 2018, the Western District of Pennsylvania determined that the plaintiff had alleged sufficient facts to establish claims for intentional interception of a wire communication and for invasion of privacy under Pennsylvania common law, denying the bank’s motion to dismiss.[253]  Plaintiff alleges that the bank used an automated system to make and record over 35 debt collection calls to his cellular phone without his consent or any prior business relationship.[254]  Each call made plaintiff aware that recording would occur, but the court (noting Pennsylvania’s longstanding two-party consent rule) was not persuaded by an implied or implicit consent theory because “Plaintiff was a party to multiple telephone calls in which he actively exchanged words with Defendant; accordingly, his consent was required to record the calls,” but instead “[Plaintiff] explicitly declined to consent to interception of his calls by Defendant.”[255] Banking Institution.  On January 16, 2018, a California Court of Appeals reversed summary judgment granted to a large bank for CIPA call recording claims brought by a mother of an employee.[256]  Plaintiff alleged that the bank recorded 316 phone calls between her, her daughter, and one with her daughter’s coworkers on an company phone line.[257]  The employer’s “Electronic Monitoring and Device Use” policy authorized its employees to use company telephones for personal calls and expressly warned that their “personal calls may be recorded.”[258]  The employer argued that “it did not ‘intentionally’—for purposes of sections 632(a) and 632.7(a)—record” and that “‘the mere act of [an employer] installing a recording device on company phones and ‘by chance’ recording non-work related calls between [Rojas] and [her d]aughter does not satisfy the ‘intentional’ requirement of [s]ections 632 and 632.7.’”[259]  The Court of Appeals disagreed, citing guidance from the California Supreme Court:  “the recording of a confidential conversation is intentional if the person using the recording equipment does so with the purpose or desire of recording a confidential conversation, or with the knowledge to a substantial certainty that his use of the equipment will result in the recordation of a confidential conversation.”[260]  Under this standard, the court determined that the employer failed to meet its burden of showing that it lacked the requisite intent.[261] 3.    Other “Interceptions” In the Internet of Things (“IoT”) age, new technologies allow for new forms of surreptitious recording and tracking.  This year saw a number of developments including new, creative theories of Wiretap Act violations. Television Manufacturer.  On October 4, 2018, defendant television manufacturer entered into a settlement agreement in response to a putative class action in the Central District of California.[262]  In the original lawsuit, plaintiffs alleged that the company violated the ECPA and the Video Privacy and Protection Act (“VPPA”), as well as several state law fraud, negligent misrepresentation, and consumer protection claims.[263]  Plaintiffs alleged that the defendant used smart TVs to secretly collect, and distribute to advertisers, information on customer viewing habits so that advertisers could deliver targeted advertising in real time.[264]  Plaintiff’s  second consolidated complaint alleged that the television software took samples of the programming displayed at any point in time and sent “fingerprints” of those samples to the centralized matching server to compare against already existing fingerprints in the database, a process that operates sufficiently fast to provide “at least some context-sensitive content substantially simultaneously with at least one targeted video.”[265]  The October 4, 2018 settlement agreement requires the defendant to pay a $17 million settlement and take several additional steps to remedy the situation (including changing its disclosures for new customers, and adding a disclosure to the guide that accompanies new TV purchases).[266] Mattress Company, Men’s Retailer, and Outdoor Retailer.  On July 12, 2018, a mattress company, a men’s clothing company, and an active outdoor retailer faced allegations of using certain software to de-anonymize online users and “observe their keystrokes, mouse clicks, and communications with the e-commerce retailers’ websites.”[267]  The plaintiff, alleging the code functions as a “wiretap,” brought a putative class action in the Southern District of New York, alleging that the use of such software was a violation of the Wiretap Act, ECPA, as well as the SCA and New York’s General Business Law.[268]  But the defendants prevailed on a motion to dismiss.  The court determined that the plaintiffs’ Wiretap Act claims fail “because § 2511 is a one-party consent statute . . . . [and i]t is clear that the Retailers were parties to the communications and [the defendant] had their consent.”[269]  The plaintiff’s ECPA claims failed because “there is no private cause of action under § 2512.”[270]  The SCA claims failed for insufficient pleadings:  the plaintiff “offers nothing more than ‘labels and conclusions’ that the communications were held in electronic storage.”[271] Mortgage Lender.  On November 9, 2018, a mortgage lender faced a similar lawsuit (with the addition of a common law intrusion upon seclusion claim) for allegedly using similar de-anonymizing and keylogging software.[272]  Again, the defendant prevailed on a motion to dismiss because the plaintiff “admit[ed] that any allegedly intercepted communications were made on” the company’s website, making the defendant “a party to the communication.”[273]  The court dismissed without prejudice the plaintiff’s state common-law tort of intrusion upon seclusion claim because “it appears that [the plaintiff] might be able to allege sufficient facts for this Court to exercise original jurisdiction over his intrusion upon seclusion claim.”[274] C.    Telephone Consumer Protection Act As in past years, 2018 included several notable actions and developments under the Telephone Consumer Protection Act (“TCPA”).[275] The highlight came in March, when the D.C. Circuit published its long-anticipated opinion in ACA International v. FCC.[276]  That case interpreted the FCC’s 2015 omnibus Declaratory Ruling and order (the “omnibus order”) which, among other things, defined what qualifies as an “automatic telephone dialing system” (“ATDS”).[277]  The applicable statute defines ATDS as “equipment which has the capacity—(A) to store or produce telephone numbers to be called, using a random or sequential number generator; and (B) to dial such numbers.”[278]  The D.C. Circuit vacated two of the omnibus order’s interpretations of this definition: First, ACA International held the omnibus order unreasonably defined “capacity.”  Under the omnibus order, whether equipment has the “capacity” to qualify as an ATDS turns on the equipment’s potential functionality, rather than its current capabilities.  In concluding this definition was unreasonable, the court emphasized that “any smartphone, with the addition of software, can gain the statutorily enumerated features of an autodialer.”[279]  Accordingly, if a device’s “capacity” under the TCPA turns on its potentiality, then “under the Commission’s approach” “all smartphones . . .  meet the statutory definition of an autodialer”—an “untenable” result.[280] Second, ACA International vacated the omnibus order’s definitions of when a device can (1) “store or produce telephone numbers to be called, using a random or sequential number generator” and (2) “dial such numbers,”[281] because the omnibus order failed to make clear how a device meets these two requirements.  In some places, “the order convey[ed] that equipment needs to have the ability to generate random or sequential numbers that it can then dial” to meet these requirements.[282]  But other times, the order suggested “equipment c[ould] meet the statutory definition even if it lacks that capacity.”[283]  Though the court acknowledged it “might be permissible for the Commission to adopt either interpretation,” endorsing these competing contentions at the same time fell below the bar of reasoned decision-making.[284] In the months since ACA International, courts have started to grapple with important questions left unanswered by the decision.  In particular, there is an emerging circuit split over what functionality a device must have in order to qualify as an ATDS. In Dominguez v. Yahoo, the Third Circuit concluded a device that sent text messages to phone numbers manually entered into the system did not qualify as an ATDS.[285]  This was because the system did not have the “present capacity to function as an autodialer by generating random or sequential telephone numbers and dialing those numbers.”[286]  Dominguez thus stands for the proposition that a device can only qualify as an ATDS under the Act when the device has the present capacity to place calls to randomly generated or sequential numbers.[287] The Ninth Circuit, in contrast, defined ATDS far more broadly in Marks v. Crunch San Diego.[288]  That case centered on a web-based marketing platform designed to send promotional text messages to a list of stored telephone numbers.[289]  Relying on the context and structure of TCPA, the court held that a device that calls a stored list of numbers—rather than numbers generated randomly or sequentially—could, in fact, qualify as an ATDS.[290]  In so doing, the court relied on two aspects of the TCPA:  (1) the fact that, in other provisions, the Act allowed an ATDS to call selected numbers, and (2) that when Congress amended the statute, it did not amend the definition of an ATDS, even though under the amended statute “equipment that dial[ed] from a list of individuals who owe a debt to the United States” was, in fact, an ATDS, although it was exempted from the TCPA.[291] The FCC has taken notice of these decisions.  On May 14, 2018, the FCC sought public comments on the open questions that ACA International raised.[292]  And just a few months after that, on October 3, 2018, the FCC issued another notice seeking public comments on the TCPA’s definition of ATDS in light of Marks v. Crunch San Diego.[293]  We expect the FCC may publish a new order interpreting this TCPA issue sometime in 2019. Related to FCC interpretation of the law, the Supreme Court recently granted the petition for certiorari in PDR Network v. Cartlon & Harris Chiropractic, a case that raises important questions about a 2006 FCC Rule interpreting the term “unsolicited advertisement.”[294]  The district court concluded that it need not defer to and apply the 2006 Rule because it unambiguously contradicted the statute.[295]  The Fourth Circuit reversed the decision.  It held that Chevron’s deferential framework[296] did not apply in this context because the Hobbs Act grants the D.C. Circuit exclusive jurisdiction to enjoin, set aside, suspend and determine the validity of “FCC interpretations of the TCPA.”[297]  It follows, the Fourth Circuit reasoned, that it (and every circuit besides the D.C. Circuit) lacks jurisdiction to set aside FCC’s interpretations of the Act and, instead, must defer to them.[298]  Nevertheless, on November 13, 2018, the Supreme Court granted a petition for a writ of certiorari on the following question:  Whether the Hobbs Act required the district court in this case to accept the FCC’s legal interpretation of the TCPA.[299]  The argument date has not yet been set. Finally, Congress has also taken a keen interest in the TCPA.  Indeed, both the House and the Senate are considering new legislation that would amend TCPA.  The Stopping Bad Robocalls Act, introduced in the House by Congressman Frank Pallone Jr. and in the Senate by Senator Edward J. Markey, would replace the definition of ATDS with “robocall.”[300]  This new definition would explicitly cover devices that make calls using “numbers stored on a list.”[301]  Similarly, Massachusetts Senator Ed Markey and South Dakota Senator John Thune introduced the TRACED Act in the Senate, which would  (1) “broaden[] the authority of the FCC to levy civil penalties of up to $10,000 per call” for those “who intentionally flout telemarketing restrictions,” and (2) extend the window for the FCC to take civil enforcement action against intentional violations up to three years after a robocall is placed, instead of one year.[302]  We will be watching carefully whether either of these bills gain traction. D.    Video Privacy Protection Act Compared to the TCPA, there were relatively few significant developments in 2018 regarding the VPPA. In fact, one of the only notable decisions on VPPA this year was a district court decision in White v. Samsung Electronics America, which followed binding Third Circuit precedent and dismissed the plaintiff’s claim that smart TVs violated the VPPA because the TVs “monitor[ed] and track[ed] consumers viewing habits and record[ed] consumers’ voices” and “then transmit[ed] that information.”[303]  The court based its decision on the fact that, in the Third Circuit, data points such as IP address, MAC address and WiFi access point do not qualify as PII.  And since the data the smart TVs allegedly recorded was “the same type of static digital identifiers the Third Circuit ha[d] determined” did not constitute PII, the court dismissed the complaint.[304] In last year’s Review, we also discussed certain circuits’ (namely, the First, Third, and Ninth circuits’) approach to the scope of PII, which the Act defines as including “information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.”[305]  This year, no additional circuits considered the definition of PII in the context of the VPPA, but we expect that other circuits will do so in coming years. E.    Biometric Information Privacy Acts The biometric technology space also was active this year.  As corporations and institutions increasingly incorporate the use of biometric information in their technologies and operations, states have raced to craft policies protecting the privacy rights of their residents. The prevalence of these issues was frequently in the headlines.  For example, earlier this year, the iconic Madison Square Garden began using facial recognition technology to enhance its security by cross-referencing the faces of individuals entering the stadium with photographs in a database of individuals who have caused security issues in the past.[306]  Similarly, news broke in mid-December that Taylor Swift is using facial recognition at her concerts to identify stalkers.[307]  And a major credit card company introduced new technology that allows users to scan their fingerprints onto a biometric card from their homes.[308]  The saved fingerprint scan obviates the need for users to remember their PIN or provide a signature to authenticate transactions; users need only their thumbs. In light of increasing use, and the sensitivity of these data—such data cannot be changed after all, unlike a password—several states have enacted biometric information privacy acts (“BIPAs”).  Illinois, Texas, and Washington were the first, and most recently, California enacted the Consumer Privacy Act of 2018 (“CCPA”)—discussed further in Section I.B.2. above—which explicitly includes “biometric information” in its definition of “personal information.”[309] While Texas’s and Washington’s BIPAs do not confer a private right of action, leaving enforcement to the state Attorneys General (similar to the current version of the CCPA), Illinois’s does.[310]  As in past years, therefore, BIPA cases in Illinois were a source of active litigation, with at least thirteen cases actively litigated during 2018.  Generally speaking, these cases were brought by either employees objecting to their employer’s practice of collecting biometric information, or consumers objecting to companies doing the same.  Many of the key conflicts this year were focused on constitutional or statutory standing, the latter of which the Illinois Supreme Court resolved just days ago. Earlier in the year, in Sekura v. Krishna Schaumburg Tan, Inc.,[311] a tanning salon customer brought BIPA claims against the salon for collecting her fingerprints without allegedly providing the statutorily required disclosure concerning the salon’s retention policy, and for allegedly disclosing her fingerprints to a third-party vendor.[312]  Plaintiff initially survived the salon’s motion to dismiss, but shortly thereafter, the Appellate Court of Illinois, Second District, held in 2017 in Rosenbach v. Six Flags Entertainment Corporation[313] that standing under the Act required an “injury or adverse effect” in addition to a violation of the Act.  This caused the trial court to reconsider and grant dismissal.[314]  Plaintiff appealed, but prior to the hearing, the U.S. District Court for the Northern District of Illinois held in Dixon v. Washington and Jane Smith Community—Beverly[315] that the disclosure of personal information to a third-party vendor constitutes an injury-in-fact, and therefore satisfies plaintiffs’ standing burden under Article III.  Because Plaintiff alleged just that, the Illinois Appellate Court reversed the dismissal and remanded the matter back to the trial court.[316] The Illinois Supreme Court then weighed in on the Rosenbach case, determining that a plaintiff is “aggrieved” under BIPA and has statutory standing to sue, even without alleging an “actual injury or adverse effect.”[317]  The Court found that BIPA confers individuals with a substantive right to control their biometric information, and that no-injury BIPA violations are not merely “technicalities,”  but “real and significant” harms to important rights created by the legislature.[318]  The Court also reasoned that the private right of action and remedies exist to prevent and deter violations of individuals’ BIPA rights, and that requiring would-be plaintiffs to wait to sue until they have suffered “actual injury” would defeat these purposes of the statute.[319]  Because the Rosenbach plaintiff alleged violations of his BIPA rights—Six Flags allegedly collected his fingerprints for use in a season pass without providing the statutorily mandated notices or publishing a data retention policy—the Supreme Court reversed the appellate court’s contrary conclusion and remanded the case back to the trial court. Rosenbach may not be the final word on BIPA’s private right of action.  This year the Illinois State Senate also will consider a bill narrowing the impact of Illinois’s BIPA.[320]  We will discuss any judicial or legislative BIPA developments in our next update. F.    Internet of Things and Device Hacking There have been a number of developments in the past year regarding connected devices—the Internet of Things (“IoT”)—as the internet has become more widely accessible on consumer products.  Indeed, it is estimated that 55 billion IoT devices will be in use worldwide by the year 2025.[321]  Many of these recent developments have involved attempts by policymakers to mitigate cybersecurity risks associated with connected devices, though the ever-changing nature of these threats have created challenges for regulators. 1.    Legislation a.    California Passes IoT Law In September, California became the first state to pass a cybersecurity law requiring security features for “smart” devices and IoT-connected products.[322]  On September 28, Governor Jerry Brown signed the two identical bills requiring manufacturers of connected devices to implement “reasonable security feature[s]” designed to protect the devices from unauthorized access.[323]  Devices that are accessible outside of a local network will be in compliance with the law if they either (1) have a unique, preprogrammed password or (2) require users to generate a new means of authentication before they first use the device.[324]  The bills require compliance by January 1, 2020 and exempt certain entities, like third-party software.[325]  The bills do not create a private right of action for consumers.[326] The legislation has received mixed reviews since its enactment.  Some, like the California Manufacturers and Technology Association (“CMTA”), have characterized it as “[an] innovation-stifling measure[] [that] not only fail[s] to protect consumers, but will drive away California manufacturing investment.”[327]  Instead, the CMTA has recommended what it considers a fairer approach that “would ensure that all connected devices are compliant and secure, no matter where they are produced.”[328]  Similarly, Robert Graham, a security researcher, has criticized the law for “do[ing] little to improve security, while doing a lot to impose costs and harm innovation” because it requires manufacturers to add costly security features, rather than removing unsecure features.[329]  Others, like Bruce Schneier, Harvard University’s “security guru,” have lauded the bills for being a good first step in regulating a largely unregulated industry, saying, “[a] California law that manufacturers have to adhere to in California is going to help everybody.”[330] b.    United States Congress Considers IoT Legislation On November 28, the House of Representatives unanimously passed the SMART IoT Act, though the Senate did not pass the bill before the close of the 115th session of Congress.  The Act would have required the Department of Commerce to conduct a study of the IoT industry in the United States, including identifying how the IoT is currently regulated, and submitting a report to Congress.[331] In addition, Congress is currently considering the following legislation aimed at regulating the IoT: The Internet of Things Cybersecurity Improvement Act, which would require companies that transact with the federal government to ensure their IoT devices are patchable (i.e., able to be periodically upgraded), do not contain known vulnerabilities (or disclose known vulnerabilities), use standard network protocols, and do not contain hard-coded passwords;[332] The Securing the IoT Act, which would require the FCC to create cybersecurity standards for certifying wireless equipment;[333] The Developing Innovation and Growing the Internet of Things (DIGIT) Act, which would require the U.S. Secretary of Commerce to convene a “working group of Federal stakeholders” to create recommendations and a report to Congress on the IoT and the FCC to obtain public comments regarding spectrum needs relating to the IoT;[334] The Cyber Shield Act, which would create a voluntary labeling and grading system for IoT devices by requiring the Secretary of Commerce to establish a voluntary program to “identify and certify covered products with superior cybersecurity and data security through voluntary certification and labeling”;[335] and The IoT Consumer Tips to Improve Personal Security Act, which would require the FTC to develop cybersecurity resources to educate consumers about the purchase and use of IoT devices.[336] 2.    Regulatory Guidance a.    Consumer Product Safety Commission Holds Public Hearing on IoT and Product Safety Issues; FTC Issues Comment in Response In May, the Consumer Product Safety Commission (“CPSC”) received testimony from a variety of stakeholders at a public hearing regarding issues relating to IoT safety, and subsequently solicited comments from the public.[337]  Among other things, the panelists highlighted the need to address product liability issues as well as privacy and security risks created by the IoT.[338] Per the request of the CPSC during this effort, the FTC issued a comment in June on the topic.[339]  The FTC advocated for flexible standards, suggesting “there is no ‘one size fits all’ approach to securing IoT devices,” that companies should engage in periodic risk assessments evaluating their security programs, that IoT manufacturers should have oversight over service providers, and that products should be continuously updated and patched to meet evolving security threats.[340]  The Commission also noted that its enforcement actions in the IoT space “send an important message to companies about the need to secure and protect internet-connected devices,” and that the FTC “continues to devote substantial resources in this area . . . to foster competition and innovation in the IoT marketplace while protecting the safety of consumers.”[341] b.    New FTC Commissioner Calls for More Robust Protections in IoT Space In October, FTC Commissioner Rebecca Slaughter offered suggestions for strengthening protections in the IoT space in a speech at the Internet of Things Global Summit.[342]  Slaughter noted that consumer trust in the IoT space includes both “ensuring that . . . devices are reasonably secure” and “ensuring that consumers have a clear and accurate picture of what data their devices collect and how that data is stored and used.”[343]  She also highlighted three common issues with IoT technologies that the FTC sees regularly:  (1) “very basic failures in product design and pre-release testing”; (2) companies not foreseeing and addressing “credible alerts about potential vulnerabilities”; and (3) “challenges in the deployment of updates and patches.”[344]  Slaughter also advocated for federal privacy legislation similar to GDPR and California’s CCPA that would provide the FTC with “rule-making authority, coupled with civil penalties in the areas of data security and privacy,” and require the creation of a “Bureau of Technology” within the Commission to provide expertise in competition and consumer protection cases.[345] 3.    Litigation Connected Vehicles.  In July, the Southern District of Illinois partially certified a class action against an automobile manufacturer, which alleges that several of the defendant’s vehicles “suffer from potentially catastrophic design effects which allow third parties to remotely take control of the vehicles over the Internet while they are being driven.”[346]  Although the judge certified three state-based classes of drivers in Michigan, Illinois, and Missouri, he refused to certify a nationwide class of drivers, noting that doing so would require “highly individualized inquiries” to determine the underlying state law claims.[347]  A trial has been scheduled for October 2019.[348] Smart Home Devices.  In October, a manufacturer of smart TVs agreed to settle a class action lawsuit claiming that it collected and sold its customers’ viewing histories to third-party advertisers without their consent.[349]  Per the settlement agreement, the company agreed to pay $17 million and to take additional measures to enhance its disclosures regarding data collection.[350]  In the early days of January 2019, the court preliminarily approved the settlement.[351]  The company had previously paid $2.2 million to the FTC and state of New Jersey in 2017 for similar allegations.[352] G.    Computer Fraud & Abuse Act The Computer Fraud and Abuse Act (“CFAA”) generally prohibits “access[ing] a computer without authorization or exceeding authorized access . . . .”[353]  Because the CFAA does not clearly define either “authorized” or “access,” courts have adopted disparate interpretations of these terms.  The First, Fifth, Seventh, and Eleventh Circuits generally define these terms broadly, allowing liability both for access of digital information without authorization as well as improper use of information an individual was otherwise authorized to access.[354]  Conversely, the Second, Fourth, and Ninth Circuits define the terms narrowly, allowing liability only where an individual accesses information without authorization.[355]  Although the past year did not bring the circuits closer to harmony, it did include novel developments regarding the CFAA’s application to employees accessing their employers’ computers, and third parties accessing generally available websites in allegedly unauthorized ways. Two cases in particular addressed whether an individual may be liable under the CFAA for allegedly misusing information he or she was otherwise authorized to access.  The courts hearing these cases—applying the prevailing CFAA interpretations of their respective circuits—reached opposite conclusions.  In Teva Pharm. USA, Inc. v. Sandhu, a pharmaceutical company alleged that a former employee had passed the company’s trade secrets to the CEO of a competitor and filed suit against the former employee and others for CFAA violations, misappropriation of trade secrets, and various state law tort claims.[356]  The U.S. District Court for the Eastern District of Pennsylvania denied in part Defendant’s motion to dismiss, but granted dismissal for the company’s CFAA claims.  Because “[c]ourts within this district universally subscribe to the narrow approach, barring liability where the employee has authorization to access the computer to obtain the information,”[357] the Court held that “an employee who misuses information she was authorized to obtain cannot be held liable” under the CFAA.[358] The U.S. District Court for the Northern District of Illinois took up similar questions in Hill v. Lynn.[359]  The case centered on two co-founders of a software application company whose relationship had soured to the point that one co-founder, Lynn, cut off the other founder, Hill’s, access to the company’s email systems, downloaded source code Hill had created, and then deleted the code from the company’s systems.[360]  Hill sued Lynn for CFAA violations, fraud, and unjust enrichment, which Lynn moved to dismiss, arguing in part that she had not accessed the application’s code “without authorization.”[361]  The Court granted the motion to dismiss in part, but denied it as to Hill’s CFAA claims because, although “Lynn did have some kind of authorization to access the account,” under Seventh Circuit precedent, “an employee who violates her fiduciary duty to her employer forfeits her authorization to access her employer’s computers.”[362]  Taken together, Sandhu and Hill suggest that the United States Supreme Court may eventually have to decide how to interpret the CFAA as it applies to access to employers’ digital information; but until then, companies should be on notice that the venue of any potential CFAA litigation may play an outsized role in the litigation’s outcome. This past year also saw courts applying the CFAA to third parties accessing generally available websites in potentially unauthorized ways, particularly through the use of computer “bots”—programs that access computers and websites to quickly acquire large quantities of information.  Most notably, in Ticketmaster L.L.C. v. Prestige Entm’t, Inc., the online ticket vendor brought CFAA claims against individuals who used bots to purchase large quantities of tickets through the site, in violation of the company’s Terms of Use.[363]  The Defendants filed a motion to dismiss, which the U.S. District Court for the Central District of California granted in part with respect to the company’s CFAA claims.  The Court noted that “a violation of the terms of use of a website—without more—cannot establish liability under the CFAA.  However, a defendant can run afoul of the CFAA when he or she has no permission to access a computer or when such permission has been revoked explicitly.”[364]  Because the company had not explicitly revoked users’ permission to access its site, and instead had merely sent cease-and-desist letters, the Court ruled that it had failed to state a claim for relief, warranting dismissal.  The company then amended its complaint, which the Court found to be sufficiently well-pled to survive dismissal because it alleged violations of the CFAA for “both access without authorization and situations where a defendant possesses some authorization, but acts in excess of that authorization.”[365]  Similarly, in Sandvig v. Sessions, the U.S. District Court for the District of Columbia adopted the “narrow interpretation” of the CFAA by ruling that a research team’s plan to conduct studies using bots and fictitious online profiles would not violate the CFAA’s authorization requirements.[366] H.    Cybersecurity Insurance 1.    State of the Market As predicted in last year’s Review, the cybersecurity insurance market has continued to expand over the past year, with reports estimating a 25% growth rate.[367]  While cyber-insurance penetration is around 30% for all businesses in the U.S., the rate is at an even higher 70% for Fortune 500 companies.[368]  The value of the cyber insurance sector is currently estimated at $2 billion, but is expected to rise to about $10 to $15 billion in the next ten years.[369] There are two primary reasons for the continued growth of the cyber-insurance market.  First, large-scale and widely publicized hacks have convinced many companies of the costs and degree of business interruptions associated with cyber-attacks,[370] and an increase in cybercrimes has highlighted the importance of cyber-insurance.[371]  Second, a growing regulatory regime governing data privacy and the resulting threats of fines and liabilities have encouraged businesses to purchase coverage.[372] In response to growing demand, many insurers have expanded their cyber-insurance offerings,[373] including coverage for GDPR violations,[374] and have begun to offer a wide range of preventive services, such as phishing awareness campaigns, incident preparedness coaching, and regulatory readiness assessments.[375] Although experts report that the industry is moving towards “all-risk” coverage,[376] insurers are cautious to proceed in the face of evolving risks, with some insurers limiting coverage to events triggered only by unauthorized activity or to costs that businesses must legally incur.[377]  Other insurers are carefully evaluating companies’ security practices when determining whether to provide coverage.[378]  As demonstrated below, a mismatch of expectations over coverage has continued to trigger disputes between insurance companies and policy holders.[379] 2.    State of the Law – Key Cases a.    Computer Fraud Insurance Provisions In 2018, both the Second and Sixth Circuits ruled in favor of insurance policy holders, holding that computer fraud provisions contained in the companies’ insurance agreements covered losses related to email-based “phishing” schemes.[380] On July 6, 2018, the Second Circuit upheld the lower court’s decision in Medidata Solutions, Inc. v. Federal Insurance Co.[381]  As described in last year’s Review, the company brought suit against its insurer to enforce their insurance agreement’s computer fraud provision, which the company claimed covered “phishing activity,” resulting from employees wiring $4.7 million to cybercriminals.[382]  The district court determined that the policy provided coverage for the company’s losses because the criminal activity was “deceitful and dishonest access.”[383]  The Second Circuit affirmed, reasoning that while “no hacking occurred,” the cybercriminals “crafted a computer-based attack that manipulated the [plaintiff’s] email system, which the parties do not dispute constitutes a ‘computer system’ within the meaning of the policy.”[384]  In reaching that decision, the Second Circuit distinguished a 2015 New York state court decision, Universal American Corp.,[385] reasoning that in that case, the court found lack of coverage because the fraud “only incidentally involved the use of computers,”[386] whereas in the instant case, the company’s computer system itself was violated.[387]  The Second Circuit also rejected the insurer’s argument that the company failed to sustain a “direct loss” as a result of the attack, reasoning that the “spoofing attack was the proximate cause of [the company’s] losses,” because the attack initiated a “chain of events,” which “unfolded rapidly,” leading the company to transfer funds to the fraudsters the very same day.[388] A week later, the Sixth Circuit released its decision in American Tooling Center, Inc. v. Travelers Casualty and Surety Company of America, similarly siding with the policy holder when it reversed the lower court’s decision and ruled that damages resulting from phishing activity was covered by the computer fraud provision of the company’s insurance agreement.  The district court had previously granted summary judgment for the insurer, reasoning that “[a]lthough fraudulent emails were used to impersonate a vendor and dupe [the plaintiff] into making a transfer of funds, such emails do not constitute the ‘use of any computer to fraudulently cause a transfer.’”[389]  The Sixth Circuit disagreed, reasoning that the fraudster’s sending of emails to induce the transfer of money constituted “computer fraud” for the purposes of the insurance agreement because the fraudster used a computer to send these emails.  The court continued that, if the insurance company wished to confine the definition of “computer fraud” to “hacking and similar behaviors,” it could have done so.[390]  The court also held that the fraudulent emails “directly caused” a “direct loss” to the insured company, because it precipitated a “chain of events,” including a “series of internal actions,” leading to the transfer of money to the fraudster.[391] b.    Commercial General Liability Insurance Policies In 2018, the courts continued to grapple with coverage for data breach litigation costs.  For example, in St. Paul Fire & Marine Insurance Company v. Rosen Millennium, Inc.,[392] the insurance company filed a declaratory judgment action against its insured, a data security services provider, seeking a declaration that the insurer did not have to defend the security company against a suit by a hotel chain that alleged that the company’s negligence caused a data breach potentially exposing the chain’s customers’ credit card information.[393]  The insurer argued that the personal injury provision of the commercial general liability insurance (“CGL”) policies—which provided coverage for, inter alia, “making known” a customer’s credit card information—did not cover the breach at hand because the breach was perpetrated by a third party and “did not result from [the defendant’s own] business activities.”[394]  The district court agreed,[395] finding the reasoning in Innovak International, Inc. v. Hanover Insurance Company[396] to be persuasive.  There, the company’s CGL policy similarly insured against the publication of “material that violates a person’s right to privacy,”[397] and the court ruled that “the only plausible interpretation of the insurance policy is that it requires the insured to be the publisher of the private information,” noting that “construing the policy to include the acts of third parties would be expanding coverage beyond what the insurance carriers were knowingly entering into.”[398]  Because the policy at issue in St. Paul Fire defined “personal injury” similarly to the policy in Innovak, the court in St. Paul Fire applied the same third-party perpetrator distinction.[399]  Defendants in St. Paul Fire filed their notice of appeal and their brief is expected in mid-January 2019.[400] c.    Financial Institution Bonds A third area of contention facing courts in 2018 was whether financial institution bonds cover losses resulting from data breaches.  On June 28, 2018, a bank filed a complaint in the U.S. District Court for the Western District of Virginia against its insurer for failing to provide coverage for losses resulting from a data breach that exposed customer debit card information.  The bank had claimed that the Computer & Electronic (“C&E”) Crime Rider to the financial institution bond granted by the insurer covered the loss resulting from the breach.[401]  The insurance company denied coverage under the C&E Crime Rider, arguing that the losses fell under the bond’s Debit Card Rider, which maintained a significantly lower coverage limit.[402]  In its complaint, the bank argued that the C&E Crime Rider controlled, because the criminal activity compromised the bank’s systems, and the losses did not arise from the hackers’ stealing debit card information directly from customers.[403]  The insurer answered, raising a number of defenses based on the policy language, including the ability to deny coverage for losses resulting from “fraud or dishonesty of a natural person.”[404]  As with other cases regarding coverage of insurance policies for breach-related losses, the outcome of this case will likely be determined by extensive contract interpretation. III.    GOVERNMENT DATA COLLECTION A.    Electronic Communications Privacy Act Reform The ECPA,[405] passed in 1986 and amended in 1994, 2001, 2006, and 2008, was enacted to protect electronic information from unauthorized access.[406]  The ECPA has three main provisions:  Title I, the Wiretap Act, prohibits the interception and disclosure of another person’s oral or electronic communications unless an exception applies, such as the government obtaining a court order authorizing surveillance.[407]  Title II, the Stored Communications Act (“SCA”), protects from compelled disclosure electronically stored information held by service providers;[408] and Title III authorizes the government to install devices, subject to a court order, that capture dialed numbers from placed calls, without intercepting the content of those calls.[409]  To obtain a court order pursuant to Title III, the government must generally show that the information sought is relevant to an ongoing investigation.[410] As briefly discussed above in Section I.B.1., there were two main efforts to reform the ECPA in 2018, only one of which was successful.  On March 23, 2018, Congress passed, and President Trump signed, the Clarifying Lawful Overseas Use of Data Act (“CLOUD Act”), which amended the SCA to allow the government to obtain a court order for customer data stored overseas.[411]  Please see the Section III.B. for further discussion of the CLOUD Act. The second reform effort this year, the Email Privacy Act (“EPA”),[412] failed to pass the Senate despite bipartisan support in the House of Representatives.  The EPA—included as an amendment to the House’s National Defense Authorization Act For Fiscal Year 2019 (“NDAA”)[413]—would have codified the Sixth Circuit’s 2010 decision in United States v. Warshak requiring law enforcement officials to obtain a warrant based on probable cause when seeking the content of email communications.[414]  It also would have ended the ECPA’s current “180-day rule,” which allows the government to obtain email communications without a warrant after they have been held “in electronic storage” for more than 180 days.[415]  A number of prominent technology companies and civil liberties groups publicly voiced their support for the bill in a July letter to the Senate and House Committees on Armed Services.[416]  However, the EPA was not passed as part of the Senate’s version of the NDAA and the House ultimately conceded during the conference committee process.[417]  This represents the second time the EPA has failed to get through the Senate.  In 2017, Senators Mike Lee (R-UT) and Patrick Leahy (D-VT) proposed the ECPA Modernization Act of 2017, which included the EPA and other reforms, but it was referred to the Senate Judiciary Committee and never received another vote.[418]  With the newly Democratic-controlled House and Republican-controlled Senate, it is unclear what the future holds for the EPA or other reforms of the ECPA. B.    Extraterritoriality of Subpoenas and Warrants and the CLOUD Act On October 16, 2017, the United States Supreme Court granted certiorari for review of United States v. Microsoft Corporation, which regards the scope of the government’s power to obtain information stored overseas under the SCA.[419]  The case involves Microsoft’s challenge to a federal warrant seeking data stored at a Microsoft facility in Ireland.[420]  Microsoft argued that the warrant was an inappropriate extraterritorial application of the SCA because the law’s proper focus is on where electronic communications are stored, and that a search and seizure occurs in the jurisdiction of the storage.[421]  The government, on the other hand, argued that the particular extraterritorial concerns should not be outcome determinative because the SCA’s focus is on the disclosure of information, not storage.[422]  During oral argument in February, the Court suggested that Congress may be the best arbiter and may update the law in response to this “brave new world” of extraterritorial data storage.[423] In March, Congress passed the Clarifying Lawful Overseas Use of Data Act (“CLOUD Act”) as part of the Consolidated Appropriations Act, 2018, Pub. L. 115–141.  As briefly discussed above in Section I.B.1, The CLOUD Act amends the SCA by adding section 2713, which expands the geographic scope of the prior law by stating, in relevant part, that a “[service provider] shall . . . preserve, backup, or disclose the contents of a wire or electronic communication and any record or other information pertaining to a customer or subscriber within such provider’s possession, custody, or control, regardless of whether such communication, record, or other information is located within or outside of the United States.”[424]  In response to the CLOUD Act, the Supreme Court issued a short per curiam opinion in United States v. Microsoft Corp. noting the mootness of the issue and vacating and remanding the case.[425] To soften its broad reach, the CLOUD Act provides safeguards that limit the extraterritorial application to certain jurisdictions.  Specifically, a motion to quash a subpoena for extraterritorial data may be granted if the required disclosure would cause the provider to violate the laws of a “qualifying foreign government.”[426]  Executive branch officials will determine which countries qualify based on several factors, including whether the foreign country has entered into an appropriate executive agreement, and whether the foreign country “affords robust substantive and procedural protections for privacy and civil liberties in light of the data collection . . . .”[427] C.    Foreign Intelligence Surveillance Act Section 702 Reauthorization FISA[428] was passed in 1978, amended in 2008, and reauthorized for another six years in January 2018.  As briefly discussed above in Section I.B.1., the purpose of FISA is to allow the U.S. government to acquire foreign intelligence information through electronic surveillance.[429]  Foreign intelligence information is defined as information that allows the U.S. government to protect the country against hostile acts, sabotage or terrorism, or clandestine intelligence activities by a foreign power or agent.[430] FISA established a specialized tribunal, the Foreign Intelligence Surveillance Court (“FISC”), to review the Attorney General’s applications for authorization of electronic surveillance.[431]  In 2017 (the most recent year for which data are available), the FISC received 1,614 applications, of which 1,147 were granted without any modification, 391 were granted with modification, and 76 were denied in full or in part.[432] FISA Section 702, passed in 2008 as part of the FISA Amendments Act, authorizes the collection of communications from foreign persons reasonably believed to be located outside of the United States.[433]  Critics of FISA, including Section 702, contend that the Act violates the First and Fourth Amendments of the Constitution by allowing law enforcement to sweep up communications passing through the United States, significantly increasing the scope of the government’s surveillance power.[434] Privacy advocates saw the pending reauthorization of FISA in late 2017 and early 2018 as an opportunity for reform, but those efforts were largely unsuccessful.  The FISA Amendments Reauthorization Act of 2017—passed on January 18, 2018 and signed into law the next day—extends FISA for six years.  It reauthorizes the collection of not only those communications to or from the target of an investigation, but also those communications that simply contain a reference to a target, pending written notice to Congress that includes a FISC authorization of the program.[435]  This is often referred to as “abouts” collection.  One notable change, however, is a new requirement that the FBI obtain a court order based on probable cause to access the communications of U.S. persons in criminal investigations unrelated to national security.[436]  FISA will next be subject to reauthorization at the end of 2023.[437] D.    Collection of Cellphone and Audio Data This year, a number of court decisions addressed the issue of individuals’ privacy rights with respect to cellphone and audio data.  Although most of the decisions bolstered such rights by narrowing the government’s ability to collect and search such data without a warrant, one Fourth Circuit case demonstrates that courts may be unwilling to curtail the government’s ability to conduct warrantless searches pursuant to the border search exception. 1.    Supreme Court Protects Cellphone Data in Carpenter v. United States A key issue for the Supreme Court this past year was whether the government must obtain a warrant in order to collect an individual’s cellphone site location history (“CSLI”).[438]  In Carpenter v. United States, the petitioner, an individual convicted of robbery, challenged the government’s acquisition of several months’ worth of CSLI, which identified the specific cell towers with which his phone connected while making and receiving calls.[439]  On appeal, Carpenter, in conjunction with a number of individuals and entities who filed amici briefs, argued that the government violated his Fourth Amendment rights when it obtained the location records from his wireless carrier without a warrant.[440]  In response, the government argued that, pursuant to the “third-party doctrine” established in the 1979 Supreme Court case Smith v. Maryland, individuals have no reasonable expectation of privacy in information they voluntarily surrender to third parties, including CSLI.[441] The Supreme Court ruled in favor of the petitioner in a June 2018 decision authored by Chief Justice Roberts and joined by the Court’s four liberal members.[442]  The Court held that the Fourth Amendment requires the government to obtain a warrant to access CSLI, except in exigent circumstances.[443]  In doing so, the Court reasoned that  “[i]n light of the deeply revealing nature of CSLI, . . .  the fact that such information is gathered by a third party does not make it any less deserving of Fourth Amendment protection.”[444]  The Court also noted that individuals do not truly “share” CSLI with cellphone companies in the normal sense of the term because there is no “affirmative act on the part of the user.”[445]  The Court did note, however, that the decision was limited to the collection of historical CSLI covering an extended period of time, declining to consider whether the government would be allowed to collect information covering a shorter period of time without a warrant.[446]  Further, the Court refused to overrule the third-party doctrine, instead emphasizing that CSLI is “qualitatively different” from other information the Court had previously allowed the government to obtain from third parties without a warrant (e.g., telephone numbers, bank records).[447] This decision continues the Supreme Court’s trend in recent years of increasingly limiting the government’s ability to access electronic personal information.[448]  Companies that collect data from users may now have a stronger basis for resisting requests made by law enforcement for CSLI without a warrant. 2.    Massachusetts District Court Rules Fourth Amendment Protections Apply to Searches and Seizures of Cellphones at U.S. Border On May 9, a federal district court in Massachusetts rejected the government’s contention that Fourth Amendment protections do not apply to searches and seizures of cellphones at the U.S. border.[449]  In Alasaad v. Nielsen, several U.S. citizens challenged the ability of federal officers at U.S. ports to search electronic devices without a warrant, contending that the government must have probable cause to suspect a violation of immigration or customs laws to do so.[450]  The court agreed, rejecting the government’s argument that the ruling in Riley v. California[451]—which requires a warrant for digital searches of cell phones incident to arrest—does not apply to the border search context, and instead reasoning that “the Supreme Court and First Circuit have acknowledged that digital searches are different too since they ‘implicate privacy concerns far beyond those implicated’ in a typical container search.”[452] 3.    Fourth Circuit Rules No Warrant Needed for Cellphone Border Probe On May 18, the Fourth Circuit held that a “month-long, off-site forensic analysis” of a cellphone constituted a “nonroutine border search” and thus required some measure of individualized suspicion on the part of law enforcement officials.[453]  The defendant in United States v. Kolsuz was a Turkish citizen detained at an airport after being charged with arms smuggling while attempting to board a flight to Turkey, at which point Customs and Border Protection officers took custody of his phone for several weeks to search its contents.[454]  On appeal, the defendant argued that the district court erred by finding the probe to have been a “nonroutine border search justified by reasonable suspicion,”[455] arguing that the privacy interest in cellphones are substantial enough to require a warrant, even under the border exception.[456]  The Fourth Circuit affirmed the district court’s decision, ruling that the search was properly categorized as a border search; however, it declined to decide what measure of individualized suspicion was appropriate, even though the reasonable suspicion standard was met here, and instead concluded that agents’ reasonable reliance on precedent was enough to preclude suppression.[457]  The court described the border search exception as “broad enough to reach [this] search,” despite the “temporal and spatial distance between the off-site analysis of the phone and the defendant’s attempted departure at the airport.”[458] IV.    CONCLUSION We expect 2019 to be another significant year in the development and application of data privacy and cybersecurity law.  As technology and data collection become more sophisticated, companies and governments will continue to explore the potential permissible uses of personal information.  At the same time, the public will continue to debate the ideal balance between the benefits of big data and concerns for privacy and security.  We will be tracking these important issues in the year ahead.  Gibson Dunn is available to address any privacy or cyber concerns your business may face.     [1]    Press Release, Federal Trade Commission, Joseph Simons Sworn in as Chairman of the FTC (May 1, 2018), https://www.ftc.gov/news-events/press-releases/2018/05/joseph-simons-sworn-chairman-ftc.     [2]    Id.     [3]    Press Release, Federal Trade Commission, Phillips, Slaughter, and Chopra Sworn in as FTC Commissioners (May 2, 2018), https://www.ftc.gov/news-events/press-releases/2018/05/phillips-slaughter-chopra-sworn-ftc-commissioners.     [4]    Press Release, Federal Trade Commission, Christine S. Wilson Sworn in as FTC Commissioner (Sept. 26, 2018), https://www.ftc.gov/news-events/press-releases/2018/09/christine-s-wilson-sworn-ftc-commissioner.     [5]    Press Release, Federal Trade Commission, FTC Announces Sessions on Consumer Privacy and Data Security As Part of its Hearings on Competition and Consumer Protection in the 21st Century (Oct. 26, 2018), https://www.ftc.gov/news-events/press-releases/2018/10/ftc-announces-sessions-consumer-privacy-data-security-part-its.     [6]    Press Release, Federal Trade Commission, Electronic Toy Maker VTech Settles FTC Allegations That it Violated Children’s Privacy Law and the FTC Act (Jan. 8, 2018), https://www.ftc.gov/news-events/press-releases/2018/01/electronic-toy-maker-vtech-settles-ftc-allegations-it-violated.     [7]    Id.     [8]    Id.     [9]    Id.     [10]    Press Release, Federal Trade Commission, Mobile Phone Maker BLU Reaches Settlement with FTC over Deceptive Privacy and Data Security Claims (Apr. 20, 2018), https://www.ftc.gov/news-events/press-releases/2018/04/mobile-phone-maker-blu-reaches-settlement-ftc-over-deceptive.     [11]    Id.     [12]    Id.     [13]    Decision and Order, In the Matter of BLU Products, Inc., Docket No. C-4657 (F.T.C. Sept. 6, 2018), https://www.ftc.gov/system/files/documents/cases/172_3025_c4657_blu_decision_and_order_9-10-18.pdf.     [14]    Press Release, Federal Trade Commission, PayPal Settles FTC Charges that Venmo Failed to Disclose Information to Consumers About the Ability to Transfer Funds and Privacy Settings; Violated Gramm-Leach-Bliley Act (Feb. 27, 2018), https://www.ftc.gov/news-events/press-releases/2018/05/ftc-gives-final-approval-settlement-paypal-related-allegations.     [15]    Id.     [16]    Id.     [17]    Id.     [18]    Id.     [19]    Id.     [20]    Id.     [21]    Press Release, Federal Trade Commission, California Company Settles FTC Charges Related to Privacy Shield Participation (July 2, 2018), https://www.ftc.gov/news-events/press-releases/2018/07/california-company-settles-ftc-charges-related-privacy-shield.     [22]    Press Release, Federal Trade Commission, FTC Reaches Settlements with Four Companies That Falsely Claimed Participation in the EU-U.S. Privacy Shield (Sept. 27, 2018), https://www.ftc.gov/news-events/press-releases/2018/09/ftc-reaches-settlements-four-companies-falsely-claimed.     [23]    Id.     [24]    Press Release, Federal Trade Commission, California Company Settles FTC Charges Related to Privacy Shield Participation (July 2, 2018), https://www.ftc.gov/news-events/press-releases/2018/07/california-company-settles-ftc-charges-related-privacy-shield; Press Release, Federal Trade Commission, FTC Reaches Settlements with Four Companies That Falsely Claimed Participation in the EU-U.S. Privacy Shield (Sept. 27, 2018), https://www.ftc.gov/news-events/press-releases/2018/09/ftc-reaches-settlements-four-companies-falsely-claimed.     [25]    Id.     [26]    LabMD, Inc. v. Fed. Trade Comm’n, 894 F.3d 1221, 1227 (11th Cir. 2018).     [27]    Id. at 1237.     [28]    Id. at 1236.     [29]    Id.     [30]    Press Release, Department of Health and Human Services, Anthem Pays OCR $16 Million in Record HIPAA Settlement Following Largest U.S. Health Data Breach in History (Oct. 15, 2018), available at https://www.hhs.gov/about/news/2018/10/15/anthem-pays-ocr-16-million-record-hipaa-settlement-following-largest-health-data-breach-history.html.     [31]    Id.     [32]    Id.     [33]    Id.     [34]    Press Release, Department of Health and Human Services, Five breaches add up to millions in settlement costs for entity that failed to heed HIPAA’s risk analysis and risk management rules (Feb. 1, 2018), available at https://www.hhs.gov/about/news/2018/02/01/five-breaches-add-millions-settlement-costs-entity-failed-heed-hipaa-s-risk-analysis-and-risk.html.     [35]    Id.     [36]    Press Release, Department of Health and Human Services, Judge rules in favor of OCR and requires a Texas cancer center to pay $4.3 million in penalties for HIPAA violations (June 18, 2018), available at https://www.hhs.gov/about/news/2018/06/18/judge-rules-in-favor-of-ocr-and-requires-texas-cancer-center-to-pay-4.3-million-in-penalties-for-hipaa-violations.html.     [37]    Id.     [38]    Press Release, Department of Health and Human Services, Consequences for HIPAA violations don’t stop when a business closes (Feb. 13, 2018), available at https://www.hhs.gov/about/news/2018/02/13/consequences-hipaa-violations-dont-stop-when-business-closes.html.     [39]    Id.     [40]    Request for Information on Modifying HIPAA Rules To Improve Coordinated Care, 83 Fed. Reg. 64302 (proposed Dec. 14, 2018) (to be codified at 45 C.F.R. pts. 160, 164), available at https://www.federalregister.gov/documents/2018/12/14/2018-27162/request-for-information-on-modifying-hipaa-rules-to-improve-coordinated-care.     [41]    The states represented in this lawsuit are Arizona, Arkansas, Florida, Indiana, Iowa, Kansas, Kentucky, Louisiana, Minnesota, Nebraska, North Carolina, and Wisconsin.     [42]    See Complaint, State of Arizona v. Med. Informatics Eng’g, Inc., No. 3:18-cv-00969 (N.D. Ind. Dec. 04, 2018), ECF No. 1.     [43]    Id.     [44]    U.S. Dep’t of Health & Human Servs. and Healthcare & Public Health Sector Coordinating Councils, Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients (Dec. 28, 2018), https://www.phe.gov/Preparedness/planning/405d/Documents/HICP-Main-508.pdf.     [45]    Id.     [46]    Id.     [47]    Id.     [48]    Press Release, Securities and Exchange Commission, SEC Adopts Statement and Interpretive Guidance on Public Company Cybersecurity Disclosures (Feb. 21, 2018), https://www.sec.gov/news/press-release/2018-22; Commission Statement and Guidance on Public Company Cybersecurity Disclosures, 17 C.F.R. pts. 229, 249, available at https://www.sec.gov/rules/interp/2018/33-10459.pdf.     [49]    See CF Disclosure Guidance: Topic No. 2 – Cybersecurity (Oct. 13, 2011), available at https://www.sec.gov/divisions/corpfin/guidance/cfguidance-topic2.htm.     [50]    SEC v. Jun Ying, No. 1:18-cv-01069-CAP (N.D. Ga. Mar. 14, 2018).     [51]    SEC v. Bonthu, No. 1:18-cv-03114-MLB (N.D. Ga. June 28, 2018).     [52]    Cyber-Related Frauds, Exchange Act Release No. 84429 (Oct. 16, 2018), available at https://www.sec.gov/litigation/investreport/34-84429.pdf.     [53]    Gibson Dunn, SEC Warns Public Companies on Cyber-Fraud Controls (Oct. 17, 2018), https://www.gibsondunn.com/sec-warns-public-companies-on-cyber-fraud-controls/.     [54]    Complaint, SEC v. AriseBank, No. 3-18CV-00186-M (N.D. Tex. Jan. 25, 2018), ECF No. 2, 2018 WL 623772; First Amended Complaint, SEC v. AriseBank, No. 3:18-cv-00186-M (N.D. Tex. Feb. 2, 2018), ECF No. 21, 2018 WL 1250524.     [55]    Ex Parte Order Granting Emergency Ex Parte Motion for Temporary Restraining Order, SEC v. AriseBank, No. 3-18CV-00186 (N.D. Tex. Jan. 25, 2018), ECF No. 11; Press Release, Securities and Exchange Commission, SEC Halts Alleged Initial Coin Offering Scam (Jan. 30, 2018), https://www.sec.gov/news/press-release/2018-8.     [56]    Chairman’s Testimony on Virtual Currencies: The Roles of the SEC and CFTC, Testimony Before the Senate Committee on Banking, Housing, and Urban Affairs (Feb. 6, 2018), https://www.sec.gov/news/press-release/2018-8.     [57]    Event, Federal Communications Commission, Fighting the Scourge of Illegal Robocalls (Mar. 23, 2018), https://www.fcc.gov/fcc-ftc-robocalls-forum.     [58]    Event, Federal Communications Commission, Stop Illegal Robocalls Expo (Apr. 23, 2018), https://www.fcc.gov/news-events/events/2018/04/stop-illegal-robocalls-expo.     [59]    Press Release, Federal Communications Commission, FCC and FTC to Host Joint Policy Forum and Consumer Expo to Fight the Scourge of Illegal Robocalls (Mar. 7, 2018), https://www.ftc.gov/news-events/press-releases/2018/03/ftc-fcc-host-joint-policy-forum-consumer-expo-fight-scourge.     [60]    ACA Int’l v. Fed. Commc’ns Comm’n, 885 F.3d 687 (D.C. Cir. 2018).     [61]    Id. at 692-94.     [62]    Id. at 692.     [63]    Id. at 709.     [64]    Id. at 709.     [65]    Press Release, Federal Communications Commission, FCC Adopts New Consumer Protections Against ‘Slamming’ and ‘Cramming’ (June 7, 2018), https://www.fcc.gov/document/fcc-adopts-new-consumer-protections-against-slamming-and-cramming; FCC, Report and Order, Protecting Consumers from Unauthorized Carrier Changes and Related Unauthorized Charges, File No. 17-169 (June 7, 2018), https://docs.fcc.gov/public/attachments/FCC-18-78A1.pdf.     [66]    Id.     [67]    Id.     [68]    Id.     [69]    Press Release, Federal Communications Commission, FCC Establishes Reassigned Phone Numbers Database to Help Reduce Unwanted Calls to Consumers (Dec. 12, 2018), https://docs.fcc.gov/public/attachments/DOC-355526A1.pdf; FCC, Order, Advanced Methods to Target and Eliminate Unlawful Robocalls, File No. 17-59 (Dec. 12, 2018), https://docs.fcc.gov/public/attachments/FCC-17-151A1.pdf.     [70]    Id.     [71]    Id.     [72]    Id.     [73]    Jim Puzzanghera, New CFPB Director Kathy Kraninger says she won’t be puppet of Mick Mulvaney, Los Angeles Times (Dec. 11, 2018), http://www.latimes.com/business/la-fi-kathy-kraninger-cfpb-20181211-story.html.     [74]    Id.     [75]    Id.     [76]    Id.     [77]    Lalita Clozel, CFPB to Resume Private Consumer Data Collection, Wall Street Journal, (May 31, 2018), https://www.wsj.com/articles/cfpb-to-resume-private-consumer-data-collection-1527796179.     [78]    Id.     [79]    Sylvan Lane, Equifax says consumer bureau still probing hack despite report it eased off, The Hill, (Mar. 2, 2018), https://thehill.com/policy/finance/376437-equifax-says-consumer-bureau-still-probing-hack-despite-report-it-eased-off.     [80]    Department of Defense, Summary: Department of Defense Cyber Strategy 2018 (Sept. 18, 2018), available at https://media.defense.gov/2018/Sep/18/2002041658/-1/-1/1/CYBER_STRATEGY_SUMMARY_FINAL.PDF.     [81]    Id. at 3.     [82]    Id. at 6.     [83]    Id. at 7.     [84]    Daniel Wilson, DoD Making ‘Do Not Buy’ List for Foreign Software Vendors, Law360, (July 27, 2018), https://www.law360.com/articles/1067768/dod-making-do-not-buy-list-for-foreign-software-vendors.     [85]    Id.     [86]    Id.     [87]    Press Release, N.E. Attorney Gen., Attorney General Files First Multi-State HIPAA-Related Data Breach Lawsuit (Dec. 3, 2018), available at https://ago.nebraska.gov/news/attorney-general-files-first-multi-state-hipaa-related-data-breach-lawsuit.     [88]    Press Release, State of N.J., Office of the Attorney Gen., AG Grewal Announces Creation of New Enforcement Unit to Protect Data Privacy of New Jersey’s Residents (May 7, 2018), available at https://nj.gov/oag/newsreleases18/pr20180507b.html.       [89]    State of N.J., Office of the Attorney Gen., Data Privacy and Security, https://www.nj.gov/oag/law/dpc.htm (last visited 12/19/18).     [90]    N.J. Div. of Consumer Affairs, NJ Division of Consumer Affairs Announces $100,000 Settlement with App Developer Resolving Investigation Into Alleged Violations of Children’s Online Privacy Law (May 8, 2018), available at https://www.njconsumeraffairs.gov/News/Pages/05082018.aspx.     [91]    Id.     [92]    Press Release, N.Y. State Office of the Attorney Gen., A.G. Underwood Announces Broad Support for Shield Act from Major Business and Consumer Groups (June 5, 2018), available at https://ag.ny.gov/press-release/ag-underwood-announces-broad-support-shield-act-major-business-and-consumer-groups.     [93]    N.Y. State Office of the Attorney Gen., Small Business Guide to Cybersecurity in New York State (June 5, 2018), available at https://ag.ny.gov/sites/default/files/nyag_data_security_small_business_guide.pdf.     [94]    Press Release, N.M. Attorney Gen, AG Balderas Announces Lawsuit Against Tech Giants Who Illegally Monitor Child Location, Personal Data (Sept. 12, 2018), available at https://www.nmag.gov/uploads/PressRelease/48737699ae174b30ac51a7eb286e661f/AG_Balderas_Announces_Lawsuit_Against_Tech_Giants_Who_Illegally_Monitor_Child_Location__Personal_Data_1.pdf; Complaint, Balderas v. Tiny Lab Productions et al., No. 1:2018-cv-00854 (D.N.M. Sept. 11, 2018).     [95]    Id.     [96]    Press Release, D.C. Office of the Attorney Gen., AG Racine Sues Facebook for Failing to Protect Millions of Users’ Data (Dec. 19, 2018), available at https://oag.dc.gov/release/ag-racine-sues-facebook-failing-protect-millions.     [97]    Complaint, District of Columbia v. Facebook, Inc., No. 2018 CA 008715 (D.C. Super. Ct. Dec. 19, 2018).     [98]    See 23 NYCRR 500, available at http://www.dfs.ny.gov/legal/regulations/adoptions/dfsrf500txt.pdf; see also, e.g., Gibson Dunn, New York State Department of Financial Services Announces Proposed Cybersecurity Regulations (Sept. 19, 2016), https://www.gibsondunn.com/new-york-state-department-of-financial-services-announces-proposed-cybersecurity-regulations/; Gibson Dunn, New York State Department of Financial Services Revises Proposed Cybersecurity Regulations (Jan. 5, 2017), https://www.gibsondunn.com/new-york-state-department-of-financial-services-revises-proposed-cybersecurity-regulations/.     [99]    Nate Lord, What Is the NYDFS Cybersecurity Regulation? A New Cybersecurity Compliance Requirement for Financial Institutions, Digital Guardian (Sept. 19, 2018), https://digitalguardian.com/blog/what-nydfs-cybersecurity-regulation-new-cybersecurity-compliance-requirement-financial.     [100]    See Gibson Dunn, New York State Department of Financial Services, supra note 98.     [101]    Covered Entities were required to submit a certificate of compliance by February 15, 2018 for the measures required to be completed by August 28, 2017.     [102]    23 NYCRR 500.11(a)-(b); Barry R. Temkin & Kenneth M. Labbate, NY Department of Financial Services Cybersecurity Regulations: An Update, New York Law Journal (June 28, 2018, 2:30 PM), https://www.law.com/newyorklawjournal/2018/06/28/062918ny_temkin2/.     [103]    Press Release, N.Y. Dep’t of Fin. Servs., DFS Superintendent Vullo Issues Cybersecurity Filing Deadline Reminder (Jan. 22, 2018), https://www.dfs.ny.gov/about/press/pr1801221.htm.     [104]    Press Release, N.Y. Dep’t of Fin. Servs., DFS Takes Additional Action To Hold Equifax Accountable for Massive 2017 Data Breach (June 27, 2018), https://www.dfs.ny.gov/about/press/pr1806271.htm.     [105]    23 NYCRR 201.07, available at https://www.dfs.ny.gov/legal/regulations/adoptions/dfsrf201txt.pdf.     [106]    More State Cybersecurity Regulation Ahead for Financial Services Industry?, LexisNexis State Net Capitol Journal (Mar. 8, 2018), https://www.lexisnexis.com/communities/state-net/b/capitol-journal/archive/2018/03/08/more-state-cybersecurity-regulation-ahead-for-financial-services-industry.aspx.     [107]    Pub. L. 115-141 § 101 (2018).     [108]    18 U.S.C. § 2701, et seq.     [109]    Pub. L. 115-141.     [110]    United States v. Microsoft Corp., 138 S.Ct. 1186, 584 U.S. __ (2018).     [111]    Letter from Apple et al. to Senator Orrin Hatch et al., U.S. Senate (Feb. 6, 2018), available at https://www.scribd.com/document/374641879/Tech-Companies-Letter-of-Support-for-Senate-CLOUD-Act-020618#download&from_embed.     [112]    See, e.g., Neema S. Guliani & Naureen Shah, The CLOUD Act Doesn’t Help Privacy and Human Rights: It Hurts Them, Lawfare (Mar. 16, 2018), https://www.lawfareblog.com/cloud-act-doesnt-help-privacy-and-human-rights-it-hurts-them.     [113]    Pub. L. 115-118, § 101(a).     [114]    Id.     [115]    Id. § 103.     [116]    S. 3655, 115th Cong. (2018), available at https://www.congress.gov/bill/115th-congress/senate-bill/3655/text.     [117]    S. 3655 § 3.     [118]    Id. § 2(a)(1).     [119]    Id.     [120]    Id. § 5.     [121]    Joseph Marks, The Cybersecurity 202: Republicans and Democrats are feuding over the Equifax breach, Washington Post (Dec. 11, 2018), https://www.washingtonpost.com/news/powerpost/paloma/the-cybersecurity-202/2018/12/11/the-cybersecurity-202-republicans-and-democrats-are-feuding-over-the-equifax-breach/5c0e9ec91b326b67caba2b5c/?utm_term=.092af4a07bd9.     [122]    Dem. Staff of H. Comm. on Oversight and Gov’t Reform and Comm. on Sci., Space and Tech., 115th Cong., What the Next Congress Should Do to Prevent a Recurrence of the Equifax Data Breach 3-7 (2018), available at https://democrats-oversight.house.gov/sites/democrats.oversight.house.gov/files/Equifax%20Minority%20Report%20-%20FINAL%2012-10-2018.pdf.     [123]    Maj. Staff of H. Comm. on Oversight and Gov’t Reform, 115th Cong., The Equifax Data Breach 94-96 (2018), available at https://oversight.house.gov/wp-content/uploads/2018/12/Equifax-Report.pdf; see also Joseph Marks, The Cybersecurity 202: Republicans and Democrats are feuding over the Equifax breach, Washington Post (Dec. 11, 2018), available at https://www.washingtonpost.com/news/powerpost/paloma/the-cybersecurity-202/2018/12/11/the-cybersecurity-202-republicans-and-democrats-are-feuding-over-the-equifax-breach/5c0e9ec91b326b67caba2b5c/?utm_term=.a267f7aa93c9.     [124]    S. 2124, 115th Cong. (2018), available at https://www.congress.gov/bill/115th-congress/senate-bill/2124.     [125]    18 U.S.C. § 2703.     [126]    See David Ruiz, Email Privacy Act Comes Back, Hopefully to Stay, Electronic Frontier Found. (May 29, 2018), https://www.eff.org/deeplinks/2018/05/email-privacy-act-comes-back-hopefully-stay; Letter from ACT: The App Association et al. to John McCain et al., U.S. Senate (July 13, 2018), https://cdt.org/files/2018/07/Email-Privacy-NDAA-sign-on-letter-final.pdf.     [127]    See H.R. Rep. No. 115-874, at 965 (2018).     [128]    National Conference of State Legislatures, 2018 Security Breach Legislation (Oct. 12, 2018), available at http://www.ncsl.org/research/telecommunications-and-information-technology/2018-security-breach-legislation.aspx.     [129]    S.B. 318 (Ala. 2018), available at https://legiscan.com/AL/bill/SB318/2018.     [130]    S.B. 62, 93rd Legis. Sess. (S.D. 2018), available at https://legiscan.com/SD/bill/SB62/2018.     [131]    S.B. 361 (La. 2018), available at https://legiscan.com/LA/rollcall/SB361/id/75044; see also National Conference of State Legislatures, 2018 Security Breach Legislation (Oct. 12, 2018), available at http://www.ncsl.org/research/telecommunications-and-information-technology/2018-security-breach-legislation.aspx.     [132]    Cal. Civ. Code § 1798.100.     [133]    Rita Heimes and Sam Pfeifle, New California privacy law to affect more than half a million US companies, Int’l Ass’n of Privacy Prof’ls., (July 2, 2018), available at https://iapp.org/news/a/new-california-privacy-law-to-affect-more-than-half-a-million-us-companies/.     [134]    See e.g., Letter from California Chamber of Commerce et al. to Senator Bill Dodd (Aug. 6, 2018), available at http://netchoice.org/wp-content/uploads/SB-1121-Final-Author-Coalition-Letter-2.8.7.2018.pdf.     [135]    Id.     [136]    Cal. Civ. Code § 1798.100, et seq.     [137]    Cal. Civ. Code § 1798.150(a)(1).     [138]    S.B. 1121 (Cal. 2018).     [139]    Cal. Civ. Code § 1798.91.04.     [140]    Cal. Civ. Code § 1798.91.04(a).     [141]    Cal. Civ. Code § 1798.91.04(b).     [142]    National Conference of State Legislatures, 2018 Security Breach Legislation (Oct. 12, 2018), available at http://www.ncsl.org/research/telecommunications-and-information-technology/2018-security-breach-legislation.aspx.     [143]    S.B. 220, 132nd Leg. Sess. (Ohio 2018), https://legiscan.com/OH/text/SB220/id/1811629.     [144]    Id.     [145]    Matthew Rosenberg et al., How Trump Consultants Exploited the Facebook Data of Millions, N.Y. Times (Mar. 17, 2018), https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html.     [146]    Facebook, Inc.’s Motion to Dismiss Plaintiffs’ Consolidated Shareholder Derivative Complaint, In re Facebook, Inc. Shareholder Derivative Privacy Litigation, No. 18-CV-01792 (N.D. Cal. Aug. 10, 2018), ECF Nos. 69-71.     [147]    See Plaintiffs’ Opposition to Motion of Defendant Facebook, Inc. to Dismiss Plaintiffs’ Consolidated Complaint, In re: Facebook, Inc. Consumer Privacy User Profile Litigation, No. 18-MD-02843 (N.D. Cal. Nov. 30, 2018), ECF No. 208; Reply in Support of Motion of Defendant Facebook, Inc. to Dismiss Plaintiffs’ Consolidated Complaint, In re: Facebook, Inc. Consumer Privacy User Profile Litigation, No. 18-MD-02843 (N.D. Cal. Dec. 21, 2018), ECF Nos. 233-34.     [148]    David Thacker, Expediting Changes to Google+, Google (Dec. 10, 2018), https://www.blog.google/technology/safety-security/expediting-changes-google-plus/; Douglas MacMillan & Robert McMillan, Google Exposed User Data, Feared Repercussions of Disclosing to Public, Wall Street Journal (Oct. 8, 2018), https://www.wsj.com/articles/google-exposed-user-data-feared-repercussions-of-disclosing-to-public-1539017194.     [149]    Lily Hay Newman, A New Google+ Blunder Exposed Data from 52.5 Million Users, Wired (Dec. 10, 2018), https://www.wired.com/story/google-plus-bug-52-million-users-data-exposed/.     [150]    Complaint, Matic v. Google, No. 18-cv-06164 (N.D. Cal. Oct. 8, 2018), ECF No. 1.     [151]    Complaint, Mawardy v. Google, No. 18-cv-05704 (E.D.N.Y. Oct. 11, 2018), ECF No. 1; Complaint, Wicks v. Google, No. 18-cv-06245 (N.D. Cal. Oct. 11, 2018), ECF No. 1; Rhode Island, Rhode Island Suing Google Over Data Breach, Rhode Island, https://www.ri.gov/press/view/34829.     [152]    Nicholas Fandos & Michael Wines, Russia Tried to Undermine Confidence in Voting Systems, Senators Say, N.Y. Times (May 8, 2018), https://www.nytimes.com/2018/05/08/us/politics/russia-2016-election-hackers.html.     [153]    Id.     [154]    The United States Department of Justice Office of Public Affairs, Grand Jury Indicts 12 Russian Intelligence Officers for Hacking Offenses Related to the 2016 Election, Department of Justice (July 13, 2018), https://www.justice.gov/opa/pr/grand-jury-indicts-12-russian-intelligence-officers-hacking-offenses-related-2016-election.     [155]    Complaint, Democratic National Committee v. The Russian Federation, No. 18-cv-03501 (S.D.N.Y. Apr. 20, 2018), ECF No. 1.     [156]    Memorandum of Law in Support of Defendant WikiLeaks’s Pre-Trial Motion to Dismiss the First Amended Complaint, Democratic National Committee v. the Russian Federation, No. 18-CV-3501 (S.D.N.Y. Dec. 7, 2018) ECF No. 206.     [157]    Marriott International, Marriott Provides Update on Starwood Database Security Incident, Marriott International (Jan. 4, 2019), http://news.marriott.com/2019/01/marriott-provides-update-on-starwood-database-security-incident/.     [158]    See, e.g., Complaint, Vetter v. Marriott Int’l, Inc., No. 19-cv-00094-RWT (D. Md. Jan. 9, 2019), ECF No. 1; Fox v. Marriott Int’l, Inc., No. 18-07936 (N.D. Ill. Dec. 1, 2018), ECF No. 1; Complaint, Perkins v. Marriott Int’l, Inc., No. 18-cv-12477 (D. Mass. Nov. 30, 2018), ECF No. 1.     [159]    Complaint, McGrath v. Marriott Int’l, Inc., 18-cv-06845 (E.D.N.Y. Dec. 1, 2018), ECF No. 1.     [160]    Hamza Shaban, Under Armour announces data breach, affecting 150 million MyFitnessPal app accounts, Washington Post (Mar. 29, 2018), https://www.washingtonpost.com/news/the-switch/wp/2018/03/29/under-armour-announces-data-breach-affecting-150-million-myfitnesspal-app-accounts.     [161]    MyFitnessPal, MyFitnessPal Account Security Issue: Frequently Asked Questions, https://content.myfitnesspal.com/security-information/FAQ.html.     [162]    Motion to Compel Arbitration and to Dismiss or Stay Litigation, Murray v. Under Armour, Inc., 18-cv-04032 (C.D. Cal. May 29, 2018), ECF No. 12.     [163]    Hudson’s Bay Co. Customer Data Sec. Breach Litig., 326 F. Supp. 3d 1372 (U.S. Jud. Pan. Mult. Lit. 2018); See also Robert McMillan & Suzanne Kapner, Saks, Lord & Taylor Hit With Data Breach, Wall Street Journal (Apr. 2, 2018), https://www.wsj.com/articles/saks-lord-taylor-hit-with-data-breach-1522598460.     [164]    Id.     [165]    Id.     [166]    See Class Action Complaint, Joseph v. Saks Inc., No. 18-cv-04563 (S.D.N.Y. May 23, 2018), ECF No. 1; Class Action Complaint, Rudolph v. Saks & Co. LLC, No. 18-cv-05107 (C.D. Cal. June 8, 2018), ECF No. 1; Class Action, Beekman v. Lord & Taylor, LLC, No. 18-cv-00521 (D. Del. Apr. 5, 2018), ECF No. 1; Complaint—Class Action, Sacklow v. Saks Inc., No. 3:18-cv-00360 (M.D. Tenn. Apr. 11, 2018), ECF No. 1.     [167]    Order Denying Transfer, In re: Hudson’s Bay Co. Customer Sec. Data Breach Litig., MDL No. 2847 (U.S. Jud. Pan. Mult. Lit. Aug. 1, 2018), ECF No. 50.     [168]    Sears Holdings Corporation Staff, Statement on Data Security Incident, Sears Holdings (Apr. 4, 2018), https://blog.searsholdings.com/shc-updates/statement-on-data-security-incident/.     [169]    Delta Airlines, Inc, Information on [24]7.AI Cyber Incident, Delta (Apr. 7, 2018), https://www.delta.com/content/www/en_US/response.html.     [170]    Complaint, Naini v. Delta Air Lines, Inc., No. 18-cv-02876 (C.D. Cal. Apr. 6, 2018) ECF No. 1.     [171]    Sean Keane, Macy’s Breach Exposed Customer Data, Credit Card Numbers, CNET (July 11, 2018), https://www.cnet.com/news/macys-data-breach-may-have-seen-customer-info-stolen/; Complaint, Carroll v. Macy’s Inc., No. 18-cv-01060 (N.D. Ala. July 9, 2018), ECF No. 1.     [172]    Thomas H. McCoy Jr., M.D., Temporal Trends and Characteristics of Reportable Health Data Breaches, 2010-2017, 320, Journal of the American Medical Association 1282 (2018).     [173]    Unity Point Health, Notice Regarding Security Incident, UnityPoint Health, https://www.unitypoint.org/filesimages/About/Security%20Substitute%20Notification.pdf.     [174]    Id.     [175]    Id.     [176]    Beth Jones Sanborn, UnityPoint Health System Hit With Cyberattack Affecting 16,000 Patients, Healthcare IT News (Apr. 20, 2018), https://www.healthcareitnews.com/news/unitypoint-health-system-hit-cyberattack-affecting-16000-patients.     [177]    Id.     [178]    Defendant’s Memorandum of Law in Support of Its Motion to Dismiss Plaintiffs’ Second Amended Complaint, Fox v. Iowa Health System d/b/a UnityPoint Health, No. 18-cv-00327 (W.D. Wis. Jun. 29, 2018), ECF No. 8.     [179]    Beth Jones Sanborn, LifeBridge Health Reveals Breach That Compromised Health Data of 500,000 Patients, Health IT News (May 23, 2018), https://www.healthcareitnews.com/news/lifebridge-health-reveals-breach-compromised-health-data-500000-patients; DDS Issues Notice of Potential Breach of Confidential Information, State of California Department of Developmental Services, https://www.dds.ca.gov/SecurityNotice/.     [180]    See, e.g., Complaint, Allen et al v. Equifax, Inc., No. 1:17-cv-04544 (N.D. Ga. Nov. 10, 2017), ECF No. 1; see also Wolf Richter, Equifax’s Data Breach Will Cost It for Months to Come, Business Insider (Nov. 11, 2017), http://www.businessinsider.com/equifax-data-breach-will-keep-costing-it-for-months-to-come-2017-11.     [181]    See In re: Equifax, Inc. Customer Data Security Breach Litigation, No. 17-md-2800 (N.D. Ga.).     [182]    Memorandum in Support of Defendants’ Motion to Dismiss the Financial Institutions’ Consolidated Amended Complaint, In re: Equifax, Inc. Customer Data Security Breach Litigation, No. 17-md-2800 (N.D. Ga. July 16, 2018), ECF No. 435.     [183]    Memorandum of Law in Support of Motion to Dismiss Consolidated Small Business Class Action Complaint, In re: Equifax, Inc. Customer Data Security Breach Litigation, No. 17-md-2800 (N.D. Ga. Aug. 8, 2018), ECF No. 441.     [184]    Motion Hearing Held on 12/14/2018, In re: Equifax, Inc. Customer Data Security Breach Litigation, No. 17-md-2800 (N.D. Ga. Dec. 14, 2018) , ECF No. 534.     [185]    See, e.g., Complaint, Bellwether Comm. Credit Union v. Chipotle Mexican Grill, Inc., No. 1:17-cv-01102 (D. Colo., May 4, 2017), ECF No. 1.     [186]    Order, Bellwether Community Credit Union v. Chipotle Mexican Grill, Inc., No. 17-cv-1102 (D. Colo. Oct. 24, 2018), ECF No. 83.     [187]    Id.     [188]    In re: U.S. Office of Pers. Mgmt. Data Sec. Breach Litig., 266 F. Supp. 3d 1 (D.D.C. 2017).     [189]    Brief for Appellants National Treasury Employees Union, In re: U.S. Office of Personnel Management Data Security Breach Litigation, No. 17-5217 (D.C. Cir. May 10, 2018), ECF No. 83; Brief of Amici Curiae Electronic Privacy Information Center (EPIC), In re: U.S. Office of Personnel Management Data Security Breach Litigation, No. 17-5217 (D.C. Cir. May 17, 2018), ECF No. 42.     [190]    Id.     [191]    Louis C. LaBrecque, Unions, Federal Agency Set for Court Clash in Data Breach Case, Bloomberg Law (Nov. 2, 2018), https://news.bloomberglaw.com/daily-labor-report/unions-federal-agency-set-for-court-clash-in-data-breach-case.     [192]    Attias v. CareFirst Inc., 865 F.3d 620, 622 (D.C. Cir. 2017).     [193]    Id. at 623.     [194]    CareFirst, Inc. v. Attias, 138 S. Ct. 981 (2018).     [195]    Petition for a Writ of Certiorari, CareFirst, Inc. v. Attias, No. 17-641 (U.S. Oct. 30, 2017).     [196]    In re Zappos.com, Inc., 888 F.3d 1020, 1022 (9th Cir. 2018).     [197]    Id.     [198]    Petition for a Writ of Certiorari, Zappos.com, Inc. v. Stevens, No. 18-225 (U.S. Aug. 20, 2018).     [199]    Dieffenbach v. Barnes & Noble, Inc., 887 F.3d 826, 829 (7th Cir. 2018).     [200]    Id. at 828.     [201]    Id. at 830.     [202]    136 S. Ct. 1540 (2016).     [203]    Attias v. Carefirst, Inc., 865 F.3d 620, 628 (D.C. Cir. 2017).     [204]    CareFirst, Inc. v. Attias, 138 S. Ct. 981 (2018); Petition for a Writ of Certiorari, CareFirst, Inc. v. Attias, 865 F.3d 620 (D.C. Cir. Aug. 15, 2017).     [205]    In re Zappos.com, Inc., 888 F.3d 1020 (9th Cir. 2018).     [206]    Id.     [207]    Petition for a Writ of Certiorari, Zappos.com, Inc. v. Stevens, No. 18-225 (U.S. Aug. 20, 2018).     [208]    Whalen v. Michaels Stores, Inc., 689 F. App’x 89 (2d Cir. 2017); Beck v. McDonald, 848 F.3d 262 (4th Cir.), cert. denied sub nom. Beck v. Shulkin, 137 S. Ct. 2307 (2017); In re SuperValu, Inc., 870 F.3d 763 (8th Cir. 2017).     [209]    See Complaint, In re The Wendy’s Co. Shareholder Derivative Action, No. 1:16-cv-01153 (S.D. Ohio Dec. 16, 2016), ECF No. 1; Memorandum of Law in Support of Plaintiff Thomas Caracci’s Motion for a Status Conference, In re The Wendy’s Co. Shareholder Derivative Action, No. 1:16-cv-01153 (S.D. Ohio Dec. 6, 2018), ECF No. 48.  A consumer class action lawsuit against Wendy’s arising of the same data breach also settled.  See Torres v. Wendy’s International, LLC, No. 6:16-cv-00210 (M.D. Fla.).     [210]    Plaintiff James Graham’s Motion for Preliminary Approval of Derivative Litigation Settlement, In re The Wendy’s Co. Shareholder Derivative Action, No. 1:16-cv-01153, 2018 WL 2328335 (S.D. Ohio May 6, 2018), ECF No. 41.     [211]    Id.     [212]    Id.     [213]    Class Action Complaint, Wicks  v. Alphabet, Inc., No. 4:18-cv-06245, 2018 WL 4941767 (C.D. Cal. Oct. 11, 2018), ECF No. 1.     [214]    Id.     [215]    Order Granting Stipulation, Wicks v. Alphabet, Inc., No. 4:18-cv-06245 (N.D. Cal. Nov. 7, 2018), ECF No. 14.     [216]    Class Act[ion] Complaint, Shah v. Chegg, Inc., No. 3:18-cv-05956 (N.D. Cal. Sept. 27, 2018), ECF No. 1.     [217]    Id.     [218]    See Order, Shah v. Chegg, Inc., No. 3:18-cv-06714 (N.D. Cal. Dec. 12, 2018), ECF No. 10; Class Action Complaint, Kurland v. Chegg, Inc., No. 3:18-cv-06714, 2018 WL 5835331 (N.D. Cal. Nov. 5, 2018), ECF No. 1.     [219]    See In re Anthem, Inc. Data Breach Litig., 162 F. Supp. 3d 953, 967 (N.D. Cal. 2016).     [220]    See generally Order Granting Motion for Preliminary Approval of Class Action Settlement, In re Anthem, No. 5:15-md-02617-LHK, (N.D. Cal. Aug. 25, 2017), ECF No. 903.     [221]    See Order, In re Anthem, No. 5:15-md-02617-LHK, (N.D. Cal. Aug. 15, 2018), ECF No. 1046.     [222]    Id.     [223]    Letter to Customers: T-Mobile’s CEO on Experian’s Data Breach, https://www.t-mobile.com/customers/experian-data-breach.     [224]    Id.     [225]    Class Action Settlement Agreement and Release, In re Experian Data Breach Litigation, No. 15-CV-01592 (C.D. Cal., Nov. 12, 2018), ECF No. 285.     [226]    Id.     [227]    Id.     [228]    See Brief for Petitioners, Frank v. Gaos, No. 17-961 (U.S. July 9, 2018).     [229]    Jimmy Hoover, Google Settlement Snubbed Class Members, Justices Told, Law360 (Oct. 31, 2018), https://www.law360.com/articles/1097634/google-settlement-snubbed-class-members-justices-told.     [230]    Brief for Petitioners, Frank v. Gaos, No. 17-961 (U.S. July 9, 2018).     [231]    Ronald Mann, Argument Analysis: Justices Skeptical of “Cy Pres” Class-Action Settlements, SCOTUSBlog (Nov. 1, 2018) http://www.scotusblog.com/2018/11/argument-analysis-justices-skeptical-of-cy-pres-class-action-settlements/.     [232]    Id.     [233]    Order, Frank v. Gaos, No. 17-961 (U.S. Nov. 6, 2018).     [234]    See Order, In re Anthem, No. 5:15-md-02617-LHK, (N.D. Cal. Aug. 15, 2018), ECF No. 1046.     [235]    See Final Order and Judgment at 3–6, In re Home Depot, No. 1:14-md-02583-TWT (N.D. Ga. Sept. 22, 2017), ECF No. 343.     [236]    Order Granting Final Approval of Class Action Settlement and Final Judgment, In re Home Depot, No. 1:14-md-02583-TWT (N.D. Ga. Aug. 23, 2016), ECF No. 260 (adopting Settlement Agreement, ECF No. 181-2); Order Granting Consumer Plaintiffs’ Motion For Service Awards, Attorneys’ Fees and Litigation Expense Reimbursement, No. 1:14-md-02583-TWT (N.D. Ga. Aug. 23, 2016), ECF No. 261 (adopting Settlement Agreement, ECF No. 181-2).     [237]    Mem. and Order Granting Mot. for Final Approval of Financial Institutions’ Class Action Settlement and Mot. for Att’y Fees and Expenses and Service Payments, In re Target, No. 0:14-md-02522-PAM (D. Minn. May 12, 2016), ECF No. 758 (adopting Settlement Agreement, ECF No. 653-1).     [238]    Robin Sidel, Target to Settle Claims Over Data Breach, Wall St. J. (Aug. 18, 2015, 5:10 PM ET), http://www.wsj.com/articles/target-reaches-settlement-with-visa-over-2013-data-breach-1439912013.     [239]    Final Approval of Class Settlement, In re Sony, No. 2:14-cv-09600-RGK-E (C.D. Cal. Apr. 6, 2016), ECF No. 165 (approving Settlement Agreement, ECF No. 146-1); Order on Mot. for Att’y Fees, Costs, and Service Awards at 3, In re Sony, No. 2:14-cv-09600-RGK-E (C.D. Cal. Apr. 12, 2016), ECF No. 166.     [240]    St. Joseph Health System Med. Info. Cases, JCCP No. 4716 (Cal. Sup. Ct.).  Gibson Dunn represented St. Joseph in this case.     [241]    Mem. and Order Granting Mot. for Final Approval of Consumer Settlement and Mot for Payment of Service Awards and Fees and Expenses, In re Target, No. 0:14-md-02522-PAM (D. Minn. Nov. 16, 2016), ECF No. 645 (approving Settlement Agreement, ECF No. 358-1).     [242]    Order Granting Final Approval of Class Action Settlement, In re LinkedIn User Privacy Litig., No. 12-CV-03088-EJD (N.D. Cal. Sept. 15, 2015), ECF No. 147 (approving Settlement Agreement, ECF No. 145-1).     [243]    Mot. for Approval of Voluntary Dismissal, In re Adobe Systems Inc. Privacy Litig., No. 5:13-CV-05226-LHK (N.D. Cal. June 9, 2015), ECF No. 87; Settlement Agreement, In re Adobe Systems Inc. Privacy Litig., No. 5:13-CV-05226-LHK (N.D. Cal. June 9, 2015), ECF No. 87-2.     [244]    Min. Order Granting Motion for Settlement, In re Sony Gaming Networks & Customer Data Sec. Breach Litig., No. 3:11-md-02258 (S.D. Cal. May 4, 2015), ECF No. 210; Settlement Agreement, In re Sony Gaming Networks, No. 3:11-md-02258 (S.D. Cal. June 13, 2014), ECF No. 190-2.     [245]    Cooper v. Slice Techs., Inc., No. 17-CV-7102 (JPO), 2018 WL 2727888, at *5 (S.D.N.Y. June 6, 2018).     [246]    Id.      [247]    Id.     [248]    Id. at *4.     [249]    Id. (quoting the agreement language in the plaintiffs’ complaint).     [250]    18 U.S.C. § 2511(2)(d).     [251]    Cal. Penal Code § 630, et seq.     [252]    See Bona Fide Conglomerate, Inc. v. SourceAmerica, No. 3:14-CV-00751-GPC, 2016 WL 3543699, at *6 (S.D. Cal. June 29, 2016) (citing Valentine v. NebuAd, Inc., 804 F. Supp. 2d 1022, 1028 (N.D. Cal. 2011)); see also Carrese v. Yes Online Inc., No. 16-CV-05301-SJO, 2016 WL 6069198, at *4 (C.D. Cal. Oct. 13, 2016).     [253]    Mulder v. Wells Fargo Bank, N.A., No. 2:18-CV-00029, 2018 WL 3750627, at *7 (W.D. Pa. July 10, 2018).     [254]    Id. at *1.     [255]    Id. at *4-5.     [256]    Rojas v. HSBC Card Servs. Inc., 20 Cal. App. 5th 427 (Ct. App. 2018)     [257]    Id. at 430.     [258]    Id.     [259]    Id. at 433-34.     [260]    Id. at 435 (quoting People v. Superior Court of Los Angeles Cty., 70 Cal. 2d 123, 134 (1969)) (alterations omitted).     [261]    Id.     [262]    Plaintiff’s Motion for Preliminary Approval of Proposed Class Action Settlement (Unopposed), In re Vizio, Inc., Consumer Privacy Litig., No. 8:16-ml-02693-JLS-KES (C.D. Cal. Oct. 4, 2018), ECF No. 282-2.     [263]    Complaint, In re Vizio, Inc., Consumer Privacy Litig., No. 8:16-ml-02693-JLS-KES (C.D. Cal. Mar. 23, 2017), ECF No. 1.     [264]    Id.     [265]    Second Consolidated Amended Complaint, In re Vizio, Inc., Consumer Privacy Litig., No. 8:16-ml-02693-JLS-KES (C.D. Cal. Mar. 23, 2017), ECF No. 136.     [266]    Plaintiff’s Motion for Preliminary Approval of Proposed Class Action Settlement (Unopposed), In re Vizio, Inc., Consumer Privacy Litig., No. 8:16-ml-02693-JLS-KES (C.D. Cal. Oct. 4, 2018), ECF No. 282-2.     [267]    Cohen v. Casper Sleep Inc., No. 17CV9325, 2018 WL 3392877, at *1 (S.D.N.Y. July 12, 2018).     [268]    Id.     [269]    Id. at *3.     [270]    Id. at *4.     [271]    Id. at *5 (quoting Ashcroft v. Iqbal, 556 U.S. 678 (2009)).     [272]    Allen v. Quicken Loans Inc., No. CV1712352ESMAH, 2018 WL 5874088, at *2 (D.N.J. Nov. 9, 2018).     [273]    Id. at *4 (internal quotation marks omitted).     [274]    Id. at *12.     [275]    47 U.S.C. §§ 227 et seq.     [276]    885 F.3d 687 (D.C. Cir. 2018).  Gibson Dunn represented the U.S. Chamber of Commerce, one of the petitioners, in this case.     [277]        Id. at 695.     [278]    47 U.S.C § 227(a)(1).     [279]    Id. at 696-97.     [280]    Id. at 697.     [281]    47 U.S.C § 227(a)(1).     [282]    ACA International, 885 F.3d at 701.     [283]    Id. at 702.     [284]    Id. at 703.     [285]    894 F.3d 116, 120-21 (3d Cir. 2018).     [286]    Id. at 121.     [287]    Id.     [288]    904 F.3d 1041 (9th Cir. 2018).     [289]    Id. at 1051-53.     [290]    Id.     [291]    Id. at 1052.     [292]    Federal Communications Commission, Consumer And Governmental Affairs Bureau Seeks Comment On Interpretation Of The Telephone Consumer Protection Act In Light Of The D.C. Circuit’s ACA International Decision (May 14, 2018), available at https://ecfsapi.fcc.gov/file/0514497027768/DA-18-493A1.pdf.       [293]    Federal Communications Commission, Consumer And Governmental Affairs Bureau Seeks Further Comment On Interpretation Of The Telephone Consumer Protection Act In Light Of The Ninth Circuit’s Marks v. Crunch San Diego, LLC Decision (Oct. 3, 2018), available at https://ecfsapi.fcc.gov/file/0514497027768/DA-18-493A1.pdf.       [294]    No. 17-1705, 2018 WL 3127423 (2018).     [295]    Id.     [296]    See Chevron, U.S.A., Inc. v. Nat. Res. Def. Council, Inc., 467 U.S. 837, 843 (1984)     [297]    Carlton & Harris Chiropractic v. PDR Network, 882 F.3d 459, 464 (4th Cir. 2018).     [298]    Id.     [299]    No. 17-1705, 2018 WL 3127423 (U.S. Nov. 13, 2018).     [300]    Griffin Connolly, Lawmakers Want to Curb Those Pesky Robocalls to Your Phone, Roll Call (Jun. 11, 2018), available at https://www.rollcall.com/news/policy/lawmakers-want-curb-pesky-robocalls-phone.     [301]    Id.     [302]    United States Senate Committee on Commerce, Science, & Transportation, Press Release, Bipartisan TRACED Act Cracks Down on Illegal Robocall Scams (Nov. 16, 2018), available at https://www.commerce.senate.gov/public/index.cfm/pressreleases?ID=91889B92-62FE-4AF1-A6A4-D26E7E2F296F.     [303]    White v. Samsung Electronics America, Inc., et al., No. 17-1775 (D.N.J. Sept. 26, 2018).     [304]    Id. at *4-5.     [305]    18 U.S.C. § 2710(a)(3).     [306]    Kevin Draper, Madison Square Garden Has Used Face-Scanning Technology on Customers, New York Times (Mar. 13, 2018), available at https://www.nytimes.com/2018/03/13/sports/facial-recognition-madison-square-garden.html.     [307]    Sopan Deb and Natasha Singer, Taylor Swift Said to Use Facial Recognition to Identify Stalkers, New York Times (Dec. 13, 2018), available at https://www.nytimes.com/2018/12/13/arts/music/taylor-swift-facial-recognition.html.     [308]    Mastercard Biometric Card FAQs, Mastercard.com, available at https://www.mastercard.us/en-us/merchants/safety-security/biometric-card.html.     [309]    California Consumer Privacy Act of 2018, Cal. Civ. Code § 1798.140(o)(1)(E).     [310]    740 Ill. Comp. Stat. Ann. 14/20.     [311]    2018 WL 4699213 (Ill. App 1st Sept. 28, 2018).     [312]    Id. at *1.     [313]    2017 WL 6523910 (IL App 2nd Dec. 21, 2017).  Rosenbach v. Six Flags Entertainment Corp. is discussed in further detail in last year’s Year-End Update:  https://www.gibsondunn.com/us-cybersecurity-and-data-privacy-outlook-and-review-2018/.     [314]    2018 WL 4699213, at *1.     [315]    2018 WL 2445292 (N.D. Ill. May 31, 2018).     [316]    2018 WL 4699213, at *1.     [317]    Rosenbach v. Six Flags Entm’t Corp., No. 123186 ¶ 1 (Ill. Jan. 25, 2019).     [318]    Id. ¶ 34.     [319]    Id. ¶ 37.     [320]    S.B. 3053, 2018 Reg. Sess. (Ill. 2018).     [321]    Peter Newman, The Internet of Things 2018 Report: How the IoT is Evolving to Reach the Mainstream with Businesses and Consumers, BUS. INSIDER INTELLIGENCE (Feb. 26, 2018), available at http://www.businessinsider.com/the-internet-of-things-2017-report-2018-2-26-1.     [322]    Senate Bill No. 327, Assembly Bill No. 1906, California 2017-2018 Regular Session (codified at 1.81.26 of Part 4 of Division 3 of California Civil Code); see Gibson Dunn Client Alert: New California Security of Connected Devices Law and CCPA Amendments (Oct. 5, 2018), available at https://www.gibsondunn.com/new-california-security-of-connected-devices-law-and-ccpa-amendments/.     [323]    Id. at § 1798.91.04(a), 1798.91.05(b).     [324]    Id. at § 1798.91.04(b).     [325]    Id. at § 1798.91.06(a)-(b).     [326]    Id. at § 1798.91.06(e).     [327]    See Theo Douglas, California Governor Approves Bills Tightening Security, Privacy of IoT Devices, Govtech.com (Sept. 28, 2018), available at http://www.govtech.com/applications/Two-Bills-Before-California-Governor-Would-Tighten-Security-Privacy-of-IoT-Devices.html.  The CMTA explained that because the bill only applies to California manufacturers, it creates a “loophole” for imported devices to avoid the security feature requirements, thereby making the state less attractive for manufacturers.     [328]    Id.     [329]    Robert Graham, California’s bad IoT law, Errata Security blog (Sept. 10, 2018), available at https://blog.erratasec.com/2018/09/californias-bad-iot-law.html#.XCY-9cL2bmi.     [330]    Derek Hawkins, Derek, The Cybersecurity 202: California’s Internet of Things cybersecurity bill could lay groundwork for federal action, The Washington Post (Sept. 17, 2018).     [331]    H.R. 6032, 115th Cong. (2018).     [332]    Fact Sheet, Internet of Things Cybersecurity Improvement Act of 2017, available at https://www.warner.senate.gov/public/_cache/files/8/6/861d66b8-93bf-4c93-84d0-6bea67235047/8061BCEEBF4300EC702B4E894247D0E0.iot-cybesecurity-improvement-act—fact-sheet.pdf.     [333]    H.R. 1234, 115th Cong. (2018).     [334]    S. 88 and H.R. 686, 11th Cong. (2017).  The Act was passed by the Senate in August 2017 but has not yet passed the House.     [335]    H.R. 4163 and S. 2020, 115th Cong. (2017).     [336]    S. 2234, 115th Cong. (2017).     [337]    Notice by Consumer Product Safety Commission, 83 FR 13122, available at https://www.federalregister.gov/documents/2018/03/27/2018-06067/the-internet-of-things-and-consumer-product-hazards.     [338]    Public Hearing on the Internet of Things and Consumer Product Hazards, U.S. Consumer Product Safety Commission (May 16, 2018), available at https://cpsc.gov/s3fs-public/Panelists%20Presentations%20-%20IoT%20and%20Consumer%20Product%20Hazards%20%20Public%20Hearing%20-%20May%2016%202018.pdf?q3A.aOH4qiLleXB3TybNrHi9mwt4yM77.     [339]    Comments of the Staff of the Federal Trade Commission’s Bureau of Consumer Protection, In the Matter of The Internet of Things and Consumer Product Hazards, Docket No. CPSC-2018-007 (June 15, 2018), https://www.ftc.gov/system/files/documents/advocacy_documents/comment-staff-federal-trade-commissions-bureau-consumer-protection-consumer-product-safety/p185404_ftc_staff_comment_to_the_consumer_product_safety_commission.pdf.     [340]    Id. at 6-8.     [341]    Id. at 4, 11.     [342]    Prepared Remarks of Commissioner Rebecca Kelly Slaughter, Visions and Goals for the Future of IoT in the USA and Globally, Federal Trade Commission (Oct. 4, 2018), available at https://www.ftc.gov/system/files/documents/public_statements/1414540/20181004_prepared_remarks_of_commissioner_slaughter_for_the_forum_global_6th_annual_iot_global.pdf     [343]    Id. at 2.     [344]    Id. at 3-4.     [345]    Id. at 5-6.     [346]    Order Granting in part and Denying in part Defs.’ Mot. for Summ. Judgment at 1, Flynn v. FCA US LLC, No. 15-cv-00855-MJR-DGW (July 5, 2018), ECF No. 399; Compl., Flynn v. FCA US LLC, No. 15-cv-00855-MJR-DGW, 2017 WL 3592040, (S.D. Ill. Dec. 22, 2015).     [347]    Order Granting in part and Denying in part Defs.’ Mot. for Summ. Judgment at 1, Flynn v. FCA US LLC, No. 15-cv-00855-MJR-DGW (July 5, 2018), ECF No. 399.  As discussed in last year’s Review, in August 2017, the court dismissed all of the plaintiffs’ claims that possible future car hacking could cause injury or death but allowed plaintiffs to pursue claims that they overpaid for the vehicles in light of the alleged system vulnerabilities.  Flynn v. FCA US LLC, No. 15-cv-00855-MJR-DGW, 2017 WL 3592040, at *5 (S.D. Ill. Aug. 21, 2017).     [348]    Order at 1, Flynn v. FCA US LLC, No. 15-cv-00855-MJR-DGW, 2017 WL 3592040, at *5 (Nov. 29, 2018), ECF No. 448.     [349]    Plaintiffs’ Notice of Motion and Motion for Preliminary Approval of Proposed Class Action Settlement (Unopposed), In Re: Vizio, Inc., Consumer Privacy Litigation, 8:16-ml-02693 (C.D. Cal. Oct. 4, 2018), ECF No. 282.     [350]    Id.     [351]    Order Granting Plaintiffs’ Motion for Preliminary Approval of Class Action Settlement, In Re: Vizio, Inc., Consumer Privacy Litigation, 8:16-ml-02693 (C.D. Cal. Jan. 4, 2019), ECF No. 297.     [352]    VIZIO to Pay $2.2 Million to FTC, State of New Jersey to Settle Charges It Collected Viewing Histories on 11 Million Smart Televisions without Users’ Consent, Federal Trade Commission (Feb. 6, 2017), available at https://www.ftc.gov/news-events/press-releases/2017/02/vizio-pay-22-million-ftc-state-new-jersey-settle-charges-it.     [353]    18 U.S.C. § 1030.     [354]    See EF Cultural Travel BV v. Explorica Inc., 274 F.3d 577, 581-82 (1st Cir. 2001); United States v. John, 597 F.3d 263, 272-73 (5th Cir. 2010); Int’l Airport Ctrs., LLC v. Citrin, 440 F.3d 418, 420-21 (7th Cir. 2006); United States v. Rodriguez, 628 F.3d 1258, 1263-64 (11th Cir. 2010).     [355]    See United States v. Valle, 807 F.3d 508, 523-28 (2d Cir. 2015); WEC Carolina Energy Sol.s LLC v. Miller, 687 F.3d 199, 204-07 (4th Cir. 2012); United States v. Nosal, 676 F.3d 854, 856-63 (9th Cir. 2012) (en banc).     [356]    291 F. Supp. 3d 659, 666 (E.D. Pa. 2018).     [357]    Id. at 669.     [358]    Id. at 670.     [359]    No. 17 C 06318, 2018 WL 2933636 (N.D. Ill. June 12, 2018).     [360]    Id. at *1-2.     [361]    Id. at *3.     [362]    Id.     [363]    Ticketmaster L.L.C. v. Prestige Entm’t, Inc., 306 F. Supp. 3d 1164 (C.D. Cal. 2018).     [364]    Id. at 1175.     [365]    Ticketmaster L.L.C. v. Prestige Entm’t W., Inc., 315 F. Supp. 3d 1147, 1171–72 (C.D. Cal. 2018).     [366]    315 F. Supp. 3d 1, 23 (D.D.C. 2018).     [367]    Cyber attack victims face disputes with insurers, Financial Times (Dec. 2, 2018) https://www.ft.com/content/3679fd84-e9c2-11e8-a34c-663b3f553b35.     [368]    Alicja Grzadkowska, How cybercrime and coverage evolved in 2018, Insurance Business America (Dec. 12, 2018), https://www.insurancebusinessmag.com/us/news/cyber/how-cybercrime-and-coverage-evolved-in-2018-118721.aspx.     [369]    Justin Lynch, Cyberattacks are increasing, and so is cyber insurance, Fifth Domain (Dec. 10, 2018), https://www.fifthdomain.com/industry/2018/12/10/cyberattacks-are-increasing-and-so-is-cyber-insurance.     [370]    See Scott Neil, Marriott breach underlines cyber-risk scale, Royal Gazette (Dec. 7, 2018), http://www.royalgazette.com/re-insurance/article/20181207/marriott-breach-underlines-cyber-risk-scale.     [371]    See Erin Illman & Alex Purvis, 2 Recent Decisions May Affect your Cyber Policy, Law360 (Nov. 2, 2018) https://www.law360.com/articles/1098216.     [372]    See Scott Neil, Marriot breach underlines cyber-risk scale, Royal Gazette (Dec. 7, 2018), http://www.royalgazette.com/re-insurance/article/20181207/marriott-breach-underlines-cyber-risk-scale; Jeff Sistrunk, Top Insurance Legislation & Regulation Stories of 2018, Law360 (Dec. 13, 2018), https://www.law360.com/articles/1109766/top-insurance-legislation-regulation-stories-of-2018.     [373]    See e.g., Sompo International Forms Cyber Team, Expands Insurance Offering, Insurance Journal (Dec. 11, 2018), https://www.insurancejournal.com/news/national/2018/12/11/511645.htm; AXA XL Adds Cybersecurity Services to Cyber Insurance Program, Insurance Journal (Nov. 30, 2018),  https://www.insurancejournal.com/news/national/2018/11/30/510695.htm.     [374]    See e.g., Sompo International Forms Cyber Team, Expands Insurance Offering, Insurance Journal (Dec. 11, 2018), https://www.insurancejournal.com/news/national/2018/12/11/511645.htm.     [375]    See e.g., AXA XL Adds Cybersecurity Services to Cyber Insurance Program, Insurance Journal (Nov. 30, 2018),  https://www.insurancejournal.com/news/national/2018/11/30/510695.htm.     [376]    See Alicja Grzadkowska, How cybercrime and coverage evolved in 2018, Insurance Business America (Dec. 12, 2018), https://www.insurancebusinessmag.com/us/news/cyber/how-cybercrime-and-coverage-evolved-in-2018-118721.aspx.     [377]    See Terry Gangcuangco, Cyber insurance hit with barrage of criticism as disputes mount, Insurance Business UK (Dec. 3, 2018), https://www.insurancebusinessmag.com/uk/news/cyber/cyber-insurance-hit-with-barrage-of-criticism-as-disputes-mount-117720.aspx.     [378]    See Justin Lynch, Cyberattacks are increasing, and so is cyber insurance, Fifth Domain (Dec. 10, 2018), https://www.fifthdomain.com/industry/2018/12/10/cyberattacks-are-increasing-and-so-is-cyber-insurance.     [379]    See Cyber attack victims face disputes with insurers, Financial Times (Dec. 2, 2018) https://www.ft.com/content/3679fd84-e9c2-11e8-a34c-663b3f553b35.     [380]    Jeff Sistrunk, The Biggest Property & Casualty Insurance Decisions of 2018, Law360 (Dec. 14, 2018),  https://www.law360.com/articles/1102073/the-biggest-property-casualty-insurance-decisions-of-2018.     [381]    Medidata Sols. Inc., v. Fed. Ins. Co., 729 F. App’x 117, 119 (2d Cir. 2018).     [382]    Medidata Sols., Inc. v. Fed. Ins. Co., No. 15-CV-907 (ALC), 2017 WL 3268529, at *1–2 (S.D.N.Y. July 21, 2017).     [383]    Id. at *5.     [384]    Medidata Sols. Inc., 729 F. App’x at 118.     [385]    Id. (citing Universal Am. Corp. v. Nat’l Union Fire Ins. Co. of Pittsburgh, Pa., 25 N.Y.3d 675, 681 (2015)).     [386]    Id.     [387]    Id.     [388]    Id. at 119.     [389]    American Tooling Ctr., Inc. v. Travelers Cas. and Sur. Co. of Am., No. 16-12108, 2017 WL 3263356 at *3 (E.D. Mich. Aug. 1, 2017).     [390]    American Tooling Ctr., Inc. v. Travelers Cas. and Sur. Co. of Am., 895 F.3d 455, 462 (6th Cir. 2018).     [391]    Id. at 463.     [392]    St. Paul Fire & Marine Ins. Co. v. Rosen Millennium, Inc., 2018 WL 4732718, at *1 (M.D. Fla. 2018).     [393]    Id.     [394]    Id. at *5.     [395]    Id. at *6.     [396]    Innovak International, Inc. v. Hanover Insurance Company,  280 F.Supp.3d 1340 (M.D. Fla. 2017).     [397]    Id. at 1343.     [398]    St. Paul Fire, 2018 WL 4732718, at *5 (quoting Innovak International, Inc. 280 F.Supp.3d at 1342, 1348) (internal alterations and citations omitted).     [399]    Id.     [400]    St. Paul Fire & Marine Ins. v. Rosen Hotels & Resorts, Inc., et al., No. 18-14427-A (11th Cir. 2018).     [401]    Complaint, Nat’l Bank of Blacksburg vs. Everest Nat’l Insurance Co., No. 7:18-cv-00310-GEC (W.D. Va. Jun. 28, 2018).     [402]    Id.     [403]    Id.     [404]    Answer, Nat’l Bank of Blacksburg vs. Everest Nat’l Insurance Co., No. 7:18-cv-00310-GEC (W.D. Va. Jul. 20, 2018).     [405]    18 U.S.C. §§ 2510-22.     [406]    U.S. Dep’t of Justice, Electronic Communications Privacy Act of 1986 (ECPA), 18 U.S.C. § 2510-22 (July 30, 2017), https://it.ojp.gov/privacyliberty/authorities/statutes/1285.     [407]    18 U.S.C. § 2511.     [408]    18 U.S.C. § 2701-12.     [409]    Id.     [410]    Id.     [411]    Clarifying Lawful Overseas Use of Data Act, Pub. L. No. 115-141, §§ 101-106 (2018).     [412]    H.R. 387, 115th Cong. (2017).     [413]    H.R. 5515, 115th Cong. (2018).     [414]    United States v. Warshak, 631 F.3d 266 (6th Cir. 2010).     [415]    18 U.S.C. § 2703.     [416]    Letter from ACT: The App Association et al. to John McCain et al., U.S. Senate (July 13, 2018), https://cdt.org/files/2018/07/Email-Privacy-NDAA-sign-on-letter-final.pdf.     [417]    H.R. Rep. No. 115-874, at 965 (2018).     [418]    S. 1654, 115th Cong. (2017) (noting that the bill was twice referred to the Senate Judiciary Committee).     [419]    United States v. Microsoft Corp., 138 S. Ct. 356 (2017)     [420]    United States v. Microsoft Corp., 138 S. Ct. 1186, 1187 (2018).     [421]    Brief for Respondent at 20-37, United States v. Microsoft Corp., 138 S. Ct. 1186 (2018) (No. 17-2).     [422]    Brief for Petitioner at 21-25, United States v. Microsoft Corp., 138 S. Ct. 1186 (2018) (No. 17-2).     [423]    Oral Argument at 6, United States v. Microsoft Corp., 138 S. Ct. 1186 (2018) (No. 17-2).     [424]    18 U.S.C. § 2713 (emphasis added).     [425]    Microsoft, 138 S. Ct. at 1188 (explaining that “[n]o live dispute remains between the parties over the issue with respect to which certiorari was granted.”); see also Gibson Dunn Client Alert: Supreme Court Holds That Recent Legislation Moots Dispute Over Emails Stored Overseas (April 17, 2018), available at https://www.gibsondunn.com/supreme-court-holds-that-recent-legislation-moots-dispute-over-emails-stored-overseas/.     [426]    18 U.S.C. § 2703(h)(2).     [427]    18 U.S.C. § 2523.     [428]    50 U.S.C. §§ 1801-1805.     [429]    H. Permanent Select Comm. on Intelligence, FISA Section 702, https://intelligence.house.gov/fisa-702/.     [430]    50 U.S.C. § 1801(e).     [431]    50 U.S.C. § 103(a).     [432]    James C. Duff, Report of the Director of the Administrative Office of the U.S. Courts on Activities of the Foreign Intelligence Surveillance Courts for 2017, Administrative Office of the United States Courts (Apr. 25, 2018), http://www.uscourts.gov/sites/default/files/ao_foreign_int_surveillance_court_annual_report_2017.pdf.     [433]    H. Permanent Select Comm. on Intelligence, FISA Section 702, supra note 429.     [434]    See, e.g., United States v. Hasbajrami, No. 11-CR-623 (JG), 2016 WL 1029500 at *16 (E.D.N.Y. Mar. 18, 2016) (denying defendant’s constitutional challenge to Section 702 arguing that the FBI was required to obtain a warrant before acquiring U.S. persons’ communications incidentally gathered through lawful targeting of foreign persons).     [435]    Pub. L. 115-118, § 103.     [436]    Id. § 101(a).     [437]    Id. § 203(a)(1).     [438]    See Gibson Dunn Client Alert: Supreme Court Holds That Individuals Have Fourth Amendment Privacy Rights In Cell Phone Location Records (June 22, 2018), available at https://www.gibsondunn.com/supreme-court-holds-that-individuals-have-fourth-amendment-privacy-rights-in-cell-phone-location-records/.     [439]    Carpenter v. United States, 138 S. Ct. 2206, 2212 (2018).     [440]    Id. at 2213.     [441]    Id. at 2219; see Smith v. Maryland, 442 U.S. 735, 743-44 (1979).     [442]    Carpenter, 138 S. Ct. at 2223.     [443]    Id. at 2220, 2223.     [444]    Id. at 2223.     [445]    Id. at 2220 (citing Riley v. California, 124 S. Ct. 2473, 2484 (2014)).     [446]    Carpenter, 138 S. Ct. at 2220.     [447]    Id. at 2216.     [448]    See United States v. Jones, 565 U.S. 400, 404 (2012); Riley, 124 S. Ct. at 2485.     [449]    Alasaad v. Nielsen, No. 17-CV-11730-DJC, 2018 WL 2170323, at *20 (D. Mass. May 9, 2018).     [450]    Id. at *14.     [451]    Riley, 124 S. Ct. at 2485.     [452]    Alasaad, 2018 WL 2170323, at *20.     [453]    United States v. Kolsuz, 890 F.3d 133, 136-37 (4th Cir. 2018).     [454]    Id. at 136.     [455]    Id.     [456]    Id. at 137.     [457]    Id.     [458]    Id. The following Gibson Dunn lawyers prepared this client update: Alexander Southwell, Ryan Bergsieker, Eric Vandevelde, Howard Hogan, Josh Jessen, Lindsey Young, Jeremy Smith, Amy Chmielewski, Reid Rector, Cassandra Gaedt-Scheckter, Alexandra Perloff-Giles, Tony Bedel, Zoey Goldnick, Lucie Duvall, Josiah Clarke, Craig Streit, Jon Newmark, Sheri Pan, Jacob Rierson, and Luke Sullivan. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues.  For further information, please contact the Gibson Dunn lawyer with whom you usually work or any of the following leaders and members of the firm’s Privacy, Cybersecurity and Consumer Protection practice grou: United States Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Christopher Chorba – Los Angeles (+1 213-229-7396, cchorba@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, smehra@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) Europe Ahmed Baladi – Co-Chair, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)207071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 24, 2019 |
The French Data Protection Authority Imposes a 50 Million Euros Fine on Google LLC

Click for PDF On January 21, 2019, the French Data Protection Authority (the “CNIL“) issued a public ruling against Google LLC. In its ruling, the CNIL imposed a penalty of 50 million euros on Google LLC for breach of EU transparency and information obligations and lack of valid consent for targeted advertising purposes. By this decision, the CNIL becomes the first European data protection authority to levy significant sanctions against a major global internet company based on the provisions of the General Data Protection Regulation (EU) 2016/679 (“GDPR“). This client alert lays out the main points of the decision. I.     Context of the decision In May 2018, two organizations (None Of Your Business and La Quadrature du Net) initiated actions before the CNIL pursuant to Article 80 of the GDPR which provides the right for data subjects to mandate a not-for-profit body, organization or association to exercise rights and bring claims on their behalf. Through these actions, the two organizations asserted claims on behalf of around 10,000 individuals. II.     Complaint investigation procedure carried out by the CNIL Following the collective complaints, the CNIL started an investigation. On June 1, 2018, pursuant to the cooperation provisions of the GDPR, the CNIL submitted the complaints to its European counterparts to assess whether it was competent to handle them. As a reminder, the GDPR provides for a one-stop-shop mechanism which requires that an organization established in the EU shall have, as its sole interlocutor, the supervisory authority of its “main establishment” (also called, the “lead supervisory authority”). This data protection authority then acts as a “lead” authority. As a consequence, before taking a decision, it must coordinate with other national data protection authorities. In this case, the CNIL concluded that Google Ireland Limited was not the main establishment of Google LLC in the European Union as it did not have management powers regarding the processing operations at issue. Therefore, in the absence of a main establishment leading to the identification of a lead supervisory authority, the CNIL concluded that it had jurisdiction over Google LLC. In order to investigate the complaints, the CNIL conducted an online inspection in September 2018. The objective was to verify the compliance of personal data processing carried out by Google LLC with the French Data Protection Act and the GDPR, by analyzing a user’s journey and the documents to which they have access when creating a Google account and configuring their mobile equipment under Android. III.     The breaches identified by the CNIL The CNIL sanctioned Google LLC for (1) a lack of transparency and unsatisfactory information, and (2) a lack of valid consent for the data processing of advertising personalization. Breach of transparency and information obligations The CNIL found that the information Google provided to its users did not meet the requirements of accessibility, clarity and intelligibility provided for in Article 12 of the GDPR. i. The information provided by Google LLC is not easily accessible to users because essential information (including among others, the purposes for which the data are processed, the length of time the data are kept or the categories of data collected for targeted advertising purposes) is excessively spread over several documents, which contain buttons and links that need to be activated to access additional information. Relevant information is only accessible after several steps that sometimes involve up to five or six actions. ii. The information provided by Google LLC is not  clear enough and comprehensible to users because the purposes of the processing are described in a too generic and vague manner. Similarly, the information provided is not enough clear for the user to understand that Google LLC is relying on the consent of the data subjects to process their personal data for targeted advertising purposes. Moreover, the CNIL found that mandatory information required by Article 13 of the GDPR is missing (notably the data retention period). Failure to comply with the obligation to rely on a legal basis for the data processing of advertising personalization The CNIL noted that Google LLC is relying on the consent of the data subjects to process personal data for advertising personalization purposes. Yet, the CNIL found that this consent is not validly obtained as it is (i) not sufficiently informed, and (ii) not specific and unambiguous. i. The consent of users is not sufficiently informed because the information on the processing is diluted in several documents and does not allow the user to be aware of its extent. For example, in the section dedicated to the “Personalization of ads”, the user is not informed of the plurality of services, sites, applications involved in these processing (Google search, You tube, Google home, Google maps, Playstore, Google photo…) and therefore of the volume of data processed and combined. ii. The consent of users is not specific because by ticking the boxes “I accept Google’s terms of use” and “I agree that my information may be used as described above and as specified in the privacy policy“, and then by selecting “Create an account”, users accept “as a whole package” all the processing of their personal data carried out by Google LLC, including those relating to personalized advertising. iii. The consent of users is not unambiguous because the display of personalized ads is pre-checked by default and does not involve any positive act performed by the user, as required by the GDPR (for example, ticking a box that is not pre-checked). IV.     The financial sanction pronounced by the CNIL According to Article 83 paragraph 5 of the GDPR, the infringement of certain provisions of the GDPR by a company shall be subject to administrative fines up to 20 million euros or 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher. Google LLC generated revenues of $109.7 billion (approximately €96 billion) in 2017. Although the CNIL had the opportunity to apply a fine up to 4% of Google LLC’s total worldwide annual turnover (corresponding to €3,840 billion), it decided that a financial sanction of €50 million, corresponding to 0.05% of Google LLC’s worldwide annual turnover, was justified in this case. The CNIL notably took into consideration the following criteria to assess the amount of the fine: – the breaches of Google LLC concern essential principles of the GDPR; – the infringements are continuous; – a large number of data subjects are concerned; and – Google business model is partly based on targeted advertising and Google LLC should thus pay particular attention to its responsibility under the GDPR when implementing such targeted advertising. This was the first occasion that the CNIL applied the administrative fines introduced by the GDPR. Before the CNIL’s decision, the maximum penalty applied by another data protection authority under the GDPR was €400,000 in Portugal. With the Google fine, the CNIL sends a clear warning that its sanction is not a simple increase but a change of scale. The CNIL also sent a strong signal to companies subject to the GDPR that their privacy policies and consent flows will be closely scrutinized and non-compliance thoroughly enforced. Gibson Dunn’s lawyers are available to assist with any questions you may have regarding these issues.  For further information, please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Privacy, Cybersecurity and Consumer Protection practice group, or the authors: Ahmed Baladi – Co-Chair, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Please also feel free to contact any of the following practice group leaders and members: Europe Ahmed Baladi – Co-Chair, Paris (+33 (0)1 56 43 13 00, abaladi@gibsondunn.com) James A. Cox – London (+44 (0)207071 4250, jacox@gibsondunn.com) Patrick Doris – London (+44 (0)20 7071 4276, pdoris@gibsondunn.com) Penny Madden – London (+44 (0)20 7071 4226, pmadden@gibsondunn.com) Michael Walther – Munich (+49 89 189 33-180, mwalther@gibsondunn.com) Vera Lukic – Paris (+33 (0)1 56 43 13 00, vlukic@gibsondunn.com) Kai Gesing – Munich (+49 89 189 33-180, kgesing@gibsondunn.com) Sarah Wazen – London (+44 (0)20 7071 4203, swazen@gibsondunn.com) Alejandro Guerrero – Brussels (+32 2 554 7218, aguerrero@gibsondunn.com) Asia Kelly Austin – Hong Kong (+852 2214 3788, kaustin@gibsondunn.com) Jai S. Pathak – Singapore (+65 6507 3683, jpathak@gibsondunn.com) United States Alexander H. Southwell – Co-Chair, New York (+1 212-351-3981, asouthwell@gibsondunn.com) M. Sean Royall – Dallas (+1 214-698-3256, sroyall@gibsondunn.com) Debra Wong Yang – Los Angeles (+1 213-229-7472, dwongyang@gibsondunn.com) Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Christopher Chorba – Los Angeles (+1 213-229-7396, cchorba@gibsondunn.com) Richard H. Cunningham – Denver (+1 303-298-5752, rhcunningham@gibsondunn.com) Howard S. Hogan – Washington, D.C. (+1 202-887-3640, hhogan@gibsondunn.com) Joshua A. Jessen – Orange County/Palo Alto (+1 949-451-4114/+1 650-849-5375, jjessen@gibsondunn.com) Kristin A. Linsley – San Francisco (+1 415-393-8395, klinsley@gibsondunn.com) H. Mark Lyon – Palo Alto (+1 650-849-5307, mlyon@gibsondunn.com) Shaalu Mehra – Palo Alto (+1 650-849-5282, smehra@gibsondunn.com) Karl G. Nelson – Dallas (+1 214-698-3203, knelson@gibsondunn.com) Eric D. Vandevelde – Los Angeles (+1 213-229-7186, evandevelde@gibsondunn.com) Benjamin B. Wagner – Palo Alto (+1 650-849-5395, bwagner@gibsondunn.com) Michael Li-Ming Wong – San Francisco/Palo Alto (+1 415-393-8333/+1 650-849-5393, mwong@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 23, 2019 |
Law360 Names Gibson Dunn Among Its Cybersecurity & Privacy 2018 Practice Groups of the Year

Law360 named Gibson Dunn one of its six Cybersecurity & Privacy Practice Groups of the Year [PDF] for 2018. The practice group was noted as “the trusted choice for leading technology companies” who “continues to do some of the most cutting-edge work in the space.” The firm’s Cybersecurity & Privacy practice was profiled on January 23, 2019. Gibson Dunn’s Privacy, Cybersecurity and Consumer Protection Practice Group represents clients across a wide range of industries in matters involving complex and rapidly evolving laws, regulations, and industry best practices relating to privacy, cybersecurity, and consumer protection. Our team includes the largest number of former federal cyber-crimes prosecutors of any law firm.

January 23, 2019 |
Kristin Linsley Named Top Cyber Lawyer 2019 by Daily Journal

The Daily Journal named San Francisco partner Kristin Linsley to its 2019 list of the Top Cyber Lawyers in California [PDF].  Linsley has extensive experience in complex business and appellate litigation across a spectrum of subject areas, including technology and privacy, international and transnational law, and complex financial litigation.  Her profile was published on January 23, 2019.

January 13, 2019 |
Gibson Dunn Named a 2018 Law Firm of the Year

Gibson, Dunn & Crutcher LLP is pleased to announce its selection by Law360 as a Law Firm of the Year for 2018, featuring the four firms that received the most Practice Group of the Year awards in its profile, “The Firms That Dominated in 2018.” [PDF] Of the four, Gibson Dunn “led the pack with 11 winning practice areas” for “successfully securing wins in bet-the-company matters and closing high-profile, big-ticket deals for clients throughout 2018.” The awards were published on January 13, 2019. Law360 previously noted that Gibson Dunn “dominated the competition this year” for its Practice Groups of the Year, which were selected “with an eye toward landmark matters and general excellence.” Gibson Dunn is proud to have been honored in the following categories: Appellate [PDF]: Gibson Dunn’s Appellate and Constitutional Law Practice Group is one of the leading U.S. appellate practices, with broad experience in complex litigation at all levels of the state and federal court systems and an exceptionally strong and high-profile presence and record of success before the U.S. Supreme Court. Class Action [PDF]: Our Class Actions Practice Group has an unrivaled record of success in the defense of high-stakes class action lawsuits across the United States. We have successfully litigated many of the most significant class actions in recent years, amassing an impressive win record in trial and appellate courts, including before the U. S. Supreme Court, that have changed the class action landscape nationwide. Competition [PDF]: Gibson Dunn’s Antitrust and Competition Practice Group serves clients in a broad array of industries globally in every significant area of antitrust and competition law, including private antitrust litigation between large companies and class action treble damages litigation; government review of mergers and acquisitions; and cartel investigations, internationally across borders and jurisdictions. Cybersecurity & Privacy [PDF]: Our Privacy, Cybersecurity and Consumer Protection Practice Group represents clients across a wide range of industries in matters involving complex and rapidly evolving laws, regulations, and industry best practices relating to privacy, cybersecurity, and consumer protection. Our team includes the largest number of former federal cyber-crimes prosecutors of any law firm. Employment [PDF]: No firm has a more prominent position at the leading edge of labor and employment law than Gibson Dunn. With a Labor and Employment Practice Group that covers a complete range of matters, we are known for our unsurpassed ability to help the world’s preeminent companies tackle their most challenging labor and employment matters. Energy [PDF]: Across the firm’s Energy and Infrastructure, Oil and Gas, and Energy, Regulation and Litigation Practice Groups, our global energy practitioners counsel on a complex range of issues and proceedings in the transactional, regulatory, enforcement, investigatory and litigation arenas, serving clients in all energy industry segments. Environmental [PDF]: Gibson Dunn has represented clients in the environmental and mass tort area for more than 30 years, providing sophisticated counsel on the complete range of litigation matters as well as in connection with transactional concerns such as ongoing regulatory compliance, legislative activities and environmental due diligence. Real Estate [PDF]: The breadth of sophisticated matters handled by our real estate lawyers worldwide includes acquisitions and sales; joint ventures; financing; land use and development; and construction. Gibson Dunn additionally has one of the leading hotel and hospitality practices globally. Securities [PDF]: Our securities practice offers comprehensive client services including in the defense and handling of securities class action litigation, derivative litigation, M&A litigation, internal investigations, and investigations and enforcement actions by the SEC, DOJ and state attorneys general. Sports [PDF]: Gibson Dunn’s global Sports Law Practice represents a wide range of clients in matters relating to professional and amateur sports, including individual teams, sports facilities, athletic associations, athletes, financial institutions, television networks, sponsors and municipalities. Transportation [PDF]: Gibson Dunn’s experience with transportation-related entities is extensive and includes the automotive sector as well as all aspects of the airline and rail industries, freight, shipping, and maritime. We advise in a broad range of areas that include regulatory and compliance, customs and trade regulation, antitrust, litigation, corporate transactions, tax, real estate, environmental and insurance.

January 15, 2019 |
Ninth Circuit Judges Call for En Banc Review of the Federal Trade Commission’s Authority to Obtain Monetary Relief

Click for PDF With increasing regularity, the Federal Trade Commission (“FTC”) is seeking and obtaining large monetary remedies as “equitable monetary relief” pursuant to Section 13(b) of the FTC Act.  Indeed, FTC settlements and judgments exceeding $100 million, and even $1 billion, are becoming commonplace. The Supreme Court, however, has never held that Section 13(b) of the FTC Act empowers the FTC to obtain monetary relief.  Although multiple federal circuit courts have held that Section 13(b) provides the agency with this power, several weeks ago two Ninth Circuit judges issued a concurrence in FTC v. AMG Capital Management, LLC et al. calling for the full Ninth Circuit to reconsider this issue en banc in light of the Supreme Court’s 2017 decision in Kokesh v. SEC. Gibson Dunn partners Sean Royall, Blaine Evanson, and Rich Cunningham, and associate Brandon J. Stoker recently published an article discussing the AMG Capital Management concurrence in the Washington Legal Foundation’s The Legal Pulse blog.  The article describes the concurrence and how it fits into the broader legal landscape around this issue, which is clearly poised for further attention from the federal appellate courts, including the Supreme Court. Ninth Circuit Judges Call for En Banc Review of the Federal Trade Commission’s Authority to Obtain Monetary Relief (click on link) © 2019, Washington Legal Foundation, The Legal Pulse, January 15, 2019. Reprinted with permission. Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. Please contact the authors of this Client Alert, the Gibson Dunn lawyer with whom you usually work, or one of the leaders and members of the firm’s Antitrust and Competition or Privacy, Cybersecurity and Consumer Protection practice groups: Washington, D.C. Scott D. Hammond (+1 202-887-3684, shammond@gibsondunn.com) D. Jarrett Arp (+1 202-955-8678, jarp@gibsondunn.com) Adam Di Vincenzo (+1 202-887-3704, adivincenzo@gibsondunn.com) Howard S. Hogan (+1 202-887-3640, hhogan@gibsondunn.com) Joseph Kattan P.C. (+1 202-955-8239, jkattan@gibsondunn.com) Joshua Lipton (+1 202-955-8226, jlipton@gibsondunn.com) Cynthia Richman (+1 202-955-8234, crichman@gibsondunn.com) Jeremy Robison (+1 202-955-8518, wrobison@gibsondunn.com) New York Alexander H. Southwell (+1 212-351-3981, asouthwell@gibsondunn.com) Eric J. Stock (+1 212-351-2301, estock@gibsondunn.com) Los Angeles Daniel G. Swanson (+1 213-229-7430, dswanson@gibsondunn.com) Debra Wong Yang (+1 213-229-7472, dwongyang@gibsondunn.com) Samuel G. Liversidge (+1 213-229-7420, sliversidge@gibsondunn.com) Jay P. Srinivasan (+1 213-229-7296, jsrinivasan@gibsondunn.com) Rod J. Stone (+1 213-229-7256, rstone@gibsondunn.com) Eric D. Vandevelde (+1 213-229-7186, evandevelde@gibsondunn.com) Orange County Blaine H. Evanson (+1 949-451-3805, bevanson@gibsondunn.com) San Francisco Rachel S. Brass (+1 415-393-8293, rbrass@gibsondunn.com) Dallas M. Sean Royall (+1 214-698-3256, sroyall@gibsondunn.com) Olivia Adendorff (+1 214-698-3159, oadendorff@gibsondunn.com) Veronica S. Lewis (+1 214-698-3320, vlewis@gibsondunn.com) Mike Raiff (+1 214-698-3350, mraiff@gibsondunn.com) Brian Robison (+1 214-698-3370, brobison@gibsondunn.com) Robert C. Walters (+1 214-698-3114, rwalters@gibsondunn.com) Denver Richard H. Cunningham (+1 303-298-5752, rhcunningham@gibsondunn.com) Ryan T. Bergsieker (+1 303-298-5774, rbergsieker@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 14, 2019 |
U.S. Department of Health and Human Services Issues New Guidance on Voluntary Cybersecurity Practices for Health Care Industry

Click for PDF On December 28, 2018, a Task Group that includes U.S. Department of Health and Human Services (“HHS”) personnel and private-sector health care industry leaders published new guidance for health care organizations on cybersecurity best practices.[1]  The guidance—Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients—is voluntary and creates no legal obligations.  It is targeted to health care providers, payors, pharmaceutical companies, and medical device manufacturers. This publication is among the most comprehensive and detailed guidance now available to the health care industry on cybersecurity.  While voluntary, the prescriptive advice and scalable tools in the new guidance may be a valuable resource for legal, compliance, IT, and information security professionals at health care organizations.  Organizations that follow this guidance may decrease the likelihood that they will suffer a costly data breach, and in the event of a breach may be able to point to compliance with the guidance to show that they have implemented reasonable cybersecurity practices, thereby helping to defend against private lawsuits or government enforcement actions. This alert briefly describes the background and key takeaways from the guidance.  Gibson Dunn is available to answer any questions you may have about how this guidance applies to your organization, as well as any other topics related to cybersecurity or privacy in the health care industry. Background The health care industry is a primary target for attacks by cyber-criminals.  The threat is especially critical because by at least one measure the average cost of a data breach in the health sector is $408 per record, almost double that of the next highest industry.[2]  In recent years, moreover, HHS’s Office for Civil Rights (“OCR”)—the office charged with enforcing the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”)—has demonstrated an increasing willingness to bring enforcement actions against even reputable and respected organizations that have suffered a data breach.[3] The new guidance comes against this backdrop and as the result of the Cybersecurity Act of 2015, which required HHS to issue guidance through a “trusted platform and tighter partnership between the United States government and the private sector.”[4]  Under Section 405(d) of the Act, industry and government leaders formed a Task Group in May 2017 to create a set of “voluntary, consensus-based principles and practices to ensure cybersecurity in the Health Care and Public Health (HPH) sector.”[5]  This guidance is the result of 18 months of work by the Task Group. The Guidance Recognizing that it would be impossible to address every cybersecurity challenge in a single publication, the Task Group focused on five prevalent cybersecurity threats: 1) e-mail phishing attacks, 2) ransomware attacks, 3) loss or theft of equipment or data, 4) insider, accidental or intentional data loss, and 5) attacks against connected medical devices that may affect patient safety.[6]  For each of the five high risk cybersecurity threats, the guidance describes the risk, lists specific vulnerabilities and the potential effects of these vulnerabilities, and offers a list of “practices to consider” to help minimize the threat. The Task Group then identified a set of voluntary best practices and organized them into ten categories: E-mail Protection Systems Endpoint Protection Systems Access Management Data Protection and Loss Prevention Asset Management Network Management Vulnerability Management Incident Response Medical Device Security Cybersecurity Policies Information regarding each of these practice categories is detailed in two supplementary technical volumes—one addressing the needs of small organizations and the other addressing the requirements of medium and large organizations—as well as a supplemental volume of additional resources and templates.  The guidance also provides a toolkit for determining and prioritizing the cybersecurity practices that would be most effective, which can be used to assist organizations in conducting a cybersecurity risk assessment.[7] The specific practices identified in the guidance are not intended to replace existing regulatory requirements or frameworks (such as the HIPAA Security Rule or the NIST Cybersecurity Framework).  Instead, they are intended to be a supplemental resource for health care organizations, with the goal of “rais[ing] the cybersecurity floor across the health care industry.”[8]  Specific application and resource allocation will be up to each organization, and the guidance recognizes that each organization will need to tailor cybersecurity practices to its specific size, complexity, and type.  The guidance provides a chart to assist in determining these categorizations.[9]  Importantly, the guidance does not authorize any causes of action or grounds for regulatory enforcement. Conclusion Because of the long shadow of HIPAA, the health care industry has long been among the most heavily-regulated industries when it comes to cybersecurity practices.  This new guidance offers an additional tool that health care organizations can use to gauge the adequacy of their systems and their preparedness for a cyber attack.  Given that HHS OCR is simultaneously seeking comments on how it might update HIPAA’s requirements,[10] and the explosion of enforcement activity and lawsuits related to cybersecurity and privacy more generally, health care organizations would be well-served to evaluate this guidance and refine or enhance their plans to address cybersecurity issues that regulators and plaintiffs are likely to examine increasingly in the years to come.    [1]   Healthcare & Public Health Sector Coordinating Councils, Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients (Dec. 28, 2018), https://www.phe.gov/Preparedness/planning/405d/Documents/HICP-Main-508.pdf.    [2]   Id. at 9.    [3]   See, e.g., Press Release, Department of Health and Human Services, Anthem Pays OCR $16 Million in Record HIPAA Settlement Following Largest U.S. Health Data Breach in History (Oct. 15, 2018), https://www.hhs.gov/about/news/2018/10/15/anthem-pays-ocr-16-million-record-hipaa-settlement-following-largest-health-data-breach-history.html, Press Release, Department of Health and Human Services, Five breaches add up to millions in settlement costs for entity that failed to heed HIPAA’s risk analysis and risk management rules (Feb. 1, 2018), available at https://www.hhs.gov/about/news/2018/02/01/five-breaches-add-millions-settlement-costs-entity-failed-heed-hipaa-s-risk-analysis-and-risk.html.    [4]   Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients, at 4.    [5]   Id.    [6]   Id. at 6.    [7]   Id. at 26.    [8]   Id.    [9]   Id. at 11.    [10]   See Request for Information on Modifying HIPAA Rules to Improve Coordinate Care, 83 Fed. Reg. 64,302 (Dec. 14, 2018). Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding the above developments.  Please contact the Gibson Dunn lawyer with whom you usually work, or the following authors: Ryan T. Bergsieker – Denver (+1 303-298-5774, rbergsieker@gibsondunn.com) Reid Rector – Denver (+1 303-298-5923, rrector@gibsondunn.com) Josiah J. Clarke – Denver (+1 303-298-5708, jclarke@gibsondunn.com) Please also feel free to contact the following practice group leaders: Alexander H. Southwell – Chair, Privacy, Cybersecurity and Consumer Protection Practice, New York (+1 212-351-3981, asouthwell@gibsondunn.com) Daniel J. Thomasch – Co-Chair, Life Sciences Practice, New York (+1 212-351-3800, dthomasch@gibsondunn.com) Tracey B. Davies – Co-Chair, Life Sciences Practice, Dallas (+1 214-698-3335, tdavies@gibsondunn.com) Ryan A. Murr – Co-Chair, Life Sciences Practice, San Francisco (+1 415-393-8373, rmurr@gibsondunn.com) Stephen C. Payne – Chair, FDA and Health Care Practice, Washington, D.C. (+1 202-887-3693, spayne@gibsondunn.com) © 2019 Gibson, Dunn & Crutcher LLP Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.

January 11, 2019 |
How Calif. Privacy Act Could Prompt Private Plaintiff Suits

Orange County partner Joshua Jessen is the author of “How Calif. Privacy Act Could Prompt Private Plaintiff Suits,” [PDF] published by Law360 on January 11, 2019.

November 28, 2018 |
Law360 Names Eight Gibson Dunn Partners as MVPs

Law360 named eight Gibson Dunn partners among its 2018 MVPs and noted that the firm had the most MVPs of any law firms this year.  Law360 MVPs feature lawyers who have “distinguished themselves from their peers by securing hard-earned successes in high-stakes litigation, complex global matters and record-breaking deals.” Gibson Dunn’s MVPs are: Christopher Chorba, a Class Action MVP [PDF] – Co-Chair of the firm’s Class Actions Group and a partner in our Los Angeles office, he defends class actions and handles a broad range of complex commercial litigation with an emphasis on claims involving California’s Unfair Competition and False Advertising Laws, the Consumers Legal Remedies Act, the Lanham Act, and the Class Action Fairness Act of 2005. His litigation and counseling experience includes work for companies in the automotive, consumer products, entertainment, financial services, food and beverage, social media, technology, telecommunications, insurance, health care, retail, and utility industries. Michael P. Darden, an Energy MVP [PDF] – Partner in charge of the Houston office, Mike focuses his practice on international and U.S. oil & gas ventures and infrastructure projects (including LNG, deep-water and unconventional resource development projects), asset acquisitions and divestitures, and energy-based financings (including project financings, reserve-based loans and production payments). Thomas H. Dupree Jr., an MVP in Transportation [PDF] –  Co-partner in charge of the Washington, DC office, Tom has represented clients in a wide variety of trial and appellate matters, including cases involving punitive damages, class actions, product liability, arbitration, intellectual property, employment, and constitutional challenges to federal and state statutes.  He has argued more than 80 appeals in the federal courts, including in all 13 circuits as well as the United States Supreme Court. Joanne Franzel, a Real Estate MVP [PDF] – Joanne is a partner in the New York office, and her practice has included all forms of real estate transactions, including acquisitions and dispositions and financing, as well as office and retail leasing with anchor, as well as shopping center tenants. She also has represented a number of clients in New York City real estate development, representing developers as well as users in various mixed-use projects, often with a significant public/private component. Matthew McGill, an MVP in the Sports category [PDF] – A partner in the Washington, D.C. office, Matt practices appellate and constitutional law. He has participated in 21 cases before the Supreme Court of the United States, prevailing in 16. Spanning a wide range of substantive areas, those representations have included several high-profile triumphs over foreign and domestic sovereigns. Outside the Supreme Court, his practice focuses on cases involving novel and complex questions of federal law, often in high-profile litigation against governmental entities. Mark A. Perry, an MVP in the Securities category [PDF] – Mark is a partner in the Washington, D.C. office and is Co-chair of the firm’s Appellate and Constitutional Law Group.  His practice focuses on complex commercial litigation at both the trial and appellate levels. He is an accomplished appellate lawyer who has briefed and argued many cases in the Supreme Court of the United States. He has served as chief appellate counsel to Fortune 100 companies in significant securities, intellectual property, and employment cases.  He also appears frequently in federal district courts, serving both as lead counsel and as legal strategist in complex commercial cases. Eugene Scalia, an Appellate MVP [PDF] – A partner in the Washington, D.C. office and Co-Chair of the Administrative Law and Regulatory Practice Group, Gene has a national practice handling a broad range of labor, employment, appellate, and regulatory matters. His success bringing legal challenges to federal agency actions has been widely reported in the legal and business press. Michael Li-Ming Wong, an MVP in Cybersecurity and Privacy – Michael is a partner in the San Francisco and Palo Alto offices. He focuses on white-collar criminal matters, complex civil litigation, data-privacy investigations and litigation, and internal investigations. Michael has tried more than 20 civil and criminal jury trials in federal and state courts, including five multi-week jury trials over the past five years.