Cyber-security and Data Privacy Outlook and Review: 2013

April 16, 2013

As we have seen in 2012 and now in 2013, the attention paid to cyber-security has reached new heights–hacking, data privacy, and cyber-espionage have continued their prominence in daily headlines, but cyber-security took on unprecedented importance when President Obama focused on it in his State of the Union address. 

Announcing a new executive order to increase sharing of critical cyber information and calling for legislative action to protect our networks and data, President Obama explained:

America must also face the rapidly growing threat from cyber-attacks.  We know hackers steal people’s identities and infiltrate private e-mail.  We know foreign countries and companies swipe our corporate secrets.  Now our enemies are also seeking the ability to sabotage our power grid, our financial institutions, and our air traffic control systems.  We cannot look back years from now and wonder why we did nothing in the face of real threats to our security and our economy.  That’s why, earlier today, I signed a new executive order that will strengthen our cyber defenses by increasing information sharing, and developing standards to protect our national security, our jobs, and our privacy.  Now, Congress must act as well, by passing legislation to give our government a greater capacity to secure our networks and deter attacks.

This Presidential attention is but one notable event in a year that has seen significant legal developments in the data privacy and security areas as the world continues on its path of unprecedented technological transition.

Gibson Dunn’s Information Technology and Data Privacy group has detailed the key data privacy and security events of the past year and anticipated trends for the coming year.  This Outlook and Review covers six core areas:  (1) class actions and civil litigation related to data privacy and security; (2) FTC and regulatory activity; (3) criminal enforcement; (4) federal legislative activity; (5) data security; and (6) select international developments in the European Union and Asia Pacific Region.

TABLE OF CONTENTS (click on link)

I.          Class Action and Civil Litigation Developments

A.        Article III Standing
B.        Assertion of Novel Substantive Claims in Data Privacy Class Actions
C.        Enforceability of Forum Selection Clauses and Arbitration Agreements
D.        Move Away From Third-Parties and Toward First-Party Developers

II.        FTC and Regulatory Activity

A.        This Means War:  Fighting the FTC
B.        History Sniffing
C.        Dumpster Diving
D.        Voyeuristic Rental Computer Software
E.         Children’s Privacy

III.       Criminal Enforcement

A.        Prosecution and Suicide of Aaron Swartz
B.        United States v. Scheinberg, No. 10-CR-336 (S.D.N.Y. Mar. 10, 2011)
C.        In Re:  Grand Jury Subpoena Duces Tecum Dated March 25, 2011, United States v. John Doe, Nos. 11-12268 & 11-15421 (11th Cir. Feb. 23, 2012)
D.        United States v. Dotcom, No. 12CR3 (E.D. Va. Jan. 5, 2012)
E.         United States v. Nosal, 642 F.3d 781 (9th Cir. 2011), rev’d No. 10-10038 (9th Cir. Apr. 10, 2012) (en banc)
F.         WEC Carolina Energy Solutions LLC v. Miller, 687 F.3d 199 (4th Cir. 2012)
G.        Economic Espionage Act

IV.       Legislative Developments

A.        HIPAA–Expanding Coverage of Business Associates and Subcontractors
B.        Amendment to the Video Privacy Protection Act Allowing Easier Customer Consent to Share Video Viewing Information
C.        Developments Regarding the California Online Privacy Protection Act and Mobile Apps
D.        FTC Adopts Long-Awaited Modifications to the Children’s Online Privacy Protection Rule
E.         Executive Order Improving Critical Infrastructure Cybersecurity
F.         Privacy Legislative Outlook for 2013

V.        Data Security

A.        State and Local Governments
B.        United States Sentencing Guidelines
C.        Foreign Theft of Trade Secrets

VI.       International Developments

A.        European Union

1.         Ambitious Requests behind Proposed New Privacy Regulation & E-Discovery
2.         Interpretation of EU Cookie Consent Requirement
3.         Cloud Computing & Binding Corporate Rules for Data Processors
4.         Enforcement of Data Protection Rules in the EU
5.         France
6.         Germany
7.         Spain
8.         The Netherlands

B.        Asia Region

1.         China
2.         Other Jurisdictions

I.   Class Action and Civil Litigation Developments

In 2012 and the first quarter of 2013, the plaintiffs’ bar continued to be extremely active in filing and prosecuting class actions asserting a variety of claims relating to the alleged unauthorized collection, use or disclosure of consumer data, or following widely publicized data breaches. 

                    A.   Article III Standing

Defendants continued to mount Article III standing challenges in most of these cases, but there was more variation in the outcomes in 2012 than in 2011 (when several putative Internet privacy class actions were dismissed for lack of Article III standing).  This was partially due to the plaintiffs’ bar adapting to a series of 2011 decisions that held that the alleged unauthorized collection of a plaintiff’s "personal information" does not constitute an injury in fact under Article III of the United States Constitution.  See, e.g., In re Specific Media Flash Cookies Litig., Case No. 10-CV-1256-GW, 2011 WL 1661532 (C.D. Cal. Apr. 28, 2011); In re iPhone Application Litig., No. 11-MD-02250-LHK, 2011 WL 4403963 (N.D. Cal. Sept. 20, 2011); Low v. LinkedIn, No. 11-CV-01468-LHK, 2011 WL 5509848 (N.D. Cal. Nov. 11, 2011).

Latching onto a line of authority that the injury required by Article III "may exist solely by virtue of ‘statutes creating legal rights, the invasion of which creates standing,’" (Edwards v. First American Corp., 610 F.3d 514, 517 (9th Cir. 2010)), plaintiffs in putative privacy class actions now routinely assert causes of action in their complaints that allege violations of federal statutes that do not have an injury requirement.  The most popular statutes utilized by plaintiffs for this purpose are the Wiretap Act (18 U.S.C. §§ 2510, et seq.) and the Stored Communications Act ("SCA") (18 U.S.C. §§ 2701, et seq.), statutes that typically are not even implicated by the alleged conduct.  For electronic platforms that offer video content, plaintiffs also increasingly have alleged violations of the Video Privacy Protection Act ("VPPA") (18 U.S.C. § 2710), a statute enacted by Congress in 1988 after Robert Bork’s video rental history was published during his Supreme Court nomination.  In 2012, privacy practitioners were anxiously awaiting the Supreme Court’s anticipated ruling in First American Corp. v. Edwards, 131 S. Ct. 3022 (2011), a decision many hoped would resolve the issue of whether an alleged statutory violation alone is sufficient to create Article III standing where the plaintiff fails to allege any actual harm.  On June 28, 2012, however, the Supreme Court held that certiorari had been improvidently granted and dismissed the case, leaving the Ninth Circuit’s decision intact.  Thus, whether the mere assertion that a statutory right was violated is sufficient to establish injury sufficient for Article III standing remains an open question.

District courts have reached different conclusions on the issue.  For example, in In re iPhone Application Litigation, the plaintiffs included Wiretap Act and SCA claims in their amended complaint after their initial complaint, which did not assert alleged violations of these statutes, was dismissed for lack of Article III standing in 2011.  Defendants moved to dismiss the amended complaint for lack of standing, but the court held that "a violation of the Wiretap Act or the Stored Communications Act may serve as a concrete injury for the purposes of Article III injury analysis," even though the court held that plaintiffs could not state a claim under either statute and dismissed both claims with prejudice.  844 F. Supp. 2d 1040, 1055 (2012).  See also Gaos v. Google, Inc., 2012 WL 109446, at *3 (N.D. Cal. Mar. 29, 2012) (holding that "a violation of one’s statutory rights under the SCA is a concrete injury"); Cousineau v. Microsoft Corp., No. 11-CV-01438-JCC (W.D. Wash. June 22, 2012) (motion to dismiss for lack of Article III standing denied where plaintiff alleged SCA violation).  In contrast, at the end of 2012, the court in In re Google, Inc. Privacy Policy Litigation dismissed plaintiffs’ complaint for lack of Article III standing even though plaintiffs had asserted a claim under Wiretap Act.  Case No. C 12-01382 PSG, 2012 U.S. Dist. LEXIS 183041 (N.D. Cal. Dec. 28, 2012 ).

In addition to asserting statutory claims to try to overcome the Article III hurdle, plaintiffs in 2012 also increasingly alleged (1) harm to their electronic devices in the form of unexpected "resource consumption," and (2) "overpayment" theories of harm, which allege that plaintiffs would not have purchased the good or service at issue or would have paid less for it had the "true facts" been disclosed to them.  For example, plaintiffs in In re iPhone Application Litigation included both types of allegations in their amended complaint, which allowed their complaint to partially survive a second motion to dismiss.    However, it is important for defendants not to lose sight of the standing issue (which goes to the court’s subject matter jurisdiction), even if the complaint survives a standing challenge based on the pleadings.  See Lujan v. Defenders of Wildlife, 504 U.S. 555, 561 (1992) (plaintiffs bear the burden of proving standing under Article III "with the manner and degree of evidence required at the successive stages of the litigation.  At the pleading stage, general factual allegations of injury resulting from the defendant’s conduct may suffice," but "[i]n response to a summary judgment motion, . . . the plaintiff can no longer rest on such ‘mere allegations,’ but must ‘set forth’ by affidavit or other evidence ‘specific facts’ to support standing.").

Moreover, even where plaintiffs assert "overpayment" theories of harm, when they cannot identify the alleged misrepresentations they relied upon in deciding to purchase a product or service, courts still will dismiss the complaint at the pleadings stage for lack of Article III standing.  See, e.g., Pirozzi v. Apple Inc., Case No. 12-CV-01529 YGR, 2012 U.S. Dist. LEXIS 180530, at *10 (N.D. Cal. Dec. 20, 2012) (dismissing complaint for lack of Article III standing where "Plaintiff fail[ed] to allege specifically which statements she found material to her decision to purchase a[] . . . Device or App"). 

Finally, although it was not expressly an Article III decision, the Western District of Washington issued an opinion in a putative class action involving an alleged data breach (Grigsby v. Valve Corp.) in which the court held that, due to the potential exorbitant costs of discovery, a "complex, large-scale case such as a class action should naturally have a higher plausibility threshold" under Twombly and Iqbal "than a simple case."  2012 U.S. Dist. LEXIS 179096, at *10 (W.D. Wash. Nov. 14, 2012).  The court held that "context matters," stating "this is a class action which may require voluminous and costly e-discoveryDefending this action will require substantial expense and effort by [the defendant] even if discovery ultimately reveals there are no damages whatsoever … In light of this, Plaintiffs’ complaint must rise to a higher plausibility threshold than it would if it were a garden-variety tort claim or claim brought by [an individual plaintiff] alone."  Id. at *10-12.  In dismissing the complaint (with leave to amend), the court held that plaintiffs had not satisfied this threshold with their conclusory allegations of harm.  Id. at *12-13.  In contrast, the Southern District of California held in In re Sony Gaming Networks and Customer Data Security Breach Litigation that plaintiffs had "articulated sufficient particularized and concrete harm to sustain a finding of injury-in-fact at this stage in the pleadings" because they had alleged "that that their sensitive Personal Information was wrongfully disseminated, thereby increasing the risk of future harm."  2012 U.S. Dist. LEXIS 146971, at *50 (S.D. Cal. Oct. 11, 2012).  The court stated that "even though Sony alleges no harm has yet occurred, in certain circumstances, as the Court finds pertinent here, future harm may be regarded as a cognizable loss sufficient to satisfy Article III’s injury-in-fact requirement."  Id.  However, even though the Sony court found plaintiffs had adequately alleged injury for purposes of Article III standing, the court dismissed plaintiffs’ negligence claim for lack of harm.  See id. at *64 ("While Plaintiffs have currently alleged enough to assert Article III standing to sue based on an increased risk of future harm, the Court finds such allegations insufficient to sustain a negligence claim under California law.").

                    B.   Assertion of Novel Substantive Claims in Data Privacy Class Actions

Many of the claims asserted in putative privacy class actions in 2012 and early 2013 were familiar:  unfair competition, negligence, and other alleged violations of consumer protection laws (such as California’s Consumers Legal Remedies Act).  However, 2012 also saw the assertion of statutes that have been less frequently litigated, such as the VPPA, which creates significant monetary exposure via a minimum $2,500 per-person liquidated damages provision for "video tape service providers" that knowingly disclose "personally identifiable information concerning any consumer," subject to certain exceptions.  Plaintiffs asserting violations of the VPPA typically argue that the website publisher has violated the statute by disclosing plaintiffs’ video viewing information in connection with a device identifier to third party analytics companies or advertising networks.

On August 10, 2012, in In re Hulu Privacy Litigation, a federal magistrate judge in the Northern District of California denied Hulu’s motion to dismiss a VPAA claim, ruling that the VPPA may extend to online video-streaming services.  Case No. C 11-03764 LB, 2012 U.S. Dist. LEXIS 112916 (Aug. 10, 2012 N.D. Cal.).  Plaintiffs alleged that Hulu had "knowingly and without . . . [their] consent disclosed to third parties . . . [their] video viewing selections and personally identifiable information, knowing that such disclosure included the disclosure of [their] personally identifying information . . . and their requests for and/or obtaining of specific video materials and/or services from Hulu," in violation of the VPPA.  Id. at *11.  In moving to dismiss, Hulu argued, inter alia, that it was not a "video tape service provider" within the meaning of the Act, and the court acknowledged that "the online streaming mechanism of delivery here did not exist when Congress enacted the statute in 1988."  Id. at *13-18.  Nonetheless, the court rejected Hulu’s argument, holding that the legislative history of the VPPA "confirms that Congress was concerned with protecting the confidentiality of private information about viewing preferences regardless of the business model or media format involved."  Id. at *17.

Another statute that was heavily litigated in 2012 was California’s "Shine the Light" statute, Cal. Civ. Code § 1798.83, which allows California residents to request information from businesses about their third-party information sharing practices.  Under Shine the Light, a business must make certain disclosures to customers upon request if the business "has an established business relationship with a customer and has within the immediately preceding calendar year disclosed personal information [of that customer] … to third parties … for the third parties’ direct marketing purposes."   Cal. Civ. Code § 1798.83(a). The disclosures must contain the names and addresses of any third parties with whom the information was shared and must also identify the categories of information that the business shares with third parties, though they need not specify any specific individual’s personal information or whether that particular person’s information has been shared.[1]

To facilitate disclosure, businesses must designate an email address, postal address, or toll-free number to which customers should send Shine the Light requests.  Once a business designates one or more Shine the Light contact points, the business must disseminate the contact point(s) by training employees to provide the contact point information, adding a section to the company website describing customers’ Shine the Light rights and providing contact point information, or by making the contact point information readily available at every location within California where the business regularly has contact with customers.  Shine the Light includes its own statutory penalties provision, which sets the remedy for a willful, intentional or reckless violation at a maximum of $3,000, while the penalty for other violations is a maximum of $500.[2]  Cal. Civ. Code § 1798.84(c).  In addition, prevailing plaintiffs are entitled to recover damages, reasonable attorney’s fees, and costs.

Late 2011 and early 2012 saw the filing of nearly a dozen putative class actions in California courts based on Shine the Light.  The complaints did not fare well, however, and several of them were dismissed at the pleadings stage.  The Central District of California issued the most significant of these rulings on June 14, 2012 in Boorstein v. Men’s Journal LLC, Case No. CV 12-771, 2012 U.S. Dist. LEXIS 83101 (C.D. Cal).  The plaintiff in that case, who subscribed to the magazine Men’s Journal and visited the magazine’s website, alleged that Men’s Journal failed to provide required Shine the Light disclosures and that its conduct violated California’s Unfair Competition Law ("UCL") (Cal. Bus. & Prof. Code § 17200 et seq.).  Critically, the plaintiff did not allege that he sought Shine the Light disclosures from the defendant and was denied them, but merely that the defendant failed to comply with the law by providing the required contact point information.  The court held that the plaintiff had failed to allege a cognizable injury under the statute and dismissed the complaint–ultimately with prejudice after plaintiff failed to "even attempt to cure the defects" in the initial complaint.  2012 WL 3791701 (C.D. Cal. Aug. 17, 2012).  Dismissals followed for the same reasons in at least three other federal suits against different defendants based on similar allegations.  See Miller v. Hearst Commnc’ns, Inc., No. 12-cv-00733, 2012 WL 3205241 (C.D. Cal. Aug. 3, 2012); King v. Condé Nast Publications, 12-cv-00719, 2012 WL 3186578 (C.D. Cal. Aug. 3, 2012); Murray v. Time, Inc., No. 12-00431, 2012 WL 3634387 (N.D. Cal. Aug. 24, 2012).  Additionally, at least one California state court has sustained a demurrer on similar grounds, holding that "[a] statute designed to provide useful information to a customer, upon request, cannot be violated if there was no request made.  No violation means there was no injury as a result of the alleged statutory violation."  Reguiero v. XO Group, Inc., Case No. 30-2012-00535636, Order Ruling on Demurrer (Orange Cnty. Sup. Ct. July 3, 2012). 

The plaintiffs in Boorstein and other cases have appealed the dismissals to the Ninth Circuit Court of Appeals.  Additionally, several of the state court actions are effectively stayed pending the Ninth Circuit’s ruling.  Thus, while plaintiffs seeking to pluck Shine the Light from obscurity have suffered some recent setbacks, the future of such claims remains unclear, and businesses should continue to take steps to comply with Shine the Light by updating privacy policies as necessary and responding to Shine the Light requests for disclosure. 

                    C.   Enforceability of Forum Selection Clauses and Arbitration Agreements

Another issue that was litigated in several putative privacy class actions in 2012 (and late 2011) was the particular forum in which an action should be litigated.  Some defendants moved to enforce forum selection clauses, with mixed results.   For example, in Harris v. comScore, Inc., 825 F. Supp. 2d 924 (N.D. Ill. 2011), a putative class action alleging that comScore improperly obtained and used personal information from plaintiffs’ computers after plaintiffs downloaded and installed comScore’s software, comScore moved to enforce a forum selection clause contained in comScore’s User License Agreement that provided that exclusive jurisdiction would reside in Virginia.  The court, however, accepted plaintiffs’ allegations that the "the terms of service were obscured during the installation process ‘in such a way that the average, non-expert consumer would not notice the hyperlink’ to them" and accordingly denied the motion.  Id. at 926-27.

In contrast, in Opperman v. Path, Inc., Case No. A-12-CA-219-SS, a case alleging that several mobile apps accessed address book information on iPhones, the Western District of Texas granted defendants’ motion to transfer the case to the Northern District of California based in part on the fact that "the Apple Terms and Conditions"–to which all plaintiffs had agreed–"include[d] the following forum selection clause:  ‘You expressly agree that exclusive jurisdiction for any claim or dispute with Apple or relating in any way to your use of the iTunes Service resides in the courts in the State of California."  The court found that "[t]he presence of such a clause," while not dispositive, "is a significant factor that figures centrally in the district court’s calculus under [28 U.S.C.] § 1404(a)."

Motions to compel arbitration also were filed by defendants in some of these cases.  For example, in In re Zappos.Com, Inc. Customer Data Security Breach Litigation, 3:12-CV-00325-RCJ-VPC, 2012 U.S. Dist. LEXIS 141803 (D. Nev. Sept. 27, 2012), a data breach case, Zappos moved to compel arbitration based on a provision in the terms of use on the website that stated that all disputes would be submitted to confidential arbitration in Las Vegas, Nevada.  Id. at *17-18.  The court denied the motion, holding that customers did not agree to the terms of use, which were inconspicuous and buried in the middle/bottom of every webpage.  Id. at *18-22.  Additionally, because Zappos reserved the right to change its terms of use at any time without notice, the court concluded that the arbitration agreement in the terms of use was illusory and unenforceable.  Id. at *23-26.  Finally, the court declined to apply equitable estoppel because plaintiffs’ breach of contract claims did not rely on the contract they were seeking to avoid, but rather on other statements found on the website.  Id. at *26-27. 

In another large privacy multi-district litigation, In re Carrier IQ, Inc. Consumer Privacy Litigation, Case No. 3:12-md-02330-EMC, which asserts federal and state claims against several  mobile device manufacturers and Carrier IQ relating to the alleged tracking of smartphone user activity through Carrier IQ’s diagnostic tool, the defendants in late 2012 moved to compel arbitration based on the fact that the plaintiffs had entered into wireless service agreements with their wireless carriers (AT&T Mobility, Sprint, and Cricket) that required mandatory individual arbitration of any disputes related to the carriers’ services.  Defendants argued that because plaintiffs’ claims against them were "intertwined with their wireless service agreements . . . and are based on allegations of concerted and interdependent misconduct by [p]laintiffs’ wireless service providers, the doctrine of equitable estoppel requires [p]laintiffs to arbitrate their claims against [d]efendants."  Nov. 20, 2012 Motion.  See also In re Apple iPhone 3G Prods. Liab. Litig., 859 F. Supp. 2d 1084, 1095-96 (N.D. Cal. 2012).  Plaintiffs had initially included the wireless carriers as defendants in the case but dropped them in their consolidated amended complaint.  The motion is scheduled to be heard later in 2013.

                    D.   Move Away From Third-Parties and Toward First-Party Developers

Finally, one other noteworthy change in 2012 was a move away from third party companies (e.g., advertising networks and analytics firms) as defendants and toward first-party developers (with whom consumers directly interface) as defendants.  Unlike claims against third parties with whom consumers have no relationship, claims against first-party developers allow plaintiffs to more easily attempt to invoke consumer protection or false advertising laws, as well as to allege "overpayment" theories of harm where they have paid for a good or service.  By way of example, although several app developers originally were defendants in In re iPhone Application Litigation (e.g., NPR, The New York Times, etc.), plaintiffs voluntarily dismissed the developers from the case in 2011.  However, in the more recent cases alleging that various apps collected address book information from plaintiffs’ smartphones, plaintiffs named as defendants the app developers themselves (most notably Path).

II.   FTC and Regulatory Activity

While throughout 2012, the FTC continued to apply its Section 5 authority to combat and investigate "unfair and deceptive" conduct, half-way through the year, the FTC faced an unexpected challenge to its authority when one of its targets decided to "fight back" and challenge that authority.  Whether this challenge will be a just a bump in the road or will result in limitations on the FTC’s authority in the data privacy arena remains to be seen.

                    A.   This Means War:  Fighting the FTC

In a case that the data privacy legal community is watching closely, the global hospitality company Wyndham Worldwide Corporation sought to challenge the FTC’s authority to regulate data security practices.  

In August 2012, the FTC brought an action against Wyndham and three of its subsidiaries alleging data security failures that led to three data breaches at Wyndham hotels in less than two years.  The FTC alleges that these failures led to fraudulent charges on consumers’ accounts, over $10.6 million in fraud loss, and the export of over 500,000 consumers’ payment card account information to an Internet domain address registered in Russia.  The FTC further accuses defendants of engaging in deceptive acts or practices in violation of Section 5 of the FTC Act by falsely representing in their privacy policies that they had implemented reasonable and appropriate measures to protect personal information against unauthorized access.  Finally, the FTC claims defendants committed unfair acts or practices by failing to employ reasonable and appropriate measures to protect personal information against unauthorized access.

Wyndham filed a motion to dismiss in November 2012, taking the FTC head on.  In its motion to dismiss, Wyndham observed that in the FTC’s Report to Congress in 2000, the FTC specifically disclaimed the authority to mandate data security standards through Section 5’s "unfair . . . practices" language, admitting that it "lacks the authority to require firms to adopt information practice policies or abide by the unfair information practice principles on their Web sites, or portions of their Web sites, not directed to children."  Wyndham pointed out that Congress has enacted a vast array of data security laws governing certain limited contexts, such as the Fair Credit Reporting Act (imposing data privacy requirements on consumer reporting agencies), the Gramm-Leach-Bliley Act (mandating data security requirements for financial institutions), the Health Insurance Portability and Accountability Act (addressing the security and privacy of health data), and the Children’s Online Privacy Act (enacting special protections for children).  Wyndham argued that this restricts the FTC’s authority to specified circumstances and indicated that Congress does not believe the FTC has plenary authority over data security practices.  In fact, Congress has considered–and rejected–broad data security legislation numerous times over the last decade, in spite of the FTC’s repeated calls for such legislation.[3]

In response, the FTC argued that Wyndham mischaracterized the Report to Congress in 2000 and that in fact Congress bestowed broad power on the FTC under section 5 "to address unanticipated practices in a changing economy."  The FTC asserted that its power to address data security is not limited to rulemaking, but was intended to proceed by case-by-case enforcement.  The FTC argued that its case against Wyndham is a run-of-the mill application of its existent statutory authority to ensure that entities use reasonable measures to protect information collected about consumers.

On March 25, 2013, the District Court for the District of Arizona transferred the case to the District Court for the District of New Jersey, where Wyndham will need to re-file its motion to dismiss.  Wyndham’s unprecedented battle is one data privacy lawyers are following closely–both sides have presented interesting arguments, but neither side hits a clear home run.  The way in which this case plays out could have far-reaching effects on FTC enforcement authority.

                    B.   History Sniffing

In a cautionary tale regarding the necessity of accurate and complete privacy disclosures, online advertising company Epic Marketplace settled charges that it used "history sniffing" to secretly and illegally gather data from millions of consumers about whether they had previously visited more than 54,000 Web sites.  These sites included those related to sensitive medical and financial issues ranging from fertility and incontinence to debt relief and personal bankruptcy.  "History sniffing" is the practice of using code to secretly determine from an Internet user’s Web browser whether that user has previously visited a given Web site.

While Epic’s privacy policy disclosed that it "automatically receives and records anonymous information that your browser sends whenever you visit a website which is part of the Epic Marketplace Network," the FTC alleged in its complaint that this representation was deceptive.  The privacy policy implied that Epic collected information about consumers’ visits to websites only within the Epic Marketplace, when, in fact, Epic used history sniffing to determine whether consumers had visited webpages that were outside the Epic Marketplace Network.  The FTC also asserted that Epic’s failure to specifically disclose the use of history sniffing in its privacy policy constituted a material omission because that information would have been material to consumers’ decisions regarding whether to use Epic’s opt-out mechanism.

As part of the settlement, Epic agreed to refrain from making further misrepresentations regarding the privacy or confidentiality of consumer data or the use of software code on webpages to determine whether a user has previously visited a webpage.  Epic also agreed to refrain from collecting or using any data obtained by history sniffing.  Epic did not pay any penalties as part of the settlement.

This case is yet another example of FTC enforcement action against companies that do not ensure that their privacy policies accurately and fully describe how they collect, use, store, and disclose personal information, including transparency on the emerging types of technologies that are used in the information collection process.

                    C.   Dumpster Diving

Another case that settled this year, United States v. PLS Financial Services, Inc., demonstrated the continued importance of the simplest of analog data security measures:  the shredder.  PLS, which operates 300 payday loan stores in nine states, disseminated Privacy Notices to customers representing that the company "maintain[s] physical electronic and procedural safeguards that comply with federal regulations to guard your nonpublic information."  Contrary to these assertions, the FTC alleged that PLS failed to properly implement and monitor procedures regarding the disposal of sensitive information.  PLS’s failure meant that intact documents containing sensitive customer information (social security numbers, wage information, loan applications, etc.) were disposed of in multiple publicly-accessible dumpsters outside PLS stores.  The FTC asserted that PLS’s failure to take reasonable measures to protect sensitive documents during disposal violated a number of federal laws governing the use of sensitive information by financial institutions and consumer reporting agencies, including the Fair Credit Reporting Act and the FTC’s Disposal Rule, Safeguards Rule, and Privacy Rule.  The FTC also alleged a Section 5 violation for falsely stating in the Privacy Notices that PLS had implemented reasonable and appropriate measures to protect sensitive consumer information from unauthorized access.

As part of the settlement, PLS agreed to injunctive relief prohibiting future violations of the FTC Act, FCRA, and implementing rules.  PLS also agreed to pay a civil penalty in the amount of $101,500 for violations of the FCRA and implementing rules.  Finally, PLS agreed to implement a comprehensive information security program, and to have that program audited biennially for 20 years.

In a world where digital data privacy has become the frontline concern, it is important not to forget the analog basics:  companies should make sure they follow through with their data security and privacy policies both electronically and physically and pay attention to proper disposal methods. 

                    D.   Voyeuristic Rental Computer Software

DesignerWare, a software company, developed a software product called PC Rental Agent that it licensed to stores in the computer rent-to-own industry.  PC Rental Agent, when installed on a rented computer, allowed the rent-to-own stores to disable a computer remotely when a consumer is late on payment, has stopped communicating with the rent-to-own store, or has otherwise violated the rental contract.  The notable aspect of PC Rental Agent was that, through an add-on called Detective mode, rent-to-own stores could use DesignWare’s servers to track the physical location of a computer and surreptitiously monitor the activities of the computer’s user, including by keystroke logging, capturing screenshots, and taking pictures using the computer’s webcam. Additionally, DesignWare collected consumer information with the software, which would trigger fake pop-up registration windows and trick consumers into providing personal information.

DesignerWare recommended–but did not require, contractually or otherwise–that its licensee rent-to-own stores disclose the presence of PC Rental Agent on a rented computer at the time the consumer signed the initial rental agreement.  Additionally, while DesignerWare recommended that its licensees install and activate Detective Mode only to locate and identify the person in possession of a lost or stolen computer, it did not monitor its own collection of or limit its licensees’ access to Detective Mode data to ensure that the information was obtained and used only for this designated purpose. 

The FTC’s action against DesignerWare alleged multiple violations of Section 5 of the FTC Act for this conduct.[4]  Specifically, the FTC alleged that the gathering and disclosure of consumer personal information, the software’s enabling rent-to-own stores to engage in unfair acts or practices, and the use of fake pop-up registration notices all constituted deceptive acts or practices in violation of Section 5.

DesignerWare settled the FTC’s charges by entering into a consent order that (1) permanently prohibited it from using, licensing, selling or otherwise providing any monitoring technology, (2) limited its use of location tracking technology and required clear notice and express user consent, (3) prohibited it from deceptively gathering consumer information, (4) required it to delete or destroy any improperly collected data, and (5) forbade it from making any misrepresentations about its privacy and data security practices.  DesignerWare paid no penalties in connection with the settlement, and was required to provide a compliance report within sixty days of the service of the order and upon the FTC’s request thereafter.

                    E.   Children’s Privacy

In a case that illustrates the FTC’s aggressive focus on children’s privacy (also reflected in legislative developments discussed below), the maker of a social networking app, Path Inc., paid $800,000 to settle charges that it illegally collected children’s personal information without parental consent.  Path also settled charges that it deceived users of all ages by collecting personal and contact information from users’ cell phones without permission, agreeing to establish a comprehensive privacy program and undergo privacy reviews every other year for 20 years. 

The FTC’s complaint against Path–whose app allows users to share journal entries, photos, the user’s location, and other personal information–alleged that Path’s user interface was misleading.  Even if the user did not opt to allow it, the app automatically collected and stored information from the user’s address book–including names, addresses, phone numbers, email address, and other personal information.  Moreover, the complaint alleged, Path’s privacy policy mislead users about the personal information Path collected, claiming only to collect limited information such as IP address and browser type.  Finally, the FTC charged Path with violating the Children’s Online Privacy Protection Act (COPPA) Rule for collecting personal information from approximately 3,000 children under the age of 13, as shown through the dates of birth provided at account registration, without first getting parental consent.     

As discussed in greater detail below, the COPPA Rule requires that online sites directed to children that have actual knowledge of child users must notify parents and obtain their consent before collecting, using, or disclosing personal information from children under the age of 13.  Sites covered by the Rule must also post clear and transparent privacy policies.  The FTC alleged that Path violated the COPPA Rule by failing to adequately post an explanation of its use of children’s personal information, not providing parents with notice, and not verifying parental consent before collecting a child’s personal information.

Together with the $800,000 civil penalty and periodic privacy review mentioned above, the settlement requires Path to delete information obtained from children under the age of 13 and any personal information it obtained when its allegedly deceptive practices were in place.  The settlement also prohibits Path from making misrepresentations in its privacy policy regarding the collection of personal information. 

III.   Criminal Enforcement

2012 was yet another busy year for U.S. law enforcement officials prosecuting cybercrime and computer-related criminal activity.  The U.S. government continues to vigorously pursue computer-related threats and cybercrime under the Computer Fraud and Abuse Act ("CFAA"), 18 U.S.C. § 1030.  The prosecution of and subsequent suicide of Aaron Swartz, however, has recently led to closer scrutiny in the media of criminal treatment of "hacking" activity.  It remains to be seen how this high-profile case may affect future enforcement decisions.

                    A.   Prosecution and Suicide of Aaron Swartz

Aaron Swartz was a talented computer programmer and zealous advocate that information should be free.  One of his former exploits entailed downloading 19 million pages of court documents from PACER using computers in public libraries and then uploading the documents to the cloud for anyone to access.  In 2010, he accessed MIT’s network with the aim of downloading as many academic journals as possible from JSTOR (short for "Journal Storage"), a digital library, which was available through paid subscription.  He downloaded almost five million documents, and he was subsequently arrested and charged in July of 2011, with two counts of wire fraud and eleven violations of the CFAA.  The charges could have resulted in 35 years in jail and $1 million fine.  Federal prosecutors reportedly offered Mr. Swartz a plea bargain that would recommend six months in a minimum security facility.  Mr. Swartz committed suicide on Jan. 11, 2013, at the age of 26.

The belief that Mr. Swartz’s suicide stemmed from the prosecution have led supporters to recommend changes to the CFAA.  For example, Representative Zoe Lefgren (D-CA) has proposed a bill to amend the CFAA to exempt violations of contractual obligations, such as terms of service agreements.  Prosecutors defended their actions, with the U.S. Attorney for Massachusetts, Carmen Ortiz, stating that "this office’s conduct was appropriate in bringing and handling this case. The career prosecutors handling this matter took on the difficult task of enforcing a law they had taken an oath to uphold, and did so reasonably."  January 16, 2013 Statement of United States Attorney Carmen Otiz Regarding the Death of Aaron Swartz.  U.S. Attorney Ortiz noted that prosecutors recognized that there was no evidence that Mr. Swartz downloaded the material for material gain and that the six months of incarceration recommended was an "appropriate sentence that matched the alleged conduct."  Id. 

                    B.   United States v. Scheinberg, No. 10-CR-336 (S.D.N.Y. Mar. 10, 2011)

As described in our 2011 Year-End Data Privacy and Security Update, in April 2011, the Department of Justice announced criminal and civil charges against the founders of the three largest Internet poker companies operating within the United States–PokerStars, Full Tilt Poker, and Absolute Poker–alleging that the companies engaged in bank fraud, money laundering, and violations of gambling laws including the Unlawful Internet Gambling Enforcement Act.  The FBI also seized the domain names of five related websites.

On July 31, 2012, the DOJ reached a settlement with PokerStars and Full Tilt Poker which resolved all civil charges against the companies.  Under the settlement, PokerStars took control of all of Full Tilt Poker’s assets, agreed to forfeit $547 million to the U.S. government, and agreed to make $184 million available to repay non-U.S. customers of Full Tilt Poker, for a total of $731 million.  PokerStars and Full Tilt Poker have reopened for non-U.S. customers, but no longer serve U.S. players.  Individuals in the United States are prevented from playing on the sites through IP filtering, and PokerStars has suspended accounts of players who have tried to evade the blocking of U.S. players.  Several of the individual defendants have pled guilty to various bank fraud and gambling charges, but the criminal cases against the majority of the individual defendants, including the principals of PokerStars and Full Tilt Poker, are ongoing.

                    C.   In Re:  Grand Jury Subpoena Duces Tecum Dated March 25, 2011, United States v. John Doe, Nos. 11-12268 & 11-15421 (11th Cir. Feb. 23, 2012)

On February 23, 2012, the Eleventh Circuit ruled that an individual could not be compelled to use a decryption password due to the Fifth Amendment privilege.  John Doe had been served with a subpoena to appear before a grand jury with the unencrypted contents of hard drives previously seized from John Doe as part of a child pornography investigation.  The hard drives were encrypted using the "TrueCrypt" program, which meant the government could not find any files or prove that any files existed on the hard drives.  After Doe refused to decrypt the hard drives and invoked his Fifth Amendment privilege, a district court judge rejected his arguments and held him in contempt of court.  This case marked the first time a circuit court of appeals has weighed in on the issue, despite several rulings by district courts in recent years, including United States v. Fricosu, No. 10-CR-00509-01-REB (D. Colo. May 6, 2011).  The Eleventh Circuit found that "Doe’s decryption and production of the contents of the drives would be testimonial, not merely a physical act" because the decryption "would require the use of the contents of Doe’s mind."  The ruling also distinguished the holdings of Fricosu and other precedent because in those cases the trial judges found that the purported testimonies being compelled were foregone conclusions, as opposed to Doe’s situation where the government had no evidence that the hard drives contained incriminating evidence.

                    D.   United States v. Dotcom, No. 12CR3 (E.D. Va. Jan. 5, 2012)

As described in our 2011 Year-End Data Privacy and Security Update, in January 2012, the Department of Justice announced that it had charged Megaupload, an online file-sharing website, and seven individuals with operating an "international organized criminal enterprise" engaged in racketeering, money laundering, and copyright infringement.  In its indictment, the government alleged that Megaupload generated more than $175 million in revenue and led to more than $500 million in damage to copyright holders.  A district court judge ordered the seizure of 18 domain names affiliated with Megaupload, and four of the individual defendants were arrested in New Zealand.

 There have been several developments in the case this past year in New Zealand, where Kim Dotcom resides and is a legal resident.  On June 28, 2012, the High Court of New Zealand ruled that the warrants to seize Dotcom’s property were invalid because they were overbroad.  On September 24, 2012, the Prime Minister of New Zealand John Key admitted that the New Zealand Government Communications Security Bureau had unlawfully intercepted Dotcom’s communications to assist in the investigation of Megaupload, which the agency was not permitted to do because Dotcom was a New Zealand resident.  Key requested an inquiry by the New Zealand Inspector General of Intelligence and Security into the incident.  The Department of Justice continues to seek the extradition of Dotcom to the United States.

                    E.   United States v. Nosal, 642 F.3d 781 (9th Cir. 2011), rev’d No. 10-10038 (9th Cir. Apr. 10, 2012) (en banc)

In April 2011, a three-judge panel of the Ninth Circuit reversed the district court’s granting of David Nosal’s motion to dismiss the indictment, holding that an employee "exceeds authorized access" within the meaning of the CFAA when an employee violates an employer’s computer access restrictions, including use restrictions.  Similar to a growing number of employee disloyalty cases involving confidential computer data, Nosal, a former employee of an executive search firm, allegedly enlisted his former colleagues to obtain confidential information from one of his former employer’s databases, which Nosal planned to use to establish his own competing executive search firm.  Notably, the colleagues were authorized to access the data at issue, but the executive search firm had a policy in place that forbade disclosing such confidential information.

On rehearing en banc, the Ninth Circuit reversed 9-2 in an opinion by Judge Kozinski and held that "exceeds authorized access" must be read narrowly and does not extend to violations of an employer’s use restrictions.  Otherwise, the CFAA would criminalize innocuous, everyday violations of employer’s use restrictions, such as reading while at work.  Moreover, corporate use policies are often unclear and subject to change without notice, making it especially problematic for criminal liability to hinge on violations of their terms.  Finally, the court recognized the Ninth Circuit’s disagreement with other circuits on this issue.

                    F.   WEC Carolina Energy Solutions LLC v. Miller, 687 F.3d 199 (4th Cir. 2012)

The Fourth Circuit in WEC Carolina Energy reached the same conclusion as the Ninth Circuit in its en banc decision in Nosal that "authorized access" under the CFAA should be narrowly interpreted.  The case involved a suit by WEC Carolina Energy Solutions, LLC ("WEC") against its former employee, Willie Miller, and his assistant, Emily Kelley.  According to WEC, Miller and Kelley downloaded WEC’s confidential and proprietary documents and emailed them to Miller’s personal email account shortly before Miller resigned from WEC to work for WEC’s competitor, Arc Energy Services, Inc. ("Arc").  Among other claims, WEC alleged that Miller and Kelley had violated the CFAA when they accessed WEC’s computers to send Miller the confidential WEC documents.  The district court, however, dismissed the CFAA claim, on the ground that Miller and Kelley did not act "without authorization" or "exceed authorized access" within the meaning of the CFAA.

  The Fourth Circuit affirmed.  The Court explained that there are currently "two schools of thought" regarding the meaning of authorized access under the CFAA.  Id. at 203.  The first–which has been adopted by the Seventh Circuit, and which had been endorsed by the Ninth Circuit’s now superseded panel decision in Nosal–"holds that when an employee accesses a computer or information on a computer to further interests that are adverse to his employer . . . [he] los[es] any authority he has to access the computer or any information on it."  Id. (citing Int’l Airport Ctrs., LLC v. Citrin, 400 F.3d 418, 420-21 (7th Cir. 2006)).  The second–set forth in the Nosal en banc decision–"interprets ‘without authorization’ and ‘exceeds authorized access’ literally and narrowly, limiting the terms’ application to situations where an individual accesses a computer or information on a computer without permission."  Id.  Adopting the latter interpretation, the Fourth Circuit held that the terms "without authorization" and "exceeds authorized access" "apply only when an individual accesses a computer without permission or obtains or alters information on a computer beyond that which he is authorized to access."  Id. at 206.  Because Miller and Kelley only violated WEC’s computer use restrictions–but were authorized to access the computers at issue–the Court found that they did not access a computer without authorization or exceed their authorized access.  Id. at 206-07.

                    G.   Economic Espionage Act

2012 was likewise a busy year for prosecutions under the Economic Espionage Act ("EEA"), and there were some significant developments in the criminal enforcement of trade secret protection this year.  Referrals from companies facing trade secret misappropriation serve as the basis for a large percentage of prosecutions in this arena.  In fact, a 2012 report on the Economic Espionage Act ("EEA") found that in more than 90% of prosecutions under the Act, "the defendant was an ‘insider,’ and had access to the trade secret because he was an employee of the victim, or worked for a vendor or contractor of the victim."[5]  That was certainly the case in United States v. Aleynikov–arguably the most significant EEA decision of the year. 

On April 11, 2012, the Second Circuit Court of Appeals reversed the conviction of Sergey Aleynikov, a former computer programmer for a financial institution, who had been found guilty of stealing computer source code for the company’s high-frequency trading ("HFT") program in violation of the National Stolen Property Act ("NSPA") and the EEA.  United States v. Aleynikov, 676 F.3d 71 (2d Cir. 2012).

On appeal, Aleynikov argued that the company’s computer source code was not "related to or included in a product that is produced for or placed in interstate or foreign commerce," as required by the EEA.  Id. at 73.  The Second Circuit agreed, reasoning that the company’s HFT program was not "produced for" or "placed in" interstate commerce, as the company "had no intention of selling its HFT system or licensing it to anyone."  Id. at 82.  Because the HFT program was "not designed to enter or pass in commerce, or to make something that does," the Court found that Aleynikov’s theft of the computer source code for the HFT program "was not an offense under the EEA."  Id.  In a concurring opinion, Judge Calabresi wrote that Congress probably intended to criminalize this type of conduct and expressed the "hope that Congress will return to the issue and state, in appropriate language, what I believe they meant to make criminal in the EEA."  Id. at 83.

Congress took note of this judicial expression of hope in passing the Trade Secrets Clarification Act, which took effect in December 2012, and limited the effect of the Aleynikov decision by striking the requirement in 18 U.S.C. § 1832(a) that the trade secret at issue be "produced for" or "placed in" interstate commerce.  Under the Act, the trade secret need only be "a product or service used in or intended for use in" interstate commerce. 

IV.   Legislative Developments

2012 featured some important new regulatory and legislative activity likely to have substantial impacts on commercial entities with regard to medical privacy, customers’ media viewing history, development of mobile apps, and cybersecurity.  Those developments are discussed in subsections below.

                    A.   HIPAA–Expanding Coverage of Business Associates and Subcontractors

On January 25, 2013, the Department of Health and Human Services ("HHS") published in the Federal Register several hundred pages of modifications to and commentary on the Health Insurance Portability and Accountability Act of 1996 ("HIPAA") Privacy and Security rules.  The most critical of those modifications include modifications to the HIPAA Privacy, Security, and Enforcement Rules, mandated under the Health Information Technology for Economic and Clinical Health Act ("HITECH Act"), pertaining to covered "business associates" of covered entities.

Under the new regulations, "business associates" and their subcontractors are now directly subject to many of HIPAA’s Privacy and Security Rules.  Previously, for a person or entity to subject to HIPAA as a business associate–and the civil and criminal penalties from violations thereunder–the person or entity had to have a business associate agreement with a covered entity.  Now, a "business associate" is a person or entity that, on behalf of a covered entity, "creates, receives, maintains, or transmits protected health information for a function or activity regulated by this subchapter, including claims processing or administration, data analysis, processing or administration, utilization review, quality assurance, patient safety activities listed at 42 CFR 3.20, billing, benefit management, practice management, and repricing," or, importantly, any "subcontractor that creates, receives, maintains, or transmits protected health information on behalf of the business associate."  78 Fed. Reg. 5566, at 5688 (amending 45 C.F.R. § 160.103).  As commentary to the rule states, "A subcontractor is then a business associate where that function, activity, or service involves the creation, receipt, maintenance, or transmission of protected health information."  Id. at 5573.

Business associates and their subcontractors thus defined are directly liable for improper uses and disclosures of protected health information; must disclose protected health information for HHS compliance investigations; must disclose protected health information to a covered entity or individual when either of the latter requests an electronic copy of the same; and are subject to the "minimum necessary" rule requiring covered entities to take reasonable steps to limit most uses or disclosures of protected health information to the minimum necessary to accomplish their intended purpose.  The changes are intended to extend HIPAA coverage to any entities that create or receive protected health information on behalf of a covered entity to perform its health care functions, "no matter how far ‘down the chain’ the information flows."  Id. at 5574. 

Moreover, HHS has redefined "breach" under the Breach Notification rule in 45 CFR 164.402 to clarify that any impermissible access, use, or disclosure of protected health information is presumptively a breach unless a covered entity or business associate can show a low probability that protected health information was compromised.  To evaluate that probability, a risk assessment no longer includes the prior harm standard, but rather involves weighing the nature and extent of protected health information involved; the identity of the unauthorized user or recipient; whether the information was actually received or viewed; and any factors mitigating the risk.  HHS intended the redefinition to make analysis of a potential breach more objective, focusing on the risk that protected health information was compromised, rather than a more subjective analysis of whether the breach caused harm to an individual. 

Other changes further limit the unauthorized use, disclosure, or sale of protected health information for marketing and fundraising purposes; expand individuals’ rights to receive electronic copies of their health information; modify certain authorization requirements regarding child immunization and access to decedents’ health information; adopt changes to enforcement of HIPAA noncompliance due to willful neglect; prohibit most health plans from disclosing genetic information for underwriting purposes; clarify that protected health information is now protected for 50 years after the subject’s death (rather than having no limitation); allowing patients paying for treatment out of pocket to restrict insurers from accessing their protected health information; and require modifications to and disclosure of covered entities’ privacy notices (requiring revised notices of material changes to be provided within 60 days).  Importantly, "business associates" are not required to provide a notice of privacy practice or designate a privacy official under the revised regulations.  A covered entity may delegate those responsibilities to the business associate, but liability thereunder would be contractual rather than regulatory.

The modifications are effective as of March 26, 2013; compliance is required within 180 days of that date (September 23, 2013). 

                    B.   Amendment to the Video Privacy Protection Act Allowing Easier Customer Consent to Share Video Viewing Information

A new amendment to the 1988 Video Privacy Protection Act ("VPPA"), 18 U.S.C. § 2710, opens the door to permitting companies to share customers’ video viewing history, with the customer’s consent.

The VPPA was originally enacted in 1988 to prevent the wrongful disclosure of customers’ video rental or sale records, or viewing histories of other similar media.  The VPPA was passed after Supreme Court nominee Robert Bork’s video rental history was published by a newspaper outlet.  The act thus bans the sharing of a person’s video rental history without the person’s written consent.

Representative Bob Goodlatte (R-Va.) introduced a bill, H.R. 6671, on December 17, 2012 to amend the VPPA.  The bill, signed into law on January 10, 2013 following its passage in the Senate, makes it easier for companies to obtain customer consent to share the customer’s viewing history information.  The amendment replaces 18 U.S.C. § 2710(b)(2)(B)–a subsection of the provision permitting covered entities to disclose a consumer’s personally identifiable information–with new language allowing a consumer to give informed written consent to disclose his or her personally identifiable information through the Internet.  Such consent must be "in a form distinct and separate from any form setting forth other legal or financial obligations of the consumer"; given at the time the request for the disclosure is made; and provided with a clear and conspicuous means by which the consumer may withdraw their consent.  Importantly, the amendment permits consumers to give such consent in advance for up to two years, or until the consumer revokes that consent.

A covered entity ("video tape service provider") includes not merely video stores, but "any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials."  18 U.S.C. § 2710(a)(4).  The amendment benefits, for example, video streaming companies by permitting such entities to integrate greater sharing of customers’ viewing history on social media and beyond, targeting advertising on that basis, after obtaining customers’ consent to do so.

                    C.   Developments Regarding the California Online Privacy Protection Act and Mobile Apps

In the realm of state law, one of the more important developments in 2012 was California’s Attorney General Kamala Harris issuing warning letters to companies that failed to post privacy policies for mobile apps, followed by issuing controversial recommended privacy practices for such apps.  The letters and report reflect the Attorney General’s view that even non-California-based companies might be subject to enforcement under California state law.

Not to be confused with the federal Children’s Online Privacy Protection Act ("COPPA") that shares its acronym, the California Online Privacy Protection Act, Cal. Bus. & Prof. Code §§ 22575–22579 (2004) ("Cal-OPPA"), is a state law addressing consumers generally, rather than children.  It requires that a covered "operator" of a commercial website (1) post a privacy policy identifying what personally identifiable information it collects (including the names of third parties with which it may share that information); (2) describe the process by which one may review and request changes to his personally identifiable information; (3) describe the process by which the operator notifies users of material changes to the policy; and (4) identify the policy’s effective date.  Id. § 22575(b).  Cal-OPPA defines a covered "operator" broadly, including any person or entity owning a commercial website or online service "that collects and maintains personally identifiable information from" California consumers who use or visit the site or service.  Id. § 22577(c).  As such, according to the California Attorney General, a company that has a mobile app may be governed by Cal-OPPA regardless of where the company is located–so long as it collects and maintains personally identifiable information from California residents.

To that end, Attorney General Harris formally notified "scores" of mobile app developers in late 2012 that they were not in compliance with Cal-OPPA.  The companies were given 30 days to conspicuously post a privacy policy or a link thereto within their app informing users of what personally identifiable information is being collected from them and how it will be used, and warning that Cal-OPPA violations may result in penalties of up to $2,500 for each time an allegedly unlawful app is downloaded by a California consumer.  The warning letters were followed in December 2012 by the commencement of a lawsuit against Delta Airlines–a company headquartered in Georgia and incorporated in Delaware–for allegedly distributing a mobile application without a privacy policy in violation of Cal-OPPA.  The case is currently pending in San Francisco County Superior Court, and Delta Airlines has filed a demurrer, which has yet to be ruled upon.

In January 2013, Attorney General Harris followed up by releasing a 23-page report recommending that app developers, app store operators, and other entities limit data collection and retention; avoid using global device identifiers that could be correlated across apps; encrypt data; limit employee access to users’ personal data; use enhanced "special notices" to draw users’ attention to an app’s privacy practices; and simplify the language used in privacy policies.  The Attorney General’s office admitted that "[t]he recommendations go beyond the law," but noted that the state’s goal was to educate mobile developers as to California’s view of best practices in the mobile app arena.  The FCC in fact later released a similar report recommending several mobile privacy practices including increased "just-in-time" disclosures to mobile app consumers regarding the information being collected from them, improved coordination between mobile platforms and developers, and offering do-not-track mechanisms for mobile users.  These developments suggest that mobile privacy should continue to be an important focus of state and federal legislators in the months and years to come.

                    D.   FTC Adopts Long-Awaited Modifications to the Children’s Online Privacy Protection Rule 

The Children’s Online Privacy Protection Act of 1998 ("COPPA"), 15 U.S.C. § 6501, et seq., and its implementing rule adopted by the FTC (the "COPPA Rule") are designed to give parents greater control over the personal information that websites and online services collect from children under the age of 13.  Specifically, the statute and COPPA Rule require (a) operators of websites or online services that are directed to children under the age of 13, and (b) operators of websites and online services that have actual knowledge that they are collecting personal information from children under the age of 13, to provide notice to parents and obtain their verifiable consent before collecting, using, or disclosing personal information from children under the age of 13. 

On December 19, 2012, the FTC adopted final changes to the COPPA Rule that significantly expand the scope of businesses and personal information covered by COPPA.  Key modifications include:

  • The scope of "personal information" covered by the COPPA Rule is expanded to include geolocation information, photographs, and videos, as well as persistent identifiers that can be used to recognize users over time and across different website or online services.  The original COPPA Rule covered individually identifiable information about a child, such as the child’s full name, home address, email address, telephone number, or other information that would allow someone to identify or contact the child, as well as other types of information, such as hobbies and interests, when that information was tied to individually identifiable information.  The COPPA Rule modifications expand the categories of covered information to include geolocation information, as well as videos, photographs, and audio files that contain a child’s image or voice.  In addition, and perhaps most controversially, the COPPA Rule now covers the collection of persistent identifiers that can be used to recognize users over time and across different websites or online services, such as IP addresses and mobile device IDs, when those persistent identifiers are not collected solely to support a website’s internal operations.  Hence, a child-directed website is now required to obtain verifiable parental consent from a child’s parents before using persistent identifiers stored in cookies to serve behaviorally targeted advertisements on its site.
  • Child-directed websites and services that integrate outside services such as plugins or advertising networks are now responsible for the collection practices of those third parties.  The final Rule expands the definition of "operator" to reach child-directed online websites and services that do not themselves collect personal information from children, but that "benefit" when they allow personal information to be collected on their sites by third-party online services and plugins.  Hence, a child-directed website that collects no information about its users, but that integrates a social media plugin that permits visitors to post personal information on third-party websites may now be found to be in violation of the COPPA Rule if it does not obtain verifiable parental consent before allowing children under the age of 13 to use the plugin.
  • General audience plugins and ad networks are now covered by the COPPA Rule whenever they have actual knowledge that they are collecting personal information from websites that are directed to children.  The final Rule expands the definition of "a website or online service directed to children" to include general-audience plugin providers and ad networks that have actual knowledge that they are collecting personal information through a child-directed website or online service.  The FTC does not specify a precise test for what constitutes "knowledge" under this portion of the rule, deeming the inquiry "highly fact specific."  The Commission does advise, however, that the actual knowledge standard will likely be met in most cases when "(a) a child-directed content provider directly communicates the child-directed nature of its content to the online service; or (b) a representative of the online service recognizes the child-directed nature of the content."
  • Sites and services that are likely to attract, but that do not target, children may rely upon age gates to identify children under the age of 13.  The final COPPA Rule provides that a website or online service that contains subject matter that is likely to attract children under the age of 13, but that does not target children under the age of 13 as its primary audience, will not be considered "directed to children" if it asks users to identify their birthdates, and then refrains from collecting information from users who self-identify as being under the age of 13.
  • New data retention and deletion requirements.  The final COPPA Rule requires websites and online services to retain personal information collected online from a child for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.  The operator must further delete such information using reasonable measures to protect against unauthorized access to, or use of, the information in connection with its deletion.
  • Strengthened confidentiality and security requirements.  The final COPPA Rule requires operators to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children, including reasonable steps to release children’s personal information only to service procedures and third parties that are capable of maintaining the confidentiality, security, and integrity of such information, and who provide assurances that they will maintain the information in such a manner.

The modifications to the COPPA Rule go into effect on July 1, 2013.

                    E.   Executive Order Improving Critical Infrastructure Cybersecurity

On February 12, 2013, President Obama signed an executive order seeking to strengthen the cybersecurity of critical infrastructure, by directing the development of a public-private sector cybersecurity framework, and increasing information sharing between the public and private sector.  

Homeland Security must, within 150 days, draw on public and private sector experts to "identify critical infrastructure where a cybersecurity incident could reasonably result in catastrophic regional or national effects on public health or safety, economic security, or national security."  Id. § 9(a).  Homeland Security must also confidentially notify owners and operators of critical infrastructure that they have been identified as such.  Id. § 9(c).  The list must be updated and provided to the President on an annual basis.  Id. § 9(a).

Primarily, the order directs the National Institute of Standards and Technology to lead development of a cybersecurity framework incorporating consensus and international standards and best practices.  The framework is to provide a "prioritized, flexible, repeatable, performance-based, and cost-effective approach, including information security measures and controls, to help owners and operators of critical infrastructure identify, assess, and manage cyber risk."  Id. § 7(b).  The preliminary framework must be published within 240 days, and a final version must be published within 1 year.

Regarding information sharing, the order directs the Attorney General, the Secretary of Homeland Security, and the Director of National Intelligence to issue instructions and processes within 120 days ensuring the timely production and dissemination of unclassified reports of cyber threats targeting a specific entity.  Id. § 4(a)-(b).  The order further requires Homeland Security and the Secretary of Defense to establish procedures to expand the Enhanced Cybersecurity Services program to all critical infrastructure sectors, including a voluntary information sharing program between the public and private sector.  Id. § 4(c).  The order further requires Homeland Security to expedite processing of security clearances for appropriate personnel of critical infrastructure owners and operators, and to expand consultation with private sector experts regarding cybersecurity issues.  Id. §§ 4(d)-(e), 6.[6]  Other provisions of the order include:  requiring Homeland Security to establish a voluntary incentive-based program supporting the cybersecurity framework (§ 8); calling for agencies to review existing cybersecurity regulations for sufficiency under the framework (§ 10); and requiring agencies to incorporate safeguards for privacy and civil liberties in accordance with the Fair Information Practice Principles ("FIPPS") and other best practices (§ 5).

The text of the order is available at

                    F.   Privacy Legislative Outlook for 2013

2013 is also poised to be one filled with important privacy law developments, particularly in light of President Obama’s call for cybersecurity legislation in his State of the Union address, followed just days later by allegations that a unit of China’s People’s Liberation Army orchestrated a massive wave of security breaches on U.S. government and commercial computer systems. 

In addition to cybersecurity, additional potential legislative topics on the horizon for 2013 may include:  an increased focus on mobile privacy and geolocation; the privacy implications of domestic aerial drone use; DNA collection from criminal defendants prior to conviction; and rulemaking regarding airport security screening procedures. 

Gibson Dunn will continue to monitor these and other developments in the privacy sector.

V.   Data Security

Data breaches continue to grow in both number and scale.  This past year saw major hacks at Zappos (24M customer accounts), Statfor (private U.S. intelligence firm; 5M e-mails), Global Payments (1.5M credit card numbers), LinkedIn (6.5M passwords), eHarmony (1.5M passwords), Yahoo (0.5M passwords), Nationwide Mutual (1.1M customer accounts), and Wyndham Worldwide (600K credit card numbers).  According to industry reports, this past year saw a sharp increase in browser-related exploits, such as luring an individual to a trusted website that has been infected with malicious code.  Using browser vulnerabilities, attackers can install malware on the target system.  In addition, the rise of "bring your own device" policies in the corporate world have led to security challenges for organizations.  For example, many large organizations reported that security breaches were caused by their own staff, most commonly through ignorance of security practices.

Stuxnet, the advanced computer worm discovered in June 2010, continued to make news.  The worm is believed to have been created by the governments of the United States and Israel to attack Iran’s nuclear facilities.  Researchers uncovered a new variant of Stuxnet that was released in 2007, two years before any other known variant.  This newly discovered variant sabotaged uranium enrichment equipment in different ways from later variants, and researchers believe that the 2007 variant was operational inside Iran’s nuclear facilities.

                    A.   State and Local Governments

This past year saw a dramatic increase in the number of breaches from state and local governments.  Leading the pack was the South Carolina Department of Revenue, where an employee fell for a phishing e-mail that allowed hackers to steal 75GB of data containing the social security numbers, credit cards, and bank account information for 3.8M residents.  The data also contained information about 700,000 businesses.  The governor faulted outdated IRS standards, which did not require social security numbers to be encrypted.  Another major hack affected the New York State Electric & Gas Company, in which 1.8M customer files were stolen that included social security numbers and some financial information.  Investigations of the hack faulted out-of-date data security standards.  Other notable breaches occurred at the California Department of Social Services (700K employees’ payroll information), Utah Department of Health (780K citizens’ health information), and the California Department of Child Support Services (800K health and financial records).  Many of these attacks could have been prevented by following up-to-date security standards.

                    B.   United States Sentencing Guidelines

The hacker group Anonymous hacked the website of the U.S. Sentencing commission in January 2013, in retaliation for what they perceived as overzealous prosecution leading to the suicide of Aaron Swartz, discussed above in Section III, A.  Anonymous distributed encrypted files that they claim are government documents.  They threatened to release the decryption codes if the government fails to reform the cybercrime laws under which Swartz was prosecuted.

                    C.   Foreign Theft of Trade Secrets

In February 2013, the computer security firm Mandiant reported that a Chinese military unit has hacked at least 115 American companies, putting their confidential information at risk.  In 2012, in the wake of similar reports of hacking at major American companies, the United States government took steps to address this growing concern.  Congress enacted both the Foreign and Economic Espionage Penalty Enhancement Act of 2012 and the Theft of Trade Secrets Clarification Act, which aim to increase liability and penalties for trade secret theft. Moreover, in February 12, 2013, the President signed an Executive Order on Improving Critical Infrastructure Cyber Security and, on February 20, 2013, announced a five-prong "Strategy Action" plan to mitigate the theft of American trade secrets.  The five prongs include:  1. Focus Diplomatic Efforts to Protect Trade Secrets Overseas, 2. Promote Voluntary Best Practices by Private Industry to Protect Trade Secrets, 3. Enhance Domestic Law Enforcement Operations, 4. Improve Domestic Legislation, and 5. Public Awareness and Stakeholder Outreach.

VI.   International Developments

                    A.   European Union

                                        1.   Ambitious Requests behind Proposed New Privacy Regulation & E-Discovery

Following the release of the European Commission’s proposed new privacy regulation in January 2012, European Union ("EU") institutions have been engaged in the review of the 1995 Data Protection Directive.  As part of the first phase of the legislative process, the European Parliament issued two reports in December 2012 in which it called for stronger protection of personal data in the EU.  The reports of the European Parliament made specific recommendations, such as the establishment of the right of users "to be forgotten" and to not be subject to profiling; emphasizing the extraterritorial effects of EU data protection rules where data on EU individuals are affected; and the transposition of the "one-stop-shop" principle to data protection (which was further bolstered by the presentation on January 9, 2013 of a bill proposing the creation of an EU data protection authority).  Despite the numerous warnings made by trade associations and corporations regarding the potential impediments to business opportunities that such recommendations may entail, the European Commission has welcomed the adoption of these reports.[7]  It appears that a number of the ambitious proposals set out in the reports might be included in the final legislation.

As regards E-Discovery, the Article 29 Working Party responded favorably to the approach recommended in the International Principles on Discovery, Disclosure & Data Protection (see Gibson Dunn’s 2012 Year-End Electronic Discovery and Information Law Update).

                                        2.   Interpretation of EU Cookie Consent Requirement

In June 2012, the Article 29 Working Party (including the data protection authorities of the EU Member States) adopted an opinion (the "Opinion on Cookie Consent Exemption") establishing the situations and conditions under which website operators could be subject to the exemption from their duty to obtain users’ consent before storing or accessing cookies in users’ computers.[8]   The Opinion on Cookie Consent Exemption clarifies that explicit consent should be requested and obtained from users to store or access cookies used for behavioral advertising, analytics or market research (e.g., social plug-in tracking cookies, third-party advertising cookies and first-party analytics).  The publishing of the Opinion on Cookie Consent Exemption was followed by the display on many websites of the "cookie consent request," indicating the far-reaching effect of the Opinion on Cookie Consent Exemption.

                                        3.   Cloud Computing & Binding Corporate Rules for Data Processors

The collection and processing of data by cloud service providers is a growing concern for technology corporations, as well as data protection authorities.  The Article 29 Working Party released an opinion in July 2012 (the "Cloud Computing Opinion") where it addressed some outstanding questions from the sector.  After emphasizing the risks of wide scale deployment of cloud computing services, the Cloud Computing Opinion clarifies that, where controllers established in the EU decide to contract cloud computing services, they are required to choose a processor providing sufficient guarantees of technical security and organizational measures governing data processing, and must ensure compliance with those measures.[9]  On September 27, 2012, in the framework of a proposal to improve the utilization of cloud computing by EU businesses and governments, the European Commission highlighted the importance of adopting the new EU data protection framework to ensure protection of customers’ data stored in cloud services, as well as to guarantee the mobility of that data.[10]

On December 21, 2012, the Article 29 Working Party approved the use of Binding Corporate Rules ("BCR") for data processors from January 1, 2013.[11]  BCRs –which were already an option for data controllers– constitute an internal code of conduct regarding data privacy and security, and will help certify that transfers of personal data outside the EU performed by a processor take place in accordance with EU data protection rules.

                                        4.   Enforcement of Data Protection Rules in the EU

                    EU-led Google Privacy Policy Investigation

In February 2012, the French Commission for IT and Freedoms ("CNIL") was appointed by the Article 29 Data Protection Working Party to lead the analysis and scrutiny of Google’s new confidentiality policy for its services to users.  Following the investigation, the EU data protection authorities addressed a joint letter to Google which included recommendations to provide clearer information to users and to offer them improved control over the combination of data across Google’s numerous services.  The EU data protection authorities also suggested that Google modify its tools in order to avoid excessive collection of data.  This is the first time that the EU data protection authorities have addressed a jointly signed letter urging a private party to comply with EU rules.

Separately from the privacy policy investigation, in July 2012, Google admitted to the CNIL and the UK Information Commissioner’s Office ("ICO") that it had failed to completely delete private data gathered by Google Street View cars, including individuals’ wireless connection passwords and login details.  While the ICO suggested, on a preliminary basis, that Google had made several improvements to enhance privacy, the CNIL has asked Google to provide the data still in its possession while the investigation continues. 

                    Other Examples of Enforcement by EU Data Protection Authorities

At the national level, EU data protection authorities have proved to be actively enforcing national data protection regulations. 

In the UK, the ICO has shown a continuing willingness to impose sanctions on data controllers in respect of breach of data protection legislation.  In contrast to the mere four monetary penalties issued in 2011, the ICO has imposed fines of over GBP 2.5 million (approximately USD 3.9 million) across twenty-three decisions adopted throughout 2012 on the basis of the UK Data Protection Act ("DPA").  In June 2012, the Belfast Health and Social Care Trust received a penalty of GBP 225,000 (approximately USD 362,000) following a serious security breach involving sensitive personal data of thousands of patients (including medical records, X-rays, scans and lab results) and staff (mainly, unopened pay slips).  A few months later, in November 2012, the ICO fined spam "texters" GBP 440,000 (approximately USD 600,000; almost reaching the maximum of GBP 500,000 set by the DPA) for sending huge volumes of text messages without the consent of the recipient and without identifying the sender.  Furthermore, the fine imposed on Sony in January 2013 for the leak of millions of details from UK users (GBP 250,000, approximately USD 395,000) suggests that the ICO will continue to strongly enforce the DPA throughout this year.

In Germany, social networks were a main area of concern for privacy regulators in 2012. For example, the Federal Commissioner for Data Protection recently criticized the new terms of use of Couchsurfing, which in his view compel users to entirely abandon any control of their personal data.  The Federal Commissioner for Data Protection noted that this "is not permitted under German and EU privacy laws".  On another front, Facebook became a target of privacy regulators and consumer associations– in 2012, the Federation of German Consumer Organizations ("VZ"), and the Hamburg and Kiel Commissioners for Data Protection made various claims that certain Facebook features violated privacy laws or that pseudonymous accounts must be permitted.  Facebook is challenging all of these claims, and no final decisions have been rendered.  Nonetheless,  these actions show that regulators and courts in Germany are willing to enforce privacy laws much more strictly than in the past (including against foreign companies), and that topics such as valid consent and the legal transfer of data can be expected to play a major role in future enforcement practice.

In Belgium, the Privacy Commission ("CPVP") opened an investigation on January 3, 2013 following a data breach in the systems of SNCB Europe (the national railway operator) concerning several thousands of users’ data.[12]  It has been communicated that the file will be passed on to the prosecutor, and SNCB Europe risks being imposed a fine of up to EUR 150,000 (approximately USD 196,000).

                                        5.   France

Since personal data has become central in a world gone increasingly digital, the CNIL stated in a press release dated July 2012 that it would rethink its role and response tools to better deal with online data streams[13].  With the rise of online social networking, users in France are calling for greater transparency and the ability to manage and control their own data.  The CNIL has proposed offering educational solutions, including programs designed to raise awareness for a reasoned use of digital technology. 

In a report issued in September 2012, [14] the CNIL examined the implications of connected TV on users’ right to privacy.  The French authority has worked together with other professionals in this sector to determine the actual use of connected TV services and define better regulations concerning this technology.

In November 2012, the CNIL published an English version of its two "advanced" security and privacy risk management guides[15].  They consist of a privacy risk management methodology and a catalogue of measures and best practices to help organizations choose the appropriate controls to protect their data processing operations.

                                        6.   Germany

                    Cloud Computing

2012, in some respect, was also the "Year of the Cloud" in Germany.  At its April 2012 meeting, the International Working Group on Data Protection in Telecommunications (aka the "Berlin Group") released a "Working Paper on Cloud Computing – Privacy and data protection issues" that identified and commented on certain main areas of risk linked to cloud computing including (i) breaches of information security such as breaches of confidentiality, integrity or availability of (personal) data; (ii) data being transferred to jurisdictions that do not provide adequate data protection; (iii) acts in violation of laws and principles for privacy and data protection; (iv) the controller accepting standard terms and conditions that give the cloud service provider too much leeway; and (v) cloud service providers or their subcontractors using the controllers’ data for their own purposes without the controllers’ knowledge or permission. The Berlin Commissioner concluded with a commitment that "privacy may not evaporate in the cloud."

Similarly, the Federal Commissioner for Data Protection recently complained about "the often unmindful and uninhibited" treatment by cloud service providers of user data and stressed that "users very often do not even know which data is collected and for what purpose."  Given that German and EU law require that express, informed and voluntary consent be provided by users for the collection, use, and particularly for the transfer of personal data, it can be reasonably expected that cloud computing will be under increased scrutiny by German regulators in 2013.

                    Expiry of Transition Period for Marketing

The transition period for the processing and use of data collected for marketing purposes prior to September 2009 expired on August 31, 2012. Therefore, as of September 1, 2012, personal data may only be processed or used for marketing purposes if the data subject has given his consent. Under German law, a consent declaration must be provided, clearly specifying the purpose and scope of the processing and use of the data. Notwithstanding this consent requirement, certain exceptions still apply; for example, no consent is required if a company uses a limited set of personal data (namely, the name, title, degree, address, year of birth and profession) that it has itself collected from its customers for its own marketing purposes.

                    Draft Bill of Employee Data Protection Act

As explained in Gibson Dunn’s 2011 Year-End Privacy and Security Update, the German legislature was debating a draft Employee Data Protection Act.  The bill, which would provide a much more detailed –and in some respects more restrictive– regulation on the use of employee data, is still in the parliamentary process but currently on hold.  In general, the Employee Data Protection Act was intended to implement specific restrictions on employee background screenings, video surveillance at work, the use of social media, positioning systems and employee health checks.  For international companies, the bill would facilitate data transfers to affiliated companies outside the European Union.  The bill is no longer expected to be enacted in the coming months and remains subject to ongoing public debate and developments at EU level.

                    Employee Statements in Social Media

Social media gained importance in civil litigation in Germany, both as evidence and in terms of degree of publicity.  In several court proceedings, private employee statements about co-workers or employers published on social media platforms (such as Facebook and Twitter) played a central role.  These statements were considered "public" (in particular where an individual was followed by or connected to a significant number of other employees of the same company), rather than personal opinions expressed within a closed group of friends.  Therefore, they were found to potentially justify termination of employment or result in the personal liability of the employee in question.

                                        7.   Spain

In May 2012, the Spanish Audiencia Nacional (Court of appeals) made a reference for a preliminary ruling to the Court of Justice of the European Union (the "CJEU") regarding users’ "right to be forgotten" in search engines.  The reference includes questions raised in a claim brought by Google against five decisions issued by the Spanish Data Protection Agency ("AEPD") requesting Google to remove from its organic search results certain links to websites containing information which affected individuals’ privacy.[16]  There is no doubt that the judgment to be delivered by the CJEU will highly influence, if not determine, the outcome of the legislative process in Brussels.

                                        8.   The Netherlands

Following a joint investigation by the Canadian and Dutch data protection authorities (in itself, a milestone in global privacy protection), the Privacy Commissioner of Canada ("OPC") and the Dutch Data Protection Authority ("CBP") released on January 28, 2013 their findings regarding the handling of personal information by mobile messaging platform WhatsApp.  The OPC and the CBP found that WhatsApp had violated Canadian and Dutch privacy laws, but also indicated that the company had taken steps to implement many recommendations to make its product safer from a privacy standpoint.  The CBP further noted that there are still a series of outstanding issues which it will examine in the second phase of its investigation.[17]

                    B.   Asia Region 

                                        1.   China

2012 was a year of rapid change for China’s data privacy regulatory framework.  Prior to last year, China’s data privacy laws were limited to an ambiguous right to privacy in the PRC constitution, as well as scattered laws and guidelines regulating the use of personal information by certain industries and government agencies.  2012 saw several major initiatives, signaling that China may be starting to enact personal information protections in line with global standards.  These include the following:

  • As discussed in Gibson Dunn’s 2011 Year-End Data Privacy and Security Update, the Ministry of Industry and Information Technology ("MIIT") introduced "Several Provisions on Regulating the Market Order for Internet Information Services"[18] (the "IIS Provisions"), which took effect on March 15, 2012.  The IIS Provisions cover a wide range of topics regarding consumer protection on the Internet.  In terms of data privacy, the IIS Provisions set forth certain limits on Internet information service providers’ ("IISPs") collection, use, and transmission of "user personal information" ("UPI"), defined as "user-related information that can be used to identify the user, either by itself or in combination with other information."  Among other requirements, IISPs may not collect UPI or transmit it to third parties without users’ consent, and must clearly inform the user of the methods, contents, and purpose of the data collection.   Violations are subject to fines of up to ¥30,000 RMB.
  • On December 28, 2012, China’s legislature adopted the "Decision of the Standing Committee of the National People’s Congress on Strengthening Online Information Protection."[19]  The brief law prohibits the collection of digital personal data by "network service providers and other enterprises and public institutions" "during the course of business" when done without the consent of the data subject, and requires the disclosure of the collection’s objective, scope, and method.  The law also mandates that network service providers and other enterprises and public institutions adopt technological measures to ensure the security of the information, and make public their rules regarding data collection and use.  Notably, the law empowers the private citizen who discovers online data divulging his or her identity or privacy or otherwise infringing on his or her rights, or who is "harassed" by commercial electronic information to request that the network service provider delete related information or take other remedial measures.
  • China’s Standardization Administration issued non-binding guidelines on data privacy, entitled "Information Security Technology–Guidelines for Personal Information Protection within Public and Commercial Services Information Systems,"[20] which came into effect on February 1, 2013.  The guidelines adopt a definition of personal information similar to that of the IISP Provisions, and also contain clauses regarding the collection of "sensitive" versus "general" personal information.  Collection of "sensitive" personal information requires the data subject’s express consent, while "tacit consent" may suffice when collecting "general" personal information.  Only the least amount of personal information necessary to achieve the stated purpose for data collection may be collected, and once the stated purpose is achieved, the information collected must be deleted.  The guidelines also contain provisions regarding consent, disclosure to the data subject, and international data transfers.

                                        2.   Other Jurisdictions

Other notable legislative developments across Asia include:

  • Singapore adopted its first ever data privacy law (the "Personal Data Protection Act"),[21] which came into effect on January 2, 2013.  The new law requires consent in most cases prior to gathering data, and contains provisions regarding transfers of data abroad and disclosures to data subjects prior to data collection.
  • Portions of Taiwan’s new "Personal Data Protection Act"[22] went into force on October 1, 2012.  The law, which was originally approved by the legislature in May 2010, provides mechanisms by which citizens may request the removal of personal data used for marketing purposes, and contains provisions governing the collection of health records. 
  • Philippine President Benigno Aquino signed the country’s comprehensive Data Privacy Act of 2012[23] into law on August 15, 2012.  The Act establishes privacy as a fundamental human right, and contains requirements for data collectors regarding consent, transfer, and the handling of sensitive personal data.

                [1]               In lieu of responding to Shine the Light requests, a business can choose to comply with the statute by adopting and disclosing to the public, in its privacy policy, a policy of not disclosing personal information of customers to third parties for the third parties’ marketing purposes unless the customer first affirmatively agrees to that disclosure or of not disclosing the personal information of customers to third parties for the third parties’ direct marketing purposes if the customer has exercised an option that prevents that information from being disclosed to third parties for those purposes.  Cal. Civ. Code § 1798.83(c)(2). 

                [2]               Unless the violation is willful, intentional, or reckless, a business has a complete defense if it remedies a defective disclosure by providing a full, accurate disclosure within 90 days of learning the violation has occurred.  Cal. Civ. Code § 1798.84(d). 

                [3]               On the merits, Wyndham also challenges the FTC’s deception claim on the grounds that the FTC improperly conflated Wyndham’s corporate privacy policy–which applies only to data collected by the parent corporation–and the data collection practices of the independently owned Wyndham-branded hotels where the data breaches occurred.

                [4]               The FTC also filed complaints against six rent-to-own companies that purportedly installed and used DesignerWare’s software to gather information in connection with collecting or attempting to collect debts pursuant to consumer rental contracts.

                [5]               See Peter Toren, "A Report on Prosecutions Under the Economic Espionage Act," Trade Secret Law Summit, AIPLA Annual Meeting (Oct. 23, 2012), available at (last visited January 7, 2013). 

[6]               The information sharing provisions in the executive order evoke and provide an alternative to the February 13, 2013 reintroduction of H.R. 3523, the Cyber Intelligence Sharing and Protection Act ("CISPA") by Representatives Mike Rogers (R-Mich.) and C.A. Ruppersberger (D-Md.).  CISPA was originally introduced in 2011, but stalled after being referred to the Select Committee on Intelligence in May 2012.

                [7]               See:

                [8]               Opinion 04/2012 on Cookie Consent Exemption, Article 29 Data Protection Working Party, 7 June 2012, available at:

                [9]               See Opinion 05/2012 on Cloud Computing, Article 29 Data Protection Working Party, 1st July 2012, available at:

                [10]             See:

                [16]             See Reference for a preliminary ruling from the Audiencia Nacional lodged on March 9, 2012 in Case C-131/12 Google Spain, S.L., Google Inc. v Agencia Española de Protección de Datos.

                [17]             See:

                [18]             Several Provisions on Regulating the Market Order for Internet Information Services (规范互联网信息服务市场秩序若干规定) available at

                [19]             Decision of the Standing Committee of the National People’s congress on Strengthening Online Information Protection  (全国人民代表大会常务委员会关于加强网络信息保护的决定) available at

                [20]             Information Security Technology – Guidelines for Personal Information Protection within Public and Commercial Services Information Systems (信息安全技朮 公共及商用服务信息系统个人信息保护指南) available at

                [21]             Personal Data Protection Act 2012, No. 26 of 2012 (approved Oct. 15, 2012) (Singapore).

                [22]             Personal Data Protection Act, Presidential Decisions Directive No. 09900125121 (approved May, 26, 2010) (Taiwan).

                [23]             Data Privacy Act 2012, Rep. Act. No. 10173 (approved Jul. 25, 2011) (Philippines).  

Gibson, Dunn & Crutcher LLP 

The following Gibson Dunn attorneys assisted in preparing this client alert:  Amanda Aycock, Alejandro Guerrero Perez, Brandon Halter, Joshua Jessen, Scott Mellon, Vivek Narayanadas, Shawn Rodriguez, Alexander H. Southwell, Oliver Welch, Susannah Wright and Adam Yarian. 

Gibson, Dunn & Crutcher’s lawyers are available to assist with any questions you may have regarding these issues.  For further information, please contact the Gibson Dunn lawyer with whom you work or any of the following members of the Information Technology and Data Privacy Group:

United States
S. Ashlie BeringerCo-Chair, Palo Alto (+1 650-849-5219, [email protected])
M. Sean RoyallCo-Chair, Dallas (+1 214-698-3256, [email protected])
Alexander H. SouthwellCo-Chair, New York (+1 212-351-3981, [email protected])
Debra Wong Yang
Co-Chair, Los Angeles (+1 213-229-7472, [email protected])
Howard S. Hogan – Member, Washington, D.C. (+1 202-887-3640, [email protected])
Karl G. NelsonMember, Dallas (+1 214-698-3203, [email protected])

James A. CoxMember, London (+44 207 071 4250, [email protected])
Andrés Font GalarzaMember, Brussels (+32 2 554 7230, [email protected])
Kai GesingMember, Munich (+49 89 189 33-180, [email protected])
Bernard GrinspanMember, Paris (+33 1 56 43 13 00, [email protected])
Alejandro Guerrero Perez – Member, Brussels (+32 2 554 7218, [email protected])
Jean-Philippe Robé – Member, Paris (+33 1 56 43 13 00, [email protected])
Michael WaltherMember, Munich (+49 89 189 33-180, [email protected])

Kelly Austin – Member, Hong Kong (+852 2214 3788, [email protected])

Jai S. Pathak – Member, Singapore (+65 6507 3683, [email protected]

Questions about SEC disclosure issues concerning data privacy and cybersecurity can also be addressed to any of the following members of the Securities Regulation and Corporate Disclosure Group:

Amy L. Goodman – Co-Chair, Washington, D.C.  (202-955-8653, [email protected])
James J. Moloney - Co-Chair, Orange County, CA (949-451-4343, [email protected])
Elizabeth Ising – Member, Washington, D.C. (202-955-8287, [email protected] 

© 2013 Gibson, Dunn & Crutcher LLP

Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.