Cybersecurity and Privacy Law Developments in Q2 of 2020
PDFProfessionals
Practice Areas
Cybersecurity and privacy law is evolving rapidly as lawmakers, government agencies and plaintiffs respond to the growth of new technologies, privacy concerns and cyberattacks. Businesses are facing new compliance obligations, greater legal uncertainty, and expanding liability risk from data breaches and privacy scandals. This trend will only increase as social-distancing measures in response to the COVID-19 public health emergency drive people and businesses to greater reliance on digital and telecommunications services. Keeping track of the many legal developments can be challenging, but Robinson Bradshaw attorneys are here to help. We publish quarterly updates to highlight noteworthy developments of cybersecurity and privacy law from the previous quarter. Click here to subscribe to our Cybersecurity and Privacy list and receive future updates via email, and click here to view all of our quarterly updates.
The second quarter of 2020 was marked by privacy and cybersecurity legal developments related directly or indirectly to the COVID-19 pandemic. For example, these include the Department of Health and Human Services Office of Civil Rights announcement to exercise enforcement discretion to allow certain uses of protected health information to combat the pandemic; European Data Protection Board guidelines and proposed federal legislation in the United States designed to address privacy challenges related to contact-tracing and related public health measures; and even the New York attorney general’s announcement of a letter agreement with Zoom Video Communications, Inc., to address cybersecurity concerns which came to light due to the service’s widespread usage during the pandemic. In addition, the legal landscape has continued to evolve in California, whose attorney general submitted a final copy of his proposed regulations to implement the California Consumer Privacy Act (CCPA), and where privacy advocates announced collecting sufficient signatures to qualify their proposed California Privacy Rights Act (CPRA) for the November 2020 ballot in California. Finally, the growth of privacy and cybersecurity litigation has continued apace, with the Seventh Circuit joining the Ninth Circuit to recognize standing to sue for violations of Illinois’s Biometric Information Privacy Act (BIPA); with Equifax reaching major settlements to resolve claims related to its 2017 data breach; and with a Virginia federal court ordering Capital One to disclose a forensic report about its data breach last year despite the bank’s assertion of privilege.
If you have questions about any of the legal developments highlighted in this quarterly update, please contact any member of our Cybersecurity and Privacy Practice Group for assistance. This quarterly update was prepared with the assistance of Cecilia Rambarat, a rising 3L student at UNC School of Law.
State Law Developments
- California; CPRA. On May 4, Californians for Consumer Privacy announced that it had collected over 900,000 signatures so as to qualify their proposed California Privacy Rights Act for the November 2020 ballot in California. The CPRA would amend the California Consumer Privacy Act to strengthen California privacy requirements even further and establish the California Privacy Protection Agency to enforce these requirements. The proposed law would create a new category of “sensitive personal information” and give consumers new rights regarding such data. California consumers would also have the right to request the correction of their personal information held by a business. The CPRA would also enhance children’s privacy rights by requiring opt-in consent to sell or share the personal data of consumers under the age of 16, and by tripling fines for violating this requirement. Finally, the CCPA’s limited private right of action for data breaches would be expanded to include breaches affecting a consumer’s email address in combination with a password or security question and answer.[1]
- California; CCPA. On June 1, the Office of the California attorney general submitted to the California Office of Administrative Law (OAL) the final version of his proposed regulations implementing the CCPA. The OAL’s typical 30-day review period has been extended by an additional 60 calendar days due to the COVID-19 pandemic. The attorney general requested an expedited review of the CCPA regulations. However, absent such expedited review, the regulations might not become effective until as late as Oct. 1 – three months after the law became enforceable by the California attorney general on July 1.[2] Please see our CCPA Practice Tip Series for comprehensive guidance to help businesses comply with the CCPA and the attorney general’s proposed regulations.
[1] The text of the proposed California Privacy Rights Act may be found here: https://www.robinsonbradshaw.com/assets/htmldocuments/Proposed_CPRA.pdf
[2] The text of the final proposed California Consumer Privacy Act may be found here: https://www.oag.ca.gov/sites/all/files/agweb/pdfs/privacy/oal-sub-final-text-of-regs.pdf
Federal Law Developments
- HHS; Health Privacy. On April 2, the Office of Civil Rights (OCR) of the Department of Health and Human Services published a notice of enforcement discretion to allow certain uses of protected health information by business associates during the COVID-19 public health emergency. Generally, business associates are only permitted to use or disclose protected health information for the purposes set forth in their business associate agreements with covered entities. However, during the COVID-19 emergency, OCR found that public health officials needed immediate access to certain protection health information. Therefore, in order to encourage and allow for rapid sharing of data, OCR has waived penalties against business associates who, in good faith, use or disclose covered entity’s protected health information for public health or health oversight activities, even if the business associate agreement between the business associate and the covered entity does not provide for such use. The business associate must inform the covered entity of such use or disclosure within 10 days of the use or disclosure.[3]
- FTC; Artificial Intelligence. On April 8, the Federal Trade Commission (FTC) issued new guidance on the use of artificial intelligence (AI) and algorithms. The FTC’s guidance emphasized that the use of AI tools should be transparent, explainable, fair and empirically sound, while fostering accountability. For transparency, the guidance instructs that companies must be careful not to mislead consumers about their interactions with AI, and they should not collect sensitive data secretly in order to feed an algorithm. In addition, the guidance observes that companies that make automated decisions based on information from a third-party vendor may be required by the Fair Credit and Reporting Act to provide the consumer with an “adverse action” notice. Furthermore, the guidance emphasizes that companies must explain their decisions to consumers. This includes explaining decisions if a consumer is denied something of value based on an algorithmic decision; disclosing key factors affecting an outcome, if an algorithm is used to assign risk scores to consumers; and, if a company changes the terms of a deal based on automated tools, informing consumers of the reasons for the change. The guidance also notes that companies should ensure their algorithms and AI tools are fair and empirically sound. Under this standard, companies should ensure AI tools do not discriminate against protected classes; should give consumers access and an opportunity to correct information used to make decisions about them; should validate and revalidate AI models to ensure that they work as intended; and should ensure any data about consumers provided to others for use in automated decision-making be correct and up-to-date, regardless of whether a company is designated as a consumer reporting agency or not. Finally, the guidance encourages companies to hold themselves accountable for compliance, ethics, fairness and nondiscrimination. Companies must take the necessary steps to protect their AI tools from unauthorized use, and must regularly assess their accountability mechanism.[4]
- Federal Privacy Bill. On May 7, a group of Republican senators led by Sen. Roger Wicker (R-Miss) introduced the COVID-19 Consumer Data Protection Act of 2020. The Act would require companies under the jurisdiction of the FTC to obtain affirmative express consent from individuals to collect, process or transfer their personal health, device, geolocation or proximity information for the purposes of tracking the spread of COVID-19. Also it would require companies to disclose to consumers, at the point of collection, how their data will be handled, to whom it will be transferred and how long it will be retained. Furthermore, companies would be required to issue a public transparency report every 30 days describing the data collected. Under the Act, individuals would be given the opportunity to opt out of the collection, use or transfer of their geolocation, proximity or personal health information. Entities covered by the Act would be required to delete or de-identify an individual’s information when it is no longer being used for a specified COVID-19 related purpose. Lastly, such entities would need to establish data minimization requirements to collect, process or transfer an individual’s data, and would also need to maintain reasonable administrative, technical and physical data security policies and practices to protect against security risks.[5]
- Federal Privacy Bill. On May 14, Democrats in both the House and Senate introduced the Public Health Emergency Privacy Act. The Act would put temporary rules in place regarding the collection, use and disclosure of personal data, including physical and behavioral health data, such as geolocation data, proximity data and demographic data collected for the purpose of tracking, screening, monitoring, contract tracing or otherwise responding to COVID-19. In addition, the rules imposed by the proposed law would only apply during the course of the Public Health Emergency as declared by the Secretary of Health and Human Services. Covered organizations would be required to take a number of steps to secure the personal data and to protect the privacy of individuals whose data has been collected, used or disclosed. The Act would also require the FTC to promulgate regulations regarding data that was collected or disclosed prior to its enactment, and would also prevent government entities from using data to interfere with an individual’s right to vote. Enforcement authority would be given to both the FTC and state attorneys general, and a private right of action would be created for negligent, reckless, willful and intentional violations.[6]
- Federal Privacy Bill. On June 1, Sen. Cantwell (D-Wash.), Sen. Cassidy (R-La.) and Sen. Klobuchar (D-Minn.) introduced the Exposure Notification Privacy Act. The proposed law would regulate “exposure notification” apps that allow individuals to receive automated alerts if they have been exposed to COVID-19. Commercial entities or nonprofits that operate “automated exposure notification services” would be subject to strict legal requirements and would have to collaborate with public health authorities. In addition to the stringent legal requirements that app providers would have to follow, the proposed law also features strong anti-discrimination provisions that would apply to restaurants, educational institutions, hotels, retailers and other places of public accommodation. These provisions would make it unlawful for certain establishments to use the data from the automated exposure notification services to deny people entry or services or otherwise discriminate against them. The Act’s requirements would be enforced by the FTC and state attorneys general.[7]
- HHS; Health Privacy. On June 12, OCR issued guidance outlining how covered health care providers may use protected health information (PHI) to contact recovered COVID-19 patients regarding the donation of blood and plasma to help treat other patients with COVID-19. By way of background, covered providers are permitted to disclose PHI without patient authorization for, among other reasons, such provider’s health care operations, which include population-based activities relating to improving health and case management and care coordination activities. OCR reasoned that the use of this PHI to facilitate the supply of donated blood and plasma would be “expected to improve the provider’s ability to conduct case management for patient populations that have or may become infected with COVID-19.” However, OCR noted this this use would not be permitted if it were otherwise considered marketing under HIPAA. Thus, for example, a covered provider could not disclose this information to a third-party blood and plasma donation center to allow the center to contact the patients to request donations for its own purposes.[8]
[3] The enforcement discretion notification can be found here: https://www.hhs.gov/sites/default/files/notification-enforcement-discretion-hipaa.pdf
[4] The FTC’s guidance on using artificial intelligence and algorithms may be found here: https://www.ftc.gov/news-events/blogs/business-blog/2020/04/using-artificial-intelligence-algorithms?utm_source=govdelivery
[5] The text of the proposed COVID-19 Consumer Data Protection Act of 2020 may be found here: https://www.commerce.senate.gov/services/files/A377AEEB-464E-4D5E-BFB8-11003149B6E0
[6] The text of the proposed Public Health Emergency Privacy Act may be found here: https://www.robinsonbradshaw.com/assets/htmldocuments/Proposed_PHEPA.pdf.
[7] The text of the proposed Exposure Notification Privacy Act may be found here: https://www.cantwell.senate.gov/imo/media/doc/Exposure%20Notification%20Privacy%20Bill%20Text.pdf
[8] At the time of publication, OCR’s guidance on contacting former COVID-19 patients could be found at: https://www.hhs.gov/sites/default/files/guidance-on-hipaa-and-contacting-former-covid-19-patients-about-blood-and-plasma-donation.pdf.
Foreign Law Developments
- Europe; GDPR. On April 21, the European Data Protection Board (EDPB) published two sets of guidelines on (i) the processing of health data for research purposes related to the COVID-19 pandemic and (ii) geolocation and other contact tracing tools. The guidelines on research data discuss the provisions of the EU’s General Data Protection Regulation (GDPR) which may permit the processing of health data where necessary for research. They also address international transfers of research data. The guidelines on contact tracing seek to clarify principles for the “proportionate” use of location data in this context, while reiterating the EPDB’s view that use of contact tracing apps should be voluntary and should rely on proximity data rather than tracing individual movements. Later, on April 24, the EDPB issued three letter responses to inquiries seeking clarification of specific points in the guidelines.[9]
- Brazil; LGPD. On April 29, the president of Brazil issued a Provisional Measure to delay the effective date of Brazil’s new privacy law, the Lei Geral de Proteção de Dados Pessoais (LGPD), from August 2020 to May 2021 based on the COVID-19 pandemic. Provisional Measures are emergency executive orders carrying the force of law. However, due to the complexity of the Provisional Measures law and the status of other Brazilian legislation, the actual length of this delay to the LGPD’s effective date still remains unclear. The LGPD was passed in 2018 and designed to be similar, though not identical, to the EU’s GDPR. For example, the law calls for similar access and deletion rights for data subjects; contract provisions between controllers and processors of personal data; and data impact assessments. In addition, the LGPD will have extraterritorial application to those who collect or process personal data in Brazil.
- Europe; GDPR. On May 4, the EDPB adopted guidelines on consent under the GDPR. The guidelines provide more information regarding the validity of consent obtained through cookie walls, and whether scrolling through a webpage could constitute clear and affirmative consent under the GDPR. In particular, the guidelines clarified that cookie walls, which prevent users who do not accept the use of cookies from accessing a site or mobile app, are unlawful because the consent is not given freely. Furthermore, the guidelines explained that scrolling or swiping through a webpage, or similar user activity, does not constitute affirmative action to meet the conditions for valid consent under the GDPR.[10]
- Europe; GDPR. On June 24, the European Commission issued its two-year report on the GDPR. In general, the Commission deemed the GDPR to have been a significant success thus far, highlighting citizen awareness of privacy rights, the fostering of a “compliance culture” among businesses, and growing enforcement by national Data Protection Authorities. The Commission also praised ongoing “adequacy decisions” which simply the flow of data between EU member states and countries such as Japan receiving such decisions. However, the Commission identified a major need for greater harmonization because current approaches to the GDPR continue to reflect national fragmentation.[11]
[9] Details about the EDPB guidelines on COVID-19 research and contact tracing can be found here: https://edpb.europa.eu/news/news/2020/european-data-protection-board-twenty-third-plenary-session-edpb-adopts-further-covid_en. Details about the EDPB’s follow-up letter guidance can be found here: https://edpb.europa.eu/news/news/2020/twenty-fourth-plenary-session-edpb-doubles-down-covid-19-guidance-newly-adopted_en.
[10] The EDPB’s guidelines on consent under the GDPR can be found here: https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_202005_consent_en.pdf.
[11] Details about the European Commission’s two-year report on the GDPR can be found here: https://ec.europa.eu/commission/presscorner/detail/en/ip_20_1163.
Litigation and Enforcement
- Equifax; Data Breach Litigation. In April and May, Equifax, Inc., agreed to settle multiple cases arising from its 2017 data breach that exposed the sensitive personal information of approximately 147 million people in the United States. These settlements are in addition to $175 million Equifax has already agreed to pay in response to other federal, state and consumer claims stemming from the breach. On April 10, the city of Chicago reached a $1.5 million settlement to resolve its lawsuit against Equifax regarding the breach. The lawsuit alleged fraud and negligence, noting that the national credit bureau’s inadequate data security measures led to the breach. On April 15, Equifax agreed to pay $19.5 million to resolve claims by the Indiana attorney general that Equifax put profits ahead of data security in the run-up to the massive 2017 data breach. On April 17, Massachusetts became the last state to settle with Equifax over the breach. This settlement requires Equifax to pay $18.2 million and make “significant” changes to business practices to fall into step with the state’s robust data security law. Finally, on May 15, Equifax agreed to pay $5.5 million to a putative class of thousands of banks and credit unions, who alleged damages related to helping individuals whose sensitive personal information was compromised, and to spend at least $25 million on the financial institutions’ data security.
- BIPA; Seventh Circuit. On May 5, in Bryant v. Compass Group USA, Inc., the U.S. Court of Appeals for the Seventh Circuit held that federal courts can hear claims by plaintiffs alleging their biometric information was collected without informed consent as required by Section 15(b) of the Illinois Biometric Information Privacy Act, even if such plaintiffs suffered no harm beyond the alleged violation of BIPA. The case involved a call center employee who was required to use her fingerprint to order from vending machines at work. The employee sued the vending machine company, alleging that it had not obtained her informed consent prior to collecting her fingerprints. The Seventh Circuit held this alleged violation of BIPA constituted a “concrete injury-in-fact” which gave the employee standing to sue in federal court. However, the court held that the employee’s claims tied to BIPA’s Section 15(a) – requiring that a data retention schedule and other information be made generally available to the public – could not be heard. The court reasoned that the company’s obligation under Section 15(a) ran to the public generally as opposed to the specific individuals whose fingerprints were collected. The Seventh Circuit’s decision to allow the lawsuit to proceed falls in line with a similar ruling by the Ninth Circuit last year. However, these cases stand in contrast to a Second Circuit decision in 2017 which held that a plaintiff lacked standing to bring BIPA claims absent allegations that their biometric information was disseminated or misused.[12]
- New York; Cybersecurity. On May 7, the New York attorney general announced a letter agreement with the popular video-conferencing service provider Zoom Video Communications, Inc., regarding its cybersecurity and privacy practices. The attorney general had announced an investigation into Zoom on March 30 after widespread usage during the COVID-19 pandemic led to reports of cybersecurity and privacy shortcomings by the video-conferencing provider. Under the letter agreement, Zoom shall implement a comprehensive information security program, including risk assessment and security code reviews; shall employ specific additional cybersecurity practices, including encryption and security protocols; shall enhance its privacy practices by offering users information about privacy controls, providing certain user-facing privacy controls for free accounts and removing a LinkedIn Navigator feature; shall protect users from abuse by updating its acceptable use policy and by accepting and investigating reports of violations; and shall implement certain audits and testing protocols, with results provided to the New York attorney general. Many of the terms of the letter agreement with Zoom are consistent with cybersecurity provisions of the New York SHIELD Act that went into effect on March 21, 2020.[13]
- BIPA; N.D. Ill. On June 1, an Illinois federal court determined that video-service giant Vimeo could not compel arbitration of a proposed class action suit claiming that its video and slideshow app Magisto violated BIPA. The proposed class action alleges Vimeo violated BIPA by collecting data about users’ face geometry from photos and videos uploaded to the Magisto app without satisfying statutory requirements. Vimeo argued that its terms of service meant any such lawsuit was subject to individual arbitration. The court disagreed, holding that although the company’s terms of service created a binding agreement to arbitrate with users, the lawsuit’s BIPA claims fell under an exception to that agreement for claims “related to, or arising from … invasion of privacy.” The court concluded that this language rendered the exception broad enough to include claims arising from an alleged violation of BIPA and noted that the Illinois Supreme Court previously held that such violations constitute “an invasion, impairment or denial of [a] statutory right.” Accordingly, the court denied Vimeo’s motion to compel arbitration and allowed the case to proceed forward.[14]
- FTC; COPPA. On June 4, the FTC reached a settlement with app developer HyperBeard, along with the company’s CEO and managing director, to resolve alleged violations of the Children’s Online Privacy Protection Act Rule (COPPA Rule). According to the complaint, the defendants violated the COPPA Rule in a number of HyperBeard’s apps directed to children by allowing third-party ad networks to collect personal information in the form of persistent identifiers without notifying parents or obtaining verifiable parental consent. The persistent identifiers, such as advertising ID’s and cookies, were then used to deliver targeted ads to the users. Under the settlement, HyperBeard received a suspended penalty of $4 million (reduced to $150,000 based on the defendants’ inability to pay), and the company must delete and not benefit from any personal data they collected from children under 13 in violation of the COPPA Rule. FTC Commissioner Philips dissented on the ground that the $4 million penalty is excessive, reasoning that the amount of actual harm caused by the violation should be more central to the calculation of civil penalties. FTC Chairman Simons issued a statement defending the large penalty, calculated based on HyperBeard’s improper gains, as an effective deterrence for violations of the COPPA Rule because “violation should not be more profitable than compliance.”[15]
- FCC; Robocalls. On June 9, the Federal Communications Commission (FCC) floated a $225 million fine – its largest to date – against two men and their businesses Rising Eagle and JSquared Telecom for facilitating 1 billion illegal robocalls. In particular, they allegedly spoofed the caller IDs of insurance providers such as Cigna, Aetna and Blue Cross Blue Shield – causing their lines to be flooded by angry callbacks – and then transferred call recipients to lesser known entities selling short-term, limited-duration insurance plans. The same day, a group of state attorneys general, including the North Carolina attorney general, filed their own charges against these defendants under the Telephone Consumer Protection Act and state telemarketing laws.[16]
- Data Breach Litigation; E.D. Va. On June 25, in the case In re Capital One Customer Data Security Breach Litigation, a federal judge in Virginia ordered Capital One to disclose a report of forensic analysis explaining how a cybercriminal was able to steal 106 million applicants’ sensitive data last year. The data breach occurred after a former software engineer stole 140,000 Social Security numbers and about 80,000 linked bank account numbers from people who applied for Capital One accounts. The bank had argued that the forensic report was protected from disclosure as privileged “work product” because it had been prepared to help Capital One’s legal counsel defend lawsuits expected to be filed after the breach was announced. Rejecting this argument, the court reasoned that Capital One likely would have commissioned the report even if it did not expect legal action.[17]
[12] The Seventh Circuit’s decision may be found at Bryant v. Compass Grp. USA, Inc., 958 F.3d 617 (7th Cir. 2020).
[13] The New York attorney general’s letter agreement with Zoom may be found here: https://ag.ny.gov/sites/default/files/nyag_zoom_letter_agreement_final_counter-signed.pdf.
[14] The Illinois federal court’s decision may be found at Acaley v. Vimeo, Inc., No. 19 C 7164, 2020 WL 2836737 (N.D. Ill. June 1, 2020).
[15] For the FTC’s announcement of the settlement with HyperBeard and links to the complaint and to the statements of FTC Commissioner Philips and FTC Chairman Simons, see https://www.ftc.gov/news-events/press-releases/2020/06/developer-apps-popular-children-agrees-settle-ftc-allegations-it.
[16] For a copy of the complaint, see https://www.robinsonbradshaw.com/assets/htmldocuments/TCPA_Complaint.pdf.
[17] The court’s decision may be found at In re Capital One Customer Data Security Breach Litigation, No. 1:19-md-02915 (June 25, 2020).