On January 28, 2021, privacy professionals around the world will celebrate Data Privacy Day. This year, we decided to mark the occasion by gathering our team’s thoughts and expectations on what we expect to be the biggest privacy law stories in 2021 and beyond.
Last year we wrote a similar article, attempting to predict how the privacy landscape would unfold in 2020. We got some things right (e.g., the emergence of CCPA 2.0). But, let’s be honest, in March everything changed, including privacy law. As spring turned into summer our writing focused on the privacy law implications of COVID-19, including contact tracing, no contact temperature taking, and the unanticipated collection of heath information, among other unexpected topics. We also took note of developments overseas, including the Court of Justice of the European Union’s Schrems II decision and the emergence of Brazil’s federal privacy law, LGPD.
If there was one takeaway from 2020 from a privacy law perspective it was this – while it is impossible to predict its path, privacy law is rapidly growing and evolving, almost on a daily basis, and in nearly every corner of the world. With that, we turn to our 2021 predictions.
State Privacy Law
Perhaps the best way for us to start our predictions for how privacy law will evolve in 2021 is to focus on what is certain. In November 2020, California voters passed the California Privacy Rights Act (CPRA) which, when fully effective, will significantly change the California Consumer Privacy Act (CCPA). One of those changes is the creation of the California Privacy Protection Agency. As explained in our prior post, by mid-March, lawmakers and the Governor will need to appoint their respective members of the five-member board of the Agency. The Agency will begin its regulatory rulemaking process in July 2021, which it will need to finalize by July 1, 2022. In the meantime, the California Attorney General’s office is still pursuing amendments to the CCPA’s regulations. In short, there can be little doubt that California will still dominate privacy law news in 2021.
Washington Privacy Act 3.0
As we reported earlier this month, lawmakers in Washington once-again introduced the Washington Privacy Act (WPA). Readers of this blog will know that we have painstakingly tracked the WPA for the past two years. In both 2019 and 2020, Washington lawmakers were unable to reach a compromise that would see Washington become the second state to enact CCPA/GDPR-like consumer privacy legislation. Will 2021 be any different?
If the WPA does pass, it is possible – if not likely – that further compromises will need to be made between privacy and business advocates. In particular, privacy advocates have long sought a private right of action and argued that anything short of that would make the bill’s provisions unenforceable. During a January 14 public hearing, privacy advocates again raised these concerns. It also is expected that a competing bill will be introduced in the Washington House of Representatives, which will include a private right of action.
If the WPA does pass, it would (as it currently stands) take effect on July 31, 2022.
Businesses also will be interested in knowing how the WPA compares with the CCPA and CPRA. Undoubtedly, there are provisions of the WPA that pull directly from the CCPA/CPRA. However, the WPA also is heavily influenced by GDPR as reflected by its use of GDPR terminology such as personal data, controllers, and processors. Indeed, readers familiar with those privacy regimes will readily recognize numerous provisions in the WPA.
As in the past two years, 2021 began with a number of other state legislatures proposing CCPA/GDPR-like privacy legislation. As of January 27, 2021, bills have been introduced in Connecticut, Oklahoma, Minnesota, Mississippi, New York, and Virginia (not including Washington). It is expected that more bills will follow. There appears to be some momentum in New York as numerous bills already have been filed, and Governor Cuomo announced on January 15, that he will be running a privacy bill this year.
To track all of these developments, on February 17, members of Husch Blackwell’s Data Privacy & Cybersecurity team will host a webinar to discuss all of the CCPA-like privacy bills proposed across the country. To register, click here.
Illinois Biometric Information Privacy Act (BIPA)
2020 was an active year for litigation under the Illinois Biometric Information Privacy Act (BIPA), which requires disclosures, consents and use restrictions when a person’s “biometric identifiers” (such as finger and hand prints, retina scans, and facial geometry) or “biometric information” (information derived from biometric identifiers) are captured, stored or used. In 2020, several hundred new class actions were filed under BIPA, courts issued mostly plaintiff-favorable rulings, and Facebook agreed to a record-setting $650 million class settlement.
In 2021, the wave of case filings is expected to continue, and several key BIPA issues are now on appellate court dockets. Whether BIPA has a five-year statute of limitations (as plaintiffs argue) or a one- or two-year statute of limitations (as defendants assert) is set to be decided in two different cases before the Illinois Appellate Court. The Illinois Supreme Court will decide early in 2021 whether to review a lower appellate court ruling finding that BIPA claims brought by employees against their employers are not barred by the Illinois Workers Compensation Act. As more than 90% of the BIPA cases on file are brought in the employment context (mostly involving the use of finger- and hand-scanning time clocks), a reversal by the Illinois Supreme Court would substantially stem the tide of BIPA litigation. And, courts in some of the earlier filed BIPA cases that have gotten beyond the pleading stage, will be called on to consider some of the few remaining factual defenses involving the technology of time clock devices and software programs which are alleged to violate BIPA, as well as the scope of the government independent contractor exemption in cases where some, but not all, of the defendant’s business was performed for a governmental entity. Finally, in early 2021, the court in the Facebook BIPA case is expected to grant final approval of the $650 million settlement, under which millions of Illinois Facebook users will receive payments estimated at around $350.
In addition to the rulings on key BIPA issues we expect to see from the courts in 2021, there may also be action from the Illinois Legislature. Over the past several years, multiple bills have been introduced in the Legislature, which would amend BIPA to refine definitions, limit damages, and set a statute of limitations. Most bills have failed on the House or Senate floor, but most recently in February 2020, SB 3593 – a bill proposing a one-year statute of limitations, a limitation on damages to actual damages for negligent violations (rather than liquidated damages of $1,000 or actual damages, whichever is greater), and a limitation to actual damages plus liquidated damages up to the amount of actual damages for willful violations (rather than liquidated damages of $5,000 or actual damages, whichever is greater) was introduced in the Illinois Senate, but the 2020 legislative session ended without a vote on SB 3593. As with the court rulings expected in 2021, corporations should watch for changes and clarifications to the law from the Legislature as well.
Proposed State Biometric Information Privacy Acts
Legislators in Maryland and New York have introduced biometric privacy legislation during their 2021 legislative sessions. The text of both bills is virtually identical to the Illinois statute, which has now been in effect for over a decade. This is the fourth time that both legislatures have considered biometric privacy legislation.
If passed, the bills would limit the collection and disclosure of biometric information and biometric identifiers (data) by private sector entities. Both bills apply to consumer and employee data and require private sector entities possessing this data to develop publicly available, written policies that establish their retention schedules and guidelines for destroying the data. Before collecting this data, companies would be required to provide notice to persons whose data will be collected and obtain written release from those persons for the collection and retention of the data. After collection, companies would be required to protect the data from disclosure pursuant to the reasonable standards of care applicable to their industry sectors.
Similar to the Illinois law, if enacted Maryland and New York would allow private rights of action, allowing individuals to recover the greater of (a) actual damages; or (b) liquidated damages of $1,000 for negligent violations and $5,000 for intentional or reckless violations. The only substantive differences in the Maryland and New York bills are the effective dates. Maryland’s bill would go into effect on January 1, 2022 if signed into law. The timeline for businesses to come into compliance with New York’s bill would be much shorter – only 90 days if the bill is enacted in its present form.
State Tax on Personal Data Sales
On January 11, State Representative Pam Marsh introduced House Bill 2392, which proposes a tax on businesses engaged in the selling of personal information. The tax would apply at a rate of 5 percent to gross receipts generated from the sale of taxable personal information in the state. The bill broadly defines taxable personal information as any personal information that a business accumulates from the Internet.
In recent years, a handful of state lawmakers have similarly taken aim at a data sales tax, citing the high value of personal data to businesses as new technology grows. However, many of the proposals have been met with stiff rebuttal from tech companies. For example, in 2017 Washington State Representative Norma Smith introduced a bill with a 3.3 percent tax on revenues from the sale of personal information. In 2018, the bill hit a wall in the House Committee on Appropriations.
Additionally, New York state legislators introduced legislation to the State Senate in 2019 and then an identical bill to the State Assembly in 2020. The bills proposed to amend tax law with a new tax on the gross income that businesses derive from sharing personal data. The bills did not see any movement in the State Senate or Assembly beyond introduction and referral to committee. In 2019, California Governor Gavin Newsom assigned a team to work with data scientists and legislators to create a “data dividend” that businesses would pay to the state or reimburse to consumers. There have yet to be any updates from the team.
Although the frequency of these proposals has increased, it is not yet certain whether a proposal like the one in Oregon will follow a similarly unsuccessful trajectory.
Vermont’s Face and Voice Recognition Legislation
In 2020, there was an increase in facial recognition bans throughout cities in response to law enforcement use and surveillance practices. Outside of these bans, facial recognition use is often governed by biometric laws. However, on January 14, Vermont State Representative Matthew Birong introduced a bill promoting general consumer data protection, while also specifically addressing face and voice recognition technology use. The bill states that, with respect to a consumer, a person shall not (1) scan the face of a nonuser in a photograph; (2) use facial or voice recognition technology unless a consumer opts in to the use; (3) use facial or voice recognition technology for a purpose other than product development; (4) use for marketing purposes any feature that will store conversations; and will also require the person to (5) delete quality enhancement data after 21 days and (6) disclose the use of facial recognition technology on a physical sign at the front of a business location. The Vermont bill stands alone and may for the foreseeable future, considering that states seem to be focused on proposing broad biometric laws that also encompass face and voice recognition technology.
Federal Privacy and Cybersecurity Law
Federal Privacy Legislation
With the Democrats taking control of the White House and Congress, privacy advocates have started to predict that 2021 could be the year that the United States finally enacts GDPR-like federal privacy legislation. Certainly, the Democrats taking control has increased the chances of seeing federal privacy legislation. No one will dispute that point, but it remains to be seen whether Democrats will use the next (at least two) years of single-party control to push through such legislation or whether they will focus on other bills.
Colleges and universities should expect new guidance from the U.S. Department of Education (ED) on data privacy and information security in 2021. On December 18, 2020, Federal Student Aid announced that it is finalizing the Campus Cybersecurity Program framework. The Program framework will be a multi-year phased implementation to ensure that institutions and their third-party servicers comply with National Institute of Standards and Technology (NIST) Special Publication 800–171 Rev. 2. The Program framework will start with a self-assessment to help ED understand the education community’s readiness to comply with NIST 800–171 Rev 2. ED has stated that its “intention is to partner and collaborate with IHEs [institutes of higher education], and other organizations, to enhance the resilience and maturity across IHEs by establishing a cybersecurity baseline, sharing information, and overseeing compliance with NIST 800–171 Rev. 2 and other cybersecurity requirements.” ED expects to release additional information about the Program framework later in 2021. You can access the FSA Announcement here.
The IoT Cybersecurity Improvement Act of 2020 (IoT Act) enacted near the end of 2020 directs NIST to publish standards and guidelines for federal agencies on the appropriate use of IoT devices connected to government information systems. The deadline for NIST to publish these standards and guidelines is less than 45 days away – March 5, 2021.
Three months later on June 3, the IoT Act requires NIST, the Department of Homeland Security and the Office of Management and Budget to publish guidelines on the receipt of IoT security vulnerabilities that are related to government information systems, and the disclosure and resolution of those vulnerabilities. These disclosure guidelines will apply not only to agencies but also to contractors and subcontractors providing those information systems or IoT devices. Presumably contractors and subcontractors will have to establish programs and procedures to receive security vulnerability information about their IoT devices, and to disseminate the solutions for those vulnerabilities
The final component of the IoT Act goes into effect at the end of next year on December 5, 2022. After that date, the IoT Act will prohibit every federal agency from entering or renewing procurement contracts that involve the use of IoT devices if the agency’s chief information officer has determined that the use of those IoT devices will prevent the agency from complying with the NIST standards and guidelines.
The IoT Act will have the greatest impact on contractors delivering IoT products and services to the federal government, but these security requirements for all federal procurements of IoT devices will be a fundamental shift in United States law, which historically has applied an industry-sector model to cybersecurity and data privacy requirements. Because the federal government is the planet’s largest customer of goods and services, the NIST standards could become the baseline for private sector contractual obligations and/or industry standards for IoT devices if economies of scale encourage IoT manufacturers to apply the NIST standards to all IoT devices, not just ones intended for the federal government.
Financial Institution Rulemaking
The Dodd Frank Act, passed in 2010, created the Consumer Financial Protection Bureau (CFPB) and gave it authority to promulgate rules implementing Section 1033 of the Act. Under Section 1033, upon request, a financial services provider must make available to a consumer information that it controls or possesses regarding the consumer including, for example, transactional, charges, and usage data. The Dodd Frank Act defines consumer very broadly to include a consumer’s agent or representative acting on behalf of the consumer, which would seem to include, for example, entities such as data aggregators. In short, pursuant to Section 1033, with a consumer’s authorization, a financial services provider could be required to provide all the consumer’s information in its possession to a third-party (e.g. data aggregator).
The CFPB took the first official step towards promulgating rules pursuant to this Section last fall by issuing an Advanced Notice of Proposed Rulemaking (ANPR). Responses to the lengthy ANPR are due February 4. Unless the new CFPB director scraps the plans, we expect that the CFPB will start work on proposing new rules in 2021. How quickly that will happen remains to be seen; we suspect that it will take some time for the CFPB to digest the responses to the ANPR and grapple with the significant tension between consumer-authorized access to data and privacy and data security that the law presents.
The pandemic of 2020 tested the mettle of our nation’s healthcare system in many unexpected and profound ways. As healthcare delivery was being rapidly restructured to accommodate COVID-19 diagnosis and treatment and socially-distanced care, bad actors simultaneously began to exploit the increased number of vulnerabilities in health information systems created by telehealth platforms, patient portals and the inattention of stressed, overworked staff. The result was an unprecedented number of cyberattacks culminating in an alert from the Cybersecurity and Infrastructure Security Agency (CISA) on October 28, 2020 addressing the plague of ransomware activity targeting the healthcare and public health sector.
As the pandemic is only slowly abating, we can expect more of the same types of incidents in 2021 and healthcare providers will need to remain hyper-vigilant in their efforts to monitor and patch systems to keep out in front of cybercriminals. After all of this is done, attention can then turn to complying with the more “mundane” changes to privacy and security requirements which took effect in 2020 including changes to 42 CFR Part 2 (with newly revised confidentiality protections against unauthorized disclosure and use of substance-use disorder patient records to facilitate better coordination of care by federally assisted programs) and implementation of the Information Blocking protections (which proscribe unreasonable practices which are likely to interfere with, prevent or discourage access, exchange or use of electronic health information).
Finally, healthcare providers and other institutions and employers which became involved with COVID-19 testing and vaccination programs in 2020 should expect state and federal agencies to turn their gaze to information practices that were not closely monitored or enforced during the height of the pandemic. Latitude in protecting the privacy and security of information that was once accorded to the front lines combatting COVID-19 likely will no longer be given.