In this series we’ve looked at recent developments in United States’ data privacy and security laws, primarily at the state level, that are transforming retention schedules and data disposal from merely prudent practices into compliance requirements:

This series explores how recent changes in U.S. privacy and data security laws are elevating retention schedules and data disposal from merely prudent practices to compliance requirements.

Today’s companion post explores how the California Consumer Privacy Act (CCPA), without statutory provisions explicitly requiring data minimization or storage limitation, nevertheless incents covered businesses to carefully manage retention and disposal of personal information (PI).  But less than two years from now, the script gets flipped, with California mandating both data minimization and storage limitation for businesses covered by the California Privacy Rights Act (CPRA).

The CPRA became law through a November 2020 ballot initiative.  Generally effective on January 1, 2023, the CPRA makes sweeping changes to the CCPA, including new provisions that directly require data retention management and data disposal.  Under the CPRA, covered businesses:

  • Must inform consumers how long the business intends to retain each category of PI the business collects, or if that is not possible, the criteria used to determine the retention period.
  • Must not retain PI for longer than is reasonably necessary and proportionate for the disclosed purpose(s) of collection or processing.

Cal. Civ. Code § 1798.100(a)(3) & (c) (effective January 1, 2023).  Thus, for the first time under any U.S. federal or state comprehensive data privacy law, The CPRA will explicitly and directly require covered businesses (1) to manage the CPRA’s broad range of PI under data retention schedule rules disclosed through notice to consumers, and (2) to dispose of PI once it is no longer required for legal compliance or reasonably necessary for the disclosed purposes for its collection and use.
Continue Reading Less data is more than ever: the CPRA and beyond

Deleting DataThis series explores how recent changes in U.S. privacy and data security laws are elevating retention schedules and data disposal from merely prudent practices to compliance requirements.

The California Consumer Privacy Act, effective January 1, 2020, was the Untied States’ first state-level comprehensive data privacy law.  And the CCPA blogging blitzkreig has not been merely hype – the CCPA presages a fundamental shift in U.S. privacy law.

The statute was a bit convoluted in its original form, almost as if the California legislature had hurriedly cobbled it together in a week’s time to avoid different provisions becoming law through a ballot initiative spearheaded by private activists, and which would have been essentially immune to subsequent direct amendment by the legislature (oops, that’s actually what happened).  Today’s CCPA is the also the product of a flurry of legislative clean-up amendments, supplemented by now-final California regulations (not that anything is ever quite final in California), and with a few targeted statutory amendments effective now due to last November’s adoption of the CPRA by ballot referendum.

Much thoughtful guidance is available elsewhere on the CCPA’s scope, applicability, and the various consumer rights it creates, including notice/transparency, access, deletion, and sale opt-out.  Our narrow focus here is on whether and how the CCPA affects the need of covered businesses (1) to manage PI with retention scheduling and (2) to dispose of PI once no longer necessary.


Continue Reading Less data is more than ever: the CCPA

Deleting DataThis series explores how recent changes in U.S. privacy and data security laws are elevating retention schedules and data disposal from merely prudent practices to compliance requirements.

Last week’s post was a whirlwind history tour of U.S. data privacy law, honing in on the privacy principles of data minimization and storage limitation.  The punchline was that unlike most foreign data privacy regimes, and with but few exceptions, U.S. data privacy laws have focused primarily on notice and consent and have avoided requiring businesses (1) to manage data under a retention schedule and (2) to dispose of personal data once no longer necessary for legal compliance or business need.

This began to change in state laws focused on a small niche of privacy – biometric data privacy.  Data security for biometric data is becoming a staple of state-level breach notification statutes (to date, in 17 states and the District of Columbia) and in some states’ laws that affirmatively require reasonable data security programs for protected personal information.  But state-level data privacy laws for biometric data have been more of an outlier.

Illinois’ Biometric Information Privacy Act (BIPA) became effective in 2008.  BIPA has been blogged about endlessly, largely because, after a bit of a sleepy start, its provisions allowing private-party class actions for statutory damages (thereby bypassing the standing impediment vexing many privacy and data security claimants) thrust BIPA to center stage in headline-grabbing litigation.

Our focus here is on a particular provision in BIPA:
Continue Reading Less data is more than ever: state biometric data privacy laws

Digital DataThis series explores how recent changes in U.S. privacy and data security laws are elevating retention schedules and data disposal from merely prudent practices to compliance requirements.  

Forgive me, but to fully appreciate the impact of state data privacy laws on managing records retention and disposing of unnecessary data, a bit of history is needed (if you’re allergic to history, skip this post).  Our focus is through the narrow lens of two key elements of data privacy regimes: data minimization (only collecting the minimum of personal data needed for the collection purposes) and storage limitation (only keeping personal data for as long as needed for these purposes).

United States data privacy law is a global outlier.  That’s ironic, given that the building blocks of modern data privacy law, the Fair Information Privacy Practices (FIPPs), were first expressed in a 1973 report by the U.S. Department of Health, Education, and Welfare, Records, Computers, and the Rights of Citizens.  As originally framed, the FIPPs (Transparency, Access, Choice, Correction, and Quality/Protection) did not speak directly to data minimization or storage limitation.  At least at the outset, the FIPPs did not expressly call for minimizing collection of personal data or deleting personal data once its collection purpose was satisfied.

If data privacy were a religion, and the FIPPs its original Word, what came next was inevitable – inspiration spread globally and resulted in various denominations, each restating and taking the core beliefs in different directions, as influenced by cultural factors and, with data privacy law, governing philosophies:
Continue Reading Less data is more than ever: for context, a ridiculously brief history of U.S. data privacy law

Businesses in the United States have a new imperative to carefully manage records retention and promptly dispose of unnecessary information (and no, it’s not due to GDPR or other global privacy law developments).  Recent changes in U.S. data security and privacy laws, and the trends they portend, are elevating the disposal of unnecessary data from a risk management strategy to a compliance requirement.

Managing data volumes has always been prudent.  Using retention schedules to curb relentless data growth remains an established, sensible way to keep business operations efficient, manage storage expense, mitigate ediscovery costs, and limit data security and privacy exposures.  Perhaps the most trenchant explanation was offered by former U.S. District Court Magistrate Judge John Facciola:  “If your clients don’t have a records management system, they may as well take their money out into the parking lot and set it on fire.”

But as a matter of pure legal compliance, U.S. federal and state laws have historically followed a “mandatory minimum” retention approach, requiring that businesses keep specified records for at least a mandated retention period, but not compelling disposal.  With precious few exceptions, U.S. businesses have not been legally required to (1) manage data with retention schedules and (2) dispose of unnecessary data.  And U.S. privacy and data security laws have generally been silent on retention periods for protected information.  For example, HIPAA and its Privacy and Security Standards impose no retention period on covered entities for protected health information (PHI); the Gramm-Leach-Bliley Act (GLBA) and its federal functional regulators’ privacy regulations and Interagency Security Guidelines do not explicitly require financial institutions to dispose of unnecessary nonpublic customer information (NPI); and the FACTA Disposal Rule only speaks to how, not when, to compliantly dispose of consumer report information.

Well … that was then, and this is a new now, driven by recent changes in U.S. data security and privacy laws.  I’ll dig deeper into these developments in upcoming posts, but here are the high points:
Continue Reading For U.S. businesses, less data is more than ever

Courtesy of Wikipedia, To Serve Man (The Twilight Zone)

To truly appreciate just how we are served by the digital economy, we must revisit Damon Knight’s award-winning 1950 short story To Serve Man.  Popularized by a beloved 1962 TV episode of The Twilight Zone, Knight’s tale tells of aliens coming to Earth to bring humans “peace and plenty.”  Courtesy of the aliens’ advanced technologies, we soon enjoy the global benefits of unlimited electrical power, inexhaustible food, and the end of warfare.  And better yet, humans are invited to visit the aliens’ home planet, a galactic paradise.

Meanwhile, a skeptical person toils to decipher the aliens’ cryptic language, in order to read a purloined alien book and come to understand their motives for such astounding beneficence toward humankind.  The book’s translated title is reassuring – “To Serve Man.”  Only later is our intrepid translator able to decipher the book’s first paragraph, revealing that it is not a treatise on helping humanity.  It’s a cookbook.

The digital revolution has indeed brought us benefits on a global scale, unimaginable just a few decades ago.  The Internet informs us, social media connect us, and our apps and devices support us.  All problems solved, right?

But something is wrong in our advanced-technology-paradise.  The digital economy traffics in something of great value – our information – and we remain largely oblivious to the basis of our “bargain.”  The signs are right there, in front of us, like a book waiting to be read.  For example, consider this from The Atlantic:
Continue Reading How the digital economy serves us

Fingerprint biometric dataIn today’s landmark ruling, the Illinois Supreme Court held that private lawsuits seeking statutory damages and injunctions for violation of the Illinois Biometric Information Privacy Act (BIPA) may be pursued by “aggrieved” persons without alleging any actual injury or adverse effect.

BIPA, enacted in Illinois back in 2008, was the seminal state statutory privacy

Person hiding head in the sandI keep getting asked about Cambridge Analytica and Facebook.  And no one seems to like my response – I’m frankly amazed that this all took so long to blow up.  How long?  How about since 1973.  That’s when the U.S. Department of Health, Education, and Welfare first articulated the Fair Information Practice Principles (FIPPs or FIPs) in its report Records, Computers, and the Rights of Citizens: Report of the Secretary’s Advisory Committee on Automated Personal Data SystemsThe FIPPs went on to become bedrock global privacy principles, and central to them are the principles of notice and consent.

As the FTC later explained in Privacy Online: A Report to Congress:

1. NOTICE/AWARENESS
The most fundamental principle is notice. Consumers should be given notice of an entity’s
information practices before any personal information is collected from them….

2. CHOICE/CONSENT
The second widely-accepted core principle of fair information practice is consumer choice
or consent. At its simplest, choice means giving consumers options as to how any personal
information collected from them may be used….

These mechanisms – notice and consent – are what make a self-governing privacy system work.  If someone (such as Facebook) is going to obtain and use our personal data, they should first give us notice of how they will use it (such as provide or sell it to others), and then we make a choice – we either consent and provide our data, or we don’t.  The government may enforce these representations and choices under fair trade practices laws, such as FTC Act Section 5, but the rules themselves are made in the marketplace.

There has to be some source of governance.  The alternative to self-governance through notice and consent is governance by government, with legislators and regulators making the rules for how our data is handled.  There’s quite a bit of that in the EU and elsewhere, but in the United States, outside of specific sectors such as healthcare (HIPAA), education (FERPA), and financial services (GLBA & FCRA), there’s little such regulation here.  In the U.S. we’ve made a policy decision to largely self-govern the privacy of personal data.

Fast forward from 1973 and, especially in our Internet-driven, U.S. self-regulatory environment, we’ve got a large, smoking crater – precious little government regulation, and even less personal responsibility.  Let’s face it.  We don’t actually pay attention to privacy policies and terms of use, and we don’t actually make informed choices on our consent to data practices for our personal information.  Under our self-governing privacy system, look in the mirror.  The enemy is ourselves.


Continue Reading (But wait, I didn’t) notice and consent

Mobile portable public toilet WiFi provider Purple recently added a “Community Service Clause” to its usual terms and conditions for wireless service:

The user may be required, at Purple’s discretion, to carry out 1,000 hours of community service. This may include the following:

  • Cleansing local parks of animal waste
  • Providing hugs to stray cats and dogs
  • Manually relieving sewer blockages
  • Cleaning portable lavatories at local festivals and events
  • Painting snail shells to brighten up their existence
  • Scraping chewing gum off the streets

More than 22,000 people accepted these terms during Purple’s two-week-long T&C gambit, with only one attentive person claiming the prize Purple offered to anyone who noticed this silliness. Purple conducted this experiment “to highlight the lack of consumer awareness when signing up to use free WiFi.” Winners include snails, local parks, sewer lines, and stray dogs and cats, now the potential beneficiaries of up to 22 million community service hours.  The clear loser? Those. Who. Don’t. Read. Notices.   
Continue Reading Reading privacy policies to avoid surrendering your firstborn child