One Bullet in Gun BarrelHaving too much data causes problems beyond needless storage costs, workplace inefficiencies, and uncontrolled litigation expenses.  Keeping data without a legal or business reason also exacerbates data security exposures.  To put it bluntly, businesses that tolerate troves of unnecessary data are playing cybersecurity roulette … with even larger caliber ammunition.

Surprisingly few U.S. data security laws and standards expressly require that protected data be compliantly disposed of once legal and business-driven retention periods expire.   PCI DSS v3.2.1, Requirement 3.1, provides “[k]eep cardholder data storage to a minimum by implementing data retention and disposal policies ….”  HIPAA regulations  mandate that business associate agreements require service providers, upon contract termination, to return or destroy all PHI received or created on the covered entity’s behalf, if feasible.  Alabama and Colorado require that records containing state-level PII be disposed of when such records are no longer needed.  And biometric data privacy laws in Illinois, Texas, and Washington generally require that biometric data be disposed of once it has served its authorized purpose.

Instead, most such laws and standards focus on securely sanitizing or destroying storage media.  For example, the NIST Cybersecurity Framework v. 1.1 includes as a security control (PR.IP-6) that “[d]ata is destroyed according to policy,” and ISO 27002 (§ 8.3.2) provides that “[m]edia should be disposed of securely when no longer required, using formal procedures.”

But data security is not achieved by simply running through a checklist of explicit compliance requirements – it instead requires assessing risks and establishing effective security controls.  And one of the most powerful security controls is to not keep too much data, for too long.
Continue Reading

Depressed employee with laptopMost people have elevated stress during the holiday season — work, travel, family, money, time.  And holiday stress can make people inattentive, tired, frustrated, and willing to take short cuts, especially when it comes to computer and Internet use.  This is when mistakes happen.  It’s when we decide to evade policy by emailing work home or by using the unsecured airport Wi-Fi because our plane is delayed.  It’s also when malicious acts of information theft, sabotage, and fraud can more easily occur and go undetected.

According to a recent survey, insider threats — as opposed to outside actors — can account for nearly 75% of cyber incidents.  These incidents occur because of the actions of employees, suppliers, customers, and previous employees.  Law firms are not exempt, particularly small to medium size firms.  In fact, smaller firms typically have fewer resources to devote to cybersecurity and use more outside suppliers.

End-of-year activities for law firms also make them especially vulnerable to insider threats, whether inadvertent or malicious: the push to bill and collect for more hours, time-sensitive legal matters that must be resolved before the end of the calendar year, attending to year-end tax accounting, case and client review, bonus calculations.  Lawyers and their staff feel the strain of extra hours, looming deadlines, and sometimes contentious clients at the same time we all feel holiday pressures at home.

What is at risk?
Continue Reading

White hatTesting for technical vulnerabilities is a key part of security risk assessment.  To get the straight scoop on technical vulnerabilities, and how they’re exploited, why not ask a hacker?

Dave Chronister is an ethical hacker, a Certified Information Systems Security Professional, and the co-founder and managing partner of Parameter Security.  To borrow from the Farmer’s Insurance commercials, Dave knows a thing or two because he’s seen a thing or two.  He started early – Dave wrote his first computer program before age 8, and as a teenager he ran a large networked bulletin board system, through which he first experienced war dialing and the underground world of hacking.

Dave and his Parameter Security team perform technical security assessments (ethical hacking penetration services, code & device reviews, and social engineering exercises), post-incident forensic investigation, and training.  Dave regularly appears as a cybersecurity expert on CNBC, CNN, Fox Business, and MSNBC, and he writes and speaks internationally on hacking and system security.

I recently asked Dave for his thoughts on the current hacking landscape, and especially on why technical vulnerability testing is crucial to an overall security risk assessment. Here’s what he shared:
Continue Reading

ChecklistWould you take a deposition by solely following a template of standard questions, without assessing the unique issues and circumstances of the case?  Or conduct transaction due diligence by simply marching though a generic punch list, without assessing the unique aspects of the company, the deal, and the industry?  Of course not.  Your law firm’s data security posture is no different – you need a security risk assessment to understand your firm’s unique vulnerabilities to security threats, and to identify which security controls are already adequate for your firm and which other safeguards are needed.

But assessing security risks is more than merely a good idea.  Conducting a security risk assessment is also a compliance requirement under virtually every U.S. regulatory data security regime and security standard.  Some of these risk assessment requirements apply directly to lawyers and firms, such as rules of professional conduct and, for firms that are business associates of HIPAA covered entities, the HIPAA Security Standards.  Other such laws directly govern the firm’s clients, which in turn increasingly require them of their law firms as service providers.  And taken together, these statutes, regulations, and standards requiring security risk assessments have coalesced into general expectations for what constitutes reasonable data security.


Continue Reading

Driver looking under the car hoodI had a nagging worry that something was wrong with my car, so I finally decided to take it to the dealer.  I couldn’t exactly describe my concern, except there was an intermittent, “funny noise” coming from somewhere in the front end.  An unscrupulous dealer would have taken me down a long path of parts replacement, beginning with tires, then wheels, then tie rods, and on and on, perhaps never fixing the real problem.  Fortunately, my dealer was honest and performed diagnostics, ultimately discovering that the rack and pinion was failing.  The part was under warranty, so the repair cost me nothing and my funny noise is gone.

Was my worry constructive?  Yes.  It also went hand-in-hand with my own risk assessment.  What were the chances that the noise foretold a failure that would cause an accident?   Would I or others be hurt in the accident?  As it turned out, a failure could have been catastrophic.   In this scenario, I could prudently act on my worry because I had a basic understanding and control of the situation.  But it’s not always easy to act on worries—particularly if you don’t understand the issues or potential risks.

It’s reasonable these days for everyone, particularly lawyers, to have a nagging worry about information security.  That’s where independent risk assessment comes in.  Most lawyers know just enough about accounting and finance to help them profitably manage their firms, calling in experts when needed.  The same should be true for information security.  An independent security risk assessment not only identifies risk, it also helps to educate regarding likely threats and vulnerabilities.
Continue Reading

checklistIt’s a common complaint – most U.S. laws requiring data security never cough up the specifics of what must be done to comply. Unlike other areas of business regulation, data security requirements seem hopelessly vague:

  • Several states’ PII laws require businesses to implement and maintain “reasonable security procedures and practices” to protect PII from unauthorized access, destruction, use, modification, or disclosure.
  • Regulations under the Gramm-Leach-Bliley Act compel financial institutions to have a “reasonably designed”comprehensive information security program with administrative, technical, and physical safeguards to protect the security, confidentiality, and integrity of customer information.
  • FACTA regulations require that consumer report information be disposed of “by taking reasonable measures to protect against unauthorized access to or use of the information….”
  • HIPAA covered entities and business associates must address the security standards for ePHI in a way that protects against “reasonably anticipated threats or hazardsto ePHI security or integrity.
  • The FTC enforces reasonable data security under Section 5 of the FTC Act, which prohibits unfair and deceptive acts in commerce, without explicitly mentioning data security and without any supporting regulatory standards for specific data safeguards.

Obviously, we can’t just put “remember to have reasonable data security” in a compliance checklist or internal audit protocol, because “reasonable” tells us nothing concrete about what specific security controls are needed to be compliant.  So, why do these laws stop short of telling us specifically what to do?


Continue Reading