Anytime we conduct a training, we can’t help but turn blue in the face repeating over and over again the importance of conducting an accurate and thorough risk analysis of electronic PHI (ePHI). In the event of a breach or an audit, one of the first items the Office of Civil Rights (OCR) will ask for is the risk analysis. The OCR has obviously lost its patience for entities that choose or fail to perform an adequate risk analysis. Earlier this month, Advocate Health Care Center (Advocate Health) agreed to pay a massive $5.55 million to settle multiple violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA). This settlement is the largest to-date against a single entity.
Privacy
Houston (Astros), We Have a Problem
Those in the privacy and data security (or baseball) world should be familiar with the St. Louis Cardinals and Houston Astros hacking incident. Former St. Louis Cardinals’ scouting director, Chris Correa, was recently sentenced to 46 months and ordered to pay restitution after pleading guilty to five counts of unauthorized access of a protected (Astros) computer, bringing an end to the federal criminal investigation. Recapping the hacking highlights, Correa accessed the Astros’ proprietary player information database, Ground Control. Ground Control contained the Astros’ “collective baseball knowledge” drawn from player statistics, impressions and opinions of the team’s scouts, coaches, statisticians and doctors, and other sources. Correa also accessed the email accounts of several members of the Astros front office including “Victim A” (likely former Cardinals executive and present Astros general manager Jeff Luhnow), “Victim B” (likely former Cardinals and present Astros sabermetrician Sig Mejdal), and at least one other person. According to the Astros, Correa accessed Ground Control at least 60 times on 35 different days over a 15-month period; one can only speculate as to breadth and depth of Correa’s access to the Astros’ email system. The intrusions initially appeared to have emanated from a device housed in a condominium in Jupiter, Florida (the Cardinals’ spring training home), but given the lengthy period of time, likely involved other devices in other locations. Correa gained access to the Astros’ systems by having Luhnow’s Cardinals’ passwords which were “similar” to his Astros’ passwords. Correa both reviewed and downloaded Ground Control information.
What Brexit means for privacy and data protection
Now that the shock has worn off and our 401(k)s have (somewhat) stabilized, we can begin to assess the implications that the UK’s historic vote to leave the EU may have on global privacy and data protection rules. While much uncertainty exists, companies should not panic as there will not be any immediate changes.
The Precision Medicine Initiative: White House privacy and security guidelines released
Precision medicine is an innovative approach to medical treatment that takes into account individual differences in people’s genes, environments, and lifestyles. The promise of precision medicine is delivering the right treatments, at the right time, to the right person. It provides medical professionals the resources they need to target the specific treatments of the illnesses that patients may encounter. Although the term “precision medicine” is relatively new, the concept has been a part of healthcare for many years. For example, a person who needs a blood transfusion is not given blood from a randomly selected donor; instead, the donor’s blood type is matched to the recipient to reduce the risk of complications.
Marketing in the age of data security
Technology has changed the way businesses market themselves to consumers. Businesses now have the ability to identify shifting consumer preferences, launch highly targeted advertising campaigns, and communicate instantly with potential customers. One thing this new marketing has in common? Consumer data. As marketing technologies evolve, companies should be aware that the myriad of data security regulations don’t just apply to how companies conduct their business, but how they market it as well.
Haunted by the past
Antiquated privacy laws are haunting businesses that base their privacy policies on current statutory language. Most laws intended to protect individuals’ privacy rights were designed with decades-old technology in mind. While this problem has been gaining attention for its impact on individuals’ privacy rights, businesses have also felt the effect of archaic privacy laws. Due to the public’s overwhelmingly favorable views toward privacy rights, businesses are becoming increasingly vulnerable to distorted interpretations of outdated laws.
What’s new with the Cybersecurity Information Sharing Act?
The Cybersecurity Act of 2015, signed into law on Dec. 18, has four titles that address longstanding concerns about cybersecurity in the United States, such as cybersecurity workforce shortages, infrastructure security, and gaps in business knowledge related to cybersecurity. This post distills the risks and highlights the benefits for private entities that may seek to take advantage of Title I of the Cybersecurity Act of 2015 – the Cybersecurity Information Sharing Act of 2015 (“CISA”).
It’s been clear for many years that greater information-sharing between companies and with the government would help fight cyber threats. The barriers to such sharing have been (1) liability exposure for companies that collect and share such information, which can include personally identifiable information, and (2) institutional and educational impediments to analyzing and sharing information effectively.
CISA is designed to remove both of these information-sharing barriers. First, CISA provides immunity to companies that share “cyber threat indicators and defensive measures” with the federal government in a CISA-authorized manner. Second, CISA authorizes, for a “cybersecurity purpose,” both use and sharing of defensive measures and monitoring of information systems. CISA also mandates that federal agencies establish privacy protections for shared information and publish procedures and guidelines to help companies identify and share cyber threat information. Notably, companies are not required to share information in order to receive information on “threat indicators and defensive measures,” nor are entities required to act upon information received – but this won’t shield companies from ordinary ‘failure to act’ negligence claims.
What’s the new EU-U.S. Privacy Shield made of?
Marvel fans know that Captain America’s shield is extraordinary, but exactly what it’s made of remains unknown – Vibranium? Adamantium? Unobtanium (oops, wrong movie)? For the time being, similar mystery shrouds the specifics of the new EU-U.S. Privacy Shield. Four months ago we posted on the European Court of Justice’s ruling that the U.S.-EU Safe Harbor was invalid. This Tuesday the European Commissioner announced negotiations with the U.S. had successfully yielded a new vehicle for compliant cross-border transfers of EU residents’ personal data, dubbed the EU-U.S. Privacy Shield. But until details of the new vehicle are disclosed, the specific features of the Privacy Shield remain murky.
Why encryption is less secure than you think
All encryption tools are not created equal. Just ask the folks at Microsoft, who have recently demonstrated that encrypted Electronic Medical Record databases can leak information. Turns out that CryptDB, a SQL database add-on developed at MIT that allows searching of encrypted data, allows search queries to be combined with information in the public domain to hack the database. More on this in a minute. In the meantime, let’s consider the assumption that encryption is inviolate/ infrangible/ impervious to hacks. As I mentioned in an earlier post, encryption algorithms are too complex for most laypersons to understand, but we should at least wrap our heads around the concept that encryption is not a “set it and forget it” technology, nor is it foolproof.
Cops or Robbers: PHI, the IRS and IRDs
HIPAA and the IRS. There isn’t a whole lot of guidance out there about what to do when the IRS knocks on your organization’s door and asks for protected health information. Should the agency be treated as a cop or robber?
The most risk-averse approach for a HIPAA-covered entity or business associate to take is to treat the IRS as a potential thief and draw the deadbolt when it comes to data requests involving PHI. Such a tack would, among other things, comply fully with HIPAA’s minimum necessary requirement and, frankly, reinforce the Everyman attitude toward the agency. Moreover, PHI produced in response to an information document request (IRD) is unlikely to be treated under 45 CFR 164.512 as a disclosure required by law, a disclosure for an administrative proceeding, or a disclosure for a law enforcement purpose, because the IRS appears to lack the authority to compel compliance with an IRD. However, we should be careful that we don’t always and automatically view the IRS with HIPAA suspicion – in some circumstances the IRS does perform a legitimate healthcare oversight function for which it may receive PHI without individual authorization, consistent with HIPAA’s treatment/ payment/ operations exception.