- Technology vendors building solutions for deployment in healthcare love to talk about encryption and how it can help patient data security. It’s the silver bullet that allows physicians and patients alike to embrace new apps and tools. Symptoms may include increased confidence, decreased stress, and a hearty belief in the power of technology.
But what if that encryption was creating a false sense of security? What if the technology wasn’t providing a shield for ePHI at all?
Say goodbye to privacy, say goodbye to HIPAA compliance… and say hello to breach notifications and financial penalties.
Safe Harbor, as outlined by the HITECH Act, provides for the good faith determination of whether ePHI has indeed been exposed when a device with access has been stolen or misplaced.
It is based on the concept that strong encryption, properly deployed, would thwart even a determined attacker with physical access to an authorized device. Thus, even when a laptop or mobile device or external hard drive is lost, the data is considered to be intact and uncompromised inside the device if the data was properly encrypted.
This is a key distinction, and it is the difference between a breach notification (causing a significant hit to the brand and future revenues as well as serious financial penalties) and Safe Harbor (causing a large exhale of relief and a flurry of high-fives).
Here’s the rub – how is strong encryption differentiated from weak encryption for the purposes of HIPAA compliance?
Nobody would actually admit that their installed crypto was subpar or was not integrated correctly, but that would make all the difference in the world. Certain cryptographic algorithms, such as Dual EC DRBG for example, have been proven to be defective and have been removed from circulation.
The once popular (and now discontinued) BSAFE encryption module used to deploy this algorithm at the core of their solution, but the NSA knew exactly how to exploit it… and likely so did hackers that didn’t work for a three-letter agency. If this algorithm was still actively used, let’s say in a supposedly compliant EHR portal on a physician’s laptop that had been stolen, should Safe Harbor have been invoked? No; it wasn’t safe or secure and the data was not properly obfuscated.
Different types of encryption methods
The National Institute of Standards and Technology (NIST) established a program to test encryption. Not all encryption is equal, so NIST and their Canadian Communications Security Establishment (CSE) counterpart work in tandem at the Cryptographic Module Validation Program (CMVP) to analyze, test and validate that crypto modules are functioning properly and deploying approved algorithms (certified by the CAVP, the Cryptographic Algorithm Validation Program, naturally).
The validation programs rely heavily on an expert network of approved testing laboratories to work directly with vendors and coordinate certification with NIST/CSE. All algorithms and modules are tested for conformance with the Federal Information Processing Standard (FIPS) 140-2.
This standard was first established by NIST in 1994, and the ‘dash two’ at the end of 140-2 denotes the revision that was approved in 2001. While that standard may seem old, make no mistake that the CMVP can act quickly. As soon as the Dual EC DRBG vulnerability was disclosed, CMVP removed it as an approved algorithm for FIPS 140-2 products.
FIPS 140-2 validation is required for all encryption that is used by federal agencies, from the Pentagon to the Office of Education. It is the policy of the United States Government that encryption that has not been certified to meet the FIPS 140-2 benchmark is considered to be equal to plaintext.
Essentially this means that crypto is useless until proven otherwise, a blunt but accurate sentiment. Other sectors have adopted the standard as their own, as well, with increasingly strict adherence in state and local government, finance, and utilities. Either encryption is validated or it is not. It’s very black-and-white.
Healthcare encryption requirements
Unfortunately, healthcare vendors have been happy to explore the gray zone created by the minimal guidance in the HIPAA Security Rule regarding implementation of cryptographic controls.
For example, HIPAA 164.312(e)(2)(ii) states, “Encryption (Addressable): Implement a mechanism to encrypt [Electronic Protected Health Information] whenever deemed appropriate.”
This subjective determination of ‘appropriateness’ is one of the biggest gotchas in healthcare compliance. Take a poll of HIPAA consultants and see how many think that it’s a good idea to find encryption unnecessary on a self-assessment. One out of 100 perhaps, and that one won’t last long in the business.
It’s just a bad idea to take the risk that the Office for Civil Rights (OCR) or the Department of Health and Human Services (HHS) would disagree with your determination.
Due to the confusion surrounding HIPAA technical requirements, NIST created a special publication, NIST SP 800-66: An Introductory Resource Guide for Implementing the HIPAA Security Rule, to provide greater depth and breadth to these controls by mapping the HIPAA security controls to a standard security controls framework.
By following that protocol, we can unravel the standards. HHS enforces HIPAA, which refers to NIST, who demands the use of FIPS 140-2 validated cryptography across the board.
Ensuring patient privacy with data encryption
So remember how ‘appropriateness’ was one of the biggest gotchas in healthcare compliance? The other is the selective enforcement of validated crypto in Safe Harbor cases.
OCR and HHS have begun cracking down on breaches, but they have been lenient thus far. They have been willing to overlook the fact that the deployed encryption software has often been uncertified and of unknown quality.
By NIST’s rules, that crypto should be considered to be equal to plaintext and be excluded from Safe Harbor. It would be downright demoralizing for many of these covered entities to be exposed for using subpar crypto and be forced to disclose their breaches, but that might not be the right message to send.
It’s glossing over a key distinction and allowing technology vendors in the healthcare space to continue cutting corners and building mediocre products. The apparent willingness to cut providers a break on this requirement puts patient ePHI at risk and only benefits the vendors.
HHS should be leaning hard on those vendors, cracking down, and making it obvious that validated encryption will be table stakes for HIPAA compliant solutions.
Certified crypto is not on a spectrum of most to least effective. It’s binary. It is either tested and validated, or it is not.
NIST knows it, OCR knows it, HHS knows it, and the vendors know it.
It’s time to step up and enforce it. No more free passes for vendors. No more Safe Harbor allowances for providers when the ePHI is absolutely in danger of unauthorized access.
Patient privacy should come first, and that means penalties for unvalidated crypto. Once providers are faced with fines, they will send the message to the vendors by only procuring solutions with validated encryption. Unvalidated, unknown encryption is an absolute wild card and nobody should gamble their right to privacy on it.
Ray Potter is the CEO and co-founder of SafeLogic. Previously, he founded Apex Assurance Group and led the Security Assurance program at Cisco Systems. Ray currently lives in Palo Alto with his family and enjoys cycling and good bourbon, although definitely not at the same time.