HIPAA, the Health Insurance Portability and Accountability Act, defines patient rights and standards for protecting health information. We briefly explored this topic in the post Exploring Regulated Information: HIPAA Data back in December. Now let’s go into a HIPAA deep dive.
HIPAA Deep Dive
The main focus of HIPAA is to improve the health insurance accessibility for people changing employers or leaving the workforce. It also addresses issues relating to electronic transmission of health data in Title II, Subtitle F of the Act entitled “Administrative Simplification”.
The Administrative Simplification provisions include 4 key areas:
- National standards for electronic transmission
- Unique health identifiers for providers, employers, health plans and individuals
- Privacy Standards
- Security Standards
Privacy and Security Rules
The Privacy Rule addresses storing, accessing and sharing of health information. Plain and simple.
The Security Rule is more specific in the actual protection of the health information, while at rest and in transit. Being in compliance with the security rule includes performing a risk analysis, implementing reasonable and appropriate security measures, and documenting and maintaining policies, procedures, and other required documentation. Policies and procedures should be living documents, meaning they are adaptable. As such, compliance is not a one-time goal, but an ongoing process. The organization that is required to adhere to these rules is called a covered entity. A covered entity must implement policies and procedures to ensure:
- the confidentiality, integrity, and availability of all electronic protected health information.
- protect against any reasonably anticipated threats or hazards to the security of such information.
- protect against any reasonably anticipated uses or disclosures that are not permitted.
The HIPAA Security Standards were effected on April 21, 2003. The date to achieve compliance for covered entities was by April 21, 2005. Small heath plans got another year, requiring compliance by April 21, 2006.
Exact security measures are not defined. However, most protected health information is electronic and as such it’s imperative to implement security best practices as follows:
- Audit policies, procedures, logs, etc.
- Create and maintain policies for usage of workstations.
- Create measures to keep protected health information accurate and intact.
- Develop emergency protocols or procedures.
- Encrypt protected health information.
- Keep activity logs.
- Keep offsite backups.
- Keep production and testing environments separate.
- Limit physical and account access to servers to personnel on an as-needed basis.
- Protect workstations with automatic logoffs or timeouts.
- Use a firewall and allow VPN access.
- Use account controls for access and identify each user.
A security incident, also called a breach, refers to unauthorized access, use, disclosure, modification, or destruction of information. This can also be an attempt to do the aforementioned or simply interfere with system operation. Response and reporting requirements include:
- identifying and responding to security incidents.
- mitigating harmful effects of security incidents.
- documenting security incidents and the subsequent outcomes.
Examples of when notification is required:
- A user (employee, contractor, or third-party provider) has obtained unauthorized access to personal information maintained in either paper or electronic form.
- An intruder has broken into one or more databases that contain personal information on an individual.
- Computer equipment such as a workstation, laptop, tablet, writable disks, or other electronic media containing personal information on an individual has been lost or stolen.
- A department or unit has not properly disposed of records containing personal information on an individual.
- A third-party service provider has experienced any of the incidents above, affecting the original organization’s data containing personal information.
Examples that MAY not require notification:
- The organization is able to retrieve personal information on an individual that was stolen, and based on an investigation, reasonably is able to conclude the retrieval took place before the information was copied, misused, or transferred to another person who could misuse it.
- The organization determines the personal information on an individual was improperly disposed of but can establish the information was not retrieved or used before it was properly destroyed.
- An intruder accessed files that contain only individuals’ names and addresses.
- A laptop computer is lost or stolen, but the data is encrypted and may only be accessed with a secure token or similar access device.
Note: the organization should reasonably conclude after an investigation that misuse of information is unlikely to occur, and appropriate steps have been taken to safeguard the interests of affected individuals.
Who Must Follow HIPAA Rules?
Any company or agency that deals with protected health information (PHI) must follow the rules to make sure privacy and security measures are in place.
More specifically, covered entities (CE) and business associates (BA) that need to be HIPAA compliant include:
- Covered healthcare providers including hospitals, clinics, regional health services, and individual medical practitioners that provide treatment, process payments and perform healthcare operations.
- Healthcare clearinghouses including those that process insurance and other healthcare information.
- Health plans (including insurers, HMOs, Medicaid, Medicare prescription drug card sponsors, flexible spending accounts, public health authority, in addition to employers, schools or universities that collect, store or transmit EPHI, or electronic protected health information, to enroll employees or students in health plans)
- Their business associates including private sector vendors and third-party administrators that have access to health information and provide support in some way. Subcontractors and business associates of business associates must be in compliance as well.
HITECH and Enforcement
The Health Information Technology for Economic and Clinical Health (HITECH) Act was passed in 2009 and further supports private, streamlined accessibility of health records for healthcare providers and patients. The act was later revised in 2013, known as the HIPAA Omnibus Rule, with more strict enforcement and the rise of non-compliance penalties.
Both accidental and intentional violations will incur substantial fines. The HITECH Act was necessary due to health technology development. The increased use of electronic health information not only added convenience for proper use but also allowed for more compromises.
In general, this includes:
- performing a risk analysis
- implementing reasonable and appropriate security measures
- documenting and maintaining policies, procedures, and other required documentation.
Compliance is not a one-time goal, but an ongoing process.
Certified versus Compliant
There are many companies offering HIPAA certifications. There are also general IT providers and healthcare providers claiming they are HIPAA certified. First, there is no certification program recognized by the federal governing body of the HIPAA standard, the Department of Health and Human Services (HHS), and the Office of Civil Rights (OCR). Obtaining a HIPAA certification does not add value to employers or clients in the way that an accredited certification would.
On the other end of the spectrum, being compliant is not the same as being certified. A third party cannot make your organization HIPAA compliant. They can offer risk assessments and recommendations but compliance is your organization’s duty. Even if the third party auditing your organization certifies you, there is no guarantee OCR will find you in compliance.
What other HIPAA background information do you think is important?
See this FAQ and this Marketing Scheme Warning from the HHS website
Industry Insiders Say Don’t Bother with HIPAA Certs at TechRepublic
Free Training Materials from the HHS website