Chapter 12. HIPAA: The Far-Reaching Healthcare Regulation

Doctors and administrators evaluating health IT software frequently ask, “Is it HIPAA compliant?” We usually answer, “You tell us what it means to be HIPAA compliant, and we will tell whether or not the software is.” HIPAA is probably the most ironic acronym in healthcare. It stands for the Health Insurance Portability and Accountability Act. Although HIPAA has succeeded largely in making health information more “accountable,” it is usually the first excuse for not making it portable.

The regulation of health IT systems has long been a complex and evolving area. Ultimately, the courts and the choices by regulators about what to really enforce determine what any regulation really means. Recent court cases and enforcement have created a much more solid context for defining just what regulations mean. Previously, the federal government was criticized for lacking enforcement for HIPAA and other health-IT-related law. HITECH—the same law that created the meaningful use funding—has changed this, providing new mechanisms for enforcement that should ensure that regulations will really stick. This chapter looks at HIPAA, with a nod to some other federal laws. Each state has a collection of laws that can be just as important to know for your state.

The first basic task is to gain an understanding of how health IT is affected by HIPAA. There at least two reliable ways to do this. The easy way is to read the summaries that HHS provides for the HIPAA security and privacy rules. They lay out most of what you need to understand. For those who prefer the hard way, you can gain an even deeper understanding by reading the actual regulations yourself. This way, you do not take our word, or that of anyone else, regarding what the regulations actually say.

So far, the regulation is short enough that a technically informed layman can skim through it pretty quickly. Most of the law applies to processes and training that have nothing to do with health IT. Print it out and then use a highlighter to mark sections that you do not understand, or that you feel are especially important in your environment. As you do this, remember that we said reading HIPAA was useful and possible for a layman, but we did not say it would be fun.

Review these resources with your own environment in mind and make notes as you go over the content. At the end of this exercise, you might feel that you have everything well in hand or, alternatively, that you really need a healthcare lawyer. A specialist lawyer will understand HIPAA as a living document, subject to interpretation by judges and juries, but you should do some work to be prepared for that level of insight. You will be better off hearing a health lawyer’s advice with your own understanding of the law. Consider first asking your lawyer to explain what you did not understand about the literal text, so that you can have some confidence in your own understanding. Many in the healthcare world choose not to hire expensive lawyers and opt instead for seminars and other less expensive ways to ensure compliance. Either way, too much discussion of HIPAA is based on what the regulation ought to say, rather than what it does say. Reading the regulations, or at least the HHS site, can help you.

Regulators and the courts understand that the technology is moving faster than the regulations can keep up. If you are operating in a fashion that is in keeping with the spirit of the regulations, which are generally designed to enforce respect for patient privacy, then you will probably be fine. This is not a blank check by any stretch of the imagination, but if you are doing something innovative, can demonstrate you are trying to protect privacy as best you can, and are not obviously violating the law, you probably do not have anything to be worried about. (This is just an observation, of course, not legal advice.)

On the other hand, if you are ignoring your responsibility to read the rules, and you are not following industry best practices or the laws in question, keep in mind that these regulations come with serious fines and in some cases potential jail time. This is something to take very seriously, as the HITECH act has made the already stiff penalties of HIPAA far more potentially painful. Make no mistake: if you flout HIPAA, it could land you in prison or bankrupt your organization. Take it seriously.

With that in mind, we begin our discussion of health IT regulations with HIPAA, the most significant and expansive health privacy law in the United States.

HIPAA is a bundle of provisions that touch on numerous health industry matters, passed by the U.S. Congress during the Clinton administration in 1996. Although the law created substantial changes in the healthcare industry, it has lain somewhat dormant until recent years. Only recently has HHS begun seriously enforcing the rule, a laxness that has postponed the court decisions that normally end up determining what the details of any law actually mean in practice. Moreover, the HITECH act also modified the provisions of HIPAA substantially. Hopefully this will allow us to discuss the law in a way that will not be quickly outdated. Still, the law changes quickly, and it is important to use recent resources when creating a compliance plan.

The first thing to understand about HIPAA is that it does not cover all healthcare data or all people who know something about a patient’s health.

HIPAA was put in place to protect the privacy of individuals from the enormous power that players in the healthcare industry have, relative to other people in our lives. If you tell Aunt Jenny that you have a serious healthcare condition, Aunt Jenny can walk right into the offices of a major newspaper and hire out an ad that details your health problem for the world to see. You might call telling your Aunt Jenny a secret a “privacy choice,” specifically a “bad privacy choice.” HIPAA has nothing to say about that. Your doctor often knows information about your health before you do, and often knows information that you would prefer that no one knows, but that you must share to get healthcare. As a natural result, your doctor has a tremendous amount of data without you having made any kind of privacy choice. HIPAA places a burden on your clinical providers, and other “covered entities” in the healthcare system—health insurance companies and health care insurance claims clearinghouses—to restore a patient’s ability to make privacy choices.

Generally, HIPAA covers anyone who is obviously in a position of power regarding your health data, but does not cover Aunt Jenny or anyone else with whom you might consider sharing your secrets. HIPAA comes into play when a healthcare provider or other covered entity has been given, or generates, healthcare data about a patient. Then, and only then, does that entity have certain obligations regarding that data under HIPAA.

Moreover, HIPAA extends to organizations and persons with whom that entity shares that data, organizations and persons known as business associates. So most of the discussion regarding any given data point should begin with, “Is this data HIPAA-covered?” Happily, this question is rarely hard to answer. If you are providing IT services to the healthcare provider as an employee, all data the patient gives you and all data you generate about the patient is always covered. If you are not an employee (and often even if you are), you will be asked to sign what is known as a Business Associate Agreement, which will spell out clearly that all data you receive from the covered entity is HIPAA-covered. HITECH has several provisions that put those given secondary access to patient data (business associates in HIPAA speak) on a largely equal footing with the original HIPAA covered entity. If you come in contact with patient data that is HIPAA covered, you are a HIPAA covered entity of one kind or another, and must have appropriate policies and procedures in place to handle this patient information appropriately.

For most readers, that can serve as the end of the discussion on when HIPAA does and does not apply.

If the preceding discussion does not sufficiently clarify your position, you can use a general rule of thumb: if you are involved with the generation, routing, or processing of electronic medical billing, you are probably a covered entity under HIPAA. That includes clearinghouses and medical billing companies that route medical bills, payers that receive medical bills, and of course healthcare providers who originate the electronic bills. If, somehow, you are still not sure given this rule of thumb, then you need to carefully read the document that provides complete flowcharts answering this question.

If you are still not sure, act as if you are covered (by putting a hold on any sharing of patient data) until you can consult with a lawyer who specializes in healthcare law.

If at this point, you are sure that your health information service is not covered by HIPAA—congratulations, your life will probably be much easier. You are probably providing a PHR, or legally speaking, you should consider yourself one. From a legal perspective, a PHR is software that stores patient health information obtained either directly from the patient, or from another organization following a request from that patient to release that information. Ironically, the patient’s right to request and receive this data is also detailed in another part of HIPAA. You are basically in the same boat with Aunt Jenny: you get health data as the result of a patient choosing to give you that data. Contract law, user agreements, or a privacy policy covers your obligation to patient privacy.

Even if you are a PHR, you still need to care about HIPAA! The HITECH act created new rules, enforced by the Federal Trade Commission (FTC; HIPAA is typically enforced by HHS) regarding breach notices. The FTC breach notices will be discussed shortly. We’ll see more reasons as we go along.

At this point, you should feel comfortable knowing that you either are or are not a HIPAA-covered entity. The industry term for data that is protected under HIPAA is protected health information (PHI). As a covered entity, you have PHI, and you need to take steps to protect it. HHS often refers to e-PHI to delineate PHI on computers. Note that PHI is either identifiable or easily reidentifiable. If you truly separate the identity of the patient from a set of health data, that health data is no longer PHI. Truly separating PHI from the identity of the patient is called deidentification and is much harder than it seems, and you should assume that you have not done so successfully until you are absolutely certain that you have.

The HIPAA regulations list 18 categories of identifying information; if a record has even one of them, it cannot be considered deidentified. They range from the obvious (patient name, date of birth, Social Security number) to the less obvious (IP address, state of residence, age if the patient is over 90), to the typical government catch-all (anything else that could be used to identify a patient).

HIPAA allows deidentified data to be shared freely, primarily to enable research. If you are guessing about whether you have truly deidentified a set of patient data, you are playing a very dangerous game. Here is a rule of thumb for deidentification: If you are using an algorithm to deidentify patient data, it had better be more reliable than a human doing the same process.

HIPAA came with two titles. Title I is all about changes to regulation for health insurance, and will not be covered here. The second title directed HHS to create several “administrative simplification rules” that would improve the handling of healthcare data. Out of this process came at least five rules covering the following:

Of these, we can dismiss some as not relevant to our discussion on health IT. The enforcement rule, for instance, details how enforcement will take place, and sets penalties for infractions of other rules. For our purposes it is enough to say that HIPAA is enforced and that penalties can be very stiff, including jail time for blatant offenders.

The transactions and code sets rule dictates specific code sets for electronic billing. These are discussed in Chapter 3 and Chapter 10.

The unique identifiers rule creates a unique numbering system, called the National Provider Identifier (NPI) system, for all healthcare providers and healthcare plans in the country. Every doctor or other healthcare provider who either prescribes medications or bills Medicare, Medicaid, or other health insurance is required to have an NPI. As a result, almost all healthcare providers have NPI records. Moreover, many of the corporations involved in the delivery of healthcare are listed in the directory. The NPI directory is published, including updates, each month from the National Plan and Provider Enumeration System (NPPES) website. The NPI directory is also available as a RESTful API resource, which is more fully discussed in Chapter 10.

The privacy rule has few direct health IT implications, but forms the heart of HIPAA’s aspirations with regard to privacy. Moreover, new requirements created by meaningful use essentially require automation of what previously were manual operations under the HIPAA privacy rule.

The bulk of specific health IT regulation takes place in the security rule. This is where the law specifies things like encryption standards. Computer security professionals should be well-versed in the language of the security rule, the contents of which are readily understandable by those with IT training. For healthcare professionals, the requirements of the security rule can be somewhat intimidating, because it uses very specific terminology regarding information security in many of its requirements.

The HHS summarizes the responsibilities detailed in the security rule for covered entities like this:

  • Ensure the confidentiality, integrity, and availability of all e-PHI they create, receive, maintain, or transmit.

  • Identify and protect against reasonably anticipated threats to the security or integrity of the information.

  • Protect against reasonably anticipated, impermissible uses or disclosures.

  • Ensure compliance by their workforce.

That is a perfect 10,000 foot summary.

A more detailed examination of the security and privacy rules, along with amendments to HIPAA from the HITECH act, the current and future requirements of meaningful use, and a healthy dose of industry best practices, allows us to create the following list of requirements for health IT systems. Because we are mixing what you are legally required to do with industry standards and common sense here, you cannot use this discussion to calculate your legal responsibilities, but this does make for a pretty good starting to-do list. It takes you above and beyond HIPAA compliance, and although it might be legal to ignore parts of this advice, you should have a very good reason for doing so, and probably have hired a lawyer to make certain that what you were ignoring was prudent.

  • The core goals are confidentiality, integrity, and availability (CIA) for PHI. Learn the meaning of CIA in relation to patient data, embrace its implications, and recognize that in health IT we are responsible for best practices and not just legal compliance. Doing the minimum here will eventually burn you. A good rule of thumb is that it’s usually spending money in the long run to achieve better security—so long as you spend it on things that really matter, and not just a false sense of security. If better security comes at the costs of patient care, you have some fudge room. If you are obviously embracing security costs that do not impact patient care (i.e., a properly maintained firewall and antivirus software), regulators and enforcing agencies might be more lenient when you make choices to ignore security best practices in a way that obviously benefits patients.

  • Patients have the right to get copies of their health data within 30 days of a request. Meaningful use requires mechanisms for patients to receive this data in both electronic and paper form. EHR and other health IT systems should probably support several modes of export, including sending data to a tethered PHR (defined in Chapter 6), emailing the patient record to the patient on the Direct network (see Chapter 10), providing an electronic copy on a burned CD-ROM or USB stick, and printing the contents of the electronic record clearly. One of the most substantial fines ($4.3 million) ever levied for HIPAA violations concerned this simple issue.

  • Patients have the right to make amendments to the contents of their health records, if they view the data as incorrect or incomplete. Eventually, meaningful use will likely require that this occur using HIE mechanisms. Mechanisms should be in place to note that certain data has “patient comments” or that certain data is “patient sourced.” Data that comes directly from a patient controller PHR should never be excluded from the core record even if it is viewed as inaccurate (as opposed to other data coming in from the HIE). Most important, even modifications to paper charts should have electronic logs, so that you can demonstrate compliance with the “patient amendment” rule.

  • Patients have the right to get a list of all the disclosures of their health data. A disclosure here means that a covered entity shares HIPAA-covered PHI with another organization. When a doctor sends clinical information to a health insurer for payment, for instance, that counts as a disclosure under HIPAA. HITECH extends this to include detailed reporting of electronic data disclosures over the last three years. It is a good bet that this disclosure will someday need to be delivered in an electronic format. (e.g., Direct, CD-ROM, USB). One ethical procedure that goes beyond compliance with this rule is to simply include the patient in all communication regarding the patient. There is no reason that all electronic communication regarding the patient could not be copied and automatically sent to the patient, too. This becomes much simpler as the result of the Direct Project. (However, I am not suggesting you copy the patient on all communication or disclosure—think about how many minute disclosures and shares there are in the course of an episode of illness. See the author’s call for restraint on interpreting privacy rules. Notification for disclosures, no matter what approach is taken, can never be completely automated. Some HIPAA disclosures, related to warrants for instance, must not be reported to the patient.

  • Covered entities must appoint a privacy officer and have a mechanism to receive complaints regarding HIPAA violations or requests from patients who want to exercise their rights under HIPAA. It is reasonable to assume that regulations will eventually require digital accounting of the work performed by this individual.

  • There must be a privacy policy in place, overseen by the privacy officer, and audited annually by this officer for compliance.

  • There must be a security policy in place, with yearly audits to enforce compliance. These audits should specifically perform risk analysis, which evaluates the risks to e-PHI against the technologies used to protect that e-PHI in an ongoing basis.

  • Both the security and privacy policies and their audits should specifically cover e-PHI.

  • Those policies that are sitting on the shelf collecting dust in the boss’s office do not cut it. HITECH has substantially changed HIPAA, and those who do not update their policies and procedures to match may be regarded as “willful neglectors.”

  • You should have HIPAA audits whenever you have a HIPAA event and at least yearly otherwise. Keep the results of the audits, and the changes you make as the result of your audits, for at least six years.

  • Your policies should include how to respond to issues discovered during audits.

  • Both the security policies and the privacy policies should encompass the process of hiring and firing employees. There should be some digital log that shows that when an employee left an organization, his access to clinical information systems containing PHI was removed.

  • Classes of employees must have access to classes of PHI. In the IT security industry this is called role-based access control. If everyone has access to all clinical information, you must specifically justify why this was needed in some kind of dated and formal document. Your billing and scheduling staff, for instance, do not need full access to the patient record to do their jobs. They need access only to the data needed to perform their jobs, and both policy and software should enforce these limits. HHS calls this level of access “minimum necessary.” Attempts to circumvent these boundaries should be logged. HHS calls this “access controls.”

  • Regularly train staff regarding the handling of PHI, including e-PHI.

  • If your organization hands over PHI to a business associate, you must have some policy to ensure that basic protection is being taken with regard to the PHI you have provided.

  • You have obligations to protect the accessibility and integrity of the PHI under your care. You must have an emergency plan to ensure that PHI is not lost in case of an emergency. Remember that an emergency can mean anything from a hurricane, to an act of war, to a simple hard drive failure. Your plan for recovering PHI should include off-site encrypted backup. Because encryption makes backing up more unstable, your emergency plan should include at least yearly “test restores” to verify that off-site encrypted backups can be re-created without access to anything on-site. (For instance, if you keep your encryption keys on-site and you lose that information when you are hit by a tornado, it does not help you to have off-site encrypted backups). HHS calls this “integrity controls.”

  • You must limit access to physical copies of health data (both paper files and EHR servers, etc.). You should probably have systems in place for logging physical access to healthcare data, although this might be prohibitively expensive. At a minimum it should require traditional, metal keys to access servers storing e-PHI.

  • You should generally have a physical access plan that includes facility security plans and logs of visitor sign-ins and escorts. Of course, you really need to escort guests, assuming they would otherwise be able to access information systems containing PHI. For simplicity, your patient workflow should never allow patients to wander into areas that contain servers with PHI. No guest access control or logging system will be effective if strangers can physically access an area containing PHI media.

  • Electronic access to PHI should be limited and logged. Given the importance of the uptime of clinical systems, this logging should be part of a comprehensive logging and monitoring strategy. If someone asks you whether Dr. Smith accessed John Doe’s electronic patient record last Tuesday, you should be able to definitively answer that question. HHS refers to this as “audit controls.”

  • Because you are required to log certain events to be HIPAA compliant, it might be possible that PHI is slipping into the logs themselves. It is not, in itself, PHI for your log to read: “Dr. Smith accessed patient record #111111 last Tuesday.” But if it reads “Dr. Smith accessed the record for John Doe last Tuesday,” or if the log ties patient record #111111 to John Doe elsewhere, the log is PHI, and must be kept private. And if your system logs might contain PHI, they must always be treated as though they do contain it. The very fact that a person is attending your clinical facility is technically PHI, and that means if you have the name of a person other than a staff member in your logs, it should be presumed to be PHI unless proven otherwise by an internal audit. For simplicity, try to keep patient names and birthdays out of the logs.

  • You need a strategy to ensure that patients and guests will not be able to see other patients’ health information on screens. Usually that means screen savers with passwords, and monitors that cannot easily be viewed by patients and guests. It also means that there should be some type of auto-logout mechanism in your EHR. Be aware of what “pops up” on screens when people log back in. Again, even the fact that a person is a patient is generally considered PHI. If a patient sees a neighbor’s name on the screen after a clinician dismisses the screen saver, that is considered a breach. This issue becomes more important as clinicians consider using computers to go over the electronic record with the patient in the exam room. This is a wonderful idea, but you must ensure that no patient names continue to be visible when the next patient enters the room.

  • When being transferred across open networks, PHI must be encrypted. The Internet, obviously, is an open network. If you fail to have a properly configured firewall, or if you have a rogue WiFi hotspot, your network might be classified as an open network. (This is especially true if it becomes obvious that you knew you had a leaky firewall or rogue hotspot.) Most agree that network traffic should be encrypted where possible, even across local networks, to protect against a “closed” network suddenly becoming an “open” network. HHS calls this “transmission security.”

  • You must have mechanisms in place to protect against data erasure, both accidental and purposeful. Your backup strategy is an important part of this, but if data can be deleted at the EHR level, and this deletion is then replicated to backups across lower-level backing systems, you might have a problem.

  • Engage in ongoing risk management. Suppose you decide that you are going to use Mac OS X on your network to avoid getting an antivirus solution. If viruses targeting Mac OS X become common, your risk management process should pick up on that fact and you should invest in OS X antivirus technology. The environment changes, and your IT processes should change, too.

  • When you are retiring hardware of any kind, assume that it has PHI on it and erase all data completely. You do not want to be that guy who sold a populated EHR server on eBay accidentally, and yes, it has happened. Have a policy that specifically mandates that decommissioned servers, workstations, laptops, cell phones, and portable drives are all erased before being retired. For most organizations, especially smaller ones, it is simpler and less expensive to assume that every computer or device has PHI than to track which ones have PHI. If something can store digital information, and it is being thrown away, wipe it just in case.

  • Although HIPAA “trumps” less restrictive state-level privacy laws, HIPAA and other federal and state privacy laws sometimes conflict. The general rule is that whichever rule is more protective of patient privacy prevails. It is not always easy to identify which one that is, however, and you do not want to get caught in a turf war between federal and state regulators. If you find that your reading of state law and HIPAA collide, hire a health care IT lawyer. This type of issue is precisely what their expertise is for. HITECH provisions allow state attorneys general to enforce HIPAA, and to provide funds to state governments through HIPAA-related fines.

  • HIE is wonderful, but different states have very different laws regarding how to handle PHI. Merely transferring certain health data across state lines could get you in legal hot water. In some cases, even different cities within different states have different rules.

  • Computer viruses have a unique ability to negatively impact the integrity of PHI. An ad-hoc, consumer-grade antivirus strategy might work for smaller sites. If you have more than 50 computers in your clinical organization, you probably need to have some kind of centralized antivirus solution. If you have more than 100 computers, lacking centralized antivirus software would probably be regarded as irresponsible.

  • If a device is mobile, or might be, encrypt it if you can. Prefer whole-drive encryption approaches, because this ends up being simpler for end users. Do not presume to know which mobile devices will be used for PHI and which will not. There are too many good, cheap encryption technologies (e.g., TrueCrypt; see http://www.truecrypt.org/) to neglect this. If a mobile device cannot easily support encryption, make sure that your training emphasizes that such mobile devices should never store PHI. Alternatively, you can ban mobile devices entirely: no laptops, smartphones, or USB sticks allowed. This type of approach is unlikely to be well-received or easy to enforce, though.

  • Recognize that encryption is not a data-loss prevention strategy. It is a data-loss mitigation strategy. There is evidence to suggest that data encryption does not actually reduce the incidence of data loss). Many organizations underinvest in important efforts to protect patient data because they are “encrypting it.” But remember that if you lose the encryption keys along with the encrypted data, the encryption does not help much. If you are encrypting data on your central EHR server, and you encrypt the network connection to your workstations, you still need to worry about copy and paste. Most of the time, privacy breaches are based on either clear human error, or in some cases, insiders who are intentionally betraying the trust the institution has placed in them. Either way, the encryption cannot protect your patients’ privacy if the people who have the “keys” are either intentionally or unintentionally failing to protect the data. If you have to choose between good training and background checks or encryption, focus on the human element, and postpone the deployment of encryption.

  • When using encryption to protect health data, default to encryption built the on NIST-approved AES 256-bit standard. This standard, also called Rijndael by the Belgian developers who invented it, seems to have won the widespread admiration of several different HITECH-related standards. However, specific encryption standards can become dangerous to rely on overnight. Pay attention to news announcing a new standard from NIST replacing AES 256, but for now it is the gold standard.

Many of the requirements mentioned here, and specifically in HIPAA, are “addressable” rather than required. That means that you have to follow the recommendation, provide a reasonable alternative, or document why you did nothing. Generally, if you have to do something that the law forbids because of some extenuating circumstance, document the event carefully as an exception to policy and do not try to hide it from auditors. If you are required to report events, do so in a timely manner. Doing otherwise could mean that you are guilty of “willful neglect,” which mandates that HHS deliver a stiff penalty if they find out. Screw up and be honest, and you will suffer some. Screw up and hide it, and you will suffer greatly.

The specific wording on the distinction between “addressable” and “required” from the HHS website is worth repeating:

Moreover, it is worth detailing exactly what HHS has to say regarding the vastly differing resources that different clinical environments can offer.

HHS intends HIPAA to be interpreted and implemented differently at different sites. No one should think that a “one-size-fits-all approach” will work with HIPAA compliance. In reality, there are hundreds of different approaches to HIPAA compliance that would probably pass muster with an HHS audit, and there are hundreds of approaches that would seem right but that they might find wanting.

Given that HHS is being reasonable regarding your resource constraints, it becomes worthwhile to approach HIPAA compliance with some ingenuity.

Just because you are following regulations does not mean that you have to be stodgy or traditional or lay out a lot of money. Veterans Affairs, after some very embarrassing incidents where patient data was lost because of stolen laptops, has created one of the most draconian information security schemes on the planet. This has had a twofold effect. First it is nearly impossible to get anything unusual done in the VA environment. USB sticks do not work, web browsers are stuck at ancient versions, and so on. Ironically, because it is nearly impossible to get anything done, many who work at the VA tend to work around the rules, by working outside the VA network.

Being draconian about security ensures that your users will revert to silent rebellion. If it becomes apparent, in a legal situation, that management was aware of this kind of policy-reality dissonance, the benefits of a super-strict policy can quickly be negated. Instead try to find solutions to difficult problems by thinking outside the box. A common policy at the VA and other institutions with strict polices is to forbid normal USB sticks. USB sticks are really useful for countless tasks that have nothing to do with PHI, so forbidding them only marginally improves security (they normally do not have PHI anyway) and dramatically reduces productivity.

Instead of forbidding the use of USB sticks, make their use safer. Use the following methods to ensure that when a USB stick containing PHI is lost, you can be certain that its contents were encrypted, and it is much more likely that the USB stick will be discovered.

First train your staff on how to use TrueCrypt, or some other encryption product of your choice, to safely store PHI on USB drives. Then visit the hardware store and buy several large chunks of sanded plywood (larger than 15 inches by 15 inches, so that it would be very difficult to fit in a backpack or briefcase). Using a permanent marker, write the following on both sides of the plywood: “This USB key is to be used internally at Acme Clinic only. Please do not take me home. Please always encrypt me with TrueCrypt/Whatever. If found outside Acme Clinic, please return to Acme Clinic at 1000 Acme Way, Houston, TX. 77004 (713)111-1111 for a $100 no-questions-asked reward.”

Drill a hole in the corner of each plywood piece and use a large key ring to attach the USB key to the large key ring. Make sure both the USB key ring and the USB key are very sturdy. Put a QR code sticker on the USB key itself with the same “reward if found” content embedded. Also put a plain-text file called readme.txt on the drive with the same information.

The problem with USB sticks and laptops is that well-meaning employees accidentally take them home and then get their backpacks or briefcases stolen. By having a very inconvenient keychain and a constant reminder to use encryption with the USB stick at all times, you can get all the benefits of an in-clinic sneaker net, without the problems associated with wanton USB stick use. Make it clear that regular USB sticks are strictly forbidden, and back that up with audits of USB mounting logs. Name the USB keys something like XYZAcmeClinicUSB1 so that when they are mounted on a computer they can be differentiated from outside USB sticks.

You can easily make 10 of these using sturdy USB drives bought in bulk for under $100. This, plus the use of TrueCrypt on all Windows laptops and servers and the whole-disk encryption technology that is built into most modern Linux servers, makes a viable alternative to an enterprise encryption architecture that could cost hundreds of thousands of dollars.

Consider marking your in-house laptops with similarly garrulous markings. (For heaven’s sake, do not mark a laptop as potentially having PHI without also encrypting the laptop’s hard drive.)

HIPAA does not require you to encrypt your servers. However, mass deployment of encryption across every computer in your environment can sometimes be easier than just encrypting laptops. Having a policy that enforces the use of inexpensive, open source hard drive encryption technology can be a much cheaper alternative to purchasing an encryption management system that can effectively manage an environment where some systems are encrypted and other systems are not.

Almost all networking security components such as firewalls, network intrusion detection systems, and antivirus products have robust open source alternatives that can save an organization tremendous amounts of money, assuming they are competently maintained. Alternatively, Microsoft provides slightly more expensive products that almost always work together well and might need less maintenance. In either case, any clinical organization is going to need help merging the traditional work of IT (just keeping computers running) and health IT (keeping EHRs and their satellite systems running). Make sure that your “normal” IT people are working in concert with your health IT people. An HHS HIPAA audit is a bad time to learn that your organization’s technical people were conducting a turf war at the expense of properly handling PHI. Consider bringing in an outside team of lawyers and health IT consultants to do a mock audit to identify holes in your system that you are too close to see.

If you have a breach of HIPAA-covered patient data, you must report it, both to HHS and to the patients whose data was breached. Moreover, the FTC enforces the same rules for non-HIPAA-covered PHR providers. The breach notification rules are one of the most significant new regulations on the PHR industry.

There are several magic numbers to remember regarding breach notification: 500, 60, and 10.

If a breach of unsecured PHI occurs that impacts more than 500 patients, a covered entity or PHR provider is required to report the breach more quickly, within 60 days, and is also required to notify the media. Large breaches are essentially automatically made public information on the HHS online “wall of shame.” Remember, a single lost laptop or USB device could easily contain information for thousands of patients. This is the reason that the industry is so concerned with device encryption. PHI on an encrypted device is not considered unsecured and therefore no embarrassing notification is necessary in the event a device is lost or stolen—it’s not considered a breach.

If fewer than 500 unsecured records are breached, you can save some embarrassment. In this case you only need to report the breach to HHS or the FTC and, of course, the patients involved in the breach.

This brings us to the magic number 10. If, when attempting to contact patients for whom breaches have occurred, it is discovered that contact information for more than 10 of the patients is out of date, then the breach must also be published either in a major newspaper where the patient lives, or on the home page of the clinical provider. Of course, this could be even more embarrassing than the notices required when more than 500 records are lost.

As a result of this rule, PHR companies especially have been much more careful to gather alternative means of contacting consumers. Breach notices can be sent to patients via first-class mail or email, assuming the patient has agreed to electronic notices. As a result of the breach notification rules, many clinics and hospitals in the United States have begun asking for email addresses for patients for the first time. PHR companies frequently require two email addresses as well as a postal mail address for patients as a result of the new rules.

The FTC definition of a PHR company broadly includes the notion of anyone offering to manage health information for consumers on the Internet. Specifically, from an FTC site on the Health Breach Notification Rule:

If you have an online service that allows consumers to store and organize medical information from many sources in one online location, you’re a vendor of personal health records.

This rule likely applies to anyone setting up shop as a PHR, a patient social network, or any other service that “manages health information” for consumers. It is unclear, at this point, just how broadly the health and fitness industry will be covered under the rule.

If you are a health information service provider who is supporting a PHR or a HIPAA-covered entity, then no matter how many people were involved, you have only 60 days to notify that entity and allow them to initiate their own disclosure responsibilities. For most part, as a PHR or covered entity, if your business associate has a breach, your responsibilities are the same as if you had a breach. As a business associate or service provider, mere notification to that entity is not sufficient; you must confirm that the entity actually got the message.

For the seasoned security professional, HIPAA can seem like a lightweight regulation.

For the most part, HIPAA simply mandates some very high-level requirements regarding your policies, your workflows, and the audits that ensure that the two are continually connected. The goal that your policies and audits should seek is to keep the outsiders entirely away from PHI, without interfering too much with the work that the insiders need to do with that PHI.