Two recent incidents demonstrate dramatically and publicly one of the biggest security risks facing the health care industry and others across corporate America—the risk of insider misuse of data access. Whether abuse is tied to relatively innocuous "peeking" at celebrity or family records, or to more malicious attacks tied to personal vendettas or to identity theft, companies need to pay significantly closer attention to how they control and police employee access to the personal data of others. This is a real risk affecting companies across the country, in a wide range of industries.

Recent Events

The past several years have witnessed a growing series of incidents involving medical facilities—usually hospitals—and employees who have peeked at the medical records of "celebrities." The list of affected "celebrities" continues to grow. In some of these cases, numerous employees have been terminated. In others, criminal charges have been pursued.

The Octomom Case

The most recent situation involves the first-ever penalty under the new California health care security law. In this case—which everyone has presumed to involve the Octomom's records even though this has not been confirmed publicly—the Bellflower Hospital in California was fined $250,000 after 23 employees of the hospital and affiliated companies accessed these medical records without authorization. The government finding in the case indicated that the breaches extended beyond the specific hospital in question, to other hospitals in the same corporate family, and continued even after initial reports to the state regulators about the breach. The state regulators also found that the security efforts to protect patient privacy were insufficient. Many of the employees were terminated or resigned, while others faced lesser disciplinary action. Unlike certain other "celebrity" cases, there has been no indication (yet?) that this information was sold or leaked to the media.

The Johns Hopkins Case

The Johns Hopkins Hospital system also faced a recent problem involving employee access. While the facts of this case remain somewhat murky, Johns Hopkins became aware (through various reports) of identity theft problems involving a variety of individuals, with the only apparent connection between them being a visit to the Johns Hopkins Hospital. Upon investigating, Johns Hopkins was led "to the suspicion that one Johns Hopkins employee, assigned to work in the patient registration area and who had access to patient identifying information as part of her job duties, may have been the source of information used for these fraudulent activities" (quoting the Johns Hopkins notice letter sent to the Maryland Attorney General's Office). This employee accessed information involving more than 10,000 patients over the course of 13 months, although much of this access, according to Johns Hopkins, was in the normal course of the employee's duties. While Johns Hopkins could not definitively conclude that this employee's access led to the fraudulent activity, it was forced to provide notice to all of these individuals, and credit monitoring to some, based on the results of its investigation.

What Should You Be Doing? Understanding the Risks

While inappropriate insider access is a specific category of security risk, three separate problems need to be addressed. It is critical for companies to understand the differences in these problems in developing an appropriate response plan.

First, the most prominent examples concern "celebrity" records. Insiders at a wide variety of companies—most prominently health care providers—have reviewed celebrity records that these employees have access to by virtue of their employment, but where there was no "legitimate" reason to view the records. The motivation for this access ranges from simple curiosity to more malicious situations—typically involving the sale of information to media outlets. The Octomom case obviously fits this situation—and the hospital in question was fined at least in part because it reasonably could have anticipated the problems involving this individual, given the media attention that had been paid to her case.

Second, insiders review information about individuals with whom they have some relationship—ranging from family and friends to ex-girlfriends, an individual whom they've had a run in with or a disgruntled neighbor. Again, the motivation may range from relatively innocuous curiosity (e.g., wanting to learn if the family friend is okay after surgery) to gathering damaging information on individuals where there is a personal vendetta. This is a similar problem to the "celebrity" issue—but with much different approaches to "solving" the problem—because there is no reasonable way for a company to identify the individuals whose data might be reviewed. The category of "celebrities" can at least be defined, even if imprecisely. This "friends, family and acquaintances" category could envelop anyone.

Third, this access can involve true criminal activity—access to data for purposes such as identity theft or health care fraud. Employees can use their access to data to obtain personal information that then can be used to commit crimes. This category is even broader—for any employee, the "victim" could be anyone whose data is stored within the employer. The Johns Hopkins example is useful in this regard. Johns Hopkins identified more than 10,000 patients whose information had been accessed by the individual in question. According to its investigation, "we believe that these accesses most likely were for appropriate business purposes." There was a smaller number (as small as 31 patients and as high as 526) where there were indications of inappropriate activity.

Approaches to "Fixing" This Problem

Limiting Access

Obviously, the first approach to this problem is to limit access to data. Whether mandated by certain specific laws or regulations (such as the Health Insurance Portability and Accountability Act) or for good practice, companies should review the access of their employees to data and impose effective controls wherever possible. This may be a legitimate solution to the "celebrity" problem—it may be reasonable and possible for employers (a hospital, for example) to dramatically cut down on the access of employees to "celebrity" medical records. While hospitals often are reluctant to impose too many restrictions on patient medical records—because access may be needed in an emergency—the risks in these "celebrity" situations may be sufficient to restrict the data in this way.

But, most companies have no realistic way of reducing access to deal with the "friends and family" situation or the more malicious situations involving identity theft and/or other kinds of fraud. This is particularly true for "customer service" or "customer-facing" employees, where access to data about a large number of individuals is necessary for these employees to perform their job functions. While it might be possible to restrict the access of a customer service employee to reduce this risk, the likely impact would be that the customer service function would suffer, perhaps significantly. Accordingly, most companies understandably are reluctant to rely on this approach. Do note, however, that almost any company can take reasonable steps to reduce—at least somewhat—the access that the employee base as a whole has to individual information. This simply requires an aggressive review of access, and the appropriate technological and procedural steps to reduce this access where feasible.

Policing Access More Aggressively and Effectively

Where front-end controls are insufficient, a more aggressive "back-end" approach is needed.

The lack of an effective "back-end" approach is one of the most obvious security failures at companies today. Where front-end access cannot be controlled, companies need to find a way to monitor employee access, so that there can be proactive and effective means of policing how employees actually are accessing data. For example, with "celebrity" records, it may be critical to have an audit trail tracking every employee's access to the medical records. Obviously, this is "easier" in an electronic setting than when the records are on paper. Companies need to review whether audit trails can be implemented on a broader basis generally. And, someone needs to make sure that the audit logs are in fact reviewed. Companies need to be reviewing patterns of behavior—whether it is access to records where there is no demonstrable reason for the access, or where access is more substantial than would be expected, or where changes are made to records without an appropriate trail. They also need to be taking direct and aggressive action against employees who misuse their access to data—even for relatively innocuous situations.

Conclusion

On the whole, companies—in the health care industry and otherwise—need to understand and recognize this real-world problem, and undertake reasonable steps to mitigate these risks. In general, companies in all industries—particularly those where there are customer service or other employees who are relatively low paid, often transient and yet have broad access to personal information—need to do a better and more complete job of effectively policing how employees access data.