Consumers are increasingly turning to health apps for a variety of medical and wellness-related purposes. This has in turn caused greater amounts of data—including highly sensitive information—to flow through these apps. These data troves can trigger significant compliance responsibilities for the app developer, along with significant legal and contractual risk. It’s mission-critical to the successful development (and future viability) of a health app to consider the privacy issues up front (otherwise known as “privacy by design“) because it is cheaper to build it in than it is to remediate.
(Note: This was originally posted as part 6 of a 7-part series on Building a Health App? on our sister blog, Health Law & Policy Matters.)
The primary federal law of concern to health app developers is the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”). To determine whether HIPAA applies, developers need to first understand the nature and the source of the personal data associated with their app. HIPAA only applies to protected health information (“PHI”) which is broadly defined as information that:
- is created or received by a covered entity (i.e., a health care provider, health plan or health clearinghouse); and
- which relates to the past, present or future mental or physical health of an individual; and
- that identifies the individual.
Some apps will avoid HIPAA by virtue of the app not interacting with data that is created or received by a provider, plan or clearinghouse. For example, apps that require end users to input their own health information may not have to comply with HIPAA. However, other developers providing apps on behalf of covered entities are not so lucky. In these cases, the developer is likely to be considered a “business associate” of the covered entity. A business associate is broadly defined as anyone who stores, collects, maintains, or transmits PHI on behalf of a covered entity. Again, context is crucial when determining whether a developer is a business associate. A developer offering a diabetes app on behalf of an insurer, and using PHI of the insurer, would be considered a business associate, but a developer independently offering the same exact app to the general public, and using health information volunteered by the public, would not. These distinctions can be subtle, and developers should take time to determine their position within this framework before marketing—and ideally before developing—their app. Fortunately, the U.S. Department of Health and Human Services Office for Civil Rights (OCR) has released guidance that analyzes various scenarios involving health apps. It is a must read for any developer that comes anywhere near health information.
Sometimes HIPAA is unavoidable given the functionality of the app and the context in which it is being provided. In these cases, the developer should not take its HIPAA obligations lightly. Developers will face a number of obligations, many of which emanate from HIPAA’s Security Rule. The Security Rule, which applies to electronic PHI (“ePHI”), requires both covered entities and business associates to ensure the confidentiality, integrity, and availability of all ePHI that the entity creates, receives, maintains, or transmits. To do this, the developer is obligated to implement administrative, physical and technical safeguards for the ePHI, and must conduct a thorough documented analysis of its systems in order to determine how it will implement these safeguards. Policies and procedures must also be created and enforced, and the developer’s workforce must be trained. A developer that fails to comply with HIPAA can be subject to steep fines and penalties.
Many developers contract with cloud services providers (“CSPs”) for various services, including storage of user data. These “downstream” relationships may also be subject to HIPAA. Last year, the OCR published guidance to assist developers and their CSPs with HIPAA compliance. They provided a principle that is useful for developers and CSPs alike:
When a covered entity engages the services of a CSP to create, receive, maintain, or transmit ePHI (such as to process and/or store ePHI), on its behalf, the CSP is a business associate under HIPAA. Further, when a business associate subcontracts with a CSP to create, receive, maintain, or transmit ePHI on its behalf, the CSP subcontractor itself is a business associate.
The guidance delves deeper into the application of HIPAA in the cloud, and should be reviewed by developers that use CSPs. If the parties determine that the CSP is a business associate, the parties must enter into their own business associate agreement (often referred to as a “sub-business associate agreement” or a “subcontractor agreement”) that is no less strict than the upstream agreement between the developer and the covered entity. The guidance also addresses the effect of encryption on the application of HIPAA to CSPs. In the past, some CSPs have questioned whether they are a business associate if they handle only encrypted ePHI for which they do not have the decryption key. The guidance clarified that encryption of its client’s data does not allow the CSP to avoid being a business associate.
Contrary to popular belief, HIPAA does not mandate encryption. Under HIPAA, encryption is an “addressable” security standard. This means that it is only required if a covered entity or business associate determines that it is reasonable and appropriate to implement. It will almost certainly be the case that implementing encryption is reasonable and appropriate for health app developers. Developers who determine that encryption is not reasonable or appropriate should identify and document compensating security controls to support their contention that encryption is not required for their app. When implementing encryption, developers should focus on encrypting data that is on the device, as well as data that is in transit, at rest in the cloud, and at rest on the developer’s own systems. Properly encrypting PHI has the added benefit of providing a safe harbor against breach notification. Under HIPAA, the developer could leave a laptop full of PHI on the subway, and, as long as it is encrypted in a manner that satisfies certain federal standards, the loss will not be considered a breach. Since the covered entity is ultimately responsible for reporting breaches to individuals (and potentially the media), avoiding such breaches can help protect the developer’s reputation.
Breaches of health information may also be reportable under certain state data breach notification laws, including California. See the Mintz Matrix for information on particular states. As with HIPAA, encryption has the benefit of providing a safe harbor against breach notification in many of these states. In any event, incident response plans should be developed that take all these differences into account.
This post has focused primarily on HIPAA, but that’s not the only law that can be implicated by health apps. For example, some of the federal consumer protection laws enforced by the Federal Trade Commission (“FTC”) also apply to health app developers. Consideration should also be given to whether an app will be used by those under the age of 13. If so, the app must also comply with the Children’s Online Privacy Protection Act (“COPPA”), and verifiable parental consent requirements may kick into play before you may even offer the app in an app store.
Addressing privacy and data security issues during development will pay dividends down the road and help the developer reduce risks to their finances and reputation. This will require a thorough examination of the source and the type of data involved in the app, as well as the relationship between the developer and those parties upstream and downstream to the developer. If HIPAA applies, developers should conduct the required Security Rule analyses and carefully negotiate business associate agreements with their covered entities and subcontractors. Developers should also take advantage of the increasing amount of guidance and tools provided by the federal government. In 2013, the FTC published its Mobile App Marketing Guidance, which is a good resource for developers as they move forward. Last May, the FTC also published “App Developers: Start with Security” which contains important considerations regarding app security. And last but not least, OCR has released a platform to provide developers (and any other interested stakeholders) a sounding board to ask questions and voice concerns about how HIPAA applies to app developers.