The world of mental health apps is a growing market, and one that has seen a significant surge in the wake of the COVID-19 pandemic. With the consequent lockdowns and social isolation taking their toll, more and more people are thinking of using apps to help maintain positive mental health. At the same time, there is an increasing focus on mental health programmes at ‘place level’, delivered by community healthcare providers and local authorities.
As this market develops and the COVID recovery phase begins, how will the applicable legal and regulatory considerations affect it? To what extent can mental health and wellbeing apps (MH apps) bridge the supply/ demand gap compared to more traditional face-to-face mental health support solutions?
Ability to address issues arising from COVID
Mental health issues have become a growing problem during the pandemic, both in terms of supply and demand. The impact the pandemic has had on our wellbeing is visible at every turn, whether from social isolation, the significant disruption to routine or the loss of recreational and stress-relieving outlets for many of us. At the same time, mental health practitioners have had to reduce services or have been redeployed in order to combat the clinical consequences of COVID-19 as providers across the healthcare sector face unprecedented demand. The inevitable consequence is that gaining access to mental health services has become harder and harder at the very time when the need for such services is at its height.
MH apps could be ideally placed to step into this gulf and provide relief for patients and practitioners alike. They have the potential to allow for COVID-safe, socially distanced support via video and voice calls. They could have a broader reach, through using interactive and text based tools that develop through artificial intelligence and machine learning and reduce the need for practitioner input. Examples of such tools include automated chat services, or ‘chatbots’.
However, can remote support such as this deliver the same benefits as face-to-face support? Is there a way that MH apps can capture the subtle interactions that take place when speaking face to face? Will they be able to create an emotionally safe ‘virtual space’, in which people are comfortable communicating about difficult subjects? Health inequalities have existed for a long time and have been exacerbated by COVID. There are growing concerns that the expansion into digital healthcare could worsen inequality for patients who lack the experience or do not have access to smartphones, computers, or reliable broadband services. This is a challenge the MH app market will need to overcome if it is to truly flourish.
Legal and regulatory impacts
One key regulatory area that has an unavoidable impact on the features and production of MH apps is privacy. Many MH apps currently on the market focus on guidance, tutorials and advice in a way that provides a one way service to their users. By using this model, these MH apps are able to keep the processing of users’ personal information to a minimum. Provided they only request the basic user data necessary to use this level of service (e.g. registration details), these MH apps can avoid most of the more stringent obligations that apply when medical data is involved.
However, if MH apps are to move into the space now occupied by more traditional mental health services, there is a need for services that are more personalised. Such services involve the use of more sensitive health information provided by users. Under data protection rules, the use of health data is prohibited unless one of a short list of exemptions apply. Some MH apps may rely on the exemption for the provision of heath treatment. This exemption can cover a wide range of data use but use has to be overseen by a health or social care professional. Many MH apps will choose instead to rely on the exemption that applies if someone has consented to the use of their data. This has the advantage of transparency and puts the user in control of their information.
Liability and reputational risk
A major concern for many suppliers is the risk of liability, and this is no exception for providers of MH apps. To the extent that a therapeutic service is offered, it is likely to establish a ‘duty of care’ between the provider and the app user. What is the ‘standard of care’ that would be expected? It may not be straightforward to apply the ‘Bolam’ test (the reasonable, competent practitioner), which is the standard for a defence to a clinical negligence claim.
Any individual practitioners behind the technology will be mindful of their various professional obligations, for example under the GMC, but may find it difficult to meaningfully apply in this new context the obligations and guidance created in an analogue age of face-to-face care. We already see problems and uncertainty in dealing with simply moving face-to-face assessments into an online setting using video call technology.
At an organisational level there are potential liability exposures connected to many of the regulatory responsibilities that cannot be avoided, especially when dealing with medical services. For example, there is the potential for regulatory fines and personal injury claims by app users.
Something less frequently thought about but at least as important is the potential reputational risk that may occur following an incident. Once the damage is done to reputation it can be very difficult to fix, and so the management of reputational risk is an important part of any operating model. This is why so many MH and other health apps choose to put the user in control of how their data may be used – to avoid the damage to reputation that may be done by claims or regulatory investigation into breach of privacy rights.
It is important to consider liabilities sensibly, and while there may be some room for balancing the liability risk between commercial partners, the risk of liability exposure to users is harder to limit. In many cases, the best solution may be to budget for appropriate insurance cover in case the worst should happen.
In many cases, consumer law will apply to MH that charge for their services. In such circumstances, the terms and conditions will be expected to conform to minimum standards, including obligations of fairness and transparency. This restricts the ability of MH apps to limit their liability to app users. It is worth noting that this will not be the case where an MH app is made freely available by the NHS, but even in this scenario developers will still need to adopt similar principles for the purposes of meeting relevant NHS criteria.
One specific aspect of potential liability is around the need for patient consent. In the traditional doctor/patient context the law is now very clear that consent is only valid if informed by all the information that is important to the patient – ie what the reasonable patient would want to know (not just what the reasonable doctor would tell her) and what this particular individual patient wants to know. Therefore, this is already very difficult to legitimately reconcile with reliance on generic, pre-printed or other fixed material, in the absence of a real dialogue so the clinician can establish what is important to this individual patient and provide the information accordingly. The new GMC Guidance on consent from November 2020, for example, is framed in terms of shared decision-making, and meaningful dialogue between clinician and patient. Providers will need to rise to this challenge in creative ways.
Medical device regulation
MH apps that remain more within the scope of general wellbeing or educational services may not fall under the regulatory gaze of the Medicines and Healthcare products Regulatory Agency (MHRA), but where and when the services provided by MH apps become more complex, providers will need to confront the possibility that their app may be deemed a medical device.
Many people (including some app developers) are unaware that stand-alone software used in a healthcare setting may be a medical device, and if it is, it is an offence to sell or supply it unless it conforms to the relevant legal standards. So when does software become a medical device?
UK medical device regulations have been amended to explicitly include software within the scope of the legal framework, but it is also necessary to consider the intended use and purpose of the software to determine whether or not it constitutes a device. Stand-alone software must have a ‘medical purpose’ to qualify as a medical device, and it is also helpful to consider the action it performs on data. For example, an app that just notifies a patient about appointments is unlikely to be categorised as a medical device, but if it incorporates modules that provide additional analysis that contributes to diagnosis or therapy (for example, a patient medication module), then this may be classified as a medical device.
Similarly, the purpose of the software itself must be medical, not just applied in a medical context. For example, an information system that records and stores data relating to clinical observations will not usually be considered a medical device in itself, but if it incorporates a functionality that provides additional diagnostic or therapeutic information then it may qualify as a device.
This distinction between medical and non-medical purposes can be nuanced, depending on the context. An example given by the regulator, the MHRA, is the use of an app that uses an accelerometer in order to detect falls in epileptic patients. This is likely to be regulated as a medical device because its purpose is medical, whereas the use of the same app to alert a carer when an elderly person gets up out of bed in the social care context would not be regulated as a medical device.
The MHRA has published useful guidance on the regulation of health apps, suggesting that there are a number of key words that are likely to contribute to the MHRA determining that an app is a medical device, including: amplify, analysis, interpret, alarms, calculates, controls, converts, detects, diagnose, measures, and monitors.
The burden of medical device regulation in the context of MH apps will most commonly lie with the developers, but healthcare providers will also need to ensure that any apps they purchase and provide to patients satisfy the relevant regulatory requirements – not least due to post-marketing surveillance obligations and the liability that may arise as a result of the use of apps.
If an MH app is being offered as part of a public sector service, such as in conjunction with the NHS, it will need to comply with equality rules, including the Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018. Developers will need to be aware of this when designing MH apps in this sphere. In practice this means MH apps should be designed for easy navigation on various devices and should avoid difficult to read font formats or colour contrasts that could be problematic for users with visual impairments.
Apps endorsed by NHS
One route available to app developers is to design an app with the intention of supplying it into the NHS, or via the NHS app store. This is something that aligns well with the NHS plans for increased digitisation, and NHSX is currently responsible for overseeing the early stages of an assessment framework known as the Digital Technology Assessment Criteria (DTAC). The DTAC provides a ‘baseline assessment criteria that validates the suitability and function of digital health technologies for use by the NHS, social care staff or directly by citizens’.
Reflecting the regulatory landscape that MH apps inhabit, the DTAC focuses on assessing five key aspects of a service: clinical safety, data protection, technical assurance, interoperability, and usability/accessibility. Covering these aspects helps to ensure that any apps used in conjunction with NHS services have suitable clinical foundations, but also that they handle patient data responsibly and are sufficiently robust from a technological perspective.
While still in its early stages, it is clear that the DTAC scheme and the NHS initiative on digitisation, alongside similar established frameworks such as the NICE evidence standards framework, have significant potential to further boost the developing market for MH apps and in doing so can perhaps also help to frame a responsible regulatory standard within the market.