IBM's Watson, the natural-language processing computer perhaps most famous in popular culture for obliterating its opponents on a January 2011 evening of Jeopardy!, has more recently been making news in the healthcare industry. A growing number of care providers are buying or renting Watson's computing power to assist their medical professionals and staff with decision-making. The possibilities for using big data to improve outcomes for patients are myriad and exciting.
With these new tools come new responsibilities. As contextual computing devices become more widely available and, presumably, less expensive and more conveniently sized, how will health providers' standard of care change in response? How can the healthcare industry use this technology to improve care and save lives, but also manage the potential exposure to liability that comes with reliance on massive amounts of data from myriad sources that no single human brain could ever contextualize or verify? Many important questions spring from this change - questions revolving around patient privacy, managing inputs, ethics in research, data protection and the potential for breach, and how to use the cold numbers of statistics to assess and improve the unquantifiable goal of "better quality of life." For health systems' risk management and in-house counsel, innovation also brings concerns for management of potential liability. The availability and use of big data analytics injects uncertainty into some already slippery concepts, including the "duty of care."
A court's determination of the standard of care in a medical malpractice action today varies from jurisdiction to jurisdiction, but generally revolves around the court's analysis of what a reasonably prudent doctor would do. This concept is fluid, and for good reason - the reasonably prudent doctor in 2015 obviously makes different choices from the reasonably prudent doctor of 1955, and the court system can use expert testimony and common sense to allow the standard of care to evolve as quickly as medicine evolves. The availability of systems like Watson will necessarily change the standard of care in medical malpractice actions: eventually, large health systems may be required to engage the use of contextual computing in order to provide the bare minimum of good care, especially as it becomes cheaper and easier to use. That's virtually certain. What is harder to define is how human practitioners should engage with the technology. To what degree will a doctor be allowed to disagree with the computer?
Although not much is yet known, given the infancy of contextual computing, some concepts can be extrapolated. For example, a reasonably prudent doctor generally must read the patient's chart and consider the full range of available information when making treatment decisions. In Breeden v. Anesthesia West, P.C., 656 N.W. 2d 913 (Neb. 2003), an anesthesiologist familiar with a patient's chart failed to check the nurse's latest notes just before putting a patient under anesthesia, and missed an important notation that may have changed his decision-making. The patient suffered brain damage and sued, and the anesthesiologist pointed the finger at the nurse, arguing that she should have brought the new notes to his attention. The court disagreed, holding that the anesthesiologist's duty to check the chart could not be delegated to another healthcare provider. It is possible that this conclusion could be extended to non-human, healthcare providers - in other words, a court could one day hold that while a doctor can use contextual computing to improve his treatment decision making, he can't blame the computer if it makes a mistake that the doctor could have prevented through reading paper charts or asking questions of the nursing staff. In other words, a doctor could use a computer but would not be permitted to delegate to that computer the duty to know the patient's pertinent, available medical information.
On the flip side of the coin, a large health system already has a duty to ensure that patient charts are complete. In one case, labs were ordered and yielded concerning results, but those results were never put into the patient's chart. The patient later died. Johnson v. Hillcrest Health Center, 70 P. 3d 811 (Okla. 2003). The court held that this failure to put pertinent information into a chart could signify a breach in the standard of care. The hospital had a duty (in this case, a duty described by an Oklahoma statute) to put test results, services rendered, and other relevant information into a treating patient's chart. In this instance, the hospital tried and failed to evade liability because it had a policy and training program in place that should have been followed. The court rejected this defense - the data was available, it was not hooked up with the patient's chart, and the hospital could not shield itself from liability with its good policies and procedures. One day this concept could be extended to incorporate a duty to include (and analyze) all of the massive amounts of data that is becoming available on most Americans. Already some large health systems are using consumer data gleaned from credit card transactions and public records to create risk profiles on certain patients, and introduce lifesaving interventions earlier based on predictive models. As data brokers now offer data sets that can provide a complete picture of the "whole patient", it may become harder for a "reasonably prudent" doctor to exclude that information from the patient's digital "chart" and fail to take it into account when treatment decisions are made.
The "standard of care" in medical malpractice is often defined in court by experts in the field. As healthcare providers grapple with the integration of contextual computing into their care models, different jurisdictions will have to adapt the definition of the reasonably prudent doctor, based on what those experts decide. It is possible that one day a natural language processing computer like Watson may itself be the expert called to assist a court in determining a standard of care. Watson's move from the Jeopardy! podium to the witness stand is not inevitable, as juries may continue to prefer the human witness. What is inevitable is contextual computers' move into the treatment rooms and health centers of America. As health systems integrate these computers into patient care models, the concept of the reasonably prudent doctor must remain in the forefront in their risk management decision-making.