In 10 years, I think, almost all cars produced would be autonomous.
It will be like having a horse. People have horses, which is cool. There will be people who have
non-autonomous cars, like people have horses. It would just be unusual to use that as a mode of transport.
- Elon Musk, Tesla CEO, National Governors Association Summer Meeting, 2017
It wouldn't be the first time the common law turned to equestrian custom for key insights on tort liability standards. Entire works have, in fact, been written on the topic, like Clifford Pannam's The Horse and the Law. It is therefore no surprise that a significant amount of case law exists on the topic in Canada: people getting kicked by horses, the care veterinarians give to horses, or, perhaps most common, horses being spooked by automobiles and causing damage to persons and/or property.
Putting elevators and autopilots to the side for a moment, how—if at all— might the common law's treatment of horses be useful for creating liability standards for new and emerging technologies, say autonomous vehicles? Where—like in Canada—no court has attributed liability for damages involving autonomous (i.e. self-driving) vehicles, the common law's evolving and incremental nature invites jurists to draw parallels among similar and previously decided sets of facts: enter horse, elevator, and autopilot case law.
Autonomous vehicles in Canada
Autonomous vehicles are computer-controlled vehicles with the ability to drive on their own without a human operator. As observed by Elon Musk above, autonomous vehicles will likely become more common—perhaps even dominant—in the not too distant future, stemming from the host of safety and efficiency benefits they are intended to bring.
However, their adoption will inevitably be linked to injuries, whether to persons or property—as has already been the case in Arizona and Florida, where a pedestrian and driver were killed, respectively, in accidents involving autonomous vehicles. While under conventional tort law principles, drivers typically have a duty to exercise reasonable care, "[a]utonomous vehicles raise new liability questions on the road because the vehicles themselves can act negligently, independent of the human driver's intentions." Who then should the law hold responsible for repairing the injuries involving autonomous vehicles?
Neither Canada nor Ontario has adopted any comprehensive legislation targeting autonomous vehicles and the potential liability issues they raise. In 2016, Ontario's Ministry of Transportation launched a 10-year pilot program to allow testing of autonomous vehicles on Ontario roads, which finally took effect in January 2019. While allowing certain autonomous vehicles on public roads, the policy makes it clear that drivers are ultimately responsible for the "full care and control of vehicles." It states:
"Automated vehicles equipped with SAE Level 3 technology that are available for public purchase in Canada can be driven on Ontario roads. These vehicles will no longer be restricted to registered pilot participants […] A human driver is required at all times to take back the driving task when alerted to do so by the vehicle. Drivers will need to be in full care and control of vehicles with SAE Level 3 technology and all existing laws (such as distracted, careless and impaired driving laws) will continue to apply to drivers of these vehicles. Drivers are responsible for the safe operation of these vehicles at all times."
Legal issues relating to autonomous vehicles
There are several legal issues relating to autonomous vehicles and tort liability that could arise therefrom. At the outset, it is important to understand that injured parties looking to recover for injuries arising from an autonomous vehicle accident would likely claim against all of the parties who may have been responsible. The driver, represented by his or her insurance company, would likely be the low hanging fruit.
Additionally, we would expect to see claims against (1) the manufacturer of the vehicle, (2) the creator of the vehicle's software, (3) the driver's "employer", if any (i.e. a taxi company or on-demand service), and even (4) the city or province for failing to take reasonable precautionary measures. It is, of course, no surprise that the latter parties would be attractive to defendants owing to deeper pockets. Therefore, not only would conventional tort law principles be relevant, but other legal principles relating to product liability, vicarious liability, and agency could also be relevant.
As David King noted in "Putting the Reins on Autonomous Vehicle Liability: Why Horse Accidents Are the Best Common Law Analogy", absent any specific legislation governing autonomous vehicles, "these liability questions are expected to be answered through the incremental common law system [...] This means courts will draw analogies and distinctions between autonomous vehicle accidents and pre-existing case law precedent."
Most notably, these vehicles can be characterized as a form of personalized transport that is manufactured, whereby "at least one critical function of the vehicle's control is delegated to a computer." as described by King. Similar—though imperfect—parallels can be drawn with horses, elevators, and autopilots.
Horses, for example, have historically been — and in some parts of the world continue to be — forms of personal transportation. What they share with autonomous vehicles is that the driver (rider) has only partial control of the over the vehicle (horse), which retains an ability to act independently of the driver. Likewise, both horses and autonomous vehicles can be dangerous and there is a significant volume of case law in Canada of riders' being sued for personal and property-related injuries involving horses acting "autonomously." Since horses themselves cannot be found liable, these cases often turn to conventional negligence principles: whether the risk of injury was reasonably foreseeable in the circumstances and, if so, whether the rider took reasonable care.
From the Ontario Ministry of Transportation's statement above, it appears that such a test would be relevant for assessing the liability of drivers of autonomous vehicles. Since drivers are ultimately responsible for the "full care and control of vehicles," failure on the part of a driver to recognize foreseeably dangerous situations and act with all due care could attribute liability (e.g. failure to respond to built-in warning signals or to maintain one's car or software).
The horse analogy is, of course, imperfect, but still has considerable relevance. Just as a horse may panic and ignore all efforts of the rider to control it, it is conceivable that an autonomous vehicle may malfunction and fail to accept manual control. Likewise, while horses were sometimes panicked by cars when both shared the streets, there may also be adverse interactions between autonomous and human-controlled vehicles due to conflicting behavioural expectations.
Elevators and autopilots (e.g. autopilot systems on aircraft) are two further examples. Both are related to personalized transport. The failure of both can also cause injury. As with autonomous vehicles, any injuries resulting therefrom would similarly be approached by the victim suing third parties beyond the operator, such as manufacturers or software developers, owners, or even landlords for occupiers' liability.
As noted by King, injuries resulting from elevator accidents where the occupant had little or no control and did not contribute to the accident are often resolved under occupiers' and manufacturers' liability. However, people using elevators generally have less control and risks are less foreseeable, thereby making it difficult to attribute blame to the users: unlike autonomous vehicles. There are generally fewer warning mechanisms, and users are restricted in their ability to observe risks stemming from outside of the elevator.
On the other hand, injuries resulting from the failure of autopilot systems generally attribute liability to operators, manufacturers, and owners, although there may be complex indemnity provisions among these parties. As per King, however, it would be rare to find a case attributing liability to manufacturers exclusively, without finding some negligence on the part of the operator. Still, autopilots are not perfect analogies considering the amount of professional training often received by autopilot operators and the complexity (i.e. twists and turns) involved in everyday driving (as compared to longer and rigid routes generally taken by autopilots).
While analogizing injuries resulting from horses, elevators, or autopilots shines light on how liability could be attributed to injuries resulting from autonomous vehicles—most likely a mix of manufacturers liability and negligence on the part of drivers—Canada can be sure to see specific case law emerge on these issues likely before any regulations. Some automotive manufacturers, like Volvo, have even gone so far as pledging to accept full liability whenever its cars are involved in accidents when in "autonomous mode." Such a pledge is not only a marketing tactic, but also stems from a wider belief that the autonomous vehicles of the future will be safe, reliable, and cause fewer accidents and injuries than today's human-operated vehicles.
As discussions surrounding the testing and regulation of autonomous vehicles intensify, manufacturers, public interest groups, and consumer groups would be well advised to take active or intervening roles in discussions and negotiations to promote their interests.