The tragic accident that caused the death of Joshua Brown involving a Tesla Model S vehicle operating in Autopilot mode, whose sensors failed to differentiate between the side of a lorry trailer and the brightly lit sky, has inevitably sparked fresh discussions about who is to blame for such an accident.

While this case does not involve a purely autonomous vehicle, the Tesla incident is a topical example of the liability risks and issues that will arise as driverless cars become a reality. Liability regimes are inherently complex and the law has often been slow to adapt to new technologies, a good example being existing data privacy law in the age of the internet and cloud computing. Whilst liability for harm done by driverless cars is not currently specifically regulated by statute in the UK, our view it is likely that, in the short term at least, it can generally be addressed by the current liability framework.

This post provides a brief summary of the possible types of legal liability that might apply to manufacturers in Tesla’s position if (and more likely when) such an incident were to take place in the UK – from common law negligence and product liability through to criminal and contractual liability, depending on the particular circumstances.

The law of negligence has adapted and evolved over time to capture new concepts through case law and precedent. The traditional three-part test for liability in negligence requires (1) a duty of care to be owed to the person who has suffered the harm, which is (2) caused by the defendant and the harm must have been (3) foreseeable. It is likely that (1) Tesla owed the victim a duty of care, but it is not clear that (2) Tesla itself “caused” the harm.

Clearly the Model S is a highly complex machine dependent on an ecosystem of contributors in the design process – if Tesla manufactured the offending sensor then it may be said to have “caused” the harm, but if Tesla procured the sensor from a third party, the chain of causation may be broken. Indeed, the complex nature of such a machine means identifying the root cause may be similarly difficult – in this case, the principle of contributory negligence recognises the role played by actors other than the main defendant, including in this case the victim himself if he did not use the Autopilot feature in the way it is intended to be used; Tesla describes the system as "an assist feature that requires [the user] to keep [their] hands on the steering wheel at all times" and users are told to "maintain control and responsibility for [their] vehicle". While a claim in negligence could clearly be pursued, we can also see how the complexity and novelty of automated systems might cause it to fail.

Alternatively, if the accident was caused due to a defective part (i.e. the sensors), it will likely be treated as a product liability issue, imposing strict liability on the manufacturer – i.e. Tesla or the third party it procured the sensor from. Existing laws such as the Product Liability Directive (85/374/EEC), implemented in the UK by the Consumer Protection Act 1987, which imposes a strict liability regime on the manufacturer of defective products if the product was not as safe as people are generally entitled to expect, may apply to autonomous vehicles, especially if they are marketed as “driverless cars”, a further commercial consideration for manufacturers. However, if the state of scientific and technical knowledge at the time that the product is put into circulation is not as such to enable the existence of the defect to be discovered, a manufacturer may avoid liability under this legislation. This may apply to cases involving the introduction of autonomous vehicles to everyday life and is a reflection of the third foreseeability limb of a negligence claim, as set out above.

Legislators may decide to develop criminal sanctions for manufacturers who do not comply with suitable standards of safety and care when designing and manufacturing autonomous vehicle technology. Under the Corporate Manslaughter and Corporate Homicide Act 2007, a corporation may be ordered to publicise its failings and be given an unlimited fine, with a guideline starting point of 5% or 10% of turnover depending on the company’s plea for such an offence.

If the defect was discovered before any such accident, the Consumer Rights Act 2015 and the common law body of contract law gives consumers protection and rights to damages, repair and replacement. If not, while Tesla and/or its third parties are likely to have primary responsibility to the victim, they might have scope to make claims in contract against members of their supply chain, for example the supplier of the component that caused the harm. This does not vitiate Tesla’s / the third party’s legal liability, but it could help them to ‘back-off’ damages awards to their supply chain, which mitigates their financial loss at least.

Some commentators have raised the concept of assigning legal personality to the driverless car itself, recognising that autonomous machines can cause damage independently of their manufacturer, meaning liability would rest with the car and the manufacturer would not be blamed. While an interesting concept, we think assigning legal personality to machines is some way off and in any event simply sidesteps the issue of who is ultimately liable (which as the law stands still needs to be a person/organisation capable of paying out damages or serving out a criminal penalty).

What is clear is that where traditional legal principles strain to assign liability to those we feel should, at least morally, be held responsible for harm, regulators must provide for an effective legal remedy for victims of harm caused by autonomous systems.