Although it is unclear when autonomous vehicles will be hitting the roads, what is clear is that this technology is coming. Automakers to suppliers to lawmakers to insurers will want to carefully consider the following issues and their legal implications before consumers take this technology for a spin.

Telematics Data Privacy and Security

Telematics systems generate a lot of data, including location, direction, speed of travel, and so forth. And interest abounds in how that data can be used, such as for usage-based insurance programs and in crash-related investigations. With data comes concerns about privacy — how is data stored and for how long, who has access to it, and what constitutes consent for data collection? In November 2014, 19 automakers committed to a set of principles to protect data streamed or downloaded from onboard computers. This is a start, but further discussions will be needed as technology evolves.

Privacy concerns are closely followed by those about data security — and there is no shortage of breaking news regarding breaches in data security. If your credit card is stolen, cards can be cancelled and replaced. It is much more complicated if your car is stolen. The U.S. military has successfully conducted tests on cars with telematics, during which hackers were able to complete tasks such as operating the trunk and windshield wipers, engaging the brakes, and turning off the engine.

Manufacturers need to be prepared to take security to the next level in order to protect drivers from cyber thieves seeking to sell GPS coordinates and VINs to car thieves or from attempts by hackers to commandeer vehicles. Protocols will need to be put in place to address the administration of security patches, and responsibility for same should be spelled out in vendor contracts. Additionally, crisis-preparedness plans should be developed in the event of a security breach or if vulnerability is exploited.

Regulating Driverless Cars

Another gray area for manufacturers and suppliers lies in what changes may come in terms of government regulations and safety standards for autonomous vehicles. These are complex issues, as there is a constitutional division of responsibility between state and federal government, resulting in regulatory changes coming from both fronts. California has taken the lead with the Department of Motor Vehicles’ new testing rules requiring that drivers be able to take “immediate physical control” of a vehicle in the event of an emergency. These guidelines have already led to design changes for Google’s driverless car prototype. Other substantial issues could include whether there should be federal standards regulating usage in different conditions because of climate, terrain, density, and so forth.

Automakers need to be prepared to stay ahead of the regulatory curve, as laws on the state and federal levels will likely change in tandem with technological developments. Being proactive and engaging lawmakers now may help avoid the some of the disconnect between proposed regulations and driverless technology. It also is important to remember that even though a car meets federal standards, the product design, production, or operation could still be subject to tort concepts of negligence. As such, product liability issues can arise relating to negligent design and defects.

Ethics (and Overrides) for Telematics Technology

Some of the most paramount ethical and legal issues surrounding autonomous vehicle technology relate to its real-world application. Can crash-avoidance software be programmed to address the decisions drivers are traditionally responsible for making? Through the use of sensors, telematics uses electromagnetic waves to detect pedestrians, cars, and other obstacles. But when an emergency situation arises and the telematics system has to make a moral decision between two bad choices (e.g., swerve to avoid colliding with a car but potentially strike a pedestrian, or collide with a car to avoid the pedestrian), how will the computer respond and who is liable for that decision?

The idea of driverless cars offers the ultimate in distracted driving — the lines between driver and passenger are blurred. States will need to determine the requirements for vehicle operation (e.g., will the vehicle operator still need to be a licensed driver, or will there be restrictions on usage relating to time of day, weather, or school zone?). NHSTA has not updated its policy regarding driverless vehicles since May 2013, which stated that it does not recommend that states authorize the operation of self-driving vehicles for purposes other than testing.

Many of the questions regarding liability will likely come through the courts. In the event of a crash, who is liable when the vehicle is not under the driver’s control? To what extent and how quickly was the driver able to override the system? If the driver did or did not override the system, is the technology or car manufacturer responsible? Determining the assumption of risk will not be a cut-and-dried decision. Looking beyond the vehicle operator, investigators will be closely examining product design, maintenance, and so forth, opening the door to potential recalls.