As loyal followers of our blog may have seen, I often write about “autonomous” or so-called “self-driving” cars and their legal implications. For a couple of recent examples, please see here and here. The technology – which has the potential to save lives, reduce pollution, and eliminate traffic jams – is not without detractors. I won’t weigh in to the debate about the pro’s and con’s in this post — other than to note that an auto accident last week involving a “self-driving” vehicle in Tempe, Arizona generated some interesting headlines. Here are just a few, which were replete with photographs of a heavily damaged Ford Edge and a self-driving Volvo SUV on its side:

Uber has suspended its self-driving-car tests in Pittsburgh and Arizona after a big accident over the weekend;

Uber suspends self-driving car program after bizarre accident;

Uber Puts Brakes on Its Self-Driving Car Efforts Following a High-Speed Crash;

Word of a “big,” “bizarre,” and “high-speed” crash involving a self-driving car will certainly get your attention — it conjures up an image of a self-driving car that simply went “haywire.” Only after reading through the articles, however, does it become clear that the autonomous vehicle was not responsible for the crash–which, fortunately, did not result in any serious injuries. Indeed, the self-driving car had the right of way and the crash resulted when the human-driven vehicle failed to yield while turning left. Thus, if anything, the accident in Tempe demonstrates the potential complexities when autonomous and human-driven cars share the road. As one autonomous vehicle analyst explained,

“Robots don’t drive like humans . . . That’s a good thing. Humans are terrible at driving, but other humans know this and adjust our driving to account for what regular people would do on the road. There are many unwritten rules of driving that humans can quickly adjust to that robots will not, and this will lead to accidents.” The analyst further noted that if all cars on the road were autonomous, accidents would decline, but “[w]hile they are mixed together, the inflexibility of computers may lead to accidents that wouldn’t have happened before even as some other accidents are prevented.”

As I have reported in the past, automated vehicle technology has been advancing at a rapid pace, with significant government support from NHTSA. Whether accidents involving driverless cars — even where automated technology is not to blame — will detract from that support remains to be seen. At a minimum, companies involved in the technology should assume that every “accident,” no matter how minor and regardless of fault, will be examined under the media’s microscope.