While just a few years ago, self-driving cars seemed more like science fiction than reality, it now appears they will be regular fixtures on the road sooner rather than later. While Google is still devolving its system that uses advanced radar, GPS, an an array of camera’s, the high end electric car company Tesla has managed to get its models on the road already.
As our Boston car accident lawyers can explain, the problem will arise when one of these automatic cars crash. Are we dealing with a typical car accident lawsuit in which a driver is at fault or is the fault of the car company and this must be filed as some type of products liability lawsuit in which the plaintiff was a foreseeable accident victim, so he or she would have a valid claim against the car manufacturer.
As it turns out, a Tesla did crash in 2016, and this was the first self-driving car crash in U.S. history. According to a recent news article from Wired, the U.S. government is saying that the manufacturer is somewhat responsible for the crash, but not fully liable.
As noted in this article, the company that manufactured the Tesla S model involved in the 2016 crash has repeatedly warned customers that the system for automation is not perfect and that drivers must pay attention and be able to take control of the vehicle immediately should something go wrong with the autopilot system. Tesla has also said the autopilot is only to be used on highways with clear markings on the road surface and clear entrance ramps and exit lanes.
In the 2016 accident, the car was on a normal road when a large truck turned left, but the autopilot system didn’t notice the truck. It was only designed for highway use and the vehicle crashed into the massive truck. The driver of the vehicle, who was not driving at the time as the autopilot system was turned on, was killed when the truck and the car collided.
The National Highway Traffic Safety Administration (NHTSA) conducted a full investigation and determined that it was not the fault of the company since they had adequately warned the driver the car was not to be put in autopilot unless it was on a highway and that human drivers must be alert at all times and ready to take over in a instant should something go wrong.
Even though NHTSA says it was operator error for using the system where it was not designed to be used, and not the car maker’s fault, the National Transportation Safety Board (NTSB) conducted its own investigation and found that the car maker was somewhat liable because it designed a car in such a way that this type of accident could occur easily.
NTSB found that it was a combination of human error and a lack of sufficient control for the other cars. These two issues resulted in sufficient the fatal 2016 car crash. Since the accident ,the company has alterations to the system. Now there is a sensor that can determine if it is on a highway. If the system detects traffic and that it is not on the well-marked highway, it will give the driver three distinct warnings before it slows the car down and shuts the engine off.
This is what is known as a subsequent remedial measure. When a defendant learns of a problem after an accident has occurred, or already knew about the problem, and takes steps to correct this issue, the laws does not consider that an admission of liability. Public policy favors this approach because the alternative is to do nothing and hope that another person is not injured. On the other hand, an effort to fix an issue that was done in a negligent manner prior to the accident occurring may come into evidence depending on the facts of the case.
If you are injured in an accident in Massachusetts, call Jeffrey Glassman Injury Lawyers for a free and confidential appointment — (617) 777-7777.
Additional Resources:
Tesla Bears Some Blame for Self-Driving Crash Death, Feds Say, September 13, 2017, By Aarian Mashall, Wired.
More Blog Entries:
Girl, 10, Struck And Killed While Standing Outside Broken-Down Car, July 23, 2017, CBS Boston