Who’s to Blame When Driverless Cars Have an Accident?
Thomas Edison has been quoted as saying, “I have not failed. I’ve just found 10,000 ways that won’t work.” As modern automotive technology continues to advance, we should remember that with innovation, there will always be risk. Every new invention comes with a period of heightened risk and potential for errors. The self-driving vehicle is no different. While this new technology is getting off the ground, lives are being lost in car accidents. Whether it is Google’s early attempts at automating self-driving technology or Uber’s failed attempt to put autonomous vehicles on the road in Arizona, crashes do occur.
Unlike a traditional car crash, it can be a little more challenging to determine liability in these accidents, as there can be some question as to whether it was driver error or a manufacturer defect that caused the accident.
The Society of Automotive Engineers (SAE) outlines six automation levels:
- Level 0: No automation.
- Level 1: Some assistance like power steering.
- Level 2: Partial automation. Driver must still remain in control. (e.g. cruise control)
- Level 3: Vehicle operates without the driver, but driver must be attentive and alert.
- Level 4: Driver is not required, but can take over if he or she wants to.
- Level 5: Complete automation. These vehicles may not even have a steering wheel or pedals.
According to the National Highway Transportation Safety Administration (NHTSA), human error causes about 94% of all auto accidents, with roughly 35,000 deaths in the U.S. each year alone. The goal of automation is to save lives and improve the quality of automotive transportation.
Who is Responsible When Driverless Vehicles Crash?
There are generally three possibilities when there is a crash involving an autonomous car.
- Driver was in control of vehicle. In many instances, just because a vehicle has a setting that allows a driver to take a brief break from operating the vehicle, it does not mean the driver is always using the self-driving features. It is important to determine whether the vehicle was in self-driving mode at the time of the crash. If not, then human error is to blame.
- Driver was not in control of vehicle (driver to blame). Assuming it is determined that the vehicle was operating in self-driving mode at the time of an auto collision, the next question is whether it should have been. Many times, crashes involving self-driving cars are due to a driver using the features inappropriately. For instance, a driver may take a nap while letting the vehicle take over. Currently, this level of automation is not available. A driver must still remain alert and awake at all times.
- Driver was not in control of vehicle (car to blame). In other cases, the driver was alert and awake but unable to avoid a crash because of a vehicle malfunction. When this happens, there could be liability on the part of the auto manufacturer for a defective navigation system or other safety controls that failed to prevent the crash.
Getting Help From Long Island Car Accident Attorneys When Hurt by Self-Driving Cars
The experienced Long Island Personal Injury Lawyer of the Law Office of Cohen & Jaffe, LLP serve the communities of Long Island nearby areas including Nassau County, Suffolk County and Queens. With more than 100 years of collective experience representing clients who are injured by the negligence of others, you can count on Cohen & Jaffe, LLP. Call or visit us online to schedule a confidential free consultation today.
For a free legal consultation, call 516-358-6900The information provided on this website does not, and is not intended to, constitute legal advice; instead, all information, content, and materials available on this site are for general informational purposes only. Information on this website may not constitute the most up-to-date legal or other information and may not be applicable in your jurisdiction.