Autonomously operated vehicles are a rapidly advancing technological reality, but we have not dawned the horizon of the mass consumer auto-piloted vehicle. With the recent death of a Tesla driver operating his vehicle in autopilot mode, questions are being directed to developers of these automobiles.
Designers are facing difficult questions about the safety of this technology and they are being forced to ask themselves if these vehicles are safe enough for mass consumer use on roadways. Many of the researchers working on the technology say yes, but acknowledge that a truly autonomous vehicle is not yet available for drivers. According to the Society of Automobile Engineers (SAE), the cars currently for sale in the market are not entirely autonomous.
The misconception that true autonomous cars are available for roadway use is prevalent in the general public. The reason this is an important distinction lies in driver requirements for each level of autonomous driving. There are various levels of automobile self-driving, and understanding the differences can help improve progression toward the development of a safe auto-piloted vehicle.
The Tesla accident being investigated by the National Highway Traffic Safety Administration will help improve the science and research being conducted in this field. The accident, which led to the death of an Ohio man, was caused when his car hit a tractor trailer as it was attempting to cross a roadway to make a turn.
While Tesla has admitted that the accident could have been the fault of the internal computer the car uses to make driving decisions, the responsibility of the driver is critical to the safe operation of these vehicles. In the case of the Ohio crash, the driver and the computer failed to anticipate the actions of the tractor trailer. The driver, in this instance, was still required to maintain the “dynamic driving tasks” required by his vehicle.
These autonomous vehicles are capable of steering, accelerating and decelerating the car, but they also demand that the human driver maintains a level of awareness to the driving conditions that will allow them to take over at any given moment. The level of dependence on this technology is premature, and the level of competence attributed to these vehicles is unwarranted. Because the cars can avert crashes in some cases doesn’t mean that it will catch every instance of danger.
In the month before this fatal crash, the same driver was nearly involved in a similar accident with a boom lift truck that was moving across lanes. The car was able to redirect to avoid the truck, and a video of the incident was lauded as a success for autonomous driving.
This false sense of security has given drivers the impression that these vehicles are more self-reliant than evidence shows. A study conducted by Stanford University acknowledged that driver disengagement is the most dangerous side-effect of autonomous vehicles. Simulations have highlighted the threat posed when drivers have been disengaged from controlling the vehicle for an extended period, and their response reaction time is dangerously delayed.
The crash of the Tesla Model S in Ohio may not be indicative of an inherent lack of safety attributed to autonomous vehicles. The crash was the first fatality recorded after more than 130 million miles of autopilot tests, but it does suggest that more communication needs to happen between manufacturers and drivers.
Those who choose to operate these vehicles must understand they are ultimately responsible for the safe operation of their car. If drivers hope to avoid a motor vehicle accident when operating an auto-piloted car, then they must realize they cannot disengage from their driving environment even for a moment.
There are occasions when motor vehicle accidents are not avoidable, but if you think you were involved in a crash caused by driver negligence, then you should contact the attorneys with the Law Offices of Cohen and Jaffe, LLP. We have offices on Long Island, and we focus on personal injury and motor vehicle accident related cases.