The National Transportation Safety Board (NTSB) has released its preliminary findings after investigating a Tesla Model X crash while the vehicle was in autopilot mode.
Extenuating circumstances have been identified, though these are related to the death of the driver not the autonomous driving system, in the report which could have some very damaging repercussions to the development of autonomous vehicles. In the final second before the crash, the vehicle, which was under control of the autopilot, accelerated from 62 to 70.8 mph, while no evidence of pre-crash braking or evasive steering movement was detected.
“The NTSB continues to work with the California Highway Patrol and the California Department of Transportation to collect and analyse data, including all pertinent information relating to the vehicle operations and roadway configuration,” the report states. “All aspects of the crash remain under investigation as the NTSB determines the probable cause, with the intent of issuing safety recommendations to prevent similar crashes.”
Looking through the recorded performance data, the autopilot system was engaged on four separate occasions during the 32-minute trip, including a continuous operation for the last 18 minutes 55 seconds prior to the crash. During this period, the vehicle provided two visual alerts and
one auditory alert for the driver to place his hands on the steering wheel, all made at least 15 minutes prior to the incident. 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph, before it began a left steering motion. At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no pre-crash braking or evasive steering movement detected. In the last six seconds, the vehicle did not detect the driver’s hands on the steering wheel.
Testing of autonomous vehicles are still required to have humans in the driver seat right now, and Tesla might well point to inaction as the cause of the crash as opposed to failure of its system, though the report will not enforce the insistence autonomous vehicles are safer than human operated ones. This has been the fuel powering the campaign to normalise autonomous vehicles in the general public, though there have been a few incidents in recent months to falter this mission.
Aside from this incident, an Uber vehicle in self-drive mode struck a woman walking with her bicycle on the pavement in Arizona in March. Since that incident, Uber has suspended all self-driving projects in the state.
The development autonomous vehicles is a sensitive movement, as the technology is strongly dependent on trust. Regulators and users have to trust the technology to make safe and effective decisions, in a timely and logical manner. Removing the element of control from the driver is a big step for everyone involved. Any incident involved an autonomous vehicle will put considerable dents in the progress of the technology.