On May 7, 2016, Joshua Brown died in an accident between his Tesla Model S and a truck on Florida's roads. It was followed by endless controversy over the safety of the Autopilot system. After months of investigation, Tesla's guidance system was cleared, but the National Traffic Safety Board wanted to come back to the matter to clarify a few points.

Tesla autopilot

Tesla's autopilot contributed to the fatal crash of a computer-controlled electric car in May 2016 by making the software's security precautions too lax. That's the assessment of American accident investigators from the NTSB who made an announcement this week.

Tesla assured them that they would check the recommendation made by the investigators. The NTSB (National Transportation Safety Board) examines airplane crashes and related accidents in the entire USA transport system. They plan recommendations on the basis of their investigations and they include the development of systems.

Toms Guide pointed out that one of the main recommendations is for all autopilot cars driver to always stay alert even when the Tesla Driving Assistant is switched on and "it's critical that drivers understand its limitations and heed all warnings."

The death

A 40-year-old person died in the accident at the start of May 2016, when his Tesla's autopilot-controlled car crashed into a truck on a highway.

Wired reported that "when a tractor trailer turning left crossed into the Model S's lane, the system did not recognize it—and the car crashed into its side, killing Brown instantly."

The Tesla driver relied too much on the driving assistant. The autopilot system had worked as described by the American brand but it was not designed for this situation.

Tesla said that the driver had not paid attention to traffic, even though Tesla had in principle prescribed that people must always keep an eye on the situation during autopilot operations.

According to an NTSB report, the Tesla driver was asked by the software to lay hands on the wheel. Tesla tightened safety precautions after the accident and made it impossible to take your hands off the steering wheel for a longer period.

The Group had always stressed that the autopilot assistant does not mean that the Tesla a self-propelled car. However, drivers have left the system in control at times, as YouTube videos proved.

After the fatal crash

Immediately after the accident, Tesla added compulsory hand control to the steering wheel and recently appears to be also implementing facial recognition to verify attention. According to the NTSB, Tesla's fault was to allow the use of Autopilot outside the mode in which it should have operated, with due and constant attention on the part of the driver.