Autopilot not used in Tesla crash in Texas

A Tesla logo on a Model S has been photographed at a Tesla dealer in New York.

Lucas Jackson | Reuters

On Monday, Tesla CEO Elon Musk tweeted a denial that his company’s automated driving systems were involved in a fatal accident in Spring, Texas.

Two federal agencies, the National Highway Traffic Safety Administration and the National Transportation Safety Board, are now investigating the crash.

Local police said in multiple press interviews that apparently no one was behind the wheel of the 2019 Tesla Model S when it swerved off the road, hit a tree and caught fire, according to their preliminary investigation.

Musk wrote in his tweet Monday, “Datalogs recovered so far show that Autopilot was not engaged and this car had not purchased FSD. In addition, standard Autopilot would require lanes to engage, which this street didn’t. “

Tesla markets its automated driving systems under the brand name Autopilot and Full Self-Driving, or FSD. It is also releasing a “beta” of FSD software to some customers who have the premium FSD option, which costs $ 10,000.

Tesla Autopilot and FSD are unable to operate the electric vehicles under all normal driving conditions, and the company’s owner’s manuals warn drivers to use them only under “active supervision”.

Autopilot, which is now standard in Tesla vehicles, doesn’t always perfectly identify lane markings – for example, it can confuse sealed cracks in the road or bike lanes with other lane markings.

The system can also be misused or misused by drivers. A teenage driver recently demonstrated in a stunt video he shared on social media that he could leave the driver’s seat while his Tesla’s Autopilot system remained in use.

Source