It happens too often. Someone sees one Tesla owner sleep while driving down the highway, their car under control of Tesla’s Autopilot driver assistance system. The next thing you know is all over social media.
You may be wondering how Tesla was able to bring this product to public roads. Are there no regulations for such functions? Isn’t this a security issue? According to a report from the Los Angeles Times, it really falls under government surveillance.
The Trump administration focused its efforts on rolling back fuel economy requirements. The arguments for this were that cars would become both cheaper and safer. That didn’t happen, and it’s a mystery why Trump thought it would happen. One explanation is he knew nothing about cars.
Sadly, fuel economy and emission control rollbacks were just about the only things Trump’s NHTSA did. The NHTSA’s important regulatory oversight stalled for four years without a director at the helm. Now the Biden administration is lagging behind with neglected tasks to dig through. Like the Times report As it turns out, NHTSA has been pretty much hands-off when it comes to driver assistance systems, particularly when it comes to Tesla’s misleadingly named Autopilot:
Officially, the National Highway Traffic Safety Administration is discouraging such behavior by running an awareness campaign last fall using the hashtag #YourCarNeedsYou. But its coverage rivals the marketing of Tesla itself, which recently said it will start selling a ‘Full Self Driving’ software package – a term it has been using since 2016 despite criticisms and the caveats in the company’s own fine print. – on a subscription basis from this quarter.
That the NHTSA has so far refused to confront Tesla directly with the issue is obvious in nature for an agency that approached a wide range of cases under the Trump administration hands-off.
“Inactive,” said Carla Bailo, CEO of the Center for Automotive Research, summarizing NHTSA’s previous four years. “Asleep,” said Jason Levine, executive director of the Center for Auto Safety. “No direction,” said Bryant Walker Smith, a professor and expert on autonomous vehicle law at the University of South Carolina.
The agency went the full Trump term without a Senate-confirmed clerk, leaving delegates in charge. It launched several security investigations into Tesla and other companies, but most of them had not been completed. “A huge pile of backlogs” awaits the Biden administration, “said Paul Eisenstein, publisher of The Detroit Bureau news website.
While the NHTSA was absent on a number of issues, the lack of oversight of autonomous driving is perhaps the greatest. The Times says level 2 autonomy is the biggest security challenge since Ralph Nader’s Unsafe at any speed. Silly Nader references aside, the Times does have a point.
How to deal with emerging autonomous driving technologies is a long-term issue. But one thing is certain: the way Tesla uses its customers as beta testers is raising alarm bells among experts.
Whoever takes the lead must weigh the long-term potential for next-generation cars to reduce pollution, traffic and greenhouse gases against the short-term risks of deploying buggy technologies on a large scale before they are fully vetted. In Silicon Valley, the ‘move fast and break things’ style, Tesla Chief Executive Elon Musk has embraced those risks.
While other driverless car developers – from General Motors’ Cruise, to Ford’s Argo AI, to Amazon’s Zoox, to Alphabet’s Waymo, to independent Aurora and more – are all following a step-by-step, slow-moving implementation with professional test drivers at the wheel, Tesla ” beta testing ”are self-driving technology on public roads with its customers as test drivers.
Musk said last month that by the end of this year, Tesla cars will be able to drive completely on their own without human intervention on public roads. He has made similar promises since 2016. No auto expert or automotive industry leader outside of Tesla has said they think that’s possible.
While rights professor Smith is impressed with Tesla’s “brilliant” ability to use Tesla drivers to collect millions of miles of sensor data to fine-tune the software, “that’s no excuse for the marketing, as this is in no way completely self-driving. There are so many things wrong with that term. It’s ridiculous. If we can’t trust a company when they tell us a product is completely self-propelled, how can we trust them when they tell us a product is safe? “
The Detroit Bureau’s Eisenstein is even tougher. “Can I say this unofficially?” he said. ‘No, let me say it on the record. I am shocked by Tesla. They opt for the smartphone approach: put the technology out there and see if it works or not. It’s one thing to release a new IOS that caused speech dictation issues. It’s another thing to have a problem with moving 60 miles per hour. “
G / O Media can receive a commission
A late 2016 NHTSA directive under the Obama administration considered “predictable abuse“As a possible defect in the use of autonomous driving. Unfortunately, NHTSA did nothing under Trump. For context, the guideline came about a year after the software that enabled Autopilot driver assistance in the Tesla Model S was released.
The NHTSA’s inaction drew ire from another federal safety agency, the National Transportation Safety Board. The NTSB – best known for its investigation of plane and train incidents – blamed predictable abuse a crash in 2018 where a Tesla Model X crashed into a concrete partition.
Part of the problem is Musk and Tesla’s lack of transparency on how safe the Autopilot driver assistance system is, as well as a lack of data in general. From the times:
Musk regularly publishes statistics claiming that Autopilot and Full Self Driving are, on balance, safer than human-driven cars. That could be, but even if Musk’s analysis is valid – several statisticians have said it isn’t – the data is Tesla’s property, and Tesla has refused to make even anonymized data available to university researchers for independent confirmation. Tesla could not be reached – it closed its media relations department last year.
***
In 2019, after a series of Tesla battery fires, NHTSA launched an investigation into the company’s software and battery management systems. The agency later said that reportedly faulty cooling tubes that could cause leaks were also being investigated. At the time, the agency did not make any public information about battery coolers prone to leakage installed in early versions of the Model S and Model X.
Since late 2016, many Tesla drivers have complained about “whompy wheels” on their cars – a tendency for the suspension system to fall apart, sometimes causing a wheel to collapse or fall off the car. Chinese drivers filed similar complaints, and last October, Chinese authorities issued a recall of 30,000 Model S and Model X cars. A Tesla attorney wrote to NHTSA a letter claiming the US recall was unnecessary and blamed the driver for the troubles in China. NHTSA said in October that it is “closely monitoring the situation.”
Four days before Biden’s inauguration, NHTSA announced that defects in Tesla touchscreen hardware could, among other things, blank the car’s rearview camera. Rather than ordering a recall, NHTSA said it had asked Tesla to voluntarily recall about 158,000 Model S and Model X cars for repair. On February 2, Tesla agreed to recall 135,000 of those cars.
Check out the full Los Angeles Times report, it is well worth reading!