This alarming video of a bad Tesla autopilot can really help make the autopilot safer

Illustration for article titled This Alarming Video Of Some Bad Tesla Autopilot Driving Can Really Help Make Autopilot Use Safer

Screenshot YouTube

Tesla, as usual, is very generous and gives us a lot to talk about, especially when it comes to them Level 2 driver assistance system known, confusing, like Autopilot and / or Full Self Driving (FSD). Yesterday there was a crash of a Tesla with Autopilot hitting a police car, and now one video of a mostly autopilot assisted drive Oakland is making the laps, generating a lot of attention because of the often confusing and / or just plain bad decisions the car makes. Strangely, however, it is the system’s lousy performance that can help people use it safely.

All of this comes on the heels of a letter from the National Transportation Safety Board (NTSB) to the United States Department of Transportation (USDOT) regarding the National Highway Traffic Safety Administration’s (NHTSA) ‘Advanced Notice of Proposed Regulations’ (ANPRM), where the NTSB is basically saying what the hell we should do (WTF) with related to autonomous vehicle driving (AV) testing on public roads.

From that letter:

Because NHTSA has no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the limitations of the AV control system. For example, Tesla recently released a beta version of its Level 2 Autopilot system, which is described as being fully self-driving. By releasing the system, Tesla is testing highly automated AV technology on public roads, but with limited monitoring or reporting requirements.

While Tesla includes a disclaimer that “currently enabled features require active driver supervision and do not make the vehicle autonomous,” NHTSA’s hands-off approach to AV testing supervision poses a potential risk to drivers and other road users.

At the moment, the NTSB / NHTSA / USDOT letter does not propose solutions, but something that we have seen for years: Tesla and other companies are testing self-driving car software on the open road, surrounded by other drivers and pedestrians who have not agreed to participate to a test, and in this beta software test, the crashes can be literal.

All of this provides good context for the Autopilot in Oakland video, the highlights of which can be seen in this tweet

… and the full 13 and a half minute video can be viewed here:

There is so much in this video that is worth watching if you are interested in Tesla’s Autopilot / FSD system. This video uses the most recent version of the FSD beta in my opinion, version 8.2, many other driving videos of which are available online.

There is no doubt that the system is technologically impressive; Doing all of this is a huge achievement, and Tesla engineers should be proud.

At the same time, however, it is not quite as good as a human driver, at least in many contexts, and yes, as a level 2 half-Autonomous system, it requires a driver to be alert and ready to take over at any time, a task people are notoriously bad at, and why i think each L2 system is inherently flawed.

While many FSD videos show that the system is used on highways, where the overall driving environment is much more predictable and easier to navigate, this video is interesting because city driving has such a higher degree of difficulty.

It’s also interesting because the man in the passenger seat is such a constant and unflappable apologist, to the point that if the Tesla were to attack and mow a litter of kittens, he would praise it for its excellent ability to track a small target .

Over the course of the Oakland ride, there are plenty of places where the Tesla performs just fine. There are also places where it makes some really awful decisions, drives down the incoming lane or turns the wrong way on a one-way street, or swings around like a drunken robot or cuts curbs or just stops, for no apparent reason, right in the middle of the street. middle of the road.

In fact, the video is conveniently divided into chapters based on these, um, interesting events:

0:00Introduction

0:42Double parked cars (1)

1:15Pedestrian in Crosswalk

1:47Crosses solid lines

2:05Withdrawal

2:15China Town

3:13Avoid drivers

3:48Left unprotected (1)

4:23Turn right in the wrong lane

5:02Almost head-on incident

5:37Act drunk

6:08Left unprotected (2)

6:46Withdrawal

7:09No Turn on red

7:26“Take over immediately”

8:09Wrong Lane; Cars parked behind

8:41Double parked truck

9:08Lane only bus

9:39Close Call (curb)

10:04Left turn; Lane blocked

10:39Wrong way !!!

10:49Double parked cars (2)

11:13Stop sign delay

11:36Hesitantly left

11:59Near collision (1)

12:20Near collision (2)

12:42Close Call (wall / fence)

12:59Verbal assessment of Drive / Beta

That reads like the track list of a very weird concept album.

Nothing in this video, as impressive as a lens, says this machine drives better than a human. If a human did the things you see here, you would ask noisily what the hell was wrong with him, over and over.

Some situations are clearly things that the software is not programmed to understand, such as how parked cars with hazard lights are obstacles that must be carefully avoided. Other situations result from the system misinterpreting the camera data, or overcompensating, or simply having trouble processing the environment.

Some of the video’s defenses help highlight the bigger issues:

The argument that many, many more human accidents happen on any given day is very fallacious. Sure, there are many more, but there are also a lot more people who drive cars, and even if the numbers were the same, no automaker is trying to sell crappy human drivers.

Plus, the reminders that FSB is a beta only serve to remind us of that NTSB letter with all the acronyms: Should We Business beta test selfmoving car software in public unsupervised?

Tesla’s FSB is no safer than a normal human driver, which is why videos like this one, showing many troubling FSD driving events, are so important and can save lives. These videos erode some confidence in FSD, which is exactly what should be done if this beta software is tested safely.

Blind faith in any L2 system is how you crash and maybe die. Because L2 systems give little to it no warning when they need people to take over, and a non-trusting person behind the wheel is much more willing to take control.

I’m not alone in suggesting this either:

The paradox here is that the better a level 2 system becomes, the more likely it is that the people behind the wheel will trust it, which means they will pay less attention, which means that they are less able to take control of it. take when the system actually needs them.

That’s why most Crashes with Automatic pilot happen on highways, where a combination of generally good autopilot performance and high speeds results in poor driver attention and less reaction time, which can lead to disaster.

All Level 2 systems not just the autopilot suffer from this, which is why they are all rubbish.

While this video clearly shows that basic FSD driving skills still require work, Tesla should not focus on that, but on figuring out safe and manageable failover procedures, so immediate driver attention is not required.

Until then, best case for safe use of Autopilot, FSD, SuperCruise or any other level 2 system is to watch all these videos of the systems screwing up, lose some faith in them and stay a little tense and alert when the machine is driving.

I know no one is want of autonomous vehicles, but the truth is they are far from ready. Time to accept that and treat them this way if we ever want to make real progress.

Getting defensive and trying to mitigate crappy driving doesn’t help anyone.

So, if you love your Tesla and Autopilot and FSD, watch that video carefully. Appreciate the good parts, but really accept the bad parts. Don’t try to make excuses. Watch, learn, and keep these bastards in mind if you’re behind a wheel and not actually steering.

It’s no fun, but at this stage of any technology like this, work is always needed, and work isn’t always fun.

Source