this post was submitted on 26 Apr 2024
97 points (96.2% liked)

RealTesla

633 readers
1 users here now

  1. Posts must be about Tesla, EV, or AV
  2. Meta Posts must be pre-approved.
  3. Shitposts are limited
  4. No Elon Worship
  5. All Links must include the original title of the Content
  6. Sites behind Paywalls must have text included.
  7. Don't be an asshole
  8. No Image Posts

founded 2 years ago
MODERATORS
 

In total, NHTSA investigated 956 crashes, starting in January 2018 and extending all the way until August 2023. Of those crashes, some of which involved other vehicles striking the Tesla vehicle, 29 people died. There were also 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These crashes, which were often the most severe, resulted in 14 deaths and 49 injuries.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 1 year ago (23 children)

The article does a good job breaking down the issues with Tesla's Auto Pilot, including the fact that it's a misleading title and has some pretty significant flaws giving people a false sense of confidence in its capabilities.

But raw crash statistics are absolutely meaningless to me without context. Is 956 crashes and 29 deaths more or less than you would expect from a similar number of cars with human drivers? What about in comparison to other brands semi-autonomous-driving systems?

Driving is an inherently unsafe process, journalists suck at conveying relative risks, probably because the average reader sucks at understanding statistical risk, but there needs to be a better process for comparing systems than just "29 people died'.

[–] [email protected] -5 points 1 year ago* (last edited 1 year ago) (21 children)

NHTSA acknowledges that its probe may be incomplete based on “gaps” in Tesla’s telemetry data. That could mean there are many more crashes involving Autopilot and FSD than what NHTSA was able to find.

You seem to be pretending that these numbers are an overestimate. But the article makes clear. This investigation is a gross underestimate. There are many, many more dangerous situations that "Tesla Autopilot" has been in.

Driving is an inherently unsafe process, journalists suck at conveying relative risks, probably because the average reader sucks at understanding statistical risk, but there needs to be a better process for comparing systems than just "29 people died’.

This is 29 people died while provably under Autopilot. This isn't a statistic. This was an investigation. Your treatment of this number as a "statistic" actually shows that you're not fully understanding what NHTSA accomplished here.

What you want, a statistical test for how often Autopilot fails, is... well... depending on the test, as high as 100%.

https://www.youtube.com/watch?v=azdX_6L1SOA

100% of the time, Tesla Autopilot will fail this test. That's why Luminar technologies used a Tesla for their live-demonstration at CES Vegas, because Tesla was so reliably failing this test it was the best one to pair up with their LIDAR technology as a comparison point.

Tesla Autopilot is an automaton. When you put it inside of its failing conditions, it will fail 100% of the time. Like a machine.

[–] [email protected] 5 points 1 year ago

I think they’re looking for number of accidents/fatalities per 100k miles, or something similar to compare to human accidents/fatalities. That’s a better control to determine comparatively how it performs.

load more comments (20 replies)
load more comments (21 replies)