this post was submitted on 05 Jul 2025
703 points (96.8% liked)

Technology

72785 readers
2974 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The car came to rest more than 70 metres away, on the opposite side of the road, leaving a trail of wreckage. According to witnesses, the Model S burst into flames while still airborne. Several passersby tried to open the doors and rescue the driver, but they couldn’t unlock the car. When they heard explosions and saw flames through the windows, they retreated. Even the firefighters, who arrived 20 minutes later, could do nothing but watch the Tesla burn.

At that moment, Rita Meier was unaware of the crash. She tried calling her husband, but he didn’t pick up. When he still hadn’t returned her call hours later – highly unusual for this devoted father – she attempted to track his car using Tesla’s app. It no longer worked. By the time police officers rang her doorbell late that night, Meier was already bracing for the worst.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 171 points 1 week ago (11 children)

If we lived in any sort of reasonable or responsible world then these cars would be banned from public roads all over the globe.

[–] [email protected] 76 points 1 week ago (2 children)

And Tesla would be fined and sued into oblivion.

[–] [email protected] 53 points 1 week ago (1 children)

And the people who knowingly put profits before lives would be individually serve time for manslaughter.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 36 points 1 week ago (14 children)

Call me a Luddite but I won't ride in a "self driving" car. I don't even trust lane assist although I've never had a car with that feature.

I think my sweet spot is 2014 for vehicles. It's about 50/50 with the tracking garbage and the "advanced features" on those models but anything past 2015 seems to be fully fly-by-wire and that doesn't sit right with me.

I'm old though and honestly if I bought a 2014 right now and babied it as my non commuter car I could probably keep it until I should give up my keys. You younger people are going to have to work around all this crap.

[–] [email protected] 14 points 1 week ago (11 children)

I have a Sprinter van with lane assist for cross country travel. As obnoxious as it is 99% of the time, it has come in clutch a few times when I started to get drowsy and drifted off my lane.

[–] [email protected] 13 points 1 week ago (4 children)

I hear you, but a 99% chance of being obnoxious isn't a great review.

I think I'll just stick to not driving when tired.

[–] [email protected] 14 points 1 week ago (3 children)

That's easier said than done. You can't judge your own behavior when impaired because you are impaired. By the time you are aware you are that tired, you've already been impaired for a long time.

load more comments (3 replies)
load more comments (3 replies)
load more comments (10 replies)
[–] [email protected] 10 points 1 week ago (15 children)

I've never had any issue with the lane assist in my Mitsubishi. It's absolutely built as an "assist" and not something that will actually try to take control from you. It's trivial to "overpower" it manually and turn out of your lane without signaling if that's what you want to do, but does a perfectly reasonable job of steering on its own when left to its own devices.

That said, I wouldn't be driving a vehicle new enough to have the feature yet either if I hadn't been rear ended a couple of years ago and had my 2012 Lancer written off. :(

load more comments (15 replies)
load more comments (12 replies)
load more comments (9 replies)
[–] [email protected] 126 points 1 week ago* (last edited 1 week ago) (3 children)

Article does not actually answer why Tesla vehicles crash as much as they do or how their crash frequency compares to other vehicles. Its more about how scummy tesla is as a company and how it witholds data from the public when it could incriminate them.

[–] [email protected] 65 points 1 week ago

In some ways that is the answer. Crashes keep happening because they are not being held accountable to regulators because they are not reporting these incidents and no one is exercising oversight to be sure the reporting matches reality.

I think over the years, accurate reporting by manufacturers has been done because they generally do not want to be known as that car company that killed a child and it could have been prevented with a 50 cent bolt. As a result, regulators have been less hawkish. Of course there are probably political donations in the US to help keep the wheels turning.

[–] TimewornTraveler 18 points 1 week ago* (last edited 1 week ago) (2 children)

just scanning the article, it seems to sum it up as - No one knows why yet, not even Tesla '

[–] [email protected] 18 points 1 week ago (1 children)

With a dash of - Tesla might know and be withholding information

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 68 points 1 week ago (6 children)

Tesla tried to do it all at once instead of perfecting the electric tech first and then incrementally adding on advances. They also made change for change’s sake. There’s absolutely no reason mechanical door locks could not have been engineered to work on this car as the default method of opening and closing the door. It’s killing people.

[–] [email protected] 26 points 1 week ago (6 children)

There's absolutely a reason to not engineer something you're not required to. It's called capitalism. Tesla cut every corner they could.

[–] [email protected] 17 points 1 week ago (2 children)

No, the problem is they engineered something they didn't need to, because Musk thinks everything should be electric because it's cool. They had to then engineer a mechanical release, because it was required by law (for good reason)

Mechanical door locks would have been cheaper. The fly by wire in the cyber truck is far more expensive, heavier, and far more dangerous than the very well polished power steering systems every other car uses

Maybe it's something like they wanted to make more money on repairs or something... But even that they could've done better by starting from very common, cheap technology

Let's be clear... The real problem here is that Elon Musk, opinion having idiot that he is, made decisions from on high with very little understanding of engineering

load more comments (2 replies)
[–] [email protected] 14 points 1 week ago (1 children)

Elon : some of you will die, but that is a sacrifice I'm willing to make.

load more comments (1 replies)
load more comments (4 replies)
[–] [email protected] 22 points 1 week ago (14 children)

Also, the fact that they removed Lidar sensors and just base their self driving on cameras is plainly stupid.

load more comments (14 replies)
load more comments (4 replies)
[–] [email protected] 63 points 1 week ago (1 children)

This is the kind of shit that makes me worried even seeing someone else driving one of these deathtraps near me while I am driving. They could explode or decide to turn into me on the highway or something. I think I about this more than Final Destination when seeing a logging truck these days.

[–] [email protected] 29 points 1 week ago

It's one of those rules you make for yourself when you drive...

Like no driving next to people with dents...

Or

Stay away from trucks with random shit in the back not strapped down ...

No driving near New cars, they are new and or it's because they got into an accident so best just be safe...

So

No driving near a Tesla...

[–] [email protected] 41 points 1 week ago

Wait, I might know the answer. Is it because they don't use LIDAR and they're made by a company headed by some piece of shit who likes to cut costs? Haha, I was just guessing, but ok.

[–] [email protected] 39 points 1 week ago

You can choose not to drive bleeding edge technology, but sadly you have no choice in whether to share the road with it.

[–] [email protected] 35 points 1 week ago

I drive a BMW i4 and one of the reasons I prefer it is because it still uses a number of mechanical options like physical buttons and an actual door handle. I never trusted that flush handle from Tesla, even back when I liked Tesla.

[–] [email protected] 34 points 1 week ago (2 children)

I have never ridden a Tesla, and I plan on requesting a non Tesla car from now on when I have to take a taxi.

Cars in general, Teslas in particular, should have a standardized blackbox data recorder that third parties can open and access the logs, we have had this kind of tech on aircrafts for many decades.

It is terrifying that Tesla can just say that there was no relevant data and the investigative agency will just accept that.

I remember watching an episode of Air Crash Investigations, where a plane crashed, and they could not find an immediate cause, but the flight data recorder was able to be analysed far back, way before the accident flight, and they noticed that a mount for the APU turbine had broken many flights earlier, and the APU had broken free during the flight, causing the crash.

It is not Tesla's job to tell the investigators what is relevant and not, it is Teslas job to unlock all data they have and send it to the investigators, if they can't or won't, then Tesla should lose the right sell cars in Europe

[–] [email protected] 17 points 1 week ago

Cars do have that in what amounts to a TCU or Telematics Control Unit. The main problem here isn't whether or not cars have that technology. It's about the relevant government agency forcing companies like Tesla (and other automakers) to produce that data not just when there's a crash, but as a matter of course.

I have a lot of questions about why Tesla's are allowed on public roads when some of the models haven't been crash tested. I have a lot of questions about why a company wouldn't hand over data in the event of a crash without the requirement of a court order. I don't necessarily agree that cars should be able to track us (if I buy it I own it and nobody should have that kind of data without my say so). But since we already have cars that do phone this data home, local, state, and federal government should have access to it. Especially when insurance companies are happy to use it to place blame in the event of a crash so they don't have to pay out an insurance policy.

load more comments (1 replies)
[–] [email protected] 25 points 1 week ago* (last edited 1 week ago) (1 children)

Seems like a lot of this technology is very untested and there are too many variables to make it where it should not be out on the roads.

[–] [email protected] 17 points 1 week ago (1 children)

Move fast and break things, but it's a passenger vehicle on a public road.

[–] [email protected] 15 points 1 week ago

It's been a nightmare seeing tech companies move into the utility space and act like they're the smartest people in the room and the experts that have been doing it for 100 years are morons. Move fast and break things isn't viable when you're operating power infrastructure either. There's a reason why designs require the seal of a licensed engineer before they can be constructed. Applying a software development mentality to any kind of engineering is asking for fatalities

[–] [email protected] 23 points 1 week ago (2 children)

Bad code. Guinea pig owners. Cars not communicating with each other. Relying on just the car’s vision and location is stupid.

[–] [email protected] 20 points 1 week ago (2 children)

Also, not only do they rely on "just vision", crucially they rely on real-time processing without any memory or persistent mapping.

This, more than anything else is what bewilders me most.

They could map an area, and when observing a construction hazard save that data and share it with other vehicles so they know when route setting or anticipate the object. Not they don't. If it drives past a hazard and goes around the block it has to figure out how to navigate the hazard again with no familiarity. That's so foolish.

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 20 points 1 week ago* (last edited 1 week ago)

the truth? Because Elon is the CEO errrr Teknoking.

[–] [email protected] 16 points 1 week ago (2 children)

What kind of engineers work at Tesla? I feel like normal people get anxiety over deleting databases or deploying secrets to production. Accidentally taking a service down.

But there you have all kinds of terrible things happening and it's purely because your company knows how to work policy makers. A dad dies in a fireball and what, it's an emergency meeting? Something you look into first thing Monday morning?

load more comments (2 replies)
[–] [email protected] 15 points 1 week ago

I first thought this article was about their self driving cars and I was like who tf gets in a self driving car with their baby. It's not. It's about Tesla cars in general. Scary stuff.

[–] [email protected] 13 points 1 week ago

News of malfunctioning Tesla cars and Musk going crazy are still not enough to crash Tesla stocks to zero. Which I am hoping will happen not just to inflict sorrow on Musk and his wealth, but so that I could hedge against the stock 😂

[–] [email protected] 10 points 1 week ago* (last edited 1 week ago)

FYI, some numbers. The guardian article is still definitely worth reading, it just had no statistics.

*Nationally (USA), Tesla drivers had 26.67 accidents per 1,000 drivers. This was up from 23.54 last year.

The Ram and Subaru brands were again among the most accident-prone. Ram had 23.15 per 1,000 drivers while Subaru had 22.89.

...

As of October 2024, there have been hundreds of documented nonfatal incidents involving Autopilot and fifty-one reported fatalities, forty-four of which NHTSA investigations or expert testimony later verified and two that NHTSA’s Office of Defect Investigations verified as happening during the engagement of Full Self-Driving (FSD).*

https://www.forbes.com/sites/stevebanker/2025/02/11/tesla-again-has-the-highest-accident-rate-of-any-auto-brand/

load more comments
view more: next ›