Tesla loses Autopilot wrongful death case in $329 million verdict

https://lemmy.sdf.org/post/39667119

Tesla loses Autopilot wrongful death case in $329 million verdict - SDF Chatter

> A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.” So, you admit that the company’s marketing has continued to lie for the past six years?

Yes. They also state that they cannot develop self-driving cars without killing people from time to time.
I mean, that’s probably strictly true.
I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there’s a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as “Autopilot” and later as “Full Self Driving” - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver.

Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn’t during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

You got me interested, so I searched around and found this:

So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

Yes, that’s it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you’ve never been in before. Maybe it’s raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it’s science fiction at this point, but essentially the end goes of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way a human driver would in that scenario. It’s really not defined much better than that end goal - because it’s not possible with current technology, it doesn’t correspond to a specific set of sensors or software system. It’s a performance-based, long-term goal.

This is why it’s so irresponsible for Tesla to continue to market their system as “Full self driving.” It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

I think this chart overcomplicates it a bit. Almost a decade ago, I worked on a very short project that touched on this topic. One expert explained to me that the difference between level 4 and 5 is that you don’t need a streeting wheel or pedals anymore. L5 can drive anywhere, anytime in all situations.

I was working in the AV industry at the time.

How is you working in the audio/video industry relevant? …or maybe you mean adult videos?

Or automotive vision.
Thank you. I seriously didn’t understand what the field was.
Not to defend Tesla here, but how does the technology become “good and well ready” for road testing if you’re not allowed to test it on the road? There are a million different driving environments in the US, so it’d be impossible to test all these scenarios without a real-world environment.

You are defending Tesla and being disingenuous about it.

The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

You sound like a psychopath.

The hyperbole is ridiculous here and it makes you sound like a psychopath.
How about fucking not claiming it’s FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.
Autopilot is ACC which is what the case was about here.

If they called fucking acc autopilot they deserve to rot in hell, what the actual fuck.

That is such a misleading naming.

Cars with humans behind them paying attention to correct the machine. Not this let’s remove humans as quickly as possible bs that we have now. I know they don’t like the cost.
I’m pretty sure millions of people have been killed by cars over the last 100 years.

And we’re having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I’ve read).

Tesla’s autopilot seems to be a step backwards with a future promise of being better than human drivers.

But they slimmed down their sensors to fucking simple 2D cams.
That’s just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn’t take the choice to trust this tech, that’s living more on PR, than actual results

Can’t comment specifically about Tesla’s but self driving is going to have to go through the same decades of iterative improvement that car safety went through. Thats just expected

However its not appropriate for this to be done at the risk to lives.

But somehow it needs the time and money to run through a decade of improvement

Cars, yes, driven by humans. But not by AI bullshit.

its really not, we just have cowards who are afraid of the word regulation running the government.

en.wikipedia.org/wiki/DO-178C

DO-178C - Wikipedia

There will always be accidents with tech or anything. No matter how much planning, foresight, etc could go into a product or service. Humans cannot account for every scenario. Death is inevitable to some degree. That being said.

Tesla point blank launched a half ass product / project that just did not fully operate as specified. I’m all for self driving vehicles, even through the bad stuff even if it happened to me I’d still be for it. Given the early stage though, they should have focused so much more on their “rolling release updates” than they have.

Of course things will need updated, of course accidents will happen. But it’s how they respond to them that makes them look evil vs good. Their response has been lack luster. The market seems to think it’s a not a major issue though. There’s more teslas now than ever on the roads.

“Some of you will die, but that’s a risk I’m willing to take.”
Brannigan is way smarter than Mush.
Some of you will be forced through a fine mesh screen for your country. They will be the luckiest of all.
Farquaad said this, not Brannigan iirc
When I’m command, son, every mission is a suicide mission.
I’m pretty sure it was both.
“Ya gotta break some eggs,” or some shit. /s
Listen, if we make it safe it could take an entire extra fiscal year! I have payments to make on my 3 vacation homes NOW!
All they really need to do is make self-driving cars safer than your average human driver.
That is a low bar. However I have yet to see independant data. I know such exists but the only ones who talk have reason to lie with stastics so I can't trust them.
Which they have not and won’t do. You have to do this in every condition. I wonder why they always test this shit out in Texas and California?
I guess they just didn’t want to admit that snow defeats both lidar and vision cameras. Plus the fact that snow covers lane markers, Street signs, and car sensors. People can adjust to these conditions, especially when driving locally. No self driving system can function without input.

Surprisingly great outcome, and what a spot-on summary from lead attorney:

“Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” said Brett Schreiber, lead attorney for the plaintiffs. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way. Today’s verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives,” Schreiber said.

Holding them accountable would be jail time. I’m fine with even putting the salesman in jail for this. Who’s gonna sell your vehicles when they know there’s a decent chance of them taking the blame for your shitty tech?
Don’t you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI…

You’d have to prove that the salesman said exactly that, and without a record it’s at best a he said / she said situation.

I’d be happy to see Musk jailed though, he’s definitely taunted self driving as fully functional.

You understand that this is only happening because of how Elon lost good graces with Trump right? If they were still “bros” this would have been swept under the rug, since Trumps administration controls most, if not all high judges in the US.

We need more people like him in the world.

The bullshit artists have had free reign over useful idiots for too long.

Don’t take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

I had to find two other articles to figure out if the system being used here was Tesla’s free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that’s true, Tesla has positioned its cars as being highly autonomous, and often times doesn’t call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

So I DO blame Tesla, even if the driver contributed to the accident.

Biden administration to require advanced safety tech on all new cars and trucks

The Biden administration plans to require that all new cars and trucks come with pedestrian-collision avoidance systems that include automatic emergency braking technology by the end of the decade.

NBC News

Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

Going off of OP’s quote, the jury found the driver responsible but Tesla is found liable, which is pretty confusing. It might make some sense if expected autopilot functionality despite the drivers foot being on the pedal didn’t work.

Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

From the article, it looks like the car didn’t even try to stop because the driver was overridden by the acceleration because the driver had their foot pressed on the pedal (which isn’t normal during autopilot use).

This is correct. And when you do this, the car tells you it won’t brake.
I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

I feel like calling it AutoPilot is already risking liability,

From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft’s control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models

Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don’t share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn’t. I agree with you that it was a legal liability waiting to happen.

So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

Because it still basically does what’s they said. The only new advent for the autopilot system besides maintaining speed, heading, and altitude is the ability to use and set a GPS heading, and waypoints (for the purposes of this conversation). It will absolutely still fly into a mountain if not for other collision avoidance systems. Your average 737 or A320 is not going to spontaneously change course just because of the elevation of the ground below it changed. But you can program other systems in the plane to know to avoid a specific flight path because there is a known hazard. I want you to understand that we know a mountain is there. They don’t love around much in short periods of times. Cars and pedestrians are another story entirely.

There’s a reasons we still have air traffic controllers and even then pilots and air traffic control aren’t infallible and they have way more systems to make flying safe than the average car (yes even the average Tesla).

FSD wasn’t even available in 2019. It was a future purchase add on that only went into very limited beta in 2020.
“Today’s verdict is wrong” I think a certain corporation needs to be reminded to have some humility toward the courts Corporations should not expect the mercy to get away from saying the things a human would

It’s all about giving something for useful idiots to latch on to.

These people know most of us can’t think for ourselves, so they take full advantage of it.

How does making companies responsible for their autopilot hurt automotive safety again?
There’s actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it’s better than people, then more people will die.

Even if self driving cars kill less people, they’ll still destroy our quality of life.

youtu.be/040ejWnFkj0

How Self-Driving Cars will Destroy Cities (and what to do about it)

YouTube
Fuck that I’m not a beta tester for a company. What happened to having a good product and then releasing it. Not oh let’s see what happens.
It’s not that simple. Imagine you’re dieing of a rare terminal disease. A pharma company is developing a new drug for it. Obviously you want it. But they tell you you can’t have it because “we’re not releasing it until we know it’s good”.
This is, or was (thanks RFK for handing the industry a blank check), how pharma development works. You don’t even get to do human trials until you’re pretty damn sure it’s not going to kill anyone. “Experimental medicine” stuff you read about is still medicine that’s been in development for YEARS, and gone through animal, cellular, and various other trials.

Actually we have “right to try” laws for the scenario I described.

But the FDA could use some serious reform. Under the system we have, an FDA approval lumps together the determinations of whether a drugs is safe, effective and worth paying for. A more libertarian system would let people spend their own money on drugs that are safe even if the FDA’s particular research didn’t find them effective. And it wouldn’t waste tax payer money on drugs that are effective but exorbitantly expensive relative to their minimal effectiveness. But if a wealthy person wants to spend their own money, thereby subsidizing pharmaceuticals for the rest of us, that’s great in my opinion.

it’s hard to prove that point, though. rolling out self driving may just make car usage go up and negate rate decreases by increasing overall usage