I just do not understand how "beta" self-driving features are allowed to be released on public highways and roads. There are so many other drivers on the road who have NOT opted into the beta test but who are forced to take part. This footage from the Tesla pile-up that happened on the Bay Bridge in San Francisco/Oakland the day that the "Tesla Full Self-Driving Beta" was launched is maddening: https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/
Exclusive: Surveillance Footage of Tesla Crash on SF’s Bay Bridge Hours After Elon Musk Announces “Self-Driving” Feature

Elon Musk has said Tesla’s problematic autopilot features are “really the difference between Tesla being worth a lot of money or worth basically zero.”

The Intercept
@kashhill Yeah it's really screaming for regulation. I did not sign up to be Elon Musk's test dummy.
@kashhill
Tesla should be held accountable for this disaster.
@kashhill
I hope the insurance companies and uninsured individuals impacted by any #BetaTest incidents all claim from #Tesla.
@adminkirsty Insurance companies need to stop insuring Teslas.
@kashhill couldn’t agree more. We are all crash test dummies for multiple un- or poorly-regulated technologies.
@kashhill As the driver of a car with a different Level-2 system, the thing I monitor the most when the system is engaged is the possibility of an event or change causing my car to decelerate rapidly for no actual reason. A common case for me happens when I follow a car that turns off, my car still tends to slow down aggressively even after the other car is not in my path any longer. The Bay Bridge crash feels like a very similar situation to me.
@jbqueru 100%. On the highway, I spend most of my time making sure no one is tailgating behind me. Because a hard shadow or misunderstood speed limit sign (like here?!) can cause my car to decelerate and surprise other drivers. @kashhill
@doncruse @jbqueru @kashhill And the law doesn't place sufficient responsibility on drivers to avoid slowing unexpectedly; the duty is on the following driver to stop on time, which made sense when roads were mostly empty, but not when efficient mass movement requires cars to work closely like train cars.

@kashhill or how making a flat featureless touchscreen you have to look at to operate basic controls is just thrown at drivers.

There's no way it's 'safe'

@kashhill What? Humans get to be student drivers, why can’t a car? This is pure speciesism. I mean, corporations have freedom of speech, why can’t Teslas have true freedom of movement?
@kashhill I just saw something about them not stopping for stopped school busses, either. A dude I know has had TWO accidents in his Tesla while letting it run the show, once with a deer.
@kashhill I've tried to make the same point over the last few years - that test driving self-driving cars is making other drivers into involuntary test subjects. Google was perhaps the first at this but Tesla seems to have taken it to a new level. We find that our own Tesla is an ever-changing work-in-progress, often "fixing" a bug with something worse.

@karlauerbach @kashhill

Even on other software stuff (not "self-driving") I got fed up with the constant introduction of new bugs in software updates with the Model S. (They rarely fixed anything; they broke a lot.)

That's why I sold it and now drive a VW ID.4. All the stuff which didn't work right in the Tesla works in the VW.

& IMO, VW's driver assist is actually better than Tesla's fake self-driving or "autopilot".

Tesla also tried to deny valid warranty claims for years. Crooks.

@neroden @karlauerbach @kashhill

And yet, VW are also plagued by software bugs (to the point its making a dent in the company's profit and has cost them at least one further CEO since Diesegate); hasn't totally fixed basic flaws such as a microswitch in a gear selector that can be flaky and has put in too many touch sensitive "buttons" in newer cars, but at least the bugs are in things like infotainment/telemetry and they don't skimp on the safety critical parts of their software..

@vfrmedia @karlauerbach @kashhill

Yeah, it's interesting that I've read that. The infotainment has worked SO much better on the VW than on the Tesla. If VW's software is considered buggy, Tesla's should be considered non-releasable.

@neroden @vfrmedia @kashhill Some Tesla stuff is downright dangerous. For example to make a left turn you flip the turn wand which turns on a rear facing camera on the screen to the driver's right that distracts the driver (who needs to look left) by putting flashing stuff in the driver's right peripheral vision.

And their "automatic" windshield wipers don't notice that you've been splashed by a truck for a second or longer, during which time the driver is blind.

My Mazda has better sensors.

@kashhill

That is a great argument for punitive damages.

The only problem is that, except for the first crash, the other crashes occurred because the drivers were following too close and didn't leave themselves room to break in an emergency.

@Turkewitz @kashhill That's not how traffic works in the real world. If you leave that big of a gap someone will turn into it.

@fyzzlefry @kashhill

It may not be how "real world" traffic works but that is how the law works. There is a presumption that you are negligent if you rear-end someone. (You can rebut it if, for example, you say the other person was backing up...but that wouldn't be a defense here.)

@kashhill If it's on the public roads it's not Beta it's Production.

Musk seems to struggle with the relationships between words and meanings.

@kashhill We're getting to the point (and may have well gone past it) that we need an IRB setup for this kind of stuff since they're experimenting on people who have absolutely *not* opted into any kind of experiment or test environment.
@kashhill I want to start by saying that I fully agree there needs to be regulation of this, and full testing the same way a vehicle is tested before it is allowed on the road. However, we are beta testing drivers all the time; every year there are more than 3M new drivers licenses issued in the US! The stats also show that self-driving cars are, per mile driven, safer than human drivers.
Now, that will all change as you add more and more unregulated self-driving cars to the road. We need to implement meaningful regulations that allow the innovation to continue, but not unchecked.
@ericgalis @kashhill "we are beta testing drivers all the time" feel like this statement needs a massive BUT understanding that computers make mistakes four-year-olds don't. We find AI hilarious because it's so completely unexpected given the input. THAT is not any kind of driver I want on the road and is not the same as a young/inexperienced/learner driver.
@smolbeaver @kashhill my experience in driver's ed was a little bit different than yours... :)
@kashhill Probably has something to do with self-certified SIL-4 compliance, proven-in-use vs proven by design, and paying off local regulators to look the other way to promote "innovation and growth". https://speedsys.io/the-difference-between-sil-certification-by-design-and-proven-in-use/

@kashhill I found this an insightful comment from the cited article, very applicable :

"Certification by design, for OEM’s, is always preferred over proven in use due to its wider applicability and longer test intervals. However, many products will never be able to receive such certification due to their design. In terms of SIL the only option is then to either use a non-certified product or strive towards a proven in use certification."

@kashhill @brooklynmarie
Presumably, to protect themselves from liability they are pretending they are not liable

Then when they face liability eventually they will settle out of court or some thing and put a small sticker on their cars that say they are running a beta test or something

@kashhill you're having a good point, how many accidents and investigations and fatalities before stopping it?

https://www.cnbc.com/2022/12/22/nhtsa-initiates-two-more-tesla-crash-investigations.html

Tesla under investigation by NHTSA for two more crashes that may have involved Autopilot or FSD

According to data obtained from NHTSA by CNBC, the agency is looking into at least 41 crashes involving Tesla vehicles where automated features were involved.

CNBC

@kashhill I think there is more to this story. For the car to signal and pull over like that and come to a complete stop mostly likely means the driver was asleep. If the car senses the driver is not in control of the vehicle it is programmed to signal and pull over safely and stop.

Since there is no shoulder, it did what it was supposed to do and the pile up after the second car was because people like to haul ass and tailgate

@kashhill I'm more concerned with the number of drivers who piled into the back of the Tesla. They were definitely at fault.
Not in control of their vehicles either. Sorry to point that out.
@kashhill if any other carmaker released a car with “beta” brakes that failed, they’d be sued out of existence. The Feds need to do their jobs here.
@kashhill this is a bad take. Nobody "opts in" to the "beta test" of human student drivers, either. Equating this with individuals' choice regarding what software they run on *their own computers* just looks silly. We need serious work on state-level regulation and liability laws, not bad analogies on social media.

@kashhill that so-called eight car pile-up could have happened with a petrol car running out of petrol.

Result would’ve been exactly the same. It was not caused by the Tesla; it was caused by those behind, following too close and too fast.

@kashhill There is only one place "Full Self Driving" (SAE4) exists as a beta....
A parking garage at an airport

@kashhill

If I were a psychopath,
I'd buy a social media platform
Pretend to have evidence proving there was bias against those in power
Reap the benefits of them not wanting to upset me & stop the reporting that keeps them in power over the ignorant.

@kashhill The level of negligence is utterly criminal.

When I worked in automotive embedded systems, the safety-critical nature of the system was reinforced through every single stage of engineering. Even for things only tangentially safety-critical (e.g. infotainment that could crash and distract a driver).

There's no way that "lol throw a machine learning at it and test on public roads" would have even passed the Shitpost At A Bar test. The regulators are asleep at the wheel.

@kashhill
I am no fan of Tesla, or Musk, but I am wondering...

If, in a conventional car, a driver indicates, changes lane, and then the car stops dead for some reason, mechanical or emergency.
Who's at fault when traffic piles into the back of it?

Surely following drivers should be able to stop their vehicles in the space they can see to be clear.
Failure to do so is driving without due care and attention. at least.
The first car driver, being cut up, is an exception.

@kashhill What are the insurance implications of this?
@kashhill 1000% agree with this sentiment.
@kashhill When they are testing this stuff on real roads it needs to be done with a certified driver being paid by Tesla behind the wheel. They should be treating it as a job that documents every anomaly they encounter …not regular people on their way to shop, etc.
@kashhill The Tesla is responsible, but there were far too many cars driving too fast with too little space in between them. That's what turned it from a crash into a pile-up.

@kashhill I’m not a Tesla fan, and I entirely agree with the point about self driving and putting other drivers at risk. But this video very clearly shows another problem as well: Drivers are driving carelessly if they’re close enough to each other, and not paying enough attention, that a car in front of them breaking can cause a crash.

Always keep a safe distance.

@kashhill @fsinn Watched the video? Car blinked & tried to safely pull over – as any vehicle with medical-emergency, etc.

Following crashes are inattentive drivers that desperately NEED automation because human reaction was WORSE. Sea of red-lights but pile-ups increased instead of people catching on!

Under well-lit, no ice/snow/fog conditions, come on! Insurance would name all in rear that failed to stop majority at-fault, from my Los Angeles rush hour experience.

Nafnlaus 🇮🇸 🇺🇦 (@[email protected])

Attached: 3 images @[email protected] As someone who has a #Tesla and regularly uses #Autopilot, that looks more like someone taking the car out of AP and pulling over. Based on past history, where "It was the car, not me!" has become the new "My dog ate my homework", I'll withhold judgement until *after* the investigation. Also, what on Earth was up with those *human drivers* where a car stopping from moderate speeds over 10 seconds in broad daylight on dry pavement led to an 8-car pileup?

Fosstodon
@kashhill Sounds like fairly reasonable cause for an group law suit to me. “We did not opt into the beta, we were victims of it instead.”
@kashhill to hear the Muskites tell it though the only reason anyone survives a crash caused by FSD is because Tesla’s are super safe. If you want to save yourself from Tesla you have to drive one…apparently.
@kashhill Yeah it's scary. I'm glad I don't live in the SF Bay Area anymore
@kashhill can you imagine how absolutely insufferable Muskites will be if the government takes away their toy? More than a few are willing to let Musk implant microchips in their brains ffs.

@kashhill

it’s completely nuts. Beta testing of vehicles, airplanes, spaceships… All should be done in a controlled area under supervision. Way too dangerous otherwise.

@kashhill @ProgGrrl the Tesla autobrake function is still too glitchy to be on the car. Let alone a fully auto driving solution. I’ve had several occasions when the car has slammed on the brakes reacting to cars waiting at t junctions, I now keep that at the least sensitive option as even medium is unreliable.
@kashhill I think overall human drivers are much worse. I see it every day and the accident rate in the U.S. is horrible. In this example, isn't one of the main causes of the accident the drivers that rear-ended the Tesla (and the chain reaction) because they were following too close (or not paying attention)? A responsible driver is supposed to leave enough room to stop. Also the driver of the Tesla should have hit the accelerator instead of letting the car stop.
@garykrysztopik Yes, I also agree that humans should still be "in the loop" as they say. But the way Tesla is marketing it seems to suggest they don't need to be.
@kashhill Yes, the Tesla "FSD" (Full Self Driving) is supposed to be completely autonomous, with the goal of being able to send your car out as a basically driverless Uber. Everything so far has been a "beta" release that requires the driver to maintain situational awareness and control of the vehicle. It's way behind schedule but getting closer every day. Again, with the 2 million cars and billions of miles of data collection, they are probably closer than anyone else but there are infinite scenarios to account for so it's a lot harder than anyone imagined.
@kashhill Of course the source of the data is Tesla but "“In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.” https://cleantechnica.com/2021/12/07/tesla-1-crash-per-4-41-million-miles-traveled-on-autopilot/
Tesla: 1 Crash Per 4.41 Million Miles Traveled On Autopilot - CleanTechnica

In light of Tesla CEO Elon Musk’s response to Kim Paquette about how safe Tesla vehicles are, I wanted to share a quick comparison between Tesla’s data and overall auto accident data from the National Highway Traffic Safety Administration. We have not reported on the latest safety update from the company. In response to Kim, […]

CleanTechnica