@kashhill or how making a flat featureless touchscreen you have to look at to operate basic controls is just thrown at drivers.
There's no way it's 'safe'
Even on other software stuff (not "self-driving") I got fed up with the constant introduction of new bugs in software updates with the Model S. (They rarely fixed anything; they broke a lot.)
That's why I sold it and now drive a VW ID.4. All the stuff which didn't work right in the Tesla works in the VW.
& IMO, VW's driver assist is actually better than Tesla's fake self-driving or "autopilot".
Tesla also tried to deny valid warranty claims for years. Crooks.
@neroden @karlauerbach @kashhill
And yet, VW are also plagued by software bugs (to the point its making a dent in the company's profit and has cost them at least one further CEO since Diesegate); hasn't totally fixed basic flaws such as a microswitch in a gear selector that can be flaky and has put in too many touch sensitive "buttons" in newer cars, but at least the bugs are in things like infotainment/telemetry and they don't skimp on the safety critical parts of their software..
@vfrmedia @karlauerbach @kashhill
Yeah, it's interesting that I've read that. The infotainment has worked SO much better on the VW than on the Tesla. If VW's software is considered buggy, Tesla's should be considered non-releasable.
@neroden @vfrmedia @kashhill Some Tesla stuff is downright dangerous. For example to make a left turn you flip the turn wand which turns on a rear facing camera on the screen to the driver's right that distracts the driver (who needs to look left) by putting flashing stuff in the driver's right peripheral vision.
And their "automatic" windshield wipers don't notice that you've been splashed by a truck for a second or longer, during which time the driver is blind.
My Mazda has better sensors.
That is a great argument for punitive damages.
The only problem is that, except for the first crash, the other crashes occurred because the drivers were following too close and didn't leave themselves room to break in an emergency.
It may not be how "real world" traffic works but that is how the law works. There is a presumption that you are negligent if you rear-end someone. (You can rebut it if, for example, you say the other person was backing up...but that wouldn't be a defense here.)
@kashhill If it's on the public roads it's not Beta it's Production.
Musk seems to struggle with the relationships between words and meanings.
@kashhill I found this an insightful comment from the cited article, very applicable :
"Certification by design, for OEM’s, is always preferred over proven in use due to its wider applicability and longer test intervals. However, many products will never be able to receive such certification due to their design. In terms of SIL the only option is then to either use a non-certified product or strive towards a proven in use certification."
@kashhill @brooklynmarie
Presumably, to protect themselves from liability they are pretending they are not liable
Then when they face liability eventually they will settle out of court or some thing and put a small sticker on their cars that say they are running a beta test or something
@kashhill you're having a good point, how many accidents and investigations and fatalities before stopping it?
https://www.cnbc.com/2022/12/22/nhtsa-initiates-two-more-tesla-crash-investigations.html
@kashhill I think there is more to this story. For the car to signal and pull over like that and come to a complete stop mostly likely means the driver was asleep. If the car senses the driver is not in control of the vehicle it is programmed to signal and pull over safely and stop.
Since there is no shoulder, it did what it was supposed to do and the pile up after the second car was because people like to haul ass and tailgate
@kashhill that so-called eight car pile-up could have happened with a petrol car running out of petrol.
Result would’ve been exactly the same. It was not caused by the Tesla; it was caused by those behind, following too close and too fast.
If I were a psychopath,
I'd buy a social media platform
Pretend to have evidence proving there was bias against those in power
Reap the benefits of them not wanting to upset me & stop the reporting that keeps them in power over the ignorant.
@kashhill The level of negligence is utterly criminal.
When I worked in automotive embedded systems, the safety-critical nature of the system was reinforced through every single stage of engineering. Even for things only tangentially safety-critical (e.g. infotainment that could crash and distract a driver).
There's no way that "lol throw a machine learning at it and test on public roads" would have even passed the Shitpost At A Bar test. The regulators are asleep at the wheel.
@kashhill
I am no fan of Tesla, or Musk, but I am wondering...
If, in a conventional car, a driver indicates, changes lane, and then the car stops dead for some reason, mechanical or emergency.
Who's at fault when traffic piles into the back of it?
Surely following drivers should be able to stop their vehicles in the space they can see to be clear.
Failure to do so is driving without due care and attention. at least.
The first car driver, being cut up, is an exception.
@kashhill I’m not a Tesla fan, and I entirely agree with the point about self driving and putting other drivers at risk. But this video very clearly shows another problem as well: Drivers are driving carelessly if they’re close enough to each other, and not paying enough attention, that a car in front of them breaking can cause a crash.
Always keep a safe distance.
@kashhill @fsinn Watched the video? Car blinked & tried to safely pull over – as any vehicle with medical-emergency, etc.
Following crashes are inattentive drivers that desperately NEED automation because human reaction was WORSE. Sea of red-lights but pile-ups increased instead of people catching on!
Under well-lit, no ice/snow/fog conditions, come on! Insurance would name all in rear that failed to stop majority at-fault, from my Los Angeles rush hour experience.
Attached: 3 images @[email protected] As someone who has a #Tesla and regularly uses #Autopilot, that looks more like someone taking the car out of AP and pulling over. Based on past history, where "It was the car, not me!" has become the new "My dog ate my homework", I'll withhold judgement until *after* the investigation. Also, what on Earth was up with those *human drivers* where a car stopping from moderate speeds over 10 seconds in broad daylight on dry pavement led to an 8-car pileup?
it’s completely nuts. Beta testing of vehicles, airplanes, spaceships… All should be done in a controlled area under supervision. Way too dangerous otherwise.
In light of Tesla CEO Elon Musk’s response to Kim Paquette about how safe Tesla vehicles are, I wanted to share a quick comparison between Tesla’s data and overall auto accident data from the National Highway Traffic Safety Administration. We have not reported on the latest safety update from the company. In response to Kim, […]