Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem
https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/
Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem
https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/
"...What makes this account particularly striking is Krikorian’s background. At Uber’s Advanced Technologies Center, he ran the team building autonomous vehicles and trained human safety drivers on exactly when and how to intervene when a self-driving system fails...."
🤔
LOL this is the problem with relying on AI tools, as well...
"...His core argument: Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.
The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that...."
@ai6yr every time
This publication comes to mind:
https://how.complexsystems.fail
As does a Human Factors lecture I attended last century (ugh) on the amount of money spent on psychological research to make fighter plane cockpits human-goof-proof, ON TOP of the extended, intense, and repeated training pilots go through.
One of the points in the early 90's was cars were becoming too complex for mere untrained humans to cope with, with next to no thought about the human-tech interface required.
@johannab @ai6yr it’s also where standards help and “innovation” breaks muscle memory and consistency. Cars have always had quirks and differences but increasingly their user interfaces are becoming so different between makes sometimes in small until it causes a crash ways
- I have two cars (a Volvo and a Kia) their interfaces do some things exactly opposite of each other (one you push up to control the windshields the other you push down) that’s minor
More major - their safety systems differ
@nazokiyoubinbou @ai6yr @Rycaut @johannab there is a "zero" point at about 1/4 the pedal travel. That is coasting / no braking. 0 throttle input is full Regen braking.
This also activates brake lights even though your speed doesn't change that much. You ever follow a Tesla where they CONSTANTLY flash their brakes? One pedal mode and poor foot work.
@kajer @ai6yr @Rycaut @johannab I have never followed one, no. I only see them flying up in my rear view mirror and then missing me by 0.3cm as they swerve without a blinker at the last possible instant.
I just remembered I've done this before in a game at least. I had a wheel for my computer and something detected it wrongly, refusing to see the brake axis and treating the throttle as a combined brake+gas like this with negative being braking and positive being throttle.
It was horrible. I felt like I was constantly threading a needle and it actually made my foot hurt.
I started to say I can't imagine how they continuously do that, but I suppose they don't. I imagine they have their foot all the way down most of the time, then all the way up at the last possible instant the rest.
@kajer @nazokiyoubinbou @ai6yr @Rycaut @johannab
Oh now THAT is bizarre.
And also rather horrible.
Although, on the other hand I guess that ends the issue I have ALWAYS despised, where automatic drive cars are constantly trying to scoot forward if you take your foot off the brake, leading in part to me wondering what happens if somebody passes out and the car just keeps moving under power, but also the hugely annoying tendency for people to constantly creep forward at every damn red light and stop sign...