Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem

https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/

#tesla #crash

Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem

Raffi Krikorian, Mozilla’s CTO and the former head of Uber’s self-driving car division, totaled his Tesla Model X while using...

Electrek
**VERY glad the guy and his kids are okay, but it would have been something else if the Uber self-driving chief had been incinerated or killed by a self driving car. 🤔

"...What makes this account particularly striking is Krikorian’s background. At Uber’s Advanced Technologies Center, he ran the team building autonomous vehicles and trained human safety drivers on exactly when and how to intervene when a self-driving system fails...."

🤔

LOL this is the problem with relying on AI tools, as well...

"...His core argument: Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.

The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that...."

#AI

Supermoosie (@[email protected])

Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention. The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that. https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/

Mastodon Australia
@SuperMoosie They only give control back as you are about to die
@ai6yr @SuperMoosie ... so they can avoid liability by pretending that the crash is the driver's fault, not the car's.
@msbellows @SuperMoosie Yeah, funny how that system works.

@msbellows @ai6yr @SuperMoosie The NHTSA rules for collecting crash data includes if an autonomous system was active within 30 seconds of the time of impact.

"Reporting Requirements" https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting

Standing General Order on Crash Reporting | NHTSA

NHTSA has issued a General Order requiring the reporting of crashes involving automated driving systems or Level 2 advanced driver assistance systems.

NHTSA

@ai6yr someone has to take the fall and it won't be tech bros

@SuperMoosie

@ai6yr @SuperMoosie

Call this the #JustinTrudeauManeuver
Take credit while things are smooth sailing but pull the ripcord and bail when things look like they're about to get a little bit spicy.