I just do not understand how "beta" self-driving features are allowed to be released on public highways and roads. There are so many other drivers on the road who have NOT opted into the beta test but who are forced to take part. This footage from the Tesla pile-up that happened on the Bay Bridge in San Francisco/Oakland the day that the "Tesla Full Self-Driving Beta" was launched is maddening: https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/
Exclusive: Surveillance Footage of Tesla Crash on SF’s Bay Bridge Hours After Elon Musk Announces “Self-Driving” Feature

Elon Musk has said Tesla’s problematic autopilot features are “really the difference between Tesla being worth a lot of money or worth basically zero.”

The Intercept
@kashhill I think overall human drivers are much worse. I see it every day and the accident rate in the U.S. is horrible. In this example, isn't one of the main causes of the accident the drivers that rear-ended the Tesla (and the chain reaction) because they were following too close (or not paying attention)? A responsible driver is supposed to leave enough room to stop. Also the driver of the Tesla should have hit the accelerator instead of letting the car stop.
@garykrysztopik Yes, I also agree that humans should still be "in the loop" as they say. But the way Tesla is marketing it seems to suggest they don't need to be.
@kashhill Yes, the Tesla "FSD" (Full Self Driving) is supposed to be completely autonomous, with the goal of being able to send your car out as a basically driverless Uber. Everything so far has been a "beta" release that requires the driver to maintain situational awareness and control of the vehicle. It's way behind schedule but getting closer every day. Again, with the 2 million cars and billions of miles of data collection, they are probably closer than anyone else but there are infinite scenarios to account for so it's a lot harder than anyone imagined.