San Francisco protestors are disabling autonomous vehicles using traffic cones | "It's a great time"

https://lemmy.world/post/1347033

San Francisco protestors are disabling autonomous vehicles using traffic cones | "It's a great time" - LemmyWorld

Safe Streets Rebel’s protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said…

Thousands of accidents a year from human drivers. I sleep

90 accidents a year from autonomous vehicles. Lazer eyes

You make it sound like it’s a 50/50 split between human drivers and autonomous vehicles, which is definitely not the case.

There are way more human drivers than autonomous vehicles. So, when an autonomous vehicle runs your child or pet over or whatever, who do you blame? The company? The programmers? The DMV for even allowing them on the road in the first place?

What’s an autonomous vehicle do if it gets a flat? Park in the middle of the interstate like an idiot instead of pulling over and phone home for a mechanic?

You need to first ask yourself if it more important to put blame than to minimize risk.

“Autonomous vehicles could potentially reduce traffic fatalities by up to 90%.”

“Autonomous vehicle accidents have been recorded at a slightly lower rate compared with conventional cars, at 4.7 accidents per million miles driven.”

blog.gitnux.com/driverless-car-accident-statistic…

Driverless Car Accident Statistics And Trends in 2023 • GITNUX

As the world moves towards a more automated future, driverless cars are becoming increasingly popular. With this new technology comes an array of potential

GITNUX
That opinion puts a lot of blind faith in the companies developing self driving and their infinitely altruistic motives.
That’s one way of strawmanning your way out of a discussion.
It’s not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there’s no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.
Maneuvering Characteristics Augmentation System - Wikipedia

We can’t audit the code for humans, but we still let them drive.

If the output for computers driving is less than for humans and the computer designers are forced to be as financially liable for car crashes as humans, why shouldn’t we let computers drive?

I’m not fully in either camp in this debate, but fwiw, the humans we let drive generally suffer consequences if there is an accident due to their own negligence

And I’m not denying it. However, it takes a very high bar to get someone convicted of vehicular manslaughter and that usually requires evidence that the driver was grossly negligent.

If you can show that a computer can drive as well as a sober human, where is the gross negligence?

Also we do audit them, it’s called a license. I know it’s super easy to get one in the US but in other countries they can be quite stringent.