San Francisco protestors are disabling autonomous vehicles using traffic cones | "It's a great time"

https://lemmy.world/post/1347033

San Francisco protestors are disabling autonomous vehicles using traffic cones | "It's a great time" - LemmyWorld

Safe Streets Rebel’s protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said…

Thousands of accidents a year from human drivers. I sleep

90 accidents a year from autonomous vehicles. Lazer eyes

You make it sound like it’s a 50/50 split between human drivers and autonomous vehicles, which is definitely not the case.

There are way more human drivers than autonomous vehicles. So, when an autonomous vehicle runs your child or pet over or whatever, who do you blame? The company? The programmers? The DMV for even allowing them on the road in the first place?

What’s an autonomous vehicle do if it gets a flat? Park in the middle of the interstate like an idiot instead of pulling over and phone home for a mechanic?

You need to first ask yourself if it more important to put blame than to minimize risk.

“Autonomous vehicles could potentially reduce traffic fatalities by up to 90%.”

“Autonomous vehicle accidents have been recorded at a slightly lower rate compared with conventional cars, at 4.7 accidents per million miles driven.”

blog.gitnux.com/driverless-car-accident-statistic…

Driverless Car Accident Statistics And Trends in 2023 • GITNUX

As the world moves towards a more automated future, driverless cars are becoming increasingly popular. With this new technology comes an array of potential

GITNUX
That opinion puts a lot of blind faith in the companies developing self driving and their infinitely altruistic motives.
That’s one way of strawmanning your way out of a discussion.
It’s not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there’s no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.
Maneuvering Characteristics Augmentation System - Wikipedia

We can’t audit the code for humans, but we still let them drive.

If the output for computers driving is less than for humans and the computer designers are forced to be as financially liable for car crashes as humans, why shouldn’t we let computers drive?

Because there’s no valid excuse to prevent us from auditing their software and it could save lives. Why the hell should we allow then to use the road if they won’t even let us inspect the engine?

Why the hell should we allow then to use the road if they won’t even let us inspect the engine?

How do you think a car gets approved right now? Do we take it apart? Do we ask for the design calculations of how they designed each piece?

That isn’t what happens. There is no “audit” of parts or the whole. Instead, there is a series of tests to determine road worthiness that everything in a car has to pass. We’ve already accepted a black box for the electronics of a car. You don’t need to get approval of your code to show that pressing the brake pedal causes the brake lights turn on; they just test it to make sure that it works.

We don’t audit the code already for life critical software already. It is all liability taken on by the manufacturers and verified via government testing of the finished product. What is an audit going to do when we don’t it already?