So now that things have quieted down a little bit from yesterday, here are my thoughts on the #Tesla recall of their #FSDBeta product...

First things first, Tesla does not sell any vehicle that is capable of "self driving" or "driving itself" at any time.

The human driver is *always* driving.

I do wish that media publications and editors (who generally write the article titles) would cease this practice. It is harmful to public safety.

https://www.vice.com/en/article/pkg7ek/us-gov-says-tesla-self-driving-may-cause-crashes-issues-recall

US Gov Says Tesla Self-Driving May Cause Crashes, Issues Recall

The safety regulators issued a recall affecting 362,000 cars, but it also highlights the shortcomings of the current recall system.

Unsurprisingly, Musk took to Twitter to state his displeasure with the term "recall" in the context of this action.

"Recall" is just a singular, high-level term that captures the existence of a safety-related defect that is *independent* of its corrective action.

It seems quite unnecessary (and with no clear public safety upside) to establish a host of terms that are dependent on the corrective action from a consumer confusion standpoint.

A "recall" in the context of a #NHTSA regulatory process is effectively a public notification that a defect exists or existed in a vehicle, a description of the defect and, potentially, a corrective action associated with the defect.

Automobile manufacturers in the US are allowed to offer a variety of corrective actions under a "recall" - some that do not include the customer to return the product to the manufacturer or a service center.

So, that is the high-level overview on that.

Ok, so moving on.

A fundamental, structural problem with the #NHTSA is that they have zero ability to independently scrutinize internal automaker development and validation *processes*.

Instead, the NHTSA takes the easy way out... focusing on observed defects in the field or "endpoints".

The trouble with that approach is once a product is on the roadway, it is extremely difficult to quickly and efficiently "catch" defects before there is actual (typically outsized) death and injury.

The #NHTSA also almost entirely ignores the role that Human Factors play in the overall systems safety of the vehicle - that is, the non-trivial dynamics between human limitations and the engineered system.

From the NHTSA's perspective, it is far easier to blame the human driver through assumptions than it is to use evolving, hard science to ensure safe vehicle design.

That was always an issue, but even more so now as partial automated driving systems are fundamentally a Human Factors issue.

Building on that, *the* structural systems safety issue with #FSDBeta is that it attempts to mesh the very real limitations of an enormous pool of possible human drivers with an opaque automated driving system that *appears* unlimited in its capabilities.

A hard mismatch therefore exists.

An extremely dangerous mismatch.

#Tesla does this in an effort to provide Tesla owners with a *sense* that their vehicle is capable of "driving itself".

But these concepts are foreign to the NHTSA...

The other core issue with the #FSDBeta program is that validation (which is very different from "QA") is not possible when the Operational Design Domain (ODD) of the automated vehicle is effectively unbounded.

Or, more accurately, Tesla has presented nothing in terms of what would undoubtedly be a revolutionary, ground-breaking safety case that describes such a validation process.

Instead, Tesla clumsily argues that "AI training" is equivalent to validation - which it is not.

Even internal #Tesla employees that reportedly "test" the FSD Beta product prior to release to external consumers are practically limited because the ODD of the system is so large (now encompassing the entire US and most of Canada).

The only testing that can occur is over an extremely small part of this enormous, highly-complex, highly-dynamic ODD.

Tesla is not performing validation, they are essentially performing a "QA" process similar to those that consumer electronics undergo.

Such "QA" processes are extremely cheap compared to safety-critical system validation - which is why #Tesla embraces it.

Ultimately, #Tesla operates under a "safety case" in the #FSDBeta program that mimics the default position of the #NHTSA - that is, why is there a need for Human Factors considerations and robust validation scrutiny when the human driver can just be blamed by everyone?

This is where this particular NHTSA recall action falls terminally short.

The NHTSA investigators performed a series of assessments in a #FSDBeta-equipped vehicle in undoubtedly a small area or a series of relatively small areas (relative to the enormous size of FSD Beta's ODD) and made some observations of automated vehicle behavior that deviated from established roadway regulations.

That misses the point entirely.

What is *really* needed is scrutiny of Tesla's systems safety lifecycle (should they actually have one) and Safety Management System.

Scrutiny of internal *processes* over arbitrary endpoints.

That is where automotive regulations really need to go.

That is where automated driving system scrutiny needs to go industry-wide.

Otherwise, it is just tail-chasing... all while the public is being unquantifiably harmed.

It really is broader than #Tesla - although Tesla has been one of the more *visibly* problematic cases in the automated driving system space.

That does not preclude wrongdoings by others that are less visible.

Those should be some of the real takeaways of this recall event in my view.

This is where the #NHTSA falls short here in evolving to protect the public.

Lastly, consider reviewing Professor Phil Koopman's response to this NHTSA recall action (on LinkedIn) - as he touches on a few other aspects of this.

Be sure to review Professor Koopman's follow-up comments on his post as well.

Professor Koopman is one of the foremost experts in embedded systems, systems safety and automated driving systems.

https://www.linkedin.com/posts/philip-koopman-0631a4116_nhtsa-recall-23v-085-tesla-fsd-activity-7032047836225511425-PmDT

#FSDBeta #Tesla

Philip Koopman on LinkedIn: NHTSA Recall 23V-085 Tesla FSD | 24 comments

Tesla recall: Full Self-Driving Software May Cause Crash "FSD Beta software that allows a vehicle to exceed speed limits or travel through intersections in an… | 24 comments on LinkedIn

@adamjcook

Buy a Tesla, Become a Beta Tester!