Elektrek: "Tesla FSD Beta tried to kill me last night"

https://lemdro.id/post/1114588

Elektrek: "Tesla FSD Beta tried to kill me last night" - Lemdro.id

Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.

Not Auto Pilot (AP). There’s a difference between FSD and AP. AP will just keep you between the lane lines and pace the car in front of you. It can also change lanes when told to. There’s also Enhanced Auto Pilot (EAP). EAP was supposed to bridge the gap between AP and FSD. It would go “on ramp to off ramp”. So it could switch lanes as needed and get to exit ramps. FSD is the mode where you shouldn’t need to touch it outside of answering the nag (the frequent nag to “apply force to the steering wheel” to tell it you are still alive and paying attention)*.

  • At least I think that’s the same for FSD. I’m only on AP with AP1 hardware. Never had an issue that I’d blame on a “bug” or the software doing something “wrong”.
I'm not getting what this reply has to do with that you appear to have replied to
It’s the beta part that scares me the most, the type of assistance isn’t really relevant. People shouldn’t be driving around in betas. These aren’t phones.
What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?
Have you seen how humans drive? Its not a very high bar to do better.
I was taught to always drive defensively. You never know when someone’s going to get distracted, get stupid, have a stroke… add glitchy robots to the list, it doesn’t make a whole lot of difference.
And yet FSD is still worse than the one time I got in the car with an exchange student who had never driven a car before coming to the US and thought her learners permit was the same as a driver’s license.
From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.

Funny how George Hotz of Comma.ai predicted this exact same issue years ago: “if I were Elon Musk I would not have shipped that lane change”.

This issue likely arises as the cars sensors can not look “far enough ahead” on the lane it changes to. Which can lead to crashes from behind due to much faster cars and in this case lane confusion aa the car can not see oncoming traffic.

Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.

FSD has never driven under a truck, that was autopilot, which is an LKAS system. The incident happend 1 year prior to “Navigate on autopilot” so the car in question was never even able to change lanes on its own. The driver deliberately instructed the car to drive into the trailer.

FSD beta is currently available in most of Europe and has been for several months.

FSD has never driven under a truck

Yes it has. Well, into the back of one so fast that it went under at least.

which is an LKAS system.

So is FSD. 🤣 It’s level 2 bud, you’re really REALLY confused for someone pretending to own one.

The driver deliberately instructed the car to drive into the trailer.

Are you saying Josh Brown killed himself? Because if you are, that would be a new repulsive low even for you Elon simps.

The craziest part of the article is just how much effort the author put into collecting data and filing feedback and really really hoping that Tesla could pull the videos (they can), then went on to actively try and succeeded in recreating the problem at high speed next to another car.

Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.

Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.

I saw the videos of them running over infants in strollers. Does that count?
on FSD? link please
The ones from that guy who runs his own competing autonomous driving company who also refused to allow anyone else to perform the test with the car (which was all proven to be bullshit later because he was hitting the accelerator pedal)? There’s a lot of misinformation and FUD floating around out there.

Dan O’Dowd of Green Hill Software. You should listen to the podcast with whole mars catalog of him trying to explain himself. Its really wild.

Tesla took him to court and won

Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.
so can you provide a link of an accident caused by FSD?
Musk just did a 20 minute video that ended with it trying to drive into traffic.
this one? Where does it drive into traffic? youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
UNCUT: Elon Musk’s MIND BLOWING Live Tesla FSD Demo

YouTube
The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.

The video didn’t end there, it was at the beginning. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…

This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.

it has to perform flawlessly 99.999999% of the time. The number of 9s matters. Otherwise, you are paying some moron to kill you and perhaps other people.

ok so im totally in agreement but 99.999999% is one accident per hundred million miles traveled. I dont think there should be any reasonable expectation that such a technology can ever possibly get that far without real world testing. Which is precisely where we are now. Maybe at 4 or 5 9s currently.

If you do actually want to have that level of safety, which lets be honest we all do, or ideally 100% safety, how would you propose such a system be tested and deemed safe if not how it’s currently being done?

Your posts here show you’re not interested in reality, but I’ll leave a link anyway

motortrend.com/…/tesla-fsd-autopilot-crashes-inve…

Excited to see your response about how this is all user error.

This Is Your Tesla FSD and Autopilot Crash Mega Thread

A reminder that Tesla’s Autopilot and Full Self Driving systems are anything but.

MotorTrend

I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.

First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.

I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.

I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.

The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.

Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.

OK, now that being said, lets dig in:

November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge

  • I’m from the area and have driven this exact spot hundreds of times on FSD and have never experienced anything even remotely close to what is shown here
  • "Allegedly" with FSD engaged
  • Tesla FSD “phantom” braking does not behave like this, and never has in the past. Teslas have 360 degree vision and are aware of traffic in front of and behind them.
  • Notice at the beginning of the video that this car was in the process of a lane change, this introduces a couple of possibilities as to what happened here, namely:
  • Teslas do have a feature under autopilot/FSD that if after multiple warnings for the driver to pay attention and no engagement, the car will slow down and pull over to the shoulder and stop. This particular part of the bay bridge does not have a shoulder, so it stopped where it is. This seems unlikely, since neural networks are very capable of identifying what a shoulder is and that its in an active lane of traffic, and even with tesla’s massive fleet of vehicles on FSD there are no other recorded instances of this happening anywhere else.
  • This particular spot on the bay bridge eastbound has a very sudden and sharp exit to Yerba Buena Island. What I think happened is that the driver was aiming for this exit, saw that they were about to miss it and tapped the brake and put on the turn signal not realizing that they just disengaged FSD. The car then engaged regen braking and came to a full stop.
  • When a tesla comes to a full stop automatically (an emergency stop) it puts the hazards on automatically. This has been a feature since the v1 autopilot days. This car’s hazards do not come on after the stop.
  • What seems especially weird to me is that the driver continued to let the car sit there at a full stop while traffic piled up behind them. In FSD you are always in control of your own car and all it would have taken is tapping the accelerator pedal to get moving again. FSD will always relinquish control over the car to you if you tap the brakes or grab and move the steering wheel hard enough. Unless there was some mechanical issue that brought the car to a stop and prevented it from moving, in which case this is not the fault of the FSD software.
  • Looking at how quickly (or lack thereof) the car slowed down this seems to very clearly be the car using regen braking, not emergency braking. I’m almost positive this means that FSD was disengaged completely.
  • We don’t have all the facts on this case yet and I’ll be anxious to see how this plays out in court but there are definitely many red flags on this one that have me questioning what actually happened here, but I doubt if FSD has anything to do with it.
  • If my earlier point is true this is actually an instance of an accident being caused because the driver disengaged self-driving. The car would have been much safer if the driver wasn’t even there.

April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet

  • This one is a favorite among the tesla hate community. Understandably so.
  • Smart summon has 0 to do with FSD or even autopilot. It is a party trick to be used under very specific supervised processes
  • Smart summon relies exclusively on the front camera and ultrasonic sensors
  • While smart summon is engaged, the user still has full control over their car via the phone app. If the car does anything unexpected you only need to release your finger from the button and the car stops immediately. The “driver” did not do this and was not supervising the car, the car did not see the jet because it was entirely above the ultrasonic sensors, and as I’m sure you can understand the object recognition isn’t exactly trained on parked airplanes.
  • The app and the car remind the driver each and every time it is engaged that they need to be within a certain range and within eyesight of the car to use it. If you remote control your car into an obstacle and it causes an accident, its your fault, period.
  • Tesla is working on a new version of smart summon which will make this feature more useful in the future.

February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety

  • I suggest actually watching the video here. What happened is highly at odds with what is actually in the video, but the vid is just over an hour long so I bet most people don’t bother watching it.
  • “It wouldn’t have hit them, it definitely wouldn’t have hit them. Do we need to cut that?” "No, you can keep it in"
  • If you look at what was happening on the car’s display, it detected someone entering the crosswalk and stepping out into traffic on the left side. The car hit the brake, sounded an alert and swerved to the right. There was a bicycle in front of where the car swerved but at no point was it about to “nearly take out a bicyclist”. It did definitely overreact here out of safety but at no point was anyone in danger.
  • Relatively speaking this is a very old version of FSD software, just after the first wave of semi-public release.

December 6, 2021: Tesla accused of faking 2016 Full Self Driving video

  • lol

March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car

  • Now we’re getting into pre-FSD autopilot. See above comments about the capabilites of autopilot. Feel free to compare these to other cars LKAS systems. You will see that there are still lots of accidents across the board even with LKAS. That is because it is an assist system and the driver is still fully responsible and in-control of the car.

June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck

  • Again, pre-FSD. If the driver didn’t see the overturned truck and disengaged to stop then I’m not sure how anyone expects a basic LKAS system to be able to do that for them.

March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes

  • This one involves a fatality, unfortunately. However, the car was not self-driving. There is something else very important to point out here:
  • The feature that allows Teslas to change lanes automatically on the freeway (Navigate on Autopilot) was not released until a year after this accident happened. That means, that if AP was engaged in this accident, the driver deliberately instructed the car via engaging the turn signal to merge into that truck.

May 7, 2016: First known fatality involving Tesla’s Autopilot system

  • Now we’re getting way back into the V1 autopilot systems which weren’t even made by tesla. This uses a system called MobilEye and is made by a third party and is even less capable than V2 autopilot

So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.

Excited to see your response.

Video shows 8-car pileup after Tesla stops in highway tunnel

The Tesla driver said at the time of the accident that he had been using the carmaker's Full Self-Driving software, The Intercept reported.

Insider
See my huge post about that very accident. Do you have any other “Many many examples”?

Here is more: motortrend.com/…/tesla-fsd-autopilot-crashes-inve…

How many do you want?

This Is Your Tesla FSD and Autopilot Crash Mega Thread

A reminder that Tesla’s Autopilot and Full Self Driving systems are anything but.

MotorTrend
ffs that is the exact same article again. Please read my other comment (the huge one) let me know if anything doesn’t make sense or you find anything factually inaccurate.
Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.
They do whatever gets them clicks. Facts do not matter.
And this opinion is based on what?
The Dangerous Difference Between Electrek, Journalism, and Truth

Editor Frederic Lambert is to truth what Autopilot is to safety.

The Drive

Honestly the quality of journalism in this article is pretty low. Some of the points are valid but most are just nitpicks about little opinion pieces at the ends of the articles. I don’t find these particularly valuable, and they sometimes contain some bad takes as pointed out here, but that’s not an issue of factual reporting. So the worst they’ve identified is a few minor omissions which, sure, but if you write thousands of articles that’s going to happen.

And by the way, this article is making the case that Electrek is deliberately biased towards Tesla, not away from them. So if anything it undermines your point.

I think the scandal about car referrals was pretty suspicious, but again, when you look at their reporting it comes down as pretending balanced. Perhaps you could argue they talk too much about Tesla but they cover the good and the bad. And I would say almost everyone in America has been talking about Tesla too much for quite some time.

They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.
Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.
not all accidents are that violent. I would even accept a simple fender bender. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?
Look man, I don’t like children either but wanting more child mowing cars out on the road is pretty twisted.
Wait, are you now suggesting you won’t accept that Teslas with FSD ever get into accidents without video evidence? FSD is perfect?
No I would never suggest that. The overwhelming consensus here is “FSD is dangerous. More dangerous than humans” Im asking for any proof of that here. So far, nothing. If they were getting in to accidents all the time there would be all kinds of footage, no?
You sure did suggest that when you said you would even accept a fender bender.
Do you have any footage to share of FSD fender benders? If not how can you even claim it’s dangerous? Every car equipped with FSD hardware is equipped with 360 dashcams. It should be really easy to find some footage where FSD is at-fault for an accident.
Again- you’re suggesting they’re perfect by implying they don’t even get into fender benders.
no what im suggesting is that currently, as of today, they don’t. There will come a day when it does cause an accident. Any self driving system will be at risk of that as long as other humans share the road. What is most important is that we don’t lose sight of the accidents it prevents. As it stands right now hundreds are dying daily in auto accidents in the US. Any effort to dismiss or shut down self driving car programs is an incredible disservice to road safety when theres no evidence to suggest its as or more dangerous to the average human driver.
So your contention is that FSD has not caused a single accident of any sort? And the reason for that is that you’ve never seen video of it?
No im saying if it was extraordinarily dangerous and causes “lots of accidents” as others have suggested, shouldn’t there at least be something to back that up? Some kind of footage? I mean ffs they’re recording all the time it shouldn’t be hard.
Except you keep bringing up fender benders, which are not dangerous at all, and suggesting there hasn’t been a single one ever with Tesla’s FSD cars being at fault.
I said I would accept any form of proof of an accident. Any sort of accident. Major or minor. Are you just trolling?
Again, implying that there has never been one since you haven’t seen it. I’m not trolling, I’m showing you what you’re implying. Either you admit that it is likely that a FSD car has caused at least one accident, no matter how minor, or you say you won’t accept it until you see it for yourself. You are doing the latter, which is a silly position to take.
You’re missing the point. The whole thing is to prove that FSD is more dangerous than a human. Im asking for dashcam proof because that should by far be the easiest thing to get. Given the number of clickbait misinformed articles about the subject im very confident an actual major accident caused by FSD would be all over the news. It would be front page everywhere. All aboard the Elon hate train. Given the fact that nobody can produce evidence of even a minor FSD related accident and instead just calling me a Tesla fanboy and downvoting everything instead is baffling to me. Is it at all possible, just maybe, that FSD is a safe technology? Thats what i’m suggesting. If you have evidence to the contrary please present it. Unlike everyone else here I’m actually open to changing my mind.
You’re upset with people calling you a fanboy, yet you keep implying FSD is so perfect that there’s never been the mildest of accidents. That’s my point. It’s a ludicrous thing to suggest.

Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.

Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.

all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.
You got evidence you just dismissed it out of hand because you’re a worthless stupid muskrat.