Tesla Technologies

Seems liked it served itā€™s purpose. It cause confusion around funding for public transportation and then made way for for cars.

https://twitter.com/MikeSington/status/1591822506327158786

1 Like

Is it wrong that I want to hear ā€œYakkety Saxā€ playing over this?

1 Like

Consider including TW (trigger warning) when sharing posts like this.

Is it just me that canā€™t open this post in Twitter?

Not just you, if I try and click through it says the post is deleted.

Found the video on YouTube

Link

1 Like
2 Likes

https://twitter.com/kenklippenstein/status/1612848872061128704

At the start of the video I was wondering how there could be an 8 car pileup crash with such light traffic and all the cars spread out so much. But in the end, I concluded from this video that Teslas keep a safe distance, but humans drive waaaay too close to the car ahead of them?

Iā€™m not saying Tesla self-drive tech is safe, or should be allowed on public roads, but holy hell humans are incompetent.

1 Like

Yeah. I drove a car with adaptive cruise for the first time last summer, and it would start slowing down when the car in front of you was way off in the distance. Well, maybe the car is right and Iā€™m wrong.

I also wonder what was going on with the driver of the Tesla, if the car suddenly brakes and changes to the left lane unexpectedly what can the driver do to take over 100% control of the car? Can you stomp on the accelerator pedal, at least so you donā€™t come to such an abrupt stop? Iā€™m thinking of two aspects of this: what driver inputs will completely disable the self-driving and what was the driver doing and what were they thinking as this occurred? To the latter point, I wouldnā€™t be surprised if they were not mentally engaged and ready to take over at any moment. But even if they were, would they be assuming that the carā€™s sensors were detecting a problem that they could not identify themselves, and thus hesitant to override the carā€™s decision to stop?

That is the biggest problem I keep coming back to, no matter how good automated driving technology gets, how do you enforce the driver being ready and able to take control at any instant? Weā€™ve long seen how people will work around any enforcement that has been tried so far of requiring hands on the wheel at all times for the automated driving to remain engaged. And even if the body position is perfectly enforced, you cannot control the mental state of the person behind the wheel while the car is driving itself.

But why would a driver take over at this moment? There isnā€™t a problem ahead, but how would the driver know that? There isnā€™t an issue with the car, but how would the driver know that?

If my car suddenly took evasive action, why on earth would I ever want to stamp on the accelerator? There could be a hole in the road I canā€™t see!

You want the driver to take over when thereā€™s an obstruction the car didnā€™t see, but you donā€™t want drivers taking over when there might be an obstruction that they didnā€™t see. This is an almost impossible situation for any decision-making process.

The answer is, as far as I can tell, cars behind actually keeping proper distances, and treating any car ahead as something that might randomly stop for any reason, be it an obstruction, or just a perceived obstruction.

Because their car suddenly slammed on the brakes for little discernible reason, with no visible obstruction ahead, maybe? An automatic system controlling your vehicle behaving erratically like that seems a pretty good reason.

And yes, it was erratic, because we know there wasnā€™t an obstruction, because we can see on the camera a reasonable distance ahead, and a tesla - ideally, anyway, they do apparently have some issues - does not emergency brake like that for an obstruction that far away.

By looking out the windshield to the completely empty lane ahead of him, which is empty to more than twice the distance the vehicle would need to stop - the distance from the location it stopped to the end of the frame is at least 110 meters per the video, and we can see on the camera, that lane is clear. If youā€™re curious, per consumer reports testing, the 60-0 mph stopping distance for most Tesla models comes out at about 46 meters, and the speed limit on that bridge slash tunnel is 50 miles an hour, so we can figure itā€™s a shorter required distance.

We can also see from the second angle, shown when you click through to the Intercept article(loathe as I am to do so for unrelated reasons), showing a good shot right down the tunnel section, we can see that the road is clear for at least an additional 110 meters, with no obstructions. I canā€™t say how clear it is going further, as thatā€™s the point where I start losing track of the vehicles in the other lanes Iā€™m using to estimate the distance.

In fairness, you shouldnā€™t arbitrarily stomp on the accelerator anyway, trying to overcome braking power with engine power is not how to handle that sort of situation.

And that aside, while Tesla added a software update that supposedly scans for potholes ahead, it doesnā€™t stop the vehicle or avoid them, it merely adjusts the suspension to ensure comfort. And last of all - Teslas solely use cameras for their driver assist feature these days, they got rid of the radar and the ultrasonic sensors(which wouldnā€™t be useful at much of a distance regardless). If you couldnā€™t see it due to a visual obstruction, then nor could the car.

On top of that, this is a moot point, because we can see a fair distance ahead of where the tesla stopped, let alone further back when it started braking, and we can see there is no obstruction, and no damage to the road. The only thing there is smooth road, in basically as perfect condition as you could ask for.

All in all, this seems to just be another case of the well-known Phantom Braking issue, which has plagued teslas for a few years at this point.

As more of a side note, too - I feel this is an odd level of confidence to have in the obstacle avoidance of a a driver assist system that has a relatively well-known habit of plowing into stationary objects. In fact, just a month after this crash, the same driver assist system that is meant to be so perceptive at spotting these hazards drove directly into the back of a parked(with flashers on) police car in ohio without braking whatsoever. While I do have to ask what the actual fuck the driver was doing in that instance, itā€™s also far, far from the first time a Tesla has just rammed a stationary object.

3 Likes

Sure. You are taking my general point about generally a car braking for a reason a driver might have not spotted, and explaining why in a very specific case where we have a video why it isnā€™t the case.

It serves as a better specific example than telling you the basics like ā€œTeslas are already well known for Phantom Braking incidents, with many complaints to the NHTSA about it, and hundreds more in virtually every online space for Tesla owners, so why are we making up scenarios where the car known for doing insane things is arbitrarily correctā€, or ā€œThis driver assist system has a long and storied record of making absurd, pointless maneuvers, causing accidents, and slamming into stationary objects, overriding it is not only the smart thing to do, but also virtually mandatory for using it safely.ā€

1 Like

Right. I get all that. Iā€™m not trying to defend Tesla self drive abilities here. The opposite in fact.

So forget about self driving cars for a moment. My point is that in MANY cases, when Iā€™m in a car with two people up front, one of us will spot something that the other misses.

If a passenger spots a something dangerous, should they reach over and yank the parking brake? Should the driver immediately try to floor it, or should they stop completely instead and check what the problem is? Or with a learner car with two sets of brake pedals. If the instructor stamps on the brakes, should the learner driver try to fight against that, because they think the road is clear?

Once there are two inputs, and one input wants the car to slow, to stop, or pull over to the side of the road, I think the safest outcome should always be: take the input that is trying to slow or stop. In almost EVERY driving situation I can think of, slowing or stopping should be the default, rather than pushing through and hoping/assuming itā€™s a false positive.

The only reason not to slow or stop when there is any doubt, is because you are concerned about cars following too closely behind at too high speed. Which, as I said in my original post, seems much more like a human driver issue than a Tesla-in-front issue.

And again, Iā€™m not defending Tesla self drive here. I think itā€™s insane they are running ā€œbetaā€ systems on public roads.

I suppose the difference is that those are all different, complete people who are in the car - itā€™s a fundamentally different proposition. As opposed to the car itself, using a system - and Iā€™m not even just talking about Tesla, every single Driver assist like this asks the same of you - that explicitly suggests you should be paying attention and ready to take control, because these systems donā€™t have the perceptive or predictive ability of your average person.

A passenger is not getting in the car with the explicit instruction that they must be ready to take over control if they think the driver is doing something odd, nor would they be resuming control, as they were not in control to begin with. And in the case of a driving instructor where they DO have those inputs, they are there on the presumption that they are a far better driver than you are, which is not the case for Driver assist systems at this time.

I mean, obviously Iā€™m also going to agree that the student shouldnā€™t be stamping on the accelerator to override them, because as I said, thatā€™s the wrong move regardless of the reason. Your brakes locking up without your control isnā€™t solved by adding engine power even in an entirely mechanical vehicle with no other passengers.

But back to what weā€™re talking about - Itā€™s a fundamental difference in expectation and capability between, say, a passenger, and a sub-system of the vehicle. Of course a passenger shouldnā€™t jump in and interrupt, because theyā€™re generally not expected to - whereas with every driver assist system like this, you are explicitly asked to pay attention, and intervene if it starts doing the wrong thing. (Where possible, anyway - one of the problems with Tesla phantom braking is that they sometimes donā€™t allow you to stop the braking until the vehicle is already at a complete stop.)

1 Like

Right. I think this is where we are talking past each other. I donā€™t care if any driver does the ā€œwrongā€ thing, be it a human driver or a computer system. The only thing Iā€™m concerned about is if it does an unsafe thing.

Clearly, any driver should be able to put the brakes on and pull over to the side of the road, or into the slow lane if there is no hard shoulder. Itā€™s what drivers do if their vehicle has a blowout (which once happened to me in a 3.5 ton truck at 70mph on a 3 lane motorway), or if thereā€™s another similar emergency. Even if the driver only THINKS there might be an emergency, this has to be an option. It might be the wrong thing, but until everyone involved in the driving (human or AI) is sure, then thatā€™s not the ā€œwrongā€ course of action but the right one.

I donā€™t think fighting against this behaviour is a bad thing, if done by a human or AI. It should be up to other drivers, especially those following another vehicle, to expect it at any time, and keep the right distance.

Again, this is not to defend Teslaā€™s self driving efforts in every case. But as per my original post, the Tesla (in this case) may have been ā€œwrongā€ but did what any human driver could/should have done in many emergency circumstances (maybe they were having an epileptic episode). The car behind did a good job of almost stopping, but cars 3 to 9 were all obviously waaay too close for their speed.

Thatā€™s a fair shout, I made a judgement call there - to explain myself, I used ā€œwrongā€ because I was also thinking about something that may be illegal, but not necessarily unsafe in the situation, ie changing lanes where youā€™re not allowed, going too fast in a school zone, what have you. After all, you are still legally liable for when your car breaks the law, even if youā€™re not in direct control, be a pisser to get stuck with an expensive traffic fine and any issues it might cause with your license because your car decided to be an asshole.

I admit that was unclear, and probably thinking more than needed about things not germane to the conversation.