As much as I like Mark, He’s got some explaining to do.
At 15:42 the center console is shown, and autopilot is disengaged before impact. It was also engaged at 39mph during the youtube cut, and he struck the wall at 42mph. (ie the car accelerated into the wall)
Mark then posted the ‘raw footage’ on twitter. This also shows autopilot disengage before impact, but shows it was engaged at 42mph. This was a seprate take.
/edit;
Youtube, the first frames showing Autopilot being enabled: 39mph
Twitter, the first frames showing autopilot being enabled: 42mph
Mark then posted the ‘raw footage’ on twitter. This also shows autopilot disengage before impact, but shows it was engaged at 42mph. This was a seprate take.
No. That’s by design. The “autopilot” is made to disengage when any likely collision is about to occur to try to reduce the likelihood of someone finding them liable for their system being unsafe.
Not saying you’re wrong (because I’ve always found it suspicious how Tesla always seems to report that autopilot is disengaged for fatal accidents) but there’s probably some people asking themselves “how could it detect the wall to disengage itself?”.
The image on the wall has a perspective baked into it so it will look right from a certain position. A distance from which the lines of the real road match perfectly with the lines of the road on the wall. As you get closer than this distance the illusion will start to break down. The object tracking software will say “There are things moving in ways I can’t predict. Something is wrong here. I give up. Hand control to driver”.
Autopilot disengaged.
(And it only noticed a fraction of a second before hitting it, yet Mark is very conscious of it. He’s screaming. )
Sidenote: the same is true as you move further from the wall than the ideal distance. The illusion will break down in that way too. However, the effect is far more subtle when you’re too far away. After all, the wall is just a tiny bit of your view when you’re a long way away, but it’s your whole view when you’re just about to hit it.
This is from the first couple frames showing that Autopilot is enabled, just as the blue lines appeared on screen: 42mph displayed on the center console.
And another a couple seconds later:
And from the youtube footage:
Again, from the first couple frames as Autopilot is enabled, just as the blue lines appear: 39mph displayed on the center console.
Here’s more from youtube, taken several seconds apart:
They are very very similar, but they do appear to be two different takes.
It would, but he explicitly says ‘without even a slight tap on the breaks’ in the youtube video.
Then:
Here is the raw footage of my Tesla going through the wall. Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas.
He did state that he hit the brakes. Just on the fog one, not the wall. 🤷
But the fact that FSD disengages 17 frames before the crash also implies the software, along with the car, crashed 😂 I’d love to see the logs. I imagine the software got real confused real quick.
This is likely what happened. The software hit an invalid state that defaulted to disengaging the autopilot feature. Likely hit an impossible state as the camera could no longer piece the “wall” parts of the road with the actual road as it got too close.
Likely an error condition occured that would happen in the same way if an object suddenly covered the camera while driving. It would be even more concerning if the autopilot DIDN’T disengage at that point.
As much as I like Mark, He’s got some explaining to do.
At 15:42 the center console is shown, and autopilot is disengaged before impact. It was also engaged at 39mph during the youtube cut, and he struck the wall at 42mph. (ie the car accelerated into the wall)
Mark then posted the ‘raw footage’ on twitter. This also shows autopilot disengage before impact, but shows it was engaged at 42mph. This was a seprate take.
/edit;
Youtube, the first frames showing Autopilot being enabled: 39mph
Twitter, the first frames showing autopilot being enabled: 42mph
No. That’s by design. The “autopilot” is made to disengage when any likely collision is about to occur to try to reduce the likelihood of someone finding them liable for their system being unsafe.
Not saying you’re wrong (because I’ve always found it suspicious how Tesla always seems to report that autopilot is disengaged for fatal accidents) but there’s probably some people asking themselves “how could it detect the wall to disengage itself?”.
The image on the wall has a perspective baked into it so it will look right from a certain position. A distance from which the lines of the real road match perfectly with the lines of the road on the wall. As you get closer than this distance the illusion will start to break down. The object tracking software will say “There are things moving in ways I can’t predict. Something is wrong here. I give up. Hand control to driver”.
Autopilot disengaged.
(And it only noticed a fraction of a second before hitting it, yet Mark is very conscious of it. He’s screaming. )
Sidenote: the same is true as you move further from the wall than the ideal distance. The illusion will break down in that way too. However, the effect is far more subtle when you’re too far away. After all, the wall is just a tiny bit of your view when you’re a long way away, but it’s your whole view when you’re just about to hit it.
If you watch the footage it’s clearly the same take.
From the twitter footage:
This is from the first couple frames showing that Autopilot is enabled, just as the blue lines appeared on screen: 42mph displayed on the center console.
And from the youtube footage:
Again, from the first couple frames as Autopilot is enabled, just as the blue lines appear: 39mph displayed on the center console.
Here’s more from youtube, taken several seconds apart:
They are very very similar, but they do appear to be two different takes.
He stated in the video he tapped the brakes. Doesn’t that disengage auto pilot?
It would, but he explicitly says ‘without even a slight tap on the breaks’ in the youtube video.
Then:
- Mark Rober
Twitter.
He did state that he hit the brakes. Just on the fog one, not the wall. 🤷
But the fact that FSD disengages 17 frames before the crash also implies the software, along with the car, crashed 😂 I’d love to see the logs. I imagine the software got real confused real quick.
This is likely what happened. The software hit an invalid state that defaulted to disengaging the autopilot feature. Likely hit an impossible state as the camera could no longer piece the “wall” parts of the road with the actual road as it got too close.
Likely an error condition occured that would happen in the same way if an object suddenly covered the camera while driving. It would be even more concerning if the autopilot DIDN’T disengage at that point.
https://www.motortrend.com/news/nhtsa-tesla-autopilot-investigation-shutoff-crash/