The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.
The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.
In addition to the pedestrian’s death, another crash involved an injury, the agency said.
Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”
how is it legal to label this “full self driving” ?
“I freely admit that the refreshing sparkling water I sell is poisonous and should not be consumed.”
“But to be clear, although I most certainly know for a fact that the refreshing sparkling water I sell is exceedingly poisonous and should in absolutely no way be consumed by any living (and most dead*) beings, I will nevertheless very heartily encourage you to buy it. What you do with it after is entirely up to you.
*Exceptions may apply. You might be one.
It drives your full self, it doesn’t break you into components and ship those seperately.
That’s pretty clearly just a disclaimer meant to shield them from legal repercussions. They know people aren’t going to do that.
Last time I checked that disclaimer was there because officially Teslas are SAE level 2, which let’s them evade regulations that higher SAE levels have, and in practice Tesla FSD beta is SAE level 4.
In theory this is pure bull, and in practice it is level 4 bull.
That’s what I read from an article but I don’t think whether they’re level 4 or not doesn’t really matter. The point is they officially claim to be level 2 but their cars clearly function beyond level 2.
You want to read again what level 4 means.
Between the levels it is not just about function, but about completeness of circumstances.
If customers can’t assume that boneless wings don’t have bones in them, then they shouldn’t assume that Full Self Driving can self-drive the car.
The courts made it clear that words don’t matter, and that the company can’t be liable for you assuming that words have meaning.
Right? It’s crazy that this is legal.
Now go after Oscar Meyer and Burger King. I am not getting any ham in my burger or dog in my hot’s. They are buying a product which they know full well before they complete the sale that it does not and is not lawfully allowed to auto pilot itself around the country. The owners manuals will give them a full breakdown as well I’m sure. If you spend thousands of dollars on something and don’t know the basic rules and guidelines, you have much bigger issues. If anything, one should say to register these vehicles to drive on the road, they should have to be made aware.
If someone is that dumb or ignorant to jump through all the hoops and not know, let’s be honest: They shouldn’t be driving a car either.
I sometimes find a small seed in seedless watermelons.
It’s the same issue, although the seeds are unlikely to harm you.
It is a legal label, if it was safe it would be “safe full self driving”.
legal or not it’s absolutely bonkers. Safety should be the legal assumption for marketing terms like this, not an optional extra.
Literally its “partial self driving” or “drive assist”