The real crime is marketing the driver assist capability under the name autopilot when it is anything but that.
Oh no, it’s even worse than that.
It’s the CEO and other staff repeatedly speaking of the system as if it’s basically fully capable and it’s only for legal reasons why a driver is even required. Even saying that the car could drive from one side of the US to the other without driver interaction (only to not actually do that, of course).
It’s the company never correcting people when they call it a self driving system.
It’s the company saying they’re ready for autonomous taxis and saying owner’s cars will make money for them while they aren’t driving it.
It’s calling their software subscription Full Self Driving
It’s honestly staggering to me that they’re able to get away with this shit.
I love my Model 3, but everything you said is spot on. Autopilot is a great driver assist, but it is nowhere near autonomous driving. I was using it on the highway and was passing a truck on the left. The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes. Fortunately, I was able to figure out what went wrong and quickly accelerated myself so as to not become a hazard to the cars behind me.
Using Autopilot as anything more than a nice dynamic cruise control setting is putting your life, and other lives, in danger.
Holy shit. My car doing that once and I’d be a nervous wreck just thinking about using it again.
I have had the adaptive cruise control brake on multiple Hondas and Subarus in similar situations. Not like slamming on the brakes, but firm enough to confuse the hell out of me.
Every time it was confusing and now I just don’t use it if the road is anything but open and clear.
Honda’s sensing system will read shadows from bridges as obstructions in the road that it needs to brake for. It’s easy enough to accelerate out of the slowdown, but I was surprised to find that there is apparently no radar check to see if the obstruction is real.
My current vehicle doesn’t have that issue, so either the programming has been improved or the vendor for the sensing systems is a different one (different vehicle make, so it’s entirely possible).
i barely trust the lane-keeping assistant in my friend’s car. imagine going 70+km/h and suddenly the car decides to jerk the steering to the left/right because you weren’t exactly in the middle of your lane.
fuck modern assistants IMO. i can use the steering wheel just fine, and people have been able to for a hundred years.
i can use the steering wheel just fine, and people have been able to for a hundred years.
People have been bad at it for a hundred years. I’m not saying that people should necessarily be using auto-steering that keeps them in the middle of their lanes, but they should at least be using systems that beep at them when they stray out of their lane.
The bar for self-driving technology isn’t some amazing perfect computer that never makes a mistake. It’s the average driver. The average driver is bad.
we can do two things (these are not mutually exclusive):
-take further control away from the drivers and make them dependent on a computer, which can always misunderstand a situation and make the driver responsible for it.
-educate drivers properly, at least in the US. americans have been historically bad at driving and have also been known to be undereducated.
I’m all for more driver education, and for stricter licensing requirements like they have in Europe. Having said that, eventually computers are going to have to take over.
It’s pretty absurd that we’re handing control over multi-ton devices traveling at tens of meters per second to fallible, bored, easily distracted humans. The safer cars get, the safer drivers feel. The safer drivers feel, the less they feel they need to concentrate on driving.
Safe driving just will never be a skill that humans will be good at. The tasks that humans are good at that require concentration are tasks that are challenging and remain challenging. Think playing a sport where there’s always action and you have to react. Humans are bad at tasks that are mostly routine and boring, but if your concentration lapses you can cause a catastrophe. Those are the kinds of tasks where people get bored so they start glancing away, reading a book or looking at a smartphone, or whatever. For driving to be engaging, it has to be non-boring, which means non-safe. The safer it gets, the more boring it gets, so people stop paying the required attention. There’s just no winning.
That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.
Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.
I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.
There is a big difference between Autopilot and that hypotethical uncle. If the uncle causes an accident or breaks shit, he or his insurance pays. Autopilot doesn’t.
By your analogy, it’s like putting a ton of learner drivers on the road with unqualified instructors, and not telling the instructors that they are supposed to be instructors, but that they are actually taking a taxi service. Except it’s somehow their responsibility. And of course pocketing both the instruction and taxi fees.
The bar is not incredibly high for self driving cars to be accepted. The only thing is that they should take the blame if they mess up, like all other drivers.
Yeah, for sure. Like I said, I get the difference. But ultimately we are talking about injury prevention. If automated cars prevented one less death per mile than human drivers, we would think they are terrible. Even though they saved one life.
And even if they only caused one death per year we’d hear about it and we might still think they are terrible.
The difference is that Tesla said it was autopilot when it’s really not. It’s also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.
While that’s true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we’ve vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.
I’m 100% for autonomous cars taking over entirely. But Tesla isn’t really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.
Your cars actions could kill someone.
The auto cruise on the Priuses at work do this a lot. If the freeway curves to the left or something it will panic and think I’m about to hit the cars in the lane next to me also going through the Curve
The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes.
Even dynamic cruise control must never do such dangerous mistakes!
You should claim that they fix this on warranty, and they should prove that this is never going to happen again.
Almost all of them do it, the one most fresh in my mind is the Prius because my work uses them as base cards so I drive them a lot. If the highway curves kind of hard to either the left or the right sometimes it will panic and think you’re about to hit the car in the lane next to you because they’re technically in front of you and so it will try to brake.
Thankfully there is an option to turn off the automatic braking it will just start screaming instead
Tesla should be playing wrongful death suits every time autopilot kills someone. Their excuses don’t excuse the blatant marketing that leads people to believe it’s a self driving car.
But you see that wasn’t the vehicle’s fault. It’s been programmed perfectly. What happened was the fault of the pedestrians and driver for not properly predicting what the car would do.
maybe /s maybe not.
no you see the issue is that the auto pilot stopped right before the accident so obviously it was entirely drivers fault, please don’t check how much time was between it stopping and the accident
Do we need to go through what autopilot in a plane or boat actually does again?
It doesn’t matter, Tesla cars are marketed to the public which isn’t expected to know these things. To probably 90% of people “autopilot” means “drive automatically”.
To probably 90% of people “autopilot” means “drive automatically”.
Based on what?
Based on my usage and understanding of the word being a lay person.
I’m an engineer myself, sometimes there are words that you have to be cognizant of the differences in meaning to other engineers vs lay people or even engineers in other fields. Some words are heavily overloaded, and “autopilot” is kinda one of them (others being “domain”, “node”, “artificial intelligence”, etc.).
Tesla markets this feature as “Full Self-Driving Capability.” Maybe I’m poorly informed, but to me that means that the car is fully capable of driving itself without human interaction.
FSD is an entirely separate thing. Autopilot is just an LKAS system, or adaptive cruise control.
Aha, today I learned that Autopilot is just lane-keeping and adaptive cruise control. I feel that it must be a common misunderstanding to confuse the terms “Autopilot” and “Fully Self-Driving” in the vernacular.
Many other manufacturers refer to lane-keeping systems as “driver assistance,” and I believe Tesla is intentionally misleading consumers with the impression that their system is more capable and allows the driver to pay less attention.
Until you drive it. You know the capabilities, you know when you can and cannot activate it, you know how often it tells you to look at the road and if you don’t prove you’ve got your hands on the wheel, it disables itself for the drive (you need to park to reactivate it). No Tesla driver thinks autopilot is more than a lane and distance keeping assistance.
Autopilot is a marketing name, that’s it.
do we need to go through the differences in training, aptitude and intelligence between pilots, captains and your neighbor Greg again? Marketing it as “autopilot” to anyone who can sign a car loan is reckless and has killed people and will continue to kill people until they stop
Yep, just like “cruise control” made tons of people drive their car into the ocean thinking they could sail it to popular island destinations.
Depends entirely on the type of autopilot.
What does full self driving mean ?
Wow the value of a life I guess. I don’t really know what can come close to the value of a life, but this doesn’t seem like it.
What would be the value of life then? I’ll save you the answer: no matter how big the number you say, someone else will say bigger. Until it becomes priceless, which is the answer.
However death and accidental death isn’t always avoidable. And when we pin the fault on someone we cannot expect to say “priceless” is what they owe the victim’s family. So we assign an amount of money or time that hurts, and call it good.
Doesn’t mean life is worth that. And saying so doesn’t help anyone.
Sure but even looking a only the financial produce of one person for a family dwarfs the comical 23k here. And that’s not even looking at the emotional side of things. 23k is straight insulting imho.
Two people were killed, so you’re really talking 11.5k.
Seeing as they were using the now being recalled Tesla auto pilot during the auto accident, it may not entirely be the drivers fault. This may be part of the rational behind the judgement.
Tesla should be out millions for this. The autopilot feature is a gimmick and not at all transparent. They’re beta testing on the public and people are dying because of it. This is a corporate decision that needs to have corporate consequences over and above legal ones. People shouldn’t just be getting minor fines, they should be going to prison and losing absolutely everything.
The U.S. uses the value of statistical life VSL. Here are the numbers from the Department of Transportation over the last 10 years or so.
So, it is interesting and egregious that the driver needs only pay $23K and Tesla pays nothing at all!
So according to this, the DoT values a life at $12.5M in 2022? I’m curious about their methodology.
True. But what if Tesla has to pay a billion for producing software that runs people over? They probably would not have beta software on the road.
That was the penalty for the felony charge for the driver of the car that ran off the highway into a surface street. It’s almost certain that drivers insurance also paid out their maximum.
In addition, Tesla is recalling all those cars to change the system that pretends to ensure a driver using autopilot is actually paying attention.
And a civil suit will likely follow from the 2 victims families.
deleted
deleted by creator
The guy was going through a suburb at 75 mph blowing through stop lights. Ofcourse he has to pay, im surprised hes not getting jail time. This has nothing to do with the car, thats just gross negligence
If you want to kill someone in the US with little consequences, run them over with a car.
Germany the same. Small fine, three month without license, that’s it for killing a human being.
If we’re talking about an honest accident then how long do you think the jail term should be?
“honest accident” is the crux of the question. If the driver was doing everything perfectly and some other party was entirely responsible for the accident, not much (maybe none?).
But, at least in my corner of Canada, most drivers are not behaving responsibly or adhering to the law. Speeding, following too closely, illegally passing, and using phones while driving are common. If a driver kills someone while doing something overtly dangerous, they deserve jail time.
If it were an honest accident then nothing. If it were due to neglect or lack of due diligence then maybe a few months of of weekend jail or month of full time jail.
For killing someone? Causing someone’s death due to negligence is only worth a month of jail to you?
deleted by creator
If we’re talking about an honest accident
There is no such thing.
Anyone else tired of beta testing Tesla’s garbage just by being outside on the roads near these vehicles?
Human beings controlling cars are extremely dangerous. Drunk drivers, racing, going through red lights and stops, speeding, not paying attention, etc. No need for autopilot for the streets to be dangerous for pedestrians. Autopilot keeps the car in line, which is already way safer than most 100% human-controlled accidents.
And again, the driver is responsible to keep their eyes on the road, even when using cruise-control or any sort of driving assistance.
deleted by creator
Civil suit. He’s already been proven guilty
You honor, I actually didn’t wack anyone with this self actuating axe. I bought it and I told it to go chop wood. The people just happened to be too close to the axe. Yeah I was holding the axe but I wasn’t actually putting any pressure. The tail was wagging the dog in other words.
Ok so $10,000.00. Fine? Oh alright I guess that’ll teach me not to buy autonomous axes.
Fines = legal for a price.
Fines only exist to punish the poor.
So $11.500 per Person. Huh. I would have guessed it that american Lives would be more expensive.
uh is that it?
American taxpayers will pick up the rest of the bill. Nice subsidy for the rich.
Well, he didn´t do anything … /s
This is the best summary I could come up with:
A Tesla driver will pay more than $23,000 in restitution for the deaths of two people during a 2019 car crash in a Los Angeles suburb, a decision announced the same day that the automaker recalled nearly all vehicles sold in the U.S.
Wednesday’s court hearing wrapped up a case believed to be the first time in the U.S. prosecutors brought felony charges against a motorist who was using a partially automated driving system.
The recall affects more than 2 million Tesla vehicles and will update software and fix a defective system that’s supposed to ensure drivers are paying attention when using Autopilot.
The Tesla driver in the Los Angeles case, Kevin Aziz Riad, pleaded no contest to two counts of vehicular manslaughter with gross negligence.
Authorities say Aziz Riad, a limousine service driver, was at the wheel of a Tesla Model S that was moving at 74 mph (119 kph) when it left a freeway and ran a red light on a local street in Gardena, California, on Dec. 29, 2019.
The Tesla, which was using Autopilot at the time, struck a Honda Civic at an intersection, and the car’s occupants, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, died at the scene.
The original article contains 364 words, the summary contains 198 words. Saved 46%. I’m a bot and I’m open source!
deleted by creator
Same as it ever was [https://www.raisethehammer.org/article/1809/](Kill a Pedestrian, Pay a $500 Fine)
Come join the war on cars. !fuckcars@lemmy.world