Last year, two Waymo robotaxis in Phoenix “made contact” with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles’ software. A “recall” in this case meant rolling out a software update after investigating the issue and determining its root cause.
In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn’t pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn’t elaborate on what it meant by saying that its robotaxis “made contact” with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren’t carrying any passenger.
After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.
I love the corpospeak. why say “crashed into” when you can use “made contact” which sounds futuristic and implies that your product belongs to an alien civilization?
By “made contact”, it means that they “smashed”.
it means that they “smashed”.
So are we gonna have some baby robotaxi trucks driving around in a few month’s time?
Now that’s how you get a true generative ai.
You smash, you make “babies”, babies are slightly different and maybe better.(probably worse)
Make contact with that like button!
deleted by creator
Hmm, so it’s only designed to handle expected scenarios?
That’s not how driving works… at all. 😐
Face it, that’s actually better than many drivers can do
The description of an unexpected/(impossible) orientation for an on road obstacle works as an excuse, right up to the point where you realize that the software should, explicitly, not run into anything at all. That’s got to be, like, the first law of (robotic) vehicle piloting.
It was just lucky that it happened twice as, otherwise, Alphabet likely would have shrugged it off as some unimportant, random event.
Billionaires get to alpha test their software on public roads and everyone is at risk.
It’s great though - that’s how you get amazing services and technological advancement.
I wish we had that. In Europe you’re just stuck paying 50 euros for a taxi in major cities (who block the roads, etc. to maintain their monopolies).
Meanwhile in the USA you guys have VR headsets, bioluminescent houseplants and self-driving cars (not to mention the $100k+ salaries!), it’s incredible.
Lol I appreciate your enthusiasm for the USA but grass is always greener.
Bruh in the US of A the grass is greener because it’s made of polypropylene and spray painted green. Just don’t smell it, or look too hard.
Most of us are in poverty, I dont know when but we’re in another gilded age and just like the last was underneath the gold is rusty iron.
Bioluminescent house plants are cool but as an American I can tell you right now that my luxury bones hurt.
I can tell you right now that my luxury bones hurt.
That’s the same in Europe though, dentistry isn’t covered on public insurance in the UK, Spain, Sweden, etc.
But we have even less net salary to cover it when there are problems.
True, but your savings on non-luxury bones helps with the fees associated with luxury ones, I’m sure. I can’t do anything for my bones with a $30 glowing petunia.
I appreciate/understand your envy. I’m not sure why everyone disagrees so much unless they have also lived under similar constraints.
Unless sarcasm.
Also agree with it might be perception or grass is greener like other comment 😉
We have something like that here too: MOIA in Hamburg.
I didn’t read it as them saying “therefore this isn’t a problem,” it was an explanation for why it happened. Think about human explanations for accidents: “they pulled out in front of me” “they stopped abruptly”. Those don’t make it ok that an accident happened either.
It should of course not run into anything, but it does need to be able to identify obstacles at the very least for crash priority when crazy shit inevitably happens. For instance, maybe it hits a nice squishy Pomeranian that won’t cause any damage to itself instead of swerving to avoid it and possibly totalling itself by hitting a fire hydrant.
Or maybe it hits the fire hydrant instead of a toddler.
At any rate, being able to identify an obstacle and react to unexpected orientations of those obstacles is something I think a human driver does pretty well most of the time. Autonomous cars are irresponsible and frankly I can’t believe they’re legal to operate.
I can’t believe they’re legal to operate.
That’s the neat part. They aren’t always legal. It doesn’t stop them.
https://www.wsj.com/articles/california-dmv-calls-ubers-autonomous-autos-illegal-1481761533
It would have been a different article if two waymos decided to take a wrong turn off a cliff.
Why is an update called a recall?
Because Tesla was fixing significant safety issues without reporting it to the NHTSA in a way that they could track the problems and source of the issue. The two of them got into a pissing match, and the result is that now all OTA’s are recalls. After this, the media realized that “recall” generates more views than “OTA”, and here we are.
I think it’s slightly more nuanced - not all OTAs are recalls, and not all recalls are OTAs (for Tesla). Depending on the issue (for Teslas), the solution may be pushed via an OTA in which case they “issue a recall” with a software update. They’re actually going through this right now. For some other issues though, it’s a hardware problem that an OTA won’t fix so they issue a recall to repair the problem (ex: when the wiring harness for their cameras was fraying the cables).
This is 100% from the NHTSA shenanigans, though.
What typically happens when a recall is issued for other vehicles? Don’t they either remove and replace the bad part or add extra parts to fix the issue?
How is removing bad code and replacing it with good code or just adding extra code to fix the issue any different?
Do you want to physically go somewhere?
Kinda, as the word implies. If it’s a software update, call it that; the car’s not going back to the shop/manufacturer.
It sounds like location is important for some reason.
Here’s an example of why I don’t like that they’re called recalls when it’s just a system update, if you have a recall on a food item, is there some way to fix it aside from taking it back (to be replaced) or throwing it away?
When there’s a security patch released on your phone, do we call it a recall on the phone? Or is that reserved for when there a major hardware defect (like the Samsung Note fiasco)
I think the difference in the case you mentioned is that with a car they use recall because it could be dangerous to keep using it as is.
Fair, it just seems like there should maybe be a new word for this era where an OTA update is all that’s needed.
The fleet of cars is summoned back to the HQ to have the update installed, so it causes a temporary service shutdown until cars are able to start leaving the garage with the new software. They can’t do major updates over the air due to the file size; pushing out a mutli-gigabyte update to a few hundred cars at once isn’t great on the cellular network.
Actually there have been several Tesla “recalls” that were just simply OTA updates.
After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it.
Having worked at Waymo for a year troubleshooting daily builds of the software, this sounds to me like they may be trying to test riskier, “human” behaviors. Normally, the cars won’t accelerate at all if the lidar detects an object in front of it, no matter what it thinks the object is or what direction it’s moving in. So the fact that this failsafe was overridden somehow makes me think they’re trying to add more “What would a human driver do in this situation?” options to the car’s decision-making process. I’m guessing somebody added something along the lines of “assume the object will have started moving by the time you’re closer to that position” and forgot to set a backup safety mechanism for the event that the object doesn’t start moving.
I’m pretty sure the dev team also has safety checklists that they go through before pushing out any build, to make sure that every failsafe is accounted for, so that’s a pretty major fuckup to have slipped through the cracks (if my theory is even close to accurate). But luckily, a very easily-fixed fuckup. They’re lucky this situation was just “comically stupid” instead of “harrowing tragedy”.
aaaaand fuck this truck in particular.
That pickup truck was asking for it I tell ya. He was looking at me sideways, he was.
It said RAM om the side!
Brb gonna dazzle paint my car
I still don’t understand how these are allowed. One is not allowed to let a Tesla drive without being 100% in control and ready to take the wheel at all times, but these cars are allowed to drive around autonomously?
If I am driving my car, and I hit a pedestrian, they have legal recourse against me. What happens when it was an AI or a company or a car?
You have legal recourse against the owner of the car, presumably the company that is profiting from the taxi service.
You see these all the time in San Francisco. I’d imagine the vast majority of the time, there are no issues. It’s just going to be big headlines whenever some accident does happen.
Nobody seems to care about the nearly 50,000 people dying every year from human-caused car accidents
Nobody seems to care about the nearly 50,000 people dying every year from human-caused car accidents
I would actually wager that’s not true, it’s just that the people we elect tend to favor the corporations and look after their interests moreso than the people who elected them, so we end up being powerless to do anything about it.
sure, but why do these accidents caused by AI drivers get on the news consistently and yet we rarely see news about human-caused accidents? it’s because news reports what is most interesting - not exactly accurate or representative of the real problems of the country
The company is at fault. I don’t think there’s laws currently in place that say a vehicle has to be manned on the street, just that it uses the correct signals and responds correctly to traffic, but I may be wrong. It may also be local laws.
Do we have a fuck you in particular group yet?
yay
“made contact” “towed improperly”. What a pathetic excuse. Wasn’t the entire point of self driving cars the ability to deal with unpredictable situations? The ones that happen all the time every day?
Considering the driving habits differ from town to town, the current approaches do not seem to be viable for the long term anyway.
It’s as if they are still in testing. This is many years away from being safe, but it will happen
It’s a rare edge case that slipped through because the circumstances to cause it are obscure, from the description it was a minor bump and the software was updated to try and ensure it doesn’t happen again - and it probably won’t.
Testing for things like this is difficult but looking at the numbers from these projects testing is going incredibly well and we’re likely to see moves towards legal acceptance soon
In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane.
See? Waymo robotaxis don’t just take you where you need to go, they also dispense swift road justice.
I’m cool with that. Maybe they can do tailgaters next.
they also dispense swift road justice.
They should launch shurikens out the front like a James Bond vehicle.
They thought the truck was being driven by Sarah Conner.
Maybe it was a cybertruck and the super stealth design made it’s signature very small.
https://sfstandard.com/2023/10/17/tech-layoffs-waymo-san-francisco-robotaxi/
I bet they were all QA.
It was in an orientation our devs didn’t account for and we don’t want liability.
“Towed improperly”