A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

    • kambusha@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      1 month ago

      Except for the last 0.05 seconds before the crash where the human was put in control. Therefore, the human caused the crash.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    30 days ago

    The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

    What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      29 days ago

      Because the US is an insane country where you can straight up just break the law and as long as you’re rich enough you don’t even get a slap on the wrist. If some small startup had done the same thing they’d have been shut down.

      What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        29 days ago

        What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

        I’ve argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
        Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        29 days ago

        To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          11
          arrow-down
          1
          ·
          29 days ago

          Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

          • bluewing@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            6
            ·
            edit-2
            29 days ago

            You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.

            FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              8
              ·
              29 days ago

              FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

              Yeah people keep bringing that up as a counter arguement but I’m pretty certain humans don’t swerve off a perfectly straight road into a tree all that often.

              So unless you have numbers to suggest that humans are less safe than FSD then you’re being equally obtuse.

              • bluewing@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                ·
                28 days ago

                A simple google search, (which YOU could have done yourself), shows it’s abut 1 in 1.5 million miles driven per accident with FSD vs 1 in 700,000 miles driven for mechanical cars. I’m no Teslastan, (I think they are over priced and deliberately for rich people only), but that’s an improvement, a noticeable improvement.

                And as a an old retired medic who has done his share of car accidents over nearly 20 years-- Yes, yes humans swerve off of perfectly straight roads and hit trees and anything else in the way also. And do so at a higher rate.

        • NιƙƙιDιɱҽʂ@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          29 days ago

          …It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          29 days ago

          Even with the distances I drive and I barely drive my car anywhere since covid, I’d probably only last about a month before the damn thing killed me.

          Even ignoring fatalities and injuries, I would still have to deal with the fact that my car randomly wrecked itself, which has to be a financial headache.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        29 days ago

        That’s probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

        Let’s say that it’s only 0.01% risk, that’s still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

        It wouldn’t be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they’re never going to add lidar scanners so is literally never going to get any better it’s always going to be this bad.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          4
          ·
          29 days ago

          …is literally never going to get any better it’s always going to be this bad.

          Hey now! That’s unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won’t know which until it tries to kill you in new and unexpected ways :j

  • RandomStickman@fedia.io
    link
    fedilink
    arrow-up
    36
    arrow-down
    1
    ·
    1 month ago

    Anything outside of a freshly painted and paved LA roads at high noon while it’s sunny isn’t ready for self drivings it seems

      • Kusimulkku@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        30 days ago

        I’m not sure about even the more advanced self-driving cars. Shit gets fucked with snow and all kinds of other stuff.

        Flummoxes many human drivers too tbh.

        • SpaceNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          30 days ago

          I’m confident that they all still need lots of work for advanced weather, but you’re not seeing a Waymo or a Zoox drive into a tree for no reason.

      • Zwuzelmaus@feddit.org
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 month ago

        Tunnels are extra dangerous. Not because of the likelihood of an accident, but because of the situation if an accident happens. It blocks the tunnels easily, fills it with smoke, and kills hundreds.

        Except newly built tunnels in rich countries.

  • Skyrmir@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 month ago

    I use autopilot all the time on my boat. No way in hell I’d trust it in a car. They all occasionally get suicidal. Mine likes to lull you into a sense of false security, then take a sharp turn into a channel marker or cargo ship at the last second.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      1 month ago

      Isn’t there a plane whose autopilot famously keeps trying to crash into the ground. The general advice is to just not let it do that, whenever it looks like it’s about to crash into the ground, pull up instead.

      • kameecoding@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 month ago

        The Being 787 Max did that when the sensor got faulty and there was no redundancy for the sensor’s because that was in an optional addon package

        • mbtrhcs@feddit.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          30 days ago

          Even worse, the pilots and the airlines didn’t even know the sensor or associated software control existed and could do that.

      • GamingChairModel@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        30 days ago

        All the other answers here are wrong. It was the Boeing 737-Max.

        They fit bigger, more fuel efficient engines on it that changed the flight characteristics, compared to previous 737s. And so rather than have pilots recertify on this as a new model (lots of flight hours, can’t switch back), they designed software to basically make the aircraft seem to behave like the old model.

        And so a bug in the cheaper version of the software, combined with a faulty sensor, would cause the software to take over and try to override the pilots and dive downward instead of pulling up. Two crashes happened within 5 months, to aircraft that were pretty much brand new.

        It was grounded for a while as Boeing fixed the software and hardware issues, and, more importantly, updated all the training and reference materials for pilots so that they were aware of this basically secret setting that could kill everyone.

      • Skyrmir@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        30 days ago

        Pretty sure that’s the Boeing 777 and they discovered that after a crash off Brazil.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 month ago

      Exactly. My car doesn’t have AP, but it does have a shed load of sensors and sometimes it just freaks out about stuff being too close to car for no discernible reason. Really freaks me out as I’m like what you see bro we just driving down the motorway.

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        29 days ago

        For mine, it’s the radar seeing the retro-reflective stripes on utility poles being brighter than it expects.

    • moving to lemme.zip. @lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      30 days ago

      They have auto pilot on boats? I never even thought about that existing. Makes sense, just never heard of it until just now!

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        30 days ago

        They’ve technically had autopilots for over a century, the first one was the oil tanker J.A Moffett in 1920. Though the main purpose of it is to keep the vessel going dead straight as otherwise wind and currents turn it, so using modern car terms I think it would be more accurate to say they have lane assist? Commercial ones can often do waypoint navigation, following a set route on a map, but I don’t think that’s very common on personal vessels.

  • sickofit@lemmy.today
    link
    fedilink
    English
    arrow-up
    22
    ·
    29 days ago

    This represents the danger of expecting driver override to avoid accidents. If the driver has to be prepared enough to take control in an accident like this AT ALL TIMES, then the driver is required to be more engaged then they would be if they were just driving manually, because they have to be constantly anticipating not just what other hazards (drivers, pedestrians,…) might be doing, they have to be anticipating in what ways their own vehicle may be trying to kill them.

    • Bytemeister@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      29 days ago

      Absolutely.

      I’ve got a car with level 2 automation, and after using it for a few months, I can say that it works really well, but you still need to be engaged to drive the car.

      What it is good at… Maintaining lanes, even in tricky situation with poor paint/markings. Maintaining speed and distance from the car in front of you.

      What it is not good at… Tricky traffic, congestion, or sudden stops. Lang changes. Accounting for cars coming up behind you. Avoiding road hazards.

      I use it mostly like an autopilot. The car takes some of the monotonous workload out of driving, which allows me to move my focus from driving the car to observing traffic, other drivers, and road conditions.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      1 month ago

      You’re probably right about the future, but like damn, I wish they would slow their roll and use LiDAR

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        12
        arrow-down
        8
        ·
        1 month ago

        Elon Musk decided they absolutely would not use lidar, years ago when lidar was expensive enough that a decision like that made economic sense to at least try making work. Nowadays lidar is a lot cheaper but for whatever reason Musk has drawn a line in the sand and refuses to back down on it.

        Unlike many people online these days I don’t believe that Musk is some kind of sheer-luck bought-his-way-into-success grifter, he has been genuinely involved in many of the decisions that made his companies grow. But this is one of the downsides of that (Cybertruck is another). He’s forced through ideas that turned out to be amazing, but he’s also forced through ideas that sucked. He seems to be increasingly having trouble distinguishing them.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          11
          ·
          1 month ago

          He’s forced through ideas that turned out to be amazing, but he’s also forced through ideas that sucked.

          He’s utterly incapable of admitting that one of his ideas is garbage.

          There is a reason he fawns all over Trump and that’s because both of them are of a type. Both of them have egos large enough to have their own gravitational fields but lack any real talent. Look his family up, they’re all like that.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          30 days ago

          Musk has drawn a line in the sand and refuses to back down on it.

          From what I heard the upcoming Tesla robotaxi test cars based on model Y are supposed to have LIDAR. But it’s ONLY the robotaxi version that has it.

          He seems to be increasingly having trouble distinguishing them.

          Absolutely, seems to me he has been delusional for years, and it’s getting worse.

    • madcaesar@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      29 days ago

      Self driving via cameras IS NOT THE FUTURE!! Cameras are basically slightly better human eyes and human eyes suck ass.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 month ago

      Ditto! They were about 1 foot from hitting the tree head on rather than glancing off, could have easily been fatal. Weirdly small axises of random chance that the world spins on

      • Corkyskog@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 month ago

        I still don’t understand what made it happen. I kept watching shadows and expecting it to happen earlier.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          They seriously need to pull FSD if it were just a matter of people risking their own lives I wouldn’t mind but they’re risking everyone else’s by driving this glitch machine around.

        • IllNess@infosec.pub
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          I thought it might be following the tire tracks but no. It just decided to veer completely off.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    ·
    30 days ago

    I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

  • atmorous@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    28 days ago

    For no reason?

    They are running proprietary software in the car that people don’t even know what is happening in background of. Every electric car needs to be turned into an open source car so that the car cannot be tampered with, no surveillancing, etc etc

    Everyone should advocate for that because the alternative is this with Tesla. And I know nobody wants this happening to other car manufacturers cars as well

  • itisileclerk@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    30 days ago

    Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      30 days ago

      I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    29 days ago

    I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week

    That’s because it won’t, that’s because Elmo musk is gasp a liar. Always has been. That robo taxi is actuyab older lie he used a couple of years prior, but he dusted it lfft and re-used it.

    Anytime Elmo says that he’s confident they can do it now, he means that they’re nowhere near a real product. Anytime he says “next year” it means that it won’t ever happen. Anytime he says that they alrethave a product, it just needs to me produced, it means that it’ll never happy

    He is a vaporware con man who has been cheating people (and mostly the US government) out of billions

    Literally look at all of his promises over the last decade, you start seeing patterns. It’s always almost there.

    SpaceX, arguay his most successful company that he actually did with his leadership is a shit show of lies. According to him we’d be having colonies on Mars by now, it’s what he took 3 billion dollars in funding for, and he literally isn’t at 1% of that. Yet, he keeps claiming, within a few years now! Three billion dollars and he managed to blow up a banana over the Indian ocean, and obliterate a launch pad

    If I commit fraud in the thousands, take thousands and then don’t deliver, I go to jail. He does it with countless billions and he’s still out there. Bit alas, his behavior finally is catching up with him, Tesla is going off a cliff bow that nobody wants to drive a Nazi brick anymore

    • atmorous@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      28 days ago

      If it was open source tech people could check it and see if it really is capable for themselves but because it’s not we don’t know what it is missing to be way way better

      • Phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        28 days ago

        Nah, on the 5 levels of autonomous driving, telsas as at level 2

        Elmo isn’t even close but that wint stip him from just lying about it because that is what Elmo does best

  • otacon239@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 month ago

    I fear the day I’m on the receiving end of a “glitch.” It’s ridiculous that anyone can think these are safe after how many of these videos I’ve seen.