Current language models (misnomered as “AI”) are great at some things but fail at any real thinking.

Self-Driving works decently in predictable environments but anything outside of those limits can make it literally crash and burn.

Public road transportation by individuals just has too many cases where real decision-making is required.

Despite all their resources, I think they’ve given up. All the brilliant engineers and scientists have given up, because they know what we’ve suspected for a long time.

  • Darkmoon_AU@lemmy.zip
    link
    fedilink
    arrow-up
    11
    arrow-down
    11
    ·
    edit-2
    19 hours ago

    You lost me at “misnomered as AI”. Artificial intelligence as a branch of Computer Science is a broad field covering not only LLMs - themselves a sub-branch of Deep Learning - but also Fuzzy Logic, Bayesian Reasoning and other statistical methods.

    By any canonical definition; LLMs very much are AI.

    Perhaps you meant ‘AGI’ as in Artificial General Intelligence which means ‘human level across every domain’. Or, perhaps you know this and we’re just being snarky about their current capabilities.

    • scbasteve@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      18 hours ago

      They’re saying that people are now referring to AI as LLMs only.

      Misnomer: “an adjective used to describe a person, place, or thing that has been given an incorrect, inappropriate, or unsuitable name or designation”

      It’s like when you keep calling all squares rectangles. Yes it is technically correct, however, its an unsuitable name. Its misleading.

      • keanu0396@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        14 hours ago

        I would also add to this and highlight that the average person does not differentiate between AI and AGI - I’d bet most haven’t even heard of AGI