• Croquette@sh.itjust.works
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    5 days ago

    LLM are just sophisticated text predictions engine. They don’t know anything, so they can’t produce an “I don’t know” because they can always generate a text prediction and they can’t think.

    • Cyberflunk@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      8
      ·
      5 days ago

      Tool use, reasoning, chain of thought, those are the things that set llm systems apart. While you are correct in the most basic sense, it’s like saying a car is only a platform with wheels, it’s reductive of the capabilities

      • Croquette@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        5 days ago

        LLM are prediction engine. They don’t have knowledge, they only chain words together related to your topic.

        They don’t know they are wrong because they just don’t know anything period.