• kadu@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    2
    ·
    22 hours ago

    LLMs don’t have any awareness of their internal state, so there’s no way for them to see something as a gap of knowledge.

    • Doorknob@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      edit-2
      22 hours ago

      Took me ages to understand this. I’d thought "If an AI doesn’t know something, why not just say so?“

      The answer is: that wouldn’t make sense because an LLM doesn’t know ANYTHING