People always misuse searchengines by writing the whole questions as a search…

With ai they still can do that and get, i think in their optinion, a better result

  • Lembot_0006@programming.dev
    link
    fedilink
    arrow-up
    8
    ·
    2 days ago

    LLM can be used as a search engine for things you know absolutely zero terminology about. That’s convenient. You can’t ask Google for “tiny striped barrels with wires” and expect to get the explanation of resistors marking.

    • BenderRodriguez@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      It sounds like you might be referring to miniature striped barrels used in crafts or model-making, often decorated or with wire elements for embellishment or functionality. These barrels can be used in various DIY projects, including model railroads, dioramas, or even as decorative items.

    • morto@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Reverse image search would let you find that answer more accurately than some llm

        • morto@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 days ago

          When you see something you have no idea what it is, you just take a photo and do the reverse search, finding other similar photos and the name of the thing. You don’t even need to spend time describing what you see and won’t have a chance of getting a wrong confident answer. Reverse image search exists for more than a decade and don’t use llms

          • Lembot_0006@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            ML is ML. No matter if it is LLM or not. And the question “What is this thing?” covers a negligibly tiny percent of search requests.

            • morto@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 days ago

              It’s not all the same. Application-specific ml models tend to be much smaller and demand much less resources than llms. They also tend to be more precise.

              And the question “What is this thing?” covers a negligibly tiny percent of search requests.

              I was just addressing the given example