Wouldn’t it cut down on search queries (and thus save resources) if I could search for “this is my phrase” rather than rawdogging it as an unbound series of words, each of which seems to be pulling up results unconnected to the other words in the phrase?

There are only 2 reasons I can think of why a website’s search engine lacks this incredibly basic functionality:

  1. The site wants you to spend more time there, seeing more ads and padding out their engagement stats.
  2. They’re just too stupid to know that these sorts of bare-bones search engines are close to useless, or they just don’t think it’s worth the effort. Apathetic incompetence, basically.

Is there a sound financial or programmatic reason for running a search engine which has all the intelligence of a turnip?

Cheers!

EDIT: I should have been a bit more specific: I’m mainly talking about search engines within websites (rather than DDG or Google). One good example is BitTorrent sites; they rarely let you define exact phrases. Most shopping websites, even the behemoth Amazon, don’t seem to respect quotation marks around phrases.

  • Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    24
    arrow-down
    16
    ·
    4 months ago

    Because search engines are much more complicated than you seem to think. The reason the operators worked back in the day (probably later than 1997 though), was because the search engines actually searched through the contents of the pages they indexed. They used a lot of tricks to make it work, but basically they were matching keywords directly to the index.

    Modern search engines are much more complex than that, using a lot more abstraction and interesting techniques to both index and search. The amount of data being indexed has exploded since then, the number of users has exploded and the way people use the internet has changed. To keep costs down and search times low, search engines needed to change drastically. And because most people using search engines won’t know how to use those features, they didn’t get preserved.

    I do wonder what kind of search engines you are talking about though. I assume you mean the big ones like Google and Bing (or sites using those engines) and not like a simple product search on a small webshop. Because as frustrating as using Google and Bing have gotten, they are still amazing tech and not bare-bones at all. The reasons for their failings are only partly in their control and not even really their fault. (Except for the AI thing Google tried, that was 100% their fault and just dumb).

    • Optional@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      4 months ago

      Have you read the article “the man who killed google search”? Google seems to have gone out of their way to screw it up and have roadmapped only more screwage.

    • fishos@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      4
      ·
      4 months ago

      Bro, this is just a load of shit. Google removed them by choice, not because of some tech need. Better search engines still use them to great effect.

      You just posted a very long rambling justification for injecting ads and other shit into the results instead of giving you what you asked for.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      4 months ago

      Here’s the thing though. You absolutely could still use operators reasonably well even if the results are fuzzier.

      You just use them to control how you leverage the algorithm. AND feeds the algorithm the two sides and filters to results that appear on both. OR joins the two result sets. “Filetype” filters the result set for results that are the relevant file type. Etc.

      If they’re not that common they’re not going to have meaningful costs, especially when most power users don’t use them for most of their searches.

      • Thorry84@feddit.nl
        link
        fedilink
        arrow-up
        1
        arrow-down
        7
        ·
        4 months ago

        Most search engines have filters for stuff like filetype, limiting the search to a specific site and filters for time and location (when applicable).

        Like I said, search engines are way more complicated than one might think. Doing the kinds of things you mention would be hard and only very few people would need something like that. There are tools out there that do meta searching for analysis though, so you can use search results as data in your analysis. Most of those are highly specialized and often paid, but when you need them it’s worth the price.

        Remember companies like Google invest millions (if not billions) into their search engine and have huge teams working on them. Anytime someone says: “Why don’t they just…” the answer is probably very long and complicated.

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          arrow-up
          10
          arrow-down
          1
          ·
          edit-2
          4 months ago

          No, it isn’t even a little hard. It’s super simple pre-parsing of the input that can trivially be done client side before the query even touches their server. Advanced users who use those tools are perfectly capable of taking the extra step to indicate to the engine that they’re doing a real search, and the worst case is still far less intensive per search than any of the LLM nonsense they added to every search and is almost never useful in any way.

          They choose not to. It’s exactly that simple.