Wouldn’t it cut down on search queries (and thus save resources) if I could search for “this is my phrase” rather than rawdogging it as an unbound series of words, each of which seems to be pulling up results unconnected to the other words in the phrase?

There are only 2 reasons I can think of why a website’s search engine lacks this incredibly basic functionality:

  1. The site wants you to spend more time there, seeing more ads and padding out their engagement stats.
  2. They’re just too stupid to know that these sorts of bare-bones search engines are close to useless, or they just don’t think it’s worth the effort. Apathetic incompetence, basically.

Is there a sound financial or programmatic reason for running a search engine which has all the intelligence of a turnip?

Cheers!

EDIT: I should have been a bit more specific: I’m mainly talking about search engines within websites (rather than DDG or Google). One good example is BitTorrent sites; they rarely let you define exact phrases. Most shopping websites, even the behemoth Amazon, don’t seem to respect quotation marks around phrases.

  • HuntressHimbo@lemm.ee
    link
    fedilink
    arrow-up
    141
    ·
    4 days ago

    Because business majors decided a search engines primary job was actually to serve you ads rather than to help you search for things

  • xia@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    19
    ·
    3 days ago

    I could FEEL when amazon removed the not and quote functions… now it’s nigh-unusable.

    • FauxPseudo @lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      I hate trying to search for specific things on amazon because negative operators don’t work. I’m frequently trying to find products that don’t contain specific words. Like when I wanted a foam mattress cover that wasn’t cooling. I need all the heat I can get when sleeping. But trying to find one that wasn’t marketed as cooling? No such luck. I tried using search engines that honor negatives but no such luck. Amazon has thwarted every attempt to find what I want.

      • Modern_medicine_isnt@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        3 days ago

        I use the browser search to highlight the word I want to ignore on the page so I can quickly scroll through and ignore those items. It sucks that I have to do that, but at least it helps a bit.

  • thirteene@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    3 days ago

    It’s because websites interpret those characters differently because of how coding requires using the physical qwerty keyboard. Essentially “>” gets used as a compator operator in programming languages, which means that it’s used as a tool to instructs the computer how to do things. When we need to display the symbol, we use “>” as an “escaped character” which basically means treat it as the symbol, not the instruction set. Often search engines will use a very powerful tool called a regular expression which looks like this for phone numbers: ^(\d{3})\s\d{3}-\d{4}

    And each character represents something, ^ means start with. \d means digit { means 3 of whatever’s in front of me }. Breaking apart the search parameters is pretty complex and it needs to happen FAST, so at a certain point the developers just throw away things that can be a security concern like special characters like &^|`"'* specially because they can be used to maliciously attack the search engine.

    For other characters: https://www.w3schools.com/html/html_entities.asp

  • RedStrider@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    4 days ago

    I’m convinced there’s an AI in Google search now that reinterprets what you put. It never seems to give me what I search, only what it thinks I mean.

    • AndrewZabar@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      3 days ago

      I said so-fucking-long to Google long ago and switched to DuckDuckGo. If I ever get really nowhere and think maybe googs might have a result for me, then on the Duck you just use a !g before your terms and it facilitates the search thru Google but without their ads.

      Their focus shifted long ago from being the best to just figuring out new ways to get more out of users, no matter how deceitful and manipulative they need to be.

  • BenLeMan@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    3 days ago

    Guys, please. The solution to Google reinterpreting your search queries has been around for years, and it is called VERBATIM SEARCH. (Search options: All results -> Verbatim). Voila, welcome back to 1997.

  • schnurrito@discuss.tchncs.de
    link
    fedilink
    arrow-up
    5
    ·
    3 days ago

    For the most part I think they do. I frequently use quoted strings in my search queries (on DDG and Google, I hardly ever use any other search engines) and it usually seems to show me more relevant ones when I do that.

    But in general the WWW is now so big that search engines have been having to become more and more complex (and think for themselves instead of taking the queries very literally) in order to be useful at all.

  • NeatNit@discuss.tchncs.de
    link
    fedilink
    arrow-up
    9
    ·
    3 days ago

    I don’t know the answer but I can tell you two things:

    1. It has often been beneficial to me when the search query wasn’t taken literally, it’s not always a bad thing. Many searches are ones where the user doesn’t know exactly what they’re looking for. Granted, that’s definitely not always the case. That said, I don’t remember ever catching it outright ignore stuff like quoted words/phrases.
    2. Regarding “save resources”, Google introduced Instant Search in 2010 which started showing results as you type, thus creating an ungodly amount of extra load on their servers since each user search now created multiple queries. They clearly have no trouble scaling up resources.
  • Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    12
    ·
    4 days ago

    Because search engines are much more complicated than you seem to think. The reason the operators worked back in the day (probably later than 1997 though), was because the search engines actually searched through the contents of the pages they indexed. They used a lot of tricks to make it work, but basically they were matching keywords directly to the index.

    Modern search engines are much more complex than that, using a lot more abstraction and interesting techniques to both index and search. The amount of data being indexed has exploded since then, the number of users has exploded and the way people use the internet has changed. To keep costs down and search times low, search engines needed to change drastically. And because most people using search engines won’t know how to use those features, they didn’t get preserved.

    I do wonder what kind of search engines you are talking about though. I assume you mean the big ones like Google and Bing (or sites using those engines) and not like a simple product search on a small webshop. Because as frustrating as using Google and Bing have gotten, they are still amazing tech and not bare-bones at all. The reasons for their failings are only partly in their control and not even really their fault. (Except for the AI thing Google tried, that was 100% their fault and just dumb).

    • fishos@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      4 days ago

      Bro, this is just a load of shit. Google removed them by choice, not because of some tech need. Better search engines still use them to great effect.

      You just posted a very long rambling justification for injecting ads and other shit into the results instead of giving you what you asked for.

    • Optional@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      4 days ago

      Have you read the article “the man who killed google search”? Google seems to have gone out of their way to screw it up and have roadmapped only more screwage.

  • corroded@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 days ago

    I’m going to break with what most people are saying and offer the suggestion that search engines are actually doing a decent job. If my mother searches Google for the phrase “Can you please show me a recipe for apple pie?,” she’s probably going to get a recipe for apple pie. If I search google for “c++20” “std::string” “constructors”, after I skip over the ads, I’m most likely going to get a web page that shows me the the constructors for std::string in c++20.

    Ad-sponsored pages and AI bullshit aside, most search engines do still give decent results.

  • db2@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    4 days ago

    It’s cheaper for them not to do it and you’ll still search so they don’t care.

    • j4k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      4 days ago

      Not any more. I use an offline open source LLM first quite a bit now because it is better than their junk. It may only be accurate 80% of the time, but that is a far higher percentage than any present search engine.

      People complain about web scrapers, but scraping is the only practical alternative for finding info and sources now that the web crawlers are worse than trash.

      • brygphilomena@lemmy.world
        cake
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        No. The issue is websites are trash, not the crawlers. SEO has created a weird amalgamation of content, filler, and keywords. It’s why recipe sites have stories with every recipe.

        Google very much is responsible for the current web design though.

  • A_A@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    4 days ago

    We need a browser extension that would post filter the garbage provided by search engines … they are treating us like their product, we should treat their garbage at its real value.