• NutinButNet
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    20 hours ago

    The ones I use do use citations, though I do sometimes find dead links and I have called it out on that so it can provide another link for the information it is sharing.

    No one should blindly trust any LLM or any source for that matter. Should be, at the very least, 2 unrelated sources saying the same thing, with an LLM not being one of the two since it is just regurgitating information.

    But it really depends on the usage and how serious it is to you. For health stuff? Should definitely seek multiple unrelated sources. Wanting to know some trivia about your favorite movie? Maybe check 1 of the sources if it’s that important to you. Definitely should get the source if writing a paper for school, etc.

    I use the LLM as a search engine, of sorts, and rely on the sources it shares more than the information it provides in some of the work I do. I sometimes find it easier than using my methods which now don’t always work or for situations like describing something with a lot of words.

    • feddylemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      18 hours ago

      I use it as a search engine too. You can ask most LLMs to cite a claim. Then you can evaluate the claim based on the credibility of the source. They’re somewhat decent at summarizing and the corollary is that they’re somewhat decent at going through a lot of material to find something that might be what you’re looking for. It’s all about the verification of the source though.

      • quacky@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        15 hours ago

        They may work as seach engine, but they omit information, so the results are only the AI-approved ones. They are like horseblinders limiting your field of vision while also letting you see the range of tolerable or preferred ideas. They may also lead you down the wrong directio, distract you

            • feddylemmy@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              13 hours ago

              That source does not disagree with my statement that all information brokers have bias. It also does not address the fact that you can verify the source the LLMs give you.

            • Artisian@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              12 hours ago

              Yeah, LLMs have consistently broken information fire walls on initial release. We’re actually pretty bad at biasing them intentionally (so far, this is definitely solvable).

              The point of a search engine is to omit irrelevant information, but what is relevant is extraordinarily hard/subjective/nuanced. Compare with, say, YouTube demonetization.

    • quacky@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      19 hours ago

      The ones that provide sources, like perplexity or gemini, seem to include blogs and reddit posts. They cant differentiate a good source from a bad one.

      Also this relates back to the bias problem because even if they are citing credible sources, their interpretation is unreliable as well