• gerryflap@feddit.nl
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    No. ChatGPT pulls information out of its ass and how I read it SearchGPT actually links to sources (while also summarizing it and pulling information out of it’s ass, presumably). ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.

    • kosmoz@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      Kagi supports this since a while. You can end your query with a question mark to request a “quick answer” generated using an llm, complete with sources and citations. It’s surprisingly accurate and useful!

    • helenslunch@feddit.nl
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      3 months ago

      ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.

      …where do you think CGPT gets the information it “knows” from?

          • featured [he/him]@lemmygrad.ml
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            3 months ago

            I mean yeah it does include data scraped from the web but that is all three years old at this point. Hardly a search engine by any metric

          • Xavienth@lemmygrad.ml
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            3 months ago

            This is like saying the library search engine and Bob the drunkard who looked at the shelf labels and swears up and down he knows where everything is are the same thing.

            Look, ChatGPT is an averaging machine. Yes it has ingested a significant chunk of the text on the internet, but it does not reproduce text exactly as it found it, it produces an average of all the text it has seen, weighted towards what seems like it make sense for the situation. For really common information this is fine. For niche information, it is bullshitting without any indication.

            • helenslunch@feddit.nl
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              3 months ago

              This is like saying the library search engine and Bob the drunkard who looked at the shelf labels and swears up and down he knows where everything is are the same thing.

              It’s…not remotely the same thing?

              It’s like saying an engine that searches the web for answers to your query is a search engine…?

              but it does not reproduce text exactly as it found it

              Nor does SearchGPT.

              • Xavienth@lemmygrad.ml
                link
                fedilink
                arrow-up
                1
                ·
                3 months ago

                ChatGPT is not a search engine, it generates predictions on what is the most likely text completion to your prompt. It does not pull information from a database. It is a mathematical model. Its weights do not contain the training data. It is not indexing anything. You will not find any page from the internet in the model. It is all averaged out and any niche detail is lost, overpowered by more prevalent but less relevant training data. This is why it bullshits. When it bullshits it is not because it searched for something and came up empty, it is because in the training data there simply was not a sufficient number of occurrences of the answer to influence its response against the weight of all the other more prevalent training data. ChatGPT does not search anything.

                • helenslunch@feddit.nl
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  3 months ago

                  ChatGPT is not a search engine

                  It is every bit as much of a search engine as SearchGPT, with the exception of more recent information, as I’ve already explained.

                  it generates predictions on what is the most likely text completion to your prompt.

                  …using information from the internet. I’m honestly baffled this needs to be explained. Once again, I ask: Where do you think the information it generates comes from? It’s not just word salad, the words contain information. Were you unaware of the many many OpenAI lawsuits based on this fact?

                  This is why it bullshits.

                  It bullshits because it’s trained on bullshit, and doesn’t actually know anything, and isn’t programmed to say “I don’t know”.

                  • Xavienth@lemmygrad.ml
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    3 months ago

                    The information it generates comes from the model. The information from the model comes from the internet. The information it generates does not come from the internet. A to B to C, not A to C. I don’t know how to explain this more simply without crayons, the information from the internet does not exist within the model, but the average of the information can be recreated by the model. That is not what a fucking search engine does. A search engine doesn’t tell you the average results for your query, it gives you the most relevant results. At least, they should and used to. I can understand the confusion if you’ve only used a search engine in the past 3 years.

      • gerryflap@feddit.nl
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        3 months ago

        From the train dataset that was frozen many years ago. It’s like you know something instead of looking it up. It doesn’t provide sources, it just makes shit up based on what was in the (old) dataset. That’s totally different than looking up the information based on what you know and then using the new information to create an informed answer backed up by sources