I'm assuming you're using ChatGPT to validate/generate technical solutions such as code and not necessarily searching for specific information? If its for information search, then how do you deal with the fact that at times it tends to make up a things that are factually incorrect or logically inconsistent?
By using Google to verify. It’s still okayish to find out whether a statement is true or not. The problem is that Google is becoming worse to find that statement. For example, I wanted to figure out how I should replace a given dependency. Google was terrible, so was Stackoverflow. ChatGPT gave me a guess. For such questions it’s quite right in about 3/4th of the queries, and when I search for the given guess Google finds it immediately whether it’s fine or not.
I was trying to figure out whether kerberos supports authorization and authentication. chatGPT sometimes said yes, sometimes said no.. Google had similar results. I had to dig through the content to assess which Google result was the truth.
As AI generated content spreads to the internet, the truth will be harder and harder to find.
Current plans for language-model-driven search is to have the answers driven by a normal search index. So the LLM can run ask to run a web search and the result gets copypasted back into its input prompt. Basically, it's Googling for you and summarizing the result.
Which makes me wonder how long until websites figure out how to convince the index to do prompt injections that wouldn't fool an actual human looking at results on a search engine results page.
There are other search engines, such as Marginalia. The bigger question is what happens if websites start disappearing, this is a bigger problem than you might think because even websites like Wikipedia, reference news websites, which are visitor- and therefore ad-driven.
I'm assuming you're using ChatGPT to validate/generate technical solutions such as code and not necessarily searching for specific information? If its for information search, then how do you deal with the fact that at times it tends to make up a things that are factually incorrect or logically inconsistent?