But it only works if you have already written all of the documentation manually, and kept that up to date. It's basically a chat bot that knows all of your documentation.
The scenario is a customer opens a chat box on your website and asks some questions for the LLM.
You wouldn't expect your customers to search your internal Confluence pages. The LLM would be trained on all of your internal documentation which is not exposed publicly.
Hallucination is mostly a problem with insufficient training with the current generation of LLMs.
Edit: Maybe not "all" of your internal docs should be exposed via LLM. But the idea is this is an interactive support agent for customers.
that sounds like a dangerous scenario. If your docs are intentionally internal and not public, why would you let a publicly accessible LLMs answer questions with info from them?
An LLM trained on public docs for the public could be a better interface for projects with lots of public documentation.
An LLM trained on internal docs only accessible to internal users might be similarly useful
Even a private LLM on public docs for your support agents to use could increase their efficiency.
But I would never expose an LLM to the public that has been trained on data I don't want public
Actually even the free ChatGPT looks better than Google and even DuckDuckGo results. Unless you want to find the nearest pizza joint ofc.
Of course you have to verify, but it seems to give you stuff that's related to what you're searching for to verify, while a traditional web search has become kinda useless for that because of all the spam pages that are clones of each other.
The memes of society are hallucinations. Worked ok so far.
If you want to live by raw logic well, you’re one of billions, idgaf what you want.
^^ there’s social life under raw logic, sort of like regular life where I have no obligation to your existence, but everyone reminds you explicitly instead of hallucinating otherwise cordially
Hallucinations may not be all that bad unless they’re hallucination's that lead to atrocity. Like the hallucination we can keep burning resources to make AI bots.