Hacker News new | past | comments | ask | show | jobs | submit login

It’s not perfect, if you know of a better alternative I would genuinely love to hear about it.



If I absolutely need to avoid hallucinations (e.g. when building marketing/sales assistants for the businesses), then I allow LLM to control and drive search for the relevant documents.

On a high level:

(1) give LLM ability and enough information to "expand" user query into a multiple search phrases. Search engine will use them to find most relevant fragments via a form of embedding search

(2) Get highest ranking document fragments and "show" them to LLM saying: "This are the results that were found in the document database using your search phrases via embedding similarity. Refine the search"

(3) Repeat that a couple of times, then rank final documents and combine them for the final answer synthesis.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: