Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
barbarr
8 months ago
|
parent
|
context
|
favorite
| on:
Storm: LLM system that researches a topic and gene...
I guess this is a good thing for increasing coverage of neglected areas. But given how cleverly LLMs can hide hallucinations, I feel like at least a few different auditor bots should also sign off on edits to ensure everything is correct.
pksebben
8 months ago
[–]
This method has actually been proven effective at increasing reliability / decreasing hallucinations [1]
1 -
https://arxiv.org/abs/2402.05120
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: