Hacker News new | past | comments | ask | show | jobs | submit login

I guess this is a good thing for increasing coverage of neglected areas. But given how cleverly LLMs can hide hallucinations, I feel like at least a few different auditor bots should also sign off on edits to ensure everything is correct.



This method has actually been proven effective at increasing reliability / decreasing hallucinations [1]

1 - https://arxiv.org/abs/2402.05120




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: