Hacker News new | past | comments | ask | show | jobs | submit login

Never bullshit, but like any application built with GPT it does occasionally hallucinate. We published info on how we solved that problem here: https://medium.com/p/f3bfcc10e4ec

We've also made some improvements in how we chunk and parse the content to make sure the information it finds is useful, since we've noticed hallucinations tend to happen when the context you give it is irrelevant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: