Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is no skepticism. LLMs are fundamentally lossy and as a result they’ll always give some wrong result/response somewhere. If they are connected to a data source, this can reduce the error rate but not eliminate it.

I use LLMs but only for things that I have a good understanding of.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: