Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Determining trustworthiness of LLM responses is like determining who's the most trustworthy person in a room full of sociopaths.

I'd rather play "2 truths and a lie" with a human rather than a LLM any day of the week. So many more cues to look for with humans.



Big problem with LLMs is if you try and play 2 truths and a lie, you might just get 3 truths. Or 3 lies.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: