Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These things are constructed in secret. I have no particular reason to grant them any benefit of the doubt. Controlling the output of LLMs in arbitrary ways is certainly worth a lot money to a lot of parties with all kinds of motivations. Even if LLMs are free of hidden agendas now, that's not a stable situation in the current environment.


I am not saying that it does not "lie", but it does not lie because it wants to lie, or because it has the intent to lie or deceive. It does so because of its system prompts or something else, done by its developers.

What you said is exactly why I said "If it "deceives", we would have to answer this question.". The developers made it so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: