Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If ChatGPT keeps giving you wrong answers wouldn’t this make paying customers leave? Effectively “losing its job”. But I guess you could say it acts more like the person that makes stuff up at work if they don’t know, instead of saying they don’t know.


There was an article here just a few days ago, which discussed how firms can be ineffective, and still remain competitive.

https://danluu.com/nothing-works/

The idea that competition is effective, is often in spherical cow territory.

There’s tons of real world conditions which can easily let a firm be terrible at their core competency, and still survive.


> But I guess you could say it acts more like the person that makes stuff up at work if they don’t know, instead of saying they don’t know.

I have had language models tell me it doesn't know. Usually when using a RAG-based system like Perplexity, but they can say they don't know when prompted properly.


I've seen Perplexity misrepresent search results and also interpret them differently depending on whether GPT4o or Claude Sonnett 3.5 are being used.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: