Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Over the weekend, I was trying to add a feature to a largely unfamiliar codebase. ChatGPT appeared to have knowledge of this codebase, so I asked it many questions as I worked, in order to understand the big picture. In hindsight, I'm pretty sure almost every answer ChatGPT gave me was wrong.

However, even with this outcome—I've had cases where ChatGPT was significantly more accurate—I think ChatGPT was helpful on net. The ability to propose ideas and talk through problems with a robot was really helpful when I got stuck, in a socratic dialog sort of way. Perhaps I could have gotten the same result from some sort of question-formulation exercise—all of the useful insights ultimately came from me—but I think ChatGPT's responses helped me follow my own reasoning, if that makes sense.



It sounds like you used ChatGPT as a rubber duck. Although this rubber duck can talk back to you.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: