> I wonder why people say stuff like this. Are people writing code then just blindly shipping it to production?
Yes, yes they are! HN is now inundated with examples and the situation is only going to get worse. People with zero understanding of code, who take hours to convert a single line from one language to another (and even then don’t care to understand the result) are shipping and selling software.
People are already cut and pasting solutions off random webpages and stackoverflow. That they would grab stuff right out of chatgpt does not surprise me in the least. I would ask back 'why would this be any different this time?'
Crucially, those have context around them. And as you navigate them more and more you develop an intuitive sense for what is trustworthy or not. Perhaps you even find a particularly good blog which you reference first every time you want to learn something from a particular language. Or in the case of Stack Overflow, your trust in an answer is enhanced by the discussion around it: the larger the conversation, the more confident you can be of its merits and limitations.
When you get all your information from the same source and have no idea what it referenced, you lose the all the other cues regarding the validity, veracity, or usefulness of the information.
Often incorrect context, incredibly biased context (no you don't know what you want, here's a complete misdirection), or just plain outdated. So basically the same thing as ChatGPT.
People who blindly copy and paste and ship are always going to do that. Everyone else isn't. It's really that simple.
> Often incorrect context, incredibly biased context
And as you learn more from different sources, you get better at identifying them.
> or just plain outdated
Which you can plainly see by looking at the date of publication. In the case of Stack Overflow, it is common that popular questions have newer answers which invalidate old ones and replace them at the top.
> People who blindly copy and paste and ship are always going to do that.
Yes, they will. You seem to agree that is bad. So wouldn’t it follow that it is also bad that the pool of people doing that is now increasing at a much larger rate?
What I can find a bit more 'scary' is do they even need a deep understanding of it? ChatGPT will get it 'pretty close' and probably many times correct (as well as wrong). But if it is 'mostly right' then does it matter? Which is even more philosophical. As long as they can read the code and it does it like 95% right it may be just fine and they can fix it later? Heck they could even ask chatgpt what is wrong with it...
Yes, yes they are! HN is now inundated with examples and the situation is only going to get worse. People with zero understanding of code, who take hours to convert a single line from one language to another (and even then don’t care to understand the result) are shipping and selling software.
https://news.ycombinator.com/item?id=35128502