Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I disagree. Not with the part that it gets things severely wrong at times. I disagree that it gets glossed over.

It’s fucking obvious how wrong LLMs get and I think this part is over exaggerated while the parts it gets right like basically slowly replacing parts of your average HNers day job as a programmer is deliberately ignored and scoffed at.

The sentiment against AI by the overwhelming majority is hatred and negativity especially on HN. It is a small minority (especially among entrepreneurs and founders) that are overly positive about AI. But make no mistake the overwhelming sentiment is negative to the point of delusion like the LLMs themselves.

Like it’s unmistakable to me how LLMs can basically up developer productivity to a much much higher degree than ever before. Yet we have plenty and plenty of people who can’t even take the middle ground and say it kind of helps. All kinds of developers everywhere saying LLMs are fucking completely useless. Which is mind bogglingly irrational.

Most Artists for example are decrying it because it produces soulless work. I agree the work is soulless but indistinguishable and often materially better than what a human can do. In fact the concept of soul becomes utter bullshit in a double blind test. They aren’t decrying it because it’s soulless that’s bullshit. They decry it because it’s on the trajectory of becoming better than them. That’s the same fucking reason you see HNers siding with the best possible scaffolding of logic and reasoning that will support their day job. That’s why you see people claiming random shit like LLMs don’t actually understand anything when we in fact have no clue or ability to even properly answer that question.





> Like it’s unmistakable to me how LLMs can basically up developer productivity to a much much higher degree than ever before. Yet we have plenty and plenty of people who can’t even take the middle ground and say it kind of helps. All kinds of developers everywhere saying LLMs are fucking completely useless. Which is mind bogglingly irrational.

There's not a single proper study showing this increase in productivity and just about every real developer I know finds very limited use in LLMs. They don't increase productivity "to a much higher degree". It's marginal, maybe 5-10% if you use them strategically in situations that are particularly suitable.

I decided to waste some time "for science" and implemented a feature twice, once by myself and once with Cursor. A feature that took me 4 hours to implement myself took 1-1.5 hours of planning + 1.5-2 hours of iterative agentic coding just to get it to meet basic functional criteria, and it would've taken me at least 2 more hours to review and refactor if I hadn't quit in frustration.

If I didn't care about long-term maintainability I could've finished it with AI in under 2 hours and I would've claimed a 100% productivity boost. I imagine that's what people do, prioritizing short term gains while taking on eye-watering amounts of technical debt, but trying to sell this as a productivity improvement is extremely naive.

LLMs are useful in very specific situations where the changes are trivial (small standalone snippets, straightforward changes in a larger codebase) OR long-term maintainability doesn't matter (one-off scripts). That's the middle ground.


> Not all content needs to be real. A huge portion of what humans appreciate is fiction. There's a huge amount of that content and hallucination is the name of the game in these contexts.

I didn’t realize this was you as I was pulling this up as an example. You literally said “we appreciate fiction” as a defense of LLM’s giving us bad results. I don’t know how that isn’t glossing over it - if anything it doesn’t accurately communicate how wild of a take that is. You’re functionally saying false information is desirable.

Unless you were taking the piss there and I totally missed it, it was truly baffling to read.


Baffling? Is just the HN way of calling someone a fucking idiot without violating the rules and pretending to be polite. Just say it to my face.

Is it not obvious human society likes fiction? Is it not obvious that creating fiction requires immense intelligence? That’s my point. If your genius mind can only bend that argument in a singular direction of a “wild take” well I hate to break it to you but you’re baffling.

False information is desirable. Watch tv, read a book. The human race makes billions off of lies not because we are being duped. But because we desire to be lied to. only a genius savant like you needs to be told what the rest of the human race knows.

Look it’s not a “defense” against LLMs as if it’s something that needs defending. It’s like saying I’m defending a ray of light or a gust of wind. Doesn’t make any sense. All im saying is that the LLM is a form of intelligence that has a use versus your brain dead argument that it slipped up when talking to you.


> Baffling? Is just the HN way of calling someone a fucking idiot without violating the rules and pretending to be polite. Just say it to my face.

Maybe that has been your experience with other users in which case I am sorry people have been so rude to you, but in my case it’s just a word I personally use a lot. If it’s too severe a term than my and and reading back I am coming in a it hot so I am sorry for the tone. I do not think you’re an idiot and I am absolutely not personally attacking you. I tend to have a dramatic way of speaking, I can admit that. But again, this is not a personal attack.

The point I am trying to communicate is that it’s (to me) a very surprising and difficult to square take. Comparing a tool failing to do its job correctly to appreciating a work of written fiction just seems bizarre to me. That’s the truth. The people building LLMs do not want that result. I do not want that result. Nobody wants it to spit out inaccurate information disguised as correct information. I don’t want my calculator to spit out fiction literally ever - the same goes for LLM’s outside of deliberately prompting it to do so. If I want fiction as you describe (art and such), I seek it out deliberately. I will grab a book off my shelf or watch a show (or prompt the LLM with intent).

Put another way: The difference between the fiction in a novel and what an LLM spits out is that I am seeking it out in the former, not the latter. When an LLM gives me incorrect information disguised as correct information (undesired fiction), it is failing to do its job correctly. It is a tool that is not functioning properly. I absolutely 100% never want fiction emerging in key instructions when I am cooking or am fixing my car. It is always an undesired result.

So to circle back to why I find this “baffling,” or another word if you find that too severe, it’s that I don’t understand how something that is so concretely undesirable can be described as a positive thing comparable to creating works of literature for us to appreciate. You’re telling me it’s good that something does not function properly/as expected and gives me results I absolutely do not want. To get away from “baffling”: That is a very bold and unexpected take that I struggle to find any agreement with.


It’s not bizarre. Hallucination is just another word for invention, the same cognitive move that produces fiction. In one context that’s failure, in another it’s success. Calling that bizarre is like calling imagination itself an error. If that feels strange to you, you’re missing something fundamental about how creativity works. Everyone knows this. Any human being with a pulse understands the difference between making something up for art and making something up by mistake. So when you act like that’s an alien concept, I don’t think you’re confused. I think you’re pretending.

The difference between the fiction in a great novel and what an LLM spits out is that I am seeking it out in the former, not the latter. When an LLM does that, it is failing to do its job correctly.

Sure, but thanks for explaining what everyone already understands. You’re not clarifying anything new, you’re just pretending not to get the point so you can keep arguing. The discussion wasn’t about LLMs fixing cars or following recipes. It was about any kind of work, and a huge portion of human work revolves around invention, storytelling, and creative synthesis. Fiction writing isn’t a corner case, it’s one of the most valued human expressions of intelligence. Everyone knows that too. It’s not an obscure philosophical insight. It’s basic cultural literacy. Which is exactly why I don’t buy your act. You’re too smart not to know what’s obvious to everyone else.

So when I say the “failure mode” of hallucination can be a “success mode” elsewhere, I’m not dodging the topic, I’m expanding it. Creativity is a massive part of human life. Pretending otherwise just to win a narrow argument is dishonest. You know exactly what I meant, you’re just acting like you don’t. No one with normal cognitive function finds that bizarre. It’s theater.

And you used the classic tells, the same ones that get used on HN all the time to dodge the rules while still getting in a jab. You drop words like “bizarre” and “baffled,” act like you’re confused, then follow up with a calm “apology” to sound like the reasonable one. It’s a well known pattern here. You literally used the exact two words everyone does when they’re trying to provoke without crossing the line.

Then came the self deprecation. The polished restraint. “If that was too severe, my apologies. I tend to be a little dramatic. I don’t think you’re an idiot. I’m just trying to communicate my point. I’m sorry for that.” It’s spotless. It hits every note. It reads like empathy but functions like control. It doesn’t defuse the conflict, it reclaims the moral high ground. It’s not humility, it’s stagecraft.

Look, maybe I was too sharp myself. I can be dramatic too, I admit that. It’s not a personal attack, I just have strong feelings about intellectual honesty. I’m sorry for that.

See what I did there?

No point in continuing this.


I’m not trying to dodge anything and I’m not sure why there’s so much hostility here but sure we can go ahead and drop this. I made my point and retreading it isn’t going to do any good. Have a good rest of your week.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: