Hacker News new | past | comments | ask | show | jobs | submit login

[flagged]



People are very tired of the "move fast and break things" approach of the last decade and a half. Especially when the technology in question is being used to replace real jobs done by real people (in an already very tough job market). The only people benefiting from implimentations of AI like this are the CEOs and executives, who I wouldn't call "hackers" either.

There is definitely an argument for having a human proofread the article, but for all we know the recap could have been proofread by someone with as much understanding of soccer drama as the AI itself.

These articles were mostly written by low paid writers, that already lack the interest in the field. Most of them lacked nuance of recaps, because of who was writing them.

And who are we to say that some disinterested writer would have produced an article any different? After all generative AI only generates articles using the body of data that it's trained on... which leads me to believe that most articles fail to provide player profiles in their recaps in any case.


Well, we can risk of having a disinterested writer write a boring recap of a game, or we can guarantee that a disinterested LLM will write a boring recap of a game. The only difference is that it's cheaper for the CEOs to call an OpenAI API than pay a writer a salary. If corporations are going to try to justify adopting AI (and the mass layoffs that naturally comes with that), then the AI has to produce work that's not just as bad as the worst human output.

Or... We can have 1 highly paid specialist review recaps and adjust the prompt to the LLM, instead of 100 low paid disinterested writers producing crap articles(like they already do today)

Why should we race to the bottom to just employ people? Why do we need another Buzzfeed?


I don't think people are upset about the technology. It's about overreliance and lack of quality control.

If you look at the comments, it's very much an anti-AI tirade rather than an anti-implementation one

That's all fine but especially for a news site with millions of visitors (that also generates over 1B in profit a year) I expect Disney to screen ai content with a team of editors to ensure it at least meets a quality standard even if it's basic information.

the problem here is not the use of AI/LLM to generate content, it's the fact that the content missed a big detail that basically any human soccer reporter would've made into a graf.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: