Stating the thesis then (repeatedly) re-stating it with a (very) slightly different wording was a red flag. Professional journalists don't write like that, but AI and schoolkids trying to pad out their essays do.
I'm guessing that a human did edit this AI output, just not very well.
I think you’re giving too much credit to “professional” journalists. This is also an excellent way to hit a certain word count if you are a few sentences short. Basically you bloat it up without needing to write anything new. Listicles and other pre LLM blogspam did this all the time.
We're all facing the need to build a mental list of which news sources are publishing AI-generated garbage, so that we can disregard them.
I'm still surprised to see that, as far as I can tell, no news outlets have made a public commitment to never use AI in their writing. Seems like it would be an easy way to promote the brand on commitment to quality.
I've griped about this before, but here we still are. We now know that MSN news, for instance, has no credibility due to their publishing AI-generated misinformation. https://news.ycombinator.com/item?id=39043135
I'm guessing that a human did edit this AI output, just not very well.