And many of those utilities and edge cases will have been wrong or inconsistent, too. That's what the new "100x engineers" don't realise, because they never check those 1,000 lines of code they generated for themselves in a few minutes.
I've made similar experiences to yours for some one-shot scripts, and once decided to actually look inside. It did stuff like writing three different validators for the same data, each called only once, each validating slightly differently, and no doubt each with their own set of subtle bugs and gotchas.
These tools are intrinsically incapable of creating clean architectures and adhering to consistent standards and best practices. They are not cost-cutting or raising efficiency, they're simply very good at camouflaging the immense time costs they will cause down the line.
I've made similar experiences to yours for some one-shot scripts, and once decided to actually look inside. It did stuff like writing three different validators for the same data, each called only once, each validating slightly differently, and no doubt each with their own set of subtle bugs and gotchas.
These tools are intrinsically incapable of creating clean architectures and adhering to consistent standards and best practices. They are not cost-cutting or raising efficiency, they're simply very good at camouflaging the immense time costs they will cause down the line.