I don’t know if I’d agree just yet. In the past, I’ve generally found that the better I got at something, the less time I spent in it. The better that I got at coding, for example, the more time I’d have to spend (proportionally) in ops, design, marketing, etc., all the things that weren’t coding.
I think that the better that AI gets at churning out the dumb stuff, the more that we’ll actually spend time on interesting things. I think the real hazard is that indeed, people will churn out lots of dumb stuff. AI can write lots of code. Can it accurately read lots of its own code? I think the difficulty in managing a large code base is what will keep people writing actual code.
We can consider the history of self-programming computers to see how this plays out. The first self-programming computer system was called an "assembler". Later ones were called "compilers". It played out about how you have predicted.
I think that the better that AI gets at churning out the dumb stuff, the more that we’ll actually spend time on interesting things. I think the real hazard is that indeed, people will churn out lots of dumb stuff. AI can write lots of code. Can it accurately read lots of its own code? I think the difficulty in managing a large code base is what will keep people writing actual code.