I think this is simply due to the fact that to train an AGI-level AI currently requires almost grid scale amounts of compute. So the current limitation is purely physical hardware. No matter how intelligent GPT-5 is, it can't conjure extra compute out of thin air.
I think you'll see the prophesized exponentiation once AI can start training itself at reasonable scale. Right now its not possible.
I think you'll see the prophesized exponentiation once AI can start training itself at reasonable scale. Right now its not possible.