An implicit aspect of Moores law has been that cost per transistor has been going down as the density is increasing. This doesn't seem to be the case anymore. The technology required to get higher transistor density is getting ridiculously expensive. We're not seeing the power benefit of scaling down transistors either, since leakage is starting to get too high. I guess there's one more trick in the pipeline with Gate-All-Around, but I don't think I see a path to get better gate control after that. And if we don't get power consumption per transistor down, then stacking transistors in layers to increase density isn't going to be very viable for compute chips, since you need to get the heat out of the chip. IIRC, Intel is working on putting the power metal layers on the back side of the chip, which grows the chip vertical in the other direction so to speak. And it helps wick away heat as well, so could open a path for a few layers of compute transistors. But all this adds a huge amount of complexity to manufacturing, so at some point it might not be worth the cost anymore.
I thought the power benefits of shrinking still hold up rather well, in contrast to cost. E.g. new Nvidia gaming cards have smaller GPUs for the same price as the respective old generation, meaning the cost per chip area doesn't stay constant for improved manufacturing nodes. So the price per transistor shrinks slower than the number of transistors per chip area grows. At some point in the future the price per transistor would go up rather than decrease. Then the value of shrinking structures could stem, at best, from lower power draw per transistor. For mobile devices. But even power draw per transistor may stop decreasing at some point. Then further shrinking the process nodes would be useless.