Moore's law is only about the number of transistors per chip doubling every 24 months, not about the performance. Seeing that the trend is still happening, Moore's law is not dead, as so many have claimed.
But what is it good for, if it does not improve performance? For example, increasingly larger and larger part of transistors on a chip is unused at a given time, due to cooling issues.
And as long as there's something to gain from going smaller/denser/bigger, and as long as the cost-benefit is good, we'll have bigger chips with smaller and denser features.
Sure, cooling is a problem, but it's not like we're even seriously trying. It's still just air cooled. Maybe we'll integrate microfluidic heat-pump cooling into chips too.
And it seems there's a clear need for more and more computing. The "cloud" is growing at an enormous rate. Eventually it might make sense to make a datacenter oriented well integrated system.
It obviously does improve performance, otherwise why would people be buying newer chips? :) It doesn't mean we'll see exponential performance increases though. In specialized scenarios, like video encoding and machine learning, we do see large jumps in performance.