I disagree. I think debugging is the most expensive activity a programmer ever does. Refactoring is a luxury that you have when you don't have bugs or time pressure to ship/fix something. Debugging potentially requires you to load the entire context of the (incorrect) program into your head, including irrelevant parts, as you grope around to figure out a.) what actually went wrong, b.) why it went wrong, and c.) how to modify the existing system in a way that doesn't make it worse.
Debugging is reverse-engineering under the gun. It has huge cognitive load. Especially debugging a production system with a difficult to reproduce bug in a deep dark part of the code. It's a nightmare scenario.
Refactoring, on the other hand, often happens with incomplete knowledge, and can be quite local. I've seen zillions of refactorings that are done with incomplete knowledge that are local improvements (and many that were not global improvements).
When I say non-trivial, I don't mean local refactoring. I mean the kind of refactoring that requires you to load the entire system (or a large part of it) into your head, and figure out how to clarify and simplify it.
It is not a luxury. When done successfully, it is the only way to lower the cost of that expensive debugging. The slow debugging and the expensive refactoring are two sides of the same coin. They are both the cost of a system that is too difficult to understand and safely change. But the cost of a good refactoring need only be paid once. Whereas the cost of debugging a system you refuse to fix is levied again and again.
Debugging is reverse-engineering under the gun. It has huge cognitive load. Especially debugging a production system with a difficult to reproduce bug in a deep dark part of the code. It's a nightmare scenario.
Refactoring, on the other hand, often happens with incomplete knowledge, and can be quite local. I've seen zillions of refactorings that are done with incomplete knowledge that are local improvements (and many that were not global improvements).