Or puts in Ruby, or System.out.println() in Java, or echo in shell scripting and PHP, or printf() in C/C++, or console.log() or worse, alert() in JS, or the equivalent in whatever language and environment you're in?
I'm primarily a JS developer, and the first debugging tool I use is usually alert() or console.log(). I usually just have to locate a single line that is past the point something has gone wrong, and I can usually very quickly binary search between this line and the event handler to find the offending line and fix it. Of course, the binary search is guided by intuition and if I don't find it after a few alerts I will start setting breakpoints in the debugger and/or stepping through instead, but somehow I've always seen the debugger as this big, heavyweight sledgehammer I don't whip out unless I'm actually stuck.
I do the same thing in all of the above mentioned languages, and in fact have never used a debugger at all for Ruby or the Pythons, though I assume they exist (nor for shell scripting or PHP, but do those exist?). I consider myself at least a proficient coder, and in particular have substantially used all the above languages (I've also used Scheme in a class, but nothing substantial). One of the best coders I know, who mainly codes in Ruby, also often first tries puts debugging. Yet I've also been ridiculed (in a nice way) by another friend for even suggesting to someone that they try to debug what seems like a simple mistake with a few quick print statements.
In a way I agree with them--print statements are a crude way to debug. But they're quick and dirty and get the job done and aren't hard at all to clean up. So I wonder: how many others debug with print statements first and a debugger second?
Using a real debugger is great, and it gives FAR more power. It also allows me to wander around and check the values of various things without having to re-compile, re-deploy, and re-run the application up to the point where the error is occurring.
But getting the debugger to work takes a few hours of setup. If I'm at someone else's desk helping them debug a problem, it will take hours to get the debugger working, and only a few minutes to add the print statement.
If I'm working at my own desk and I briefly need to make a small change to another application that I don't normally work on, it would take me hours to get the debugger set up and only a few minutes to add the print statement.
If I'm fixing a problem that only occurs in QA (or worse yet, in production) then I have to use logging or a print statement because it would take days to get permission to attach my debugger to QA.
If I'm working on my own usual project, then I use the debugger because it is many, many times more powerful. Unless it's Tuesday, or for some other random reason the environment gods hate me and my debugger isn't working today. Then, depending on how much of a rush I'm in, I invest another few hours into fixing it or I just add a print statement and get on with my work and fix the environment later.
PS: I'm writing this while waiting for my IDE to recompile the world in my 4th attempt to get my debugger working again because my environment broke for no known reason.