Debuggers are just another programming tool to be utilized when needed and as needed. It is my opinion that programmers starting out should avoid them like the plague and here is why:
Debuggers are like a crutch when it comes to figuring out what is going on in code. Debuggers are amazingly detailed allowing line by line inspection and manipulation of items in mid flight while the code is bing executed. This power, when you are new to programming, is enticing. However having this ability is akin to the draw of the dark side of the force, wild power with no control is a problem. (Note: This crutch can also be abused by people that have been programming for a long time - so please don't think I am singling people out here.) Debuggers as a tool are great, as a learning or teaching practice, I believe them to teach the wrong things.
Debuggers bring about a certain complacancy that I think hurts the programmers and the people they work with and for. Some programmers get to thinking that they can solve the problems they run into by simply saying "Oh you are having X problem, well we'll just attach the debugger and see whats going on..." when in fact there are a fair number of situations where that can not be done. The large for instance is companies who are large enough to have a "production" environment are not inclined, in most cases, to just let a programmer "connect" in such a powerful way to a running production application. So what do you do in this instance? Lets break out the swiss army knife and see.
1) Utilize TDD up front - use your tests to ferret out as many of the bugs as you can find. In some cases this testing my actually help you locate things like race conditions and so on (assuming the test was written to do that).
2) Effective Logging - NOW here is a topic that people so often skip over. So many times in my life I have said "I wish the person who wrote this had logged something in this location where this very important state changed because of X" that I would be rich if I got a penny every time I said it. This is also a topic that is rarely covered in any learning or teaching in school. In fact I don't believe that ANY class I have attended or heard of teaches this as a topic. IT SHOULD BE, even if it doesn't cover it in depth the topic should be discussed.
3) Cleaner code - This is hard to quantify, but one could pose the argument that cleaner, simpler code would have less bugs simply because it would be more straight forward to understand.
Is there a black and white in the use of debuggers - probably not. Like I said above it is just one more tool in the arsenal. Since we already know there is no one tool for all jobs it should be safe to say that debuggers should be used when appropriate. Finding those appropriate times is hard however given the ultimate power debuggers can provide.
In the end just because you write tests doesn't make you ignorant of a debugger(s) nor does using a debugger mean that your language of choice at the time can not run tests gracefully although it may be true in some cases. Debuggers are also certainly no silver bullet, but then again neither is TDD, but doesn't mean that I should abandon one for the other. Each person is going to use the tools that they think best fit the bill. In some cases they may also choose to sharpen their saw with a blunt instrument. Now while using a debugger is not my first choice I can recognize that it is a tool I can use to help solve the problems I have, point out where I need to add logging to legacy applications, or even write tests for things I know nothing about when I start my day. Debuggers are bad for learning, and can be bad when used improperly - but I would certainly shy away from a black and white statement that says ALL debuggers are bad in ALL situations.