Unit Testing is essential in two places: if I’m working on a problem that’s really hard and if I’m working on something that’s going to be refactored *a lot*. Your average developer doesn’t do a lot of the latter and does the former only rarely.
Coders tend to “compile-crap” things into existence and then drop into a fearful sort of maintenance mode after that where they improve code only if they have to. Developers simply do not have a vision for anything beyond that. They fear that unit testing is just more of the same sort of stifling bureaucracy that they’re used to in their suffocating work environment. They have no inkling that they can gain several orders of magnitude in effectiveness and productivity by applying themselves in a concentrated study of programming tools, principles, and concepts.
Their managers understand these things even less: they can’t even identify a good programmer when they see one, much less encourage things that “grow” good programmers over time.
The big fat ERP system my company just bought comes with absolutely no regression testing framework whatsoever– and they’re constantly rolling out patches and updates to clients that have made their own extensive in-house modifications. If ever there was a place where unit testing could offer massive benefits in terms of reducing developer exasperation, this is it. Yet the tribal practice is essentially to rely on the users to do the testing for IT.
If unit testing had gone from being buzzword among the “elite” developers to being something that average managers *insist on* whenever they buy a large system like this, then it wouldn’t be on the wan. At the moment, though, there just doesn’t appear to be anyone on that scene willing to put up a fight for it.