Test-driven development, a technique borrowed from Extreme Programming for unit testing means spending up-front...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
You mention that TDD was borrowed from Extreme Programming. I know I've often heard TDD used in the context of agile methodologies, yet unit testing is practiced with waterfall and other traditional methodologies as well. Do you find that TDD is "methodology-agnostic"? Is there any reason why it wouldn't be equally relevant regardless of methodology being used?
TDD came along with Extreme Programming, yes, but it is one of those practices that you can borrow no matter how your project is organized and still benefit from it. As I've explained, TDD is a programming-level activity, and there is no reason why a programmer in a waterfall project should not be able to benefit from it. Reduced number of defects, increased confidence in code and increased ease of maintainability should be desirable goals regardless of project methodology.
It seems that the time to write, test and maintain the automated tests required for TDD would require almost the same amount of time it would take to write the application code. Is this true? Does anyone test the test cases?
Writing tests takes time, for sure. However, writing tests also saves time by removing or vastly reducing other activities, such as manual or formal debugging and ad hoc bug fixing. TDD also has the ability of reducing the total time spent writing tests. When retrofitting unit tests onto a system, a programmer will likely occasionally encounter tightly coupled code that is either hard or impossible to test. Because TDD promotes the unit test to the front seat, testability is never an issue.
Do tests required for TDD require almost the same amount of time it takes to write the application code? Maybe not exactly as much, but not far from it. But you have to look at it from a wider angle-- you spend more time writing tests, and probably a little less writing production code. All in all, initial development time will be slightly raised, but experience tells me you reap the benefits over time in reduced number of defects and improved maintainability, even though I cannot provide you with hard numbers to back me up on that.
Does anyone test the test cases? This is the question I am asked most frequently when presenting unit testing to uninitiated programmers. The answer is no, but that does not mean we don't take measures to make sure tests do not contain bugs, and this is another area where TDD helps over traditional development (where tests are written after the fact).
The first redeeming fact is that unit tests should be very simple-- no branching logic, just a set of simple linear statements that sets up an initial state, exercises a method or two, and then finally makes assertions on the results. Keeping the test code simple reduces the chances of bugs. The second fact is that in TDD, tests should always be run immediately after they are written. At this point we can confirm our expectations as to how and why the test fails. If the outcome is not as expected here, we are prompted to double check the test, and in fact I find that most errors I make in tests are caught this way.
Any final words of advice for those interested in practicing TDD?