Testing your product is critical to its success. But if you've got a legacy product that doesn't have any automated testing facilities and is eyeball deep in critical change requests, how do you go about testing it effectively?
How do you convince team members who've been reacting to "emergency" change requests for years that it is a good idea to take extra time to make the code more maintainable and cover it with tests? If they think this is a good idea but don't believe there is time to do so, what then?
Following are some of the general reasons why automated testing is a great idea. While each person that is resisting writing automated tests will take different evidence to convince, I hope that these can provide a good starting point.
Start with testing only the code you are actually changing for the change request. By doing this, we don't have to worry about taking the time to write tests for the entire program. Additionally it may help alleviate some of the "where do I even start?" blues that can be associated with facing such a monumental task.
Refactor the code being changed along "seams" as described by "Working Effectively With Legacy Code" by Michael Feathers. Look for places where sections of logic can be pulled into smaller, easier to test modules (functions, classes, etc). The process of doing this forces us to simplify the code as much as possible. Doing so will allow us to test it in a sane manner.
Set team expectations that testing can be ignored if the situation is truly an emergency (and define what an emergency is so that it is not defined by default as "any change request"). Setup the project configuration in the build system such that tests can be built or not built. This will allow the flexibility to not write tests if there is a rush on the change request. Given that the team is used to working without taking the time to add tests, some of the team members will appreciate this as well. It can be used to ease such team members into testing; allow them to make changes to the project while we build up enough of a test suite for them to see how effective it is.
Now, having suggested that an option for not writing tests be implemented, let me also suggest setting team expectations that demand tests be written and run for emergency fixes the next time the code is open for changes. The goal here is to make sure that the tests don't become broken or stale through several sets of "emergency" changes. This also helps make sure that you actually start writing tests instead of just declaring everything an emergency that doesn't need tests.
In order to write tests we have to understand the requirements of the code that is being changed. We have to understand what good input (or beginning conditions) are. We have to understand the output that this will produce. We have to understand what bad input is and what, if any, errors the code will produce. If we don't truly understand any of these things, even if we think we do, attempting to write a test will quickly make it blatantly apparent that we don't actually understand. Consequently the process of writing tests will help us to understand why we're changing what we are. This helps us to write code that does what was asked for the first time (as opposed to having to go through cycles of coding, failing QA for requirements issues, coding, failing QA, etc).
There are many other reasons (and ways) to write automated tests which can be found in a variety of books and blogs. There are also many tools that can help with many aspects of testing from mock generation to test generation. Please take some time to think about how automated testing testing can help your project. I promise that it will help you in the long run.
No comments:
Post a Comment