What better way to begin 2014 than with a new set of guiding principles to improve the software development process? I pulled together these guiding principles after thinking about a conversation I had with Stephen Wilson, director of product specialists for Detroit-based Compuware, which sells software for managing application performance.
We began by talking about DevOps. I asked him what role software test professionals play in a DevOps world, which appears, by its very definition, to leave them out.
Wilson offered his thoughts on the role of testers in a DevOps-oriented organization. But what emerged from our discussion was a broader, big-picture view of how the software development process should work—or could work if organizations were willing to rethink it and make needed changes.
Here are my guiding principles for improving the software development process in 2014, based on ideas shared by Compuware's Wilson.
Guiding principle #1: Continual testing is the norm.
Testing takes place all the time. In a perfect world, testers are essentially scientists who rely on analytics to assess the state of the code after each build. Each time a build is completed, testers validate the code. Did anything break? Glitches are fixed along the way. Thus, as the codebase grows larger and more complex, the quality of the application also improves.
In this arrangement, testers are part of development and part of operations. While the term DevOps suggests otherwise, testers play an active role throughout the development process. "They are the more important, albeit silent, partner," Wilson said.
Guiding principle #2: Constant communication among all players.
In a perfect world, the right hand knows what the left is doing. When, for example, the operations people opt to move an on-premises application into the cloud, testers spring into action automatically. Here is the state of the application before the move; here's what it looks like on the cloud.
Is it just the application that has been moved, or have its required resources also been relocated? The idea is to take all of the changes into account and quickly compare and contrast the quality and performance of each setup. Another important concern: What impact will cloud hosting have on mobile users, who call the application under constantly changing--often sub-optimal--connectivity conditions?
Guiding principle #3: Testers deliver highly specific feedback to developers.
In a perfect world, testers direct "user-context" feedback to developers. "User context" is Wilson's preferred term for describing the precise circumstances under which an error or performance problem occurs. Here's the example he offered: "The user did A, B, C, D, and E. When he got to the product search page, the page took 3.5 seconds to load."
What's key here is that you are not simply indicating for the developer the point at which the application called the database 100 times. You are offering the information in the context of the precise task the user was performing when the delayed response occurred. Now the developer can devise a solution that improves the transaction from the user's point of view.
In Wilson's example, it turned out that the application wasted 2.5 seconds getting information which the user neither needed nor requested. With an understanding of the user-context, developers can optimize the code with the specific user transaction in mind.
Guiding principle #4: Analyze which features users care about.
Social media is often singled out as a way to improve products by incorporating user feedback in meaningful ways. But in a perfect world, user feedback is based on analytics, not comments posted on the Web.
The idea is to monitor an application in production to see what users are actually doing with it. You might find, for example, that of the twelve recently released features, only two are being used. Is that because they were hidden and difficult to get at? Or is it simply because no one likes or needs them?
Wilson called this "process monitoring from the user's perspective," and he sees it as a way of providing meaningful feedback that developers can act on. "You're building use cases based on user behavior, and you're testing these things all the time," he told me.
How will your software development process change in 2014? Will you incorporate any of these ideas and suggestions in your shop? Let me know and I'll share your ideas in a future column. Have a top-quality New Year!
This was first published in January 2014