Manage Learn to apply best practices and optimize your operations.

Best practices for Scrum and when to apply them

Truth be told there many situations where a Scrum or Agile "best practice" is not the appropriate practice to be using. In this expert tip, software pros use project examples to illustrate when a methodology approach is the right one.

There are no silver bullets in software development. As Mike Herrick, program manager at Collaborative Software...

Initiative, likes to remind us, "Software is hard. Jeez." This makes an article about best practices for scrum difficult to write - there are no "best practices" that work for every situation, and some best practices can be harmful in the wrong situation. That's why it is important to use good judgment as you decide which "best practices" apply to you, and which don't.

That said here are a few practices that address some of the common weaknesses I've observed in a lot of Scrum-based projects.

No points until the story is properly tested

At the end of each iteration the team adds up all the estimates for the stories they've completed (often called points) and publishes their velocity for that iteration. Velocity is important for predicting how many stories you'll be able to complete in any given iteration, usually based on a three-iteration average, so it's important that stories that are finished get counted. That's why I promote the idea that you can't get points for a story unless it's "properly tested". By making this a hard and fast rule, it creates an incentive to focus on testing a completed story as an official step in the development process.

Properly testing a story means more than just covering the acceptance tests - if that's all you're doing you're not adding any value that your customer won't add anyway. You need to, at a minimum, examine the story's output and look for bugs using the HICCUPPS mnemonic for testing oracles. To use this mnemonic, ask yourself whether the story you are testing is inconsistent with any of the following:

  • History of the product
  • Image of the product
  • Comparable products
  • Claims made about the product
  • User expectations
  • Within the Product
  • With the Purpose of the product
  • Statues (including internal standards) that apply to the product

If you find inconsistencies, you may have found a bug and you should investigate further.

If you are testing a web-based application, your testing should also include some basic security testing such as a simple check for XSS (cross-site scripting) vulnerabilities. For a simple test you can use, check out this blog post: Information Technology Dark Side.

Don't deliver a story until it passes all automated tests

One of the observations I've made about scrum training is that it frequently gives automated testing short shrift. If this happens on your project, you'll find that over time re-factoring your code base to push it in new directions becomes more and more difficult. As your application become increasingly complicated, test cycles become longer and longer, and before you know it the cost of regression testing your application becomes almost as high as the cost of new development.

To avoid this scenario, you need to make sure automated tests are in place for your application. The best way to make this happen is to create a project culture that doesn't consider a story delivered, until the appropriate set of tests have been written, and the code passes. Unfortunately, when your development team has never written automated tests before your velocity is going to take a hit as you get up to speed. It's better to pay the price early in the project than late so if you aren't using automated testing do it now.

Execute all automated tests every time code is delivered

What's the point of having a lot of automated tests if they don't find bugs for you? To take advantage of these valuable tests, you need to run them every time someone checks in a code change. Using tools like Hudson or CruiseControl, you can automate this process with a little extra effort. Once you have a continuous integration server set up, every time someone "breaks the build" the whole team will get an email. When that happens, whoever has checked in code recently needs to drop what they're doing and investigate and fix the bug they've introduced.

If your team is hesitant to use continuous integration, find a way to make it fun. One of my colleagues used an X10 appliance to turn on a lava lamp whenever the build was broken, and the team had a goal to get the build back to normal before the first bubble broke the surface of the lamp. Other teams require the developer who breaks the build to bring donuts the next day.

Our team is completely remote, but when one of us breaks the build it's not uncommon for the offending developer to proclaim his guilt with this message: "Looks like donuts are on me."

Origins of this phrase are not entirely clear. Variations of it have probably been uttered many times over the decades, but we attribute it to Patrick D. Logan.

This was last published in November 2009

Dig Deeper on Scrum software development

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchMicroservices

TheServerSide.com

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

DevOpsAgenda

Close