Let me answer this question with two short stories. I'll outline two projects, and the teams and products behind the projects.
The product being tested is software that communicates to a medical device. The software is not embedded in the device, but instead the software communicates via serial cables to the device, the software instructs the device how to perform a particular operation. At the end of the project, the product and its documentation will be audited by an internal audit team as well as by the FDA. If the product passes the audits, the product goes to market. The market in this case means medical labs both inside and outside of hospitals.
The team includes a testing team as well as a quality team. The testing team focuses on testing and the quality team focuses on creating and managing documentation. The release documentation is prepared for the team and also for the internal audit and the FDA. The project team includes: developers, testers, quality staff, field implementation/support, a project manager, and business analysts. There are also specialists in the specific area that the medical device operates in. The team is large, product specific (vs. centralized and reallocated to various products) and on-site.
The culture is formal, the process for every aspect of project work is defined in standard operating procedures (SOPs) and every team member is trained in the process. Variance to the process is not tolerated.
The SDLC: classic waterfall.
Practices that work in this context:
Some of the practices that challenge the team:
What makes the team work well in the environment is the individual commitment that a product that could potentially kill a patient drives behavior throughout the team on a daily basis that could be best summarized as: responsible, thorough. Overall atmosphere: conservative, focused.
The product, a content management system built as a Web application. The team is virtual; the product team is sprawled across the United States with users in multiple countries.
At the end of most iterations, the release is posted live as soon as possible. Features being added are current of the technology world. Examples: social bookmarking, integration to Google's custom search engine and RSS feeds.
The team consists of: the client, a development team, tester and project manager. A handful of end users participate on occasion. Requirements are written by a lead developer with heavy input from the client. Every team member questions the requirements, the testing, when the build is ready for production use, and generally everyone is open to question everything all the time. The rapport is rapid and effective. When issues erupt in production, the emails and response time across the team are impressively fast.
The culture is informal; the process is open to change. The ability to quickly learn new technology and in some cases -- many cases, actually -- to learn on one's own is needed. Skills needed to work on the team: Be able to Google something, learn it and run with it.
The SDLC: modified agile; perhaps better described as rapid.
Practices that work in this context:
What challenges the team?
The team is one of the most nimble I've ever worked with. Emails and IMs take place up to seven days a week, night and day. I've had exchanges with team members in the middle of the night. Overall atmosphere: technical, fast-paced.
So, now how can I answer your question? I hope that by illustrating two different products built for different needs by radically different teams illustrates why the answer "it depends" is the only logical answer I can provide.
Imagine taking on the long and heavy process of an FMEA for fast-moving website? Or thinking that a virtual team connected through IM chats and a wiki would work for a medical device (or pass an FDA audit)?
I outlined the two teams instead of replying "it depends" to help illustrate the variety you might experience if you work in different industries. If you've only worked in one environment throughout your career you might be lead to believe that there is set of best practices. I've had the fantastic opportunity to be exposed to different industries and have learned that there are many practices and for each situation, you and the team you work with should strive to find what works. And then once you believe you've found the appropriate set of best practices, don't be attached to those practices past the point of their usefulness. Be ready to adapt. (See: www.context-driven-testing.com)
Where to start if you're determined to implement best practices and you don't have any? Look at the team makeup. What's working? What's not? Have you held post project meetings to review success and failures? Esther Derby has written a book on retrospectives. You might consider hosting team discussions to brainstorm for improvements. Even with a consistent team and the same product there can be different best practices. You might experience a short notice patch release to production (whatever production means for the product) which could include different practices than a full release. You might find people on other products within your company that can help you find practices that work for your company's unique atmosphere. You might find talking with people in the same field helps you find ideas. You might try one approach and then hopefully be willing to adapt the approach as needed.
Your question is a good one although complex to answer well. Perhaps now you see why "it depends" would be the most accurate response of all.
This was first published in July 2008