Let me answer this question with two short stories. I'll outline two projects, and the teams and products behind the projects.
The product being tested is software that communicates to a medical device. The software is not embedded in the device, but instead the software communicates via serial cables to the device, the software instructs the device how to perform a particular operation. At the end of the project, the product and its documentation will be audited by an internal audit team as well as by the FDA. If the product passes the audits, the product goes to market. The market in this case means medical labs both inside and outside of hospitals.
The team includes a testing team as well as a quality team. The testing team focuses on testing and the quality team focuses on creating and managing documentation. The release documentation is prepared for the team and also for the internal audit and the FDA. The project team includes: developers, testers, quality staff, field implementation/support, a project manager, and business analysts. There are also specialists in the specific area that the medical device operates in. The team is large, product specific (vs. centralized and reallocated to various products) and on-site.
The culture is formal, the process for every aspect of project work is defined in standard operating procedures (SOPs) and every team member is trained in the process. Variance to the process is not tolerated.
The SDLC: classic waterfall.
Practices that work in this context:
- Constant communication throughout the team. Team members typically eat lunch together and discuss the project.
- Intentional pairing of SMEs and technical staff.
- Generally well-documented information.
- Continual risk assessment formally through FMEAs, MMEAs and informally through discussions that are followed up with documented assessments.
- The lessons learned session conducted at the end of each project cycle that includes how we'll implement what we learned for the next cycle.
Some of the practices that challenge the team:
- A steep learning curve.
- A long training cycle in the process and all related procedures.
- Turnover; while not high, any change in the team is difficult as the process to bring someone up to speed is long.
- Massive quantities of documentation keep the team from being more nimble.
- Chronic concerns surrounding obtaining proof of what is done to fulfill FDA requirements. An example, the use of utility software -- for example, the use of software to configure and maintain test environment has be to proven. So every aspect of what is used by anyone on the team to do what they have to do has to be proven and documented in addition to the core product work. This requirement slows down efforts, increases documentation and decreases the use of utilities due to the associated overhead.
What makes the team work well in the environment is the individual commitment that a product that could potentially kill a patient drives behavior throughout the team on a daily basis that could be best summarized as: responsible, thorough. Overall atmosphere: conservative, focused.
The product, a content management system built as a Web application. The team is virtual; the product team is sprawled across the United States with users in multiple countries.
At the end of most iterations, the release is posted live as soon as possible. Features being added are current of the technology world. Examples: social bookmarking, integration to Google's custom search engine and RSS feeds.
The team consists of: the client, a development team, tester and project manager. A handful of end users participate on occasion. Requirements are written by a lead developer with heavy input from the client. Every team member questions the requirements, the testing, when the build is ready for production use, and generally everyone is open to question everything all the time. The rapport is rapid and effective. When issues erupt in production, the emails and response time across the team are impressively fast.
The culture is informal; the process is open to change. The ability to quickly learn new technology and in some cases -- many cases, actually -- to learn on one's own is needed. Skills needed to work on the team: Be able to Google something, learn it and run with it.
The SDLC: modified agile; perhaps better described as rapid.
Practices that work in this context:
- A team made up of people who embrace technology and stay on top of current trends and technology.
- Email, IM, Web conferencing.
- Detailed information in the defect tracking system that includes screen shots and people's detailed comments.
- The project wiki is mostly effective.
What challenges the team?
- Occasional miscommunications.
- Lack of documentation leading to "Is it a bug or is it a feature?" conversations.
- Lack of oracles to draw upon when integrating features.
- The leanness of the team when an individual contributor is unavailable and there isn't another person who can provide the same information.
The team is one of the most nimble I've ever worked with. Emails and IMs take place up to seven days a week, night and day. I've had exchanges with team members in the middle of the night. Overall atmosphere: technical, fast-paced.
So, now how can I answer your question? I hope that by illustrating two different products built for different needs by radically different teams illustrates why the answer "it depends" is the only logical answer I can provide.
Imagine taking on the long and heavy process of an FMEA for fast-moving website? Or thinking that a virtual team connected through IM chats and a wiki would work for a medical device (or pass an FDA audit)?
I outlined the two teams instead of replying "it depends" to help illustrate the variety you might experience if you work in different industries. If you've only worked in one environment throughout your career you might be lead to believe that there is set of best practices. I've had the fantastic opportunity to be exposed to different industries and have learned that there are many practices and for each situation, you and the team you work with should strive to find what works. And then once you believe you've found the appropriate set of best practices, don't be attached to those practices past the point of their usefulness. Be ready to adapt. (See: www.context-driven-testing.com)
Where to start if you're determined to implement best practices and you don't have any? Look at the team makeup. What's working? What's not? Have you held post project meetings to review success and failures? Esther Derby has written a book on retrospectives. You might consider hosting team discussions to brainstorm for improvements. Even with a consistent team and the same product there can be different best practices. You might experience a short notice patch release to production (whatever production means for the product) which could include different practices than a full release. You might find people on other products within your company that can help you find practices that work for your company's unique atmosphere. You might find talking with people in the same field helps you find ideas. You might try one approach and then hopefully be willing to adapt the approach as needed.
Your question is a good one although complex to answer well. Perhaps now you see why "it depends" would be the most accurate response of all.
Dig Deeper on Topics Archive
Related Q&A from Karen N. Johnson
User acceptance testing and system integration testing differ in one key way: the person who does the testing. Learn when to apply UAT vs. SIT. Continue Reading
There are so many resources out there about the ever-changing world of Web design and mobile testing, but to choose the most salient and insightful ... Continue Reading
In this expert response, consultant Karen Johnson describes strategies she uses for browser compatibility testing. Experience and knowledge of common... Continue Reading