Manage Learn to apply best practices and optimize your operations.

Why DevOps underscores the importance of software testing

Listen to this podcast

There's no debating the importance of software testing. But QA should be everyone's responsibility. In this podcast, learn how to follow a team-wide approach to quality.

Software testing faces drastic procedural changes as organizations strive for iterative and incremental development. Testers are no longer the sole arbiters of software quality. DevOps, Agile and continuous testing all present opportunities, but at great cost to the testers' and developers' routines. So how do their roles change, and how much QA work can developers expect to take on when an organization shifts its software culture?

This episode of the Test & Release podcast features Jeff Payne, CEO of Coveros, a software consultancy based in Fairfax, Va. Payne believes organizations need to mentor -- not dictate -- their way into philosophy changes and encourage professionals to step out of their comfort zones so they can become more versatile. In the development industry, those changes frequently revolve around the importance of software testing, and when and how it's done in the application lifecycle.

Payne calls continuous testing the practice of testing across the entire lifecycle, at the right time in the right place to improve software quality. Developers need to get involved for it to work, even if solely in unit tests, but Payne warns that the great experiment of pushing all testing onto developers did not pan out for many organizations. "They're all backing away from that really fast, because that is not going to work, for a variety of reasons," he said. Payne also digs into the trend to run tests in production -- with real customers -- and how and when that succeeds.

While shift left and shift right shake up software culture, Payne thinks the movements have benefited quality assurance and software testing in the long run.

"The good news is, in the last couple of years -- at least from a testing perspective -- I think the software testing product community has gotten a lot more focused on solving particular problems," Payne said, "whether it's helping to create better tests, or ... to automate the test execution process, or ... fixing or correcting automated tests that have broken, either by the code or by the infrastructure automatically self-healing."

In addition, Payne explains why security is too often neglected in acceptance criteria. And he reveals what transformed his thinking when it comes to AI in testing. Payne's moved from being a skeptic of buzzy product pitches to a believer in the practical application of AI testing technology.

Payne's company Coveros also owns TechWell, which provides learning conferences and communities for developers and testers.

Editor's note: Payne spoke with site editor David Carty and assistant site editor Ryan Black. The transcript has been lightly edited for clarity and brevity.

To start off, Jeff, I wanted to ask you about continuous testing, and some of the things you saw as necessary for the practice. So maybe just as a first question, could you talk about the different testing roles, like how they should come together to enforce continuous testing?

Jeff Payne: Yeah, sure. Continuous testing is definitely a hot area right now. We hear a lot about it, when we're out and about speaking with customers and all. Obviously, you've got a variety of roles if you're going to make continuous testing work. And I know there's a lot of definitions out there, so maybe I should start with my definition, which is that continuous testing is really the practice of testing across the entire lifecycle. The goal there is to place testing and do testing at the right time in the right place, where you're going to uncover any defects or unexpected behaviors quickly, and resolve them, obviously, but most importantly help the business make good decisions.

So, I think there's a lot of roles in that definition. You've obviously got your traditional software testers -- they might be manual, they might be automated, we can talk about manual versus automated -- they're obviously playing a critical role in continuous testing, but so are your developers. Because we really do expect and want developers to be involved in the testing process, [for] at least the unit tests level. And then you've got the business, which if you are doing any kind of what we often call shift right now in continuous testing, and trying to test with your customers on real data, then obviously the business, along with the customer, is going to be involved in that. So, you got a lot of roles wrapped up in there, particularly if you have dedicated automation specialists as well in your organization that might be helping out.

Is there any sort of personnel in particular that you view as especially necessary?

Payne: Well, you gotta have testers right? I'm biased, but [the presence of] professional testers is a critical component. It's kind of funny, a couple years ago, you heard a lot of rumblings about how with the changes, and DevOps coming down the pipeline, and testing moving more towards something that would be done in sprint, [people would say], 'Gee, do we really even need dedicated testers, maybe we could live without them.'

And I actually know quite a few larger organizations, who certainly didn't ask us first, but went ahead and decided they were going to basically lay off or fire or reallocate all their QA and testing personnel, and have the software developers be in charge of all testing. And here we are a couple years later and they're all backing away from that really fast, because that is not going to work, for a variety of reasons we could talk about. So, the testing role continues to be and will be critical in this process.

It's funny you say that, because I actually have a friend who's a software engineer here in [Boston], and he mentioned to me that over the past year they've reduced the size of their QA team to two people.

Payne: Okay. See how that works for him.

From what's he said, it hasn't worked well.

Payne: No, it's not going to work.

But you mentioned shift right testing before, and that's another thing I wanted to ask you about. What are your expectations when it comes to widespread adoption of shift left and shift right testing? Do you expect one to be more widely and quickly embraced than the other?

Payne: It's kind of funny, because we've been talking about shifting left for so long, right? I mean, I remember decades ago, when I got involved in testing, we were always trying to figure out, 'How do we get testing earlier in the process' and, 'How do we get dev and test working together,' and all that is really trying to shift left. And then out of nowhere this concept popped up of actually shifting some testing right, which when I first heard of that, I was like, 'Well that's not good -- we're trying to shift left!'

But then when you understand what is meant by shifting right, for instance that there are some kinds of testing that you really can't get good results unless your end customer is involved. The nice thing about DevOps, as it's structured and some of the metrics around trying to reduce cycle time when things are failing in production, [these measurements] actually encourage you and provide a mechanism for you to shift right and do some types of testing, like usability and A/B tests and other things we could talk about. Because if you find problems you can, with DevOps, rapidly fix those problems and get them into production; I think that was probably the most difficult thing with the shift right idea. [Before] we knew that when things got to production it was very costly and time intensive to fix them. Now, if you've got DevOps in place, that might not necessarily be the case.

I would say that it's kind of funny; shifting left has always been a challenge, I don't know [if] that's going to change, even with DevOps. I would think that shifting right is probably going to take hold a lot, a lot quicker and easier than shifting left.

Yeah, and both of those ideas are really designed, Jeff, to kind of remove this idea of testing as a fixed stage, right? It's something like we talk about in DevOps, it's got to happen throughout the lifecycle and kind of flip that traditional thinking on its head, right?

Payne: Yeah, absolutely. I define continuous testing -- I've given a couple talks on it -- as having four key components. One is testing continuously; duh, it's in the name, right? But that means testing everything continuously, not just your code, but your requirements, your system, your production system, your infrastructure -- everything in that whole process. Then shifting left to try to find defects earlier in the process that got introduced by the development process, [and] shifting right to get more feedback and information from your customers on the product. And [next] sharing all of that with the business through metrics, dashboards and reports, so you can make business decisions around what you're finding out about the quality of your software. That's kind of how I look at continuous testing and its pieces.

Right, and you talked about the challenge of shift left testing -- there's a lot that goes into that. How can an organization better invest their developers in the idea of testing software and scanning their code, particularly as things like cloud and serverless abstract more and more away from ops and just make it more challenging to perform those tasks? It's partially a cultural challenge and a tools challenge, right?

Payne: Yes, definitely both. I'd say more cultural than tools. You know, one of the things I've learned being in our business -- helping organizations adopt Agile, DevOps, better testing practices, etc. -- it usually comes down to culture. I mean, we can solve the tool problem -- most organizations can solve the tool. [T]he big challenge that we see in shifting left is for many, for a long time, most organizations did not ask or expect software developers to be involved at all in testing. And we all know that [lack of involvement] is not a good idea, but that's just reality. And now we're asking them to be involved and that can be a big change for them.

It also, though, is a management change, because if you're going to ask software developers to test at some level, inherently that means they're going to produce less code. And so your productivity from a new feature perspective -- take quality out of the equation -- is going to go down, and management has to be okay with that, right? And I think that's the big struggle. Management, they want quality, but sometimes they don't want to [necessarily reduce] code productivity for that quality, and they don't understand that you can't have it both ways.

I actually had a question based on something you brought up. You mentioned, of course, it's tough to bring about change in organizations, especially in a cultural sense. But I was curious about, in your experience, if [there are] any particular methods that work pretty effectively imparting these cultural transitions or even just like training team members on new technologies. [For instance], what's the best way to spread knowledge throughout an organization, because I can't imagine most people respond well to a change in culture that feels forcibly imposed on them?

Payne: You're exactly right. If it's forced upon them, that makes it more difficult. But, yet, if you don't push change, it's probably not going to happen at all. So, you've got to kind of balance the two together. I do think training can help; it's not going to solve your cultural problems, but just education in general -- [I'm a] big fan of education -- can help. But it also needs to be hands-on, I would say, and one of the things we [at Coveros] really push is: It's one thing to sit in the classroom, we've all done it, and listen to somebody talk about something. It's another thing to go try to apply what you just heard. And so a lot of what we try to do is more what I would call "mentoring" and "coaching" than just straight-up training. So, introduce ideas and concepts, but then work with people to use those concepts. Because most technical people learn best by doing. And they learned even better if they're doing it with people who have done it before. So, almost as an apprenticeship model, if you will, can work very well to get people to learn new concepts and actually see the value and benefit in those concepts.

And then you have to educate management [and] leadership, because a lot of times, culturally, one of the biggest challenges is what I mentioned about expecting developers to be as productive while testing. Management has some very misinformed understandings of some of these things that you've got to set straight. [Otherwise], they're going to continue to expect what's not possible and get frustrated when they see this change also having some impact -- maybe some productivity in the short term or whatever -- that they don't expect or don't understand. And they might [have a] knee-jerk reaction [and] say, 'Oh, this isn't working,' and move away from it when you just haven't gotten the organization through the knothole yet.

Going back to what you were talking about before, Jeff, [with] developers, instilling that idea of quality code into what they're doing, is there any way an organization can spend its way out of that challenge a little bit? Can they turn to some more full-featured IDEs that might have some more handy features that make programming easier? Or, can they turn to low-code tools that might be positioned a little bit more toward developers? I mean, certainly, that market seems to be going a little bit in that direction. What sorts of ways can they make that programming burden easier on developers, especially as we hear so much about developer burnout on a regular basis?

Payne: Yeah, well, it's funny you should say that; starting with the developer burnout. This developer burnout, it's real simple: Our industry is not sustainable, [the way] we work in most organizations. And even those that have embraced Agile seem to just kind of forget the sustainability principle of Agile, right? I'm a big believer that either you're following the principles of Agile or you're not, it's not something you can pick and choose the ones that you want to follow. Because, in my experience … they're kind of like pieces of a puzzle, you have to have them all for the puzzle to make sense. And sustainability is one of those things that our industry still struggles with.

You can invest in some tools that make your make your developers' and your testers' lives more efficient and more productive. There are definitely tools out there that can help you do that. Better IDEs [and a lot of] other tools that we have on our desktops now do integrate better testing concepts and CI [continuous integration] concepts, and are almost doing kind of what we'll call grammar checking or spell checking for security issues and quality issues. So, you're seeing them right in the environment you're working; I think those can definitely help developers. They say a good spell checker teaches you how to spell better -- I do believe that. And so providing those kinds of capabilities for quality and security right in the code I think can help over time teach developers how to write better code.

I wanted to come back to the question of developer burnout for a second, because one of the other things we were talking about, of course, is kind of just when and how software professionals are asked to essentially perform more functions. It's like, there's almost a competing notion [that] any professional should be expected to do any sort of task at any moment, but then there's also people who are still a fan of the dedicated specialist thing. But, of course, [then] people on the other side will say, 'It's antithetical to the point of Agile or Scrum, for there to be a dedicated specialist.' So, I'm wondering how would you go about striking that balance?

Payne: Yeah, so I just actually was at a conference; I was keynoting and talking about this exact issue, and it is an interesting thing. Agile, in one sense, you'll hear people say, 'Everybody on the team should be able to do everybody else's job.' And I wish I could find those people. If I can staff with all those people, I [would] get a lot done, I guarantee you that. But the reality is, most people aren't Renaissance technologists, right? I mean, they can't do it all and do it all well. And so … while that might be a great pie-in-the-sky idea, the reality is it doesn't scale and it doesn't work for large organizations. And so that model to me -- I get the idea behind it but I don't think it's practical.

I do though think that in Agile, and DevOps in particular, to be productive as a team we have to teach people how to step a little bit out of their comfort zone, and be able to at least help others. Some people call that being a generalized specialist -- a T shape, right? Where you have a specialty, [yet] you know enough about the areas around you that you can help when you don't necessarily have something to do today that is critical but one or two of your teammates have something that if it doesn't get done, unfortunately, it's going to become a bottleneck to the team. And you've got to be able to pick some of that up and help. And the more, of course, you can do that, the better the team functions.

And I do think a lot of people put the tester in the specialty role; I don't actually agree with that. I actually wrote a paper about a year ago now called Dev test pairings and it was talking about how developers and testers can help each other every day and learn from each other as they go, so they can be more helpful to each other over time. I really think that kind of a model can get people out of straight specialty roles, but never expect them to be able to do every role on the team.

So, you see it as more like organizations should pursue this T-shaped specialist, I believe that's what you called it, as opposed to like a dedicated specialist.

Payne: Yes, definitely. Some people call them Es and Fs and whatever -- now you're getting a little crazy -- but yeah, I do think we need that. Because in a sprint, when you're talking about a very short amount of time, just using Scrum as an example, if you're doing a Waterfall, and it's a 12-month project and you're blocked for a week, eh, it hurts the schedule a bit. [But] if you're in a sprint and you're blocked for a week, the sprint's cooked. It's not going to get done. If you're blocked for a day it might not get done.

And so the organizations have to realize, in that model, you have to figure out a way to keep everyone productive. And that means that people are going to have to be able to step out of their roles on occasion to help others, because they won't always have something right this minute to do. A lot of times when we go into an organization that hasn't fully embraced Agile, what you'll see is that the developers and testers are still kind of in silos, and the testers will complain because invariably they get to the last day of the sprint -- [developers] deliver code for the first time, [then] kickoff for the weekend and they expect the testers to hang around and now test, right? And the testers have been sitting around doing some planning and some other things, but for the sprint [they haven't] been able to be fully productive testing yet. That's just not going to work, right?

You've got to figure out a way to get testers engaged from day one. Being productive and getting them to figure out how to work together and leverage each other is the way you have to do that.

Jeff, we're seeing AI infused into testing more and more as time goes on, which is a scary idea for some testers. But some smaller tool vendors are even basing their entire business model on AI-based testing. So, I'm curious where you see AI in testing? Where are we at with that? And what do you think is in store for the future there?

Payne: Well, a couple years ago, I was pretty skeptical about it when I first heard about it, not that I don't believe that AI can help. And ironically, you probably heard of the AI effect. It's kind of funny, AI has been helping and AI has been used practically for decades now; it's just that ironically, whenever artificial intelligence becomes practical, we don't consider it AI anymore, right? It's kind of because, fundamentally, AI is looked at as something that [you only find] in the future. It's something you haven't yet figured out how to achieve.

It's always the stuff of sci-fi.

Payne: Exactly, yeah. So, optical character recognition, speech recognition, fuzzy logic, using artificial neural [networks] or various things -- those things are all at this point pretty proven technologies that work. And they're embedded into our daily lives, whether we realize it or not, to the point where none of that is considered AI anymore, right? It's, 'Oh, that's just you know, that's just how we work.' So, I was kind of skeptical, only because AI has this kind of … you mentioned this sci-fi view, where it's going to come in and solve all of our problems, or cause all of our problems and put us all out of business or whatever.

That was kind of unfortunately how I felt like the test tool and the developer tool product companies were approaching it. When you saw demos and whatnot, they were talking about how their software was going to let you -- basically, you just pointed at an application and it would just figure out how to do your job for you. And that was scary for testers, right? Because they thought, 'Well, gosh, that means they won't need me anymore.'

Good news is people have been talking about that for 50 years and it hasn't happened yet. It's not going to happen anytime soon. I just heard an AI specialist talk and his projection was that it was at least another 50 years before a machine could think enough to even identify a new novel solution to something without being prodded to look at it -- just cognitively to say, 'Oh, that's an interesting thing I should pursue.' That says we're a long way away.

But the good news is, in the last couple of years -- at least from a testing perspective -- I think the software testing product community has gotten a lot more focused on solving particular problems, which do show promise, whether it's helping to create better tests, or it's helping to automate the test execution process, or it's fixing or correcting automated tests that have broken either by the code or by the infrastructure automatically self-healing, if you will.

And those are all interesting ideas that can at least make our testers more productive, which I think is a good track to follow for AI: How do we make our jobs more productive? And there [are] some promising ideas out there and thoughts, and some of the demos I've seen look much more interesting and much more useful than they did a couple of years ago.

I did want to pivot to something, Jeff, something I know you've got a background in and that you've continued to write about, and it's security. And I know, this might sound like a basic question, but I still think it might be useful to lay out for our listeners: How would you write the acceptance criteria for a security-oriented software requirement?

Payne: Yeah. Boy, that's a fun one.

Well, I mean, obviously, first, it depends on whether it's a functional or non-functional security requirement, right? I mean, the way we do security [user] stories or integrate security into stories is -- obviously, there are particular stories that have security components. Often times, we are looking to put what we in the security community call security control in place. These are things that are really features of our application that provide security, like logins or authentication or encryption, things like that, which, in Agile would be [a] story. They would be specific features or functions that you are going to create. So, that's pretty straightforward.

So, the more difficult security requirements are the non-functional requirements, right? 'How do I inherently make the entire application secure? How do I make it reliable and not susceptible to attack?' Those, to me, tend to be more -- we write them as non-functional security stories, which apply to all your stories. And [they] really kind of guide your security testing or your penetration testing toward the areas of risk that you've identified in your application.

So, I teach a security testing class. The purpose of that class is to teach software testers how to better test for security, because I think it's just another bullet that you have, that you can use -- it's something else in your arsenal -- as a tester. I do believe that security needs to be integrated into the entire continuous testing process for it to be effective.

One question to that effect, [do] you currently think security is often the most overlooked thing when it comes to criteria with testers?

Payne: Well, for testers, yes, I do. And [that's] mainly because … they're worried that they can't or won't be able to somehow help. And that's one of the goals of this [security testing] class. I tell the students right up front, 'I don't expect you to know anything about security. That's why you're here. I expect you know a lot about testing. What I'm going to show you is how you can use that testing knowledge you already have with a little dose of what security is all about to start to become an effective security tester. And you ought to be able to leave this class and go back and start to do damage.'

In fact, ironically, even though I tell them not to do [this], some of them decide to go home after the first night and see if they can break into their website. Not a good strategy. Actually, last time I taught that class, two people in the class got calls from their IT organization during the class asking them what they were doing. And I said, 'I told you all don't do that!' But people are curious and they're testers, right? So, they want to break something. But no harm, no foul, nothing came of it other than IT [saying] the same thing I did, which was, 'Ok, that's fine. Tell us next time you're going to do that, please. You gave us all a heart attack, we thought that somebody was attacking us.'

As long as they got out of it with a slap on the wrist, I think that's okay.

Payne: Yeah, exactly. I'm gonna have to maybe make them sign a waiver or something. I don't know, it's like, 'Gosh dang these testers, they want to dig in and start breaking stuff!' And this is the wrong kind of stuff to be breaking at night. Because irrespective of how well security is integrated into the software process, every organization has it well integrated into the operational process. And so you're not going to be hacking around and doing malicious things very long in production in most organizations.

You told me that you expect STARWEST to be particularly well attended at the end of September, potentially even a record crowd for a TechWell event. [I'm] curious what sorts of conversations you're looking forward to having with attendees, and what's particularly interesting on the agenda?

Payne: Yeah, so we hit on a lot of the topics [in this discussion]. I went through and looked at the submissions, and also what was accepted. For anyone that's interested usually we take anywhere between 15 and 20% of submissions, so it is pretty competitive to get into the STAR conferences. But AI is all over the conference, which is always interesting to hear different perspectives [about]. The great thing is now practitioners are starting to show up and talk about how they're applying artificial intelligence to testing in their organizations, which is a great next step for some of these tools and technologies.

A huge theme is, 'How do I integrate tests into DevOps and continuous delivery' and all that fun stuff. And then the third one is effectively testing what historically I would call highly distributed systems or distributed systems; really it's IoT, to some extent microservices, service-oriented architectures -- lots of stuff around effective ways to test those kinds of systems as well.

So, I think we got a good set of speakers and talks around all of the cutting edge topics that are out there, as well as just great fundamental software testing techniques -- from design, to execution, to planning, to whatnot. So, we're pretty excited about this one. Should be fun.

Cloud Computing
App Architecture
ITOperations
TheServerSide.com
SearchAWS
Close