On May 20 and 21 I hosted the first ever Writing About Testing conference in Durango, Colorado. In the room were fifteen of the most prolific and respected software testers in the industry. In the course of our discussions over those two days, I was deeply impressed by the sheer diversity of experience and expertise shown by everyone at the conference.
The past decade or so has seen big changes in software development.
In the room were experts in databases, framework programming, exploratory testing, API testing and development, User Interface (UI) and User Experience (UX) work, development process, system administration, scripting, application development, management, software architecture, security, performance, data visualization, collaborative communities, and even Quality Assurance (QA).
The past decade or so has seen big changes in software development. We have access to new languages, new architectures, and new databases that did not exist ten years ago. The internet has created new business models, from Software as a Service (SaaS) to boutique development shops to totally distributed teams to cloud computing.
Each of these new technical advances or business opportunities demands new skills and new approaches from software teams. Yet application development stays the same; someone has to write the application code, and we know pretty much the best way to get that done: with unit tests, code review, and continuous integration.
What I learned at the Writing About Testing conference is that much of the new work demanded by new technical advances and new business models falls to people who call themselves "testers."
Beyond the technical tester
here is an ongoing discussion among people in the testing community about the increasingly visible tendency of large companies to value testers with programming ability over testers without programming ability. The name for such a role is usually "SDET," or Software Development Engineer in Test, after Microsoft's job title for such people.
What the discussion fails to note is the enormous number of other technical skills that testers employ besides programming for test automation purposes. System administration, database expertise, shell scripting, architecture work, security, performance, etc. are all technical aspects of software development that all demand particular technical skill, and that are all certainly employed by testers in order to do testing.
The discussion also fails to note the enormous number of "soft skills" testers employ. While it is true that many testers work more closely with the development staff, many other testers work more closely with project management staff, product development (requirements) staff, or even executive management. One popular definition of a tester is someone who provides information to the software team; what we call testing is capable of supplying an enormous amount of information to any role on any team.
Make your set of skills solid while staying interested in new skills
As of right now, I would say of myself that I am highly skilled in the following areas: automated UI test design; UI/UX testing; API testing; and test architecture and environment work. And I'm pretty good at influencing software development process, also; I take my nominal title of "QA" quite seriously.
There are some things I used to be good at and either my skills have atrophied, or the state of the practice has overtaken what skill I had: mainframe/COBOL systems testing; coding test frameworks; database hacking; scripting. I used to consider myself good at all of these, but I no longer care about testing mainframe apps, and today most of the testers I work with are better than me at the rest of the list; the state of the practice in test framework development, database hacking, and scripting for testing has outstripped what skill I once had, and I have not found it valuable (so far) to keep up.
Besides my skills, I have a set of interests that come in handy as well. I follow the literature on these things and practice some techniques, even though I am not employed (yet!) to use any of them. Software security has always been of great interest, and I am familiar with a lot of grey-hat network security tools. Software engineering theory is of great interest. I can hold my own in a detailed discussion of software architecture.
Here is advice from my Writing About Testing experience: identify your own set of skills related to software testing, and spend enough time to get really good at each one. Also identify a set of interests, things related to software for which you might not be paid directly, but which might come in handy someday.
Seek out auditions
Before I ever worked in software, I was a professional musician. As a musician I learned that being able to play your instrument is only a small part of being in a successful musical organization. I might be hired to play bass, but on the record I might also play acoustic guitar or tambourine; I might have to know how to back a trailer while touring; and ultimately I have to get along well with my colleagues.
The same is true for software teams. I might be hired to do testing, but along the way I might design a test automation framework, define API interfaces for test data, and work on UX issues with designers and managers. In my most recent job that is exactly what I have done. Luckily for me, I am working with people who are much better programmers than me. My colleague in the testing department implemented the automation framework; my colleague in the development department implemented the API interface.
With a set of skills and interests in hand, it falls to the tester to find a team who is in need of those skills and who can benefit from those interests. What we call software testing is a field both deep and broad, as demonstrated by the people who attended the Writing About Testing conference. Each of those people is using a unique set of skills, interests, and experiences in a unique situation. There is a persistent myth that software testers are interchangeable units, but such an idea is easily refuted just by the existence of the people at the Writing About Testing conference.
Tools designed for testers working in Agile development environments
Agile testers must be able to keep up with the pace of constant changes – not only to the code being written, but to the user requirements.
The importance of critical thinking in software testing
There are many definitions for 'software testing' there is common ground. Testers need to use critical thinking skills when diving into the testing of a software application.
As a tester seeking a team, look everywhere for a team needing your skills and interests. It might be a team within your own company, or it might not. You might consider telecommuting in order to be part of a great team, or you might even consider starting out on your own as a consultant or an entrepreneur. The world of software testing is a remarkably big place, with a lot of roles to be filled.
The disappearing tester
Another theme that emerged from the Writing About Testing conference is that software testing is so deep and so broad that the term "software tester" might even be irrelevant and might actually disappear. A number of us think of ourselves not explicitly as testers, but as members of a software development team able to answer certain specific needs of the team.
That is certainly the case in my current position. The team that hired me decided that they needed people to fill roles to: validate UI/UX implementation; design and implement a UI test automation framework; and work with the rest of the team to build an agile development process. My colleague and I have those skills and share those duties. She is a better programmer and a better database hacker than me, so I spend less time on those activities. If it came to it, I could step and do that work also; but it would be at a much lower level of quality.
The most productive approach to a software testing career in the future might be not to think of the work as software testing. Instead, think of the work you want to do on a software development team, whether "technical" or not, and then find a team that will value that work.
About the author: Chris McMahon is a software tester and former professional bass player. His background in software testing is both deep and wide, having tested systems from mainframes to web apps, from the deepest telecom layers and life-critical software to the frothiest eye candy. Chris has been part of the greater public software testing community since about 2004, both writing about the industry and contributing to open source projects like Watir, Selenium, and FreeBSD. His recent work has been to start the process of prying software development from the cold, dead hands of manufacturing and engineering into the warm light of artistic performance. A dedicated agile telecommuter on distributed teams, Chris lives deep in the remote Four Corners area of the U.S. Luckily, he has email: email@example.com.