Over 30,000 software testers participate in uTest's quarterly Bug Battle competitions, searching for flaws in various...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
types of applications. Matt Johnston describes the uTest global crowdsourced test efforts, in which testers vie for prize money and reputation in this interview. In part two of this Q&A, Johnston explains how software testers can enter future Bug Battles.
UTest just completed one of their quarterly bug battles. So Matt, you want to tell us about that?
Sure, so every quarter we do what we call a bug battle competition, and it's a contest, a competition, that's open to our entire community of 30,000 testers from around the world. For Q3 of 2010, we picked leading job search sites, so it was Monster.com, CareerBuilder, Indeed, and Simply Hired. And we turned our testers loose and let them go through and do exploratory testing and report any defects they discovered. Then after the testers had completed their work we actually did a usability survey where they were able to evaluate the various applications and compare them one to another on things like ease of use and job search results so it's a pretty comprehensive report that we publish each quarter.
So when you do these survey results do you get answers from actual users, too, or is it just the usability that the testers see when testing the applications?
It's limited only to people who actually participated in the bug battle, so yes, they are professional testers, and yes, they have also recently tested those applications. We could also issue it publicly but we want to make sure what this bug battle is about is around software testing as opposed to user feedback.
Do you ever get responses from the different companies who are surprised by the results and want to explain why the results came out as they did or explain what they're doing to try and fix the issues that were found?
We absolutely hear from companies every quarter. I would say that more than half of the companies that have been a part of a bug battle have contacted us and we have an absolutely open door policy to them that we will not only share the feedback, we will share the usability survey, we will share the list of bugs, all free of charge. Our goal is to engage our community with these quarterly competitions, but it's also to have some fun and hopefully help out some of these leading software brands. So we would always share that stuff with any companies that reach out. You know they always reach out and more than half of them have actually reached out to us in the past to get a list of bugs and taken us up on our offer. Not only will we give them the whole list of bugs, we'll actually filter it and scrub it for them so it's a little more useful. So it generally does start a dialogue and I will say though this is not the intent of bug battle, some of the past participants have actually gone on to become recurring uTest customers.
Do the people that accept the feedback and the bugs typically share that with their customers and with the public or does it vary?
Well, much like any QA, they don't necessarily publish their list of known bugs, but they will share it internally with their engineering organization, their QA organization, product group whoever is appropriate. Then they prioritize it as they see fit in terms of fixing it. And I know in the past we have run into privacy or security issues that we would share directly with the QA department of an organization which they'd prioritize and fix very quickly. But in this particular bug battle there were no critical privacy or security related issues that were uncovered.
Do testers engage in specialized testing -- performance testing, security testing -- something that goes beyond functional testing? Were those types of testing were done as well?
Sure, we do ask our customers not to do any kind of synthetic load testing or anything like that because they're testing production environments in these companies and our role or goal is not to put them out in any way so it's all manual testing but absolutely, some testers will go in and be checking for common vulnerabilities like cross site scripting, those sorts of things; but absolutely, will be a part of it and in this case all four of them showed up clear, so that's great news.
Yeah, so how many crowdsource bug battle competitions have you hosted?
This is our eighth. So we've done two year's worth of these and at this point have given away nearly $40,000 in prize money to various members of the community who have won various awards in these eight competitions.
And are they growing in popularity? Do you get more testers every time or are some more popular than others?
Yeah, they are growing in popularity overall, but you definitely have some that are more popular than others. We try to make sure we pick the leading players in the categories. We don't want to have, you know a 2,000 person company stacked up against a 20 person company. So we try to make sure it's a fair battle, and we also try to make sure that whatever the sites we pick are globally available because our community of testers is present in 168 countries. So we want to make sure that it's a level playing field for each bug battle which is why we picked these four. And yeah , there have been some that are more popular than others. I can tell you that when we did the battle of the search engines in Q3 of 2009, that was wildly popular. When we did the battle of the ETailers, last fall, Q4 of 2009, that was a very popular one. And then we've had others that weren't as popular like TV networks or Twitter applications that didn't capture the attention or interest of our community in quite the same way.
Can an organization nominate themselves or volunteer and say, 'I want to be a bug battle participant?'
Yeah, they could. In fact, we always open it up to the public and whether it's a journalist, whether it's an analyst, whether it's a company, we're always on the lookout for bug battle ideas but we do host them once per quarter and we're very protective. We do it to engage the community, not to expose anyone or embarrass any companies. So we're very discreet in terms of we will promote the aggregate results, but we would never share the list of known bugs or anything like that. We think that's part of being a good Web citizen, and that's really the goal of these things -- for people to have fun and engage but also to hopefully improve the quality of software among the leaders in the category. Anyone can volunteer and once a quarter we sit down among the marketing and community teams here at uTest, and we have a blank white board in front of us and we talk about all the different possibilities for categories that could be interesting.
And there's a lot of criteria that goes into selecting what will be the next bug battle, including what's the media going to be interested in, what's the community going to be interested in, are there three or four good players that are fairly evenly matched, and is it globally available? And do you have something in mind yet for Q4?
We have some ideas. I won't share what they are yet because we haven't made that final decision; we haven't had that final fight among the marketing teams, because it's a topic that we end up caring a great deal about. My team spends hundreds of hours on this, you know putting the bug battle together and promoting it, and then weighing and judging all of the bugs and feedback and preparing all the reports. So it's something that they care about a great deal. And so we haven't finalized the Q4 one yet, but we've narrowed it down to a few finalists.
When do you plan to announce what the Q4 bug battle is about?
The bug battle for Q4 will actually run during November and we'll be announcing the results and publishing the report in early December.
Well, as far as when you're going to announce what the battle is?
That would be probably be early November.
Dig Deeper on Software Testing Methodologies