News Stay informed about the latest enterprise technology news and product updates.

Crowdsource specialist uTest launching new performance, load test offerings

uTest, known for their crowdsourced approach to functional testing, is adding load and performance testing to their offerings. SearchSoftwareQuality got wind of uTest’s news, to be formally announced on Wednesday, February 24, and spoke with Vice President of Marketing and Community Matt Johnston. “We’ll be offering three flavors to our customers: Live Load, Simulated Load, and Hybrid Load,” said Johnston.

Johnston explained that Live Load would entail coordinating with their global team of testers to simultaneously test the system. Simulated Load will use test tools designed to simulate load on a system. Hybrid Load, will do both: use tools to simulate a load using the test tools, while testers are simultaneously performing functional test.

“There are certain bugs that only reveal themselves while your application is under load,” Johnston explained. Applications that use flash or streaming video, for example, need to be checked for quality of audio and video while the application is experiencing heavy traffic.

Currently these performance test offerings are primarily for Web-based applications because that’s where there is greatest demand, but uTest is willing to dig in and customize performance test efforts for customers with other needs. At some point, Johnston thinks there might be additional interest in the mobile market as it continues to mature, but right now Web-based performance test is their biggest market.

The competitors in the performance test arena are not other groups that offer crowdsource services, but the vendors and consultants that specialize in performance test tools, such as HP’s LoadRunner, according to Johnston. How uTest differs from consultants that specialize in certain tools is that, thanks to the uTest crowdsource model, they have access to a vast array of test tools and performance test experts.

“uTest has over 23,000 testers spread across 163 countries,” Johnston told us. Being a uTest member myself, I reminded Johnston that many of the 23,000 testers were inactive, and Johnston agreed. In any online community the typical makeup is 90 percent inactive (“lurkers”), nine percent active and one percent hyperactive.  With uTest, Johnston said the spread is more like 70 percent inactive, 27 percent active and three percent hyperactive, so uTest is a more active community than most. 

All uTest testers fill out a personal profile with information about their skill sets, locations and the technologies and tools to which they have access. This information helps uTest match people with the right skillset to the clients. Again, being a member myself — albeit a self-proclaimed “lurker” — I can well attest to uTest’s active community. Even though I rarely sign up to test, I have found the site one that actively encourages networking and professional development.

I asked Johnston about the pricing model for performance test, knowing that when I was a performance test manager at Sun, LoadRunner consultants were very expensive. Johnston said that the price will vary depending on the client’s needs. Though they follow the market and will charge more for expertise in the competitively-priced tools, overall, using uTest will give the client a cost advantage. Due to the wide array of testers, skills, and tools available, there is flexibility in what can be done and the client isn’t locked in to any high-priced contracts.

uTest CEO, Doron Reuveni, will be presenting at the upcoming StarEast conference on April 29th. My SSQ colleague, Dan Mondello, and I will be at the conference and plan to talk to Reuveni and more with Johnston there. So, stay tuned for more news and information on crowdsource testing.

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Once upon a time we believed that it was impossible to conduct concurrent performance testing of real-world conditions using REAL USERS. This is exciting news for customers who need REAL WORLD testing - because it just doesn't get any more real than this. If you need to execute performance test scnearios with real-world conditions like random transaction arrival rates, variation of use/humanization, non-repeatable traffic patterns and scenarios, limited test results and logging and higher risks of false positives/negatives. There are 2 items that concern me the most about this: 1) Can this service work for negative performance tests? It seems to me this crowdsourced approach for perf doesn't fit for the majority of performance testing on-premises, inside isolated environments where the conditions of the load test are intentionally harmful to the application, components infrastrucutre and perhaps the entire system under test. I don't think so. As such, your comparison to solutions like HP LoadRunner are not appropriate here. 2) What is the long-term impact of this service to the quality of life/work for testers? I know when I configure Virtual Users in LoadRunner, I push them for hours-and-hours never ending, then stopping & re-starting at my whim, with excessive concurrency running 100 times faster than humans are capable of running, and I require them to have maximum logging and output. Can real human beings actually do that? Once? Maybe. Consistently or Repeatedly? I don't think so. Good luck! (DISCLOSURE: comment submitted from test automation vendor, HP)
Mtomlins: Thanks for your comment. Johnston points out that there are times when it makes more sense to use performance test automation, which is why they have three offerings, with two of those including test automation. I think we all agree that for very large performance test efforts, use of tools is both cost-effective and the most efficient. However combining the tools with live testers, as is done with the Hybrid approach, will provide the best of both worlds.
Well, the crowd reading this article will notice 'crowd' spelt as 'croud' in the article headline ;)
Hey Phil, how are you doing! Darn it, you caught that right as I was changing it! (I guess that's why you're such a good QA guy, right?) Good to see you out here! I just planted that misspelled word to see who would be the first one to catch it and U are the one in the middle of the croUd that gets the prize! ;-)