The rapid rise of mobile devices and the growing sophistication of the applications that run on them are putting a strain on mobile device testing professionals and disrupting test organizations, said Matt Johnston of testing services firm uTest Inc.
How a mobile application performs in the testing lab provides little indication of how that mobile application will fare in the real world. "Nothing you learned about testing desktop applications in the lab applies to testing mobile applications in the wild," said Johnston, uTest chief marketing officer. "In the wild" refers to the multitude of handsets, tablets, operating systems, browsers, carriers, languages and locations that test professionals must take into account to assure the quality and security of mobile applications. Compounding the challenge for test professionals is the growing complexity of the applications that run on mobile devices, Johnston said. Increasingly, these apps incorporate social media features -- such as "share this with your friends on Facebook and Twitter" -- as well as location-based intelligence, and this is wreaking havoc on test organizations.
Those are some of the issues Johnston's colleague John Montgomery, uTest vice president of product delivery, will address in "SoLoMo Is Shaking Up the Testing Landscape: How to Adapt, Keep Up and Thrive," a session at STARWEST 2012, taking place Sept. 30 to Oct. 5 in Anaheim. The term "SoLoMo" refers to the collision of three factors -- social, local and mobile. (Note: Johnston was scheduled to speak; Montgomery is a last-minute replacement.)
Johnston offered examples of how mobile apps with social media components and location-based intelligence pose additional obstacles for mobile device testing professionals working with them.
A media site that publishes content embeds a video from a third-party website. Over the weekend, the video site makes changes to the application programming interface and the video on the media site fails. "The QA [quality assurance] manager is left scratching his head," Johnston said. "He is responsible for someone else's code -- and yet he has no visibility into the changes being made." What you have here is clash of cultures, he added. "Social is all about free sharing. QA is all about command and control. It's a tough dilemma."
Location-aware mobile apps present an even bigger challenge for test professionals, because it's virtually impossible to test every use scenario. "In a QA lab, you can simulate browsers and hardware, but there is no way to accurately replicate location," Johnston said. "Even if you spend millions of dollars to test your mobile app in Bellevue, Wash., for example, all you know at the end of the day is that the app works in Bellevue. What about Highway 101 in the valley? What about Europe?" For test professionals working to assure the quality of mobile apps, location-based intelligence is the straw that broke the camel's back, said Johnston said. "It is simply impossible to test every location."
He offered test professionals advice for dealing with the challenges of mobile device testing:
- Recognize that never again will in-the-lab testing be enough. Apps deployed on mobile devices will be the rule, not the exception. In fact, according to market research firm Canalys, sales of smartphones have already overtaken sales of client PCs. Vendors shipped 488 million smartphones in 2011, compared to 415 million client PCs. Getting your arms around what this means is tough for most testing organizations, who still view mobile device testing as an afterthought, as a subset of desktop testing, Johnston said.
- Place a portion of your testers around the globe. "When apps and users are distributed around the world, a portion of your testers should be, too," Johnston said. Of course, it's impossible to cover every location, but analyzing usage statistics carefully helps situate testers strategically. It isn't an exact science, but analytics can identify, for example, the top 25 devices, the top carriers, the top operating systems for Android phones. Johnston advised leaders of testing organizations to look at what the data tells them and told them to determine "what you have to test; what's nice to test."
- Keep doing ongoing testing. "If YouTube changes, does it break my stuff? This is the kind of question you need to look at on a continual basis," Johnston said. Again, it's challenging to foresee every scenario that could come up, so focus testing on the key processes for the sites your app is interacting with. On Amazon.com, that's the checkout process; on eBay, it's listing a new product and bidding; on Google, it's searching and seeing results, he said. "If these things don't work, something is on fire," he added.
- Educate top management. "Let them know we are headed toward a future where mobile device testing won't be an afterthought, it will be the main thing," Johnston said. "Test and QA leadership has to come in and say, 'Here's how my team's world will change in 5 years.'" Johnston emphasized that it's crucial to be proactive with top management. "You don't want to be a victim," he said. The changes are happening so rapidly, and it's no longer about getting up to speed on new smartphones, new apps, new users, he added. "It's not just an education; it's call to arms."