Over the past five years, the term internet of things has grown at a rapid pace, and this boom is indicative of just how many devices exist under that category. IoT devices have these characteristics: network connectivity; a small operating system footprint; and a way to interact with the entire world via sensors, actuators, cameras and more. That can make IoT testing and verification very tricky.
Due to the nature of how these devices are designed and operate, IoT testing isn't always as straightforward as testing a standard software or even a normal hardware application might be. There are a few challenges to consider when establishing an IoT testing process:
- IoT devices have embedded systems.
- They may not have a user interface on the device itself.
- They may be running a custom OS, or a highly customized version of an existing OS -- typically some flavor of Linux.
- And they may rely on external physical or virtual interactions to operate.
Given these challenges, how can we effectively create an IoT testing process? Let's look at a well-known IoT device, the Nest Thermostat, for example. One way to test it would be to create simple representations of the HVAC devices it's designed to work with, along with some simple, software-based emulators that allow for more advanced control over the state of the mechanical systems. These items take up space and resources and aren't standard in a tester's toolkit; as a result, the testing process can be slow and mostly manual. The other way to test can be really fun. By using containers and microservices, we can create an IoT testing process that's more flexible and efficient.
Simply defined, a container can take an entire system image, store it as a file and run it virtually on another OS for a long time. Containers take this concept even further by stripping the system down to just what is needed, therefore reducing the necessary requirements. With the same resources you would use for a single virtual machine, you can run several containers. Beyond an IoT testing process, containers open up a lot more possibilities with other testing:
- Model-driven tests. By combining the software under test (SUT), a random-input program and a prediction program, you can create new prediction models with a minimal amount of overhead. Running the randomizer to provide input to the SUT, the programmer can compare the results with the prediction and explore any differences.
- Automated end-to-end acceptance tests. The technical team can devise sample usage scenarios, simulate them and run them on every build, automatically, without manual intervention such as burning a chip or running a simulation. Tying these tests to your version-control system will provide a way to identify issues and defects in seconds -- not hours or minutes -- and fix the offending code more quickly.
- Long-term memory and soak tests. While the physical devices may be cheap to create -- perhaps just a couple of dollars -- by running on a desktop, laptop or server, you can scale far beyond what physical devices would allow. And since you'll be creating virtual hardware, you can scale that as well. If your physical chip operated at 10 MHz, you can set the clock speed of the container to 100 times that, allowing you to complete a multiyear soak test in about a week.
The role of microservices
Microservices, like containers, are small stand-alone instances. But containers simulate the whole system, while microservices are highly targeted packages of interactions -- full APIs, specific endpoints and more. The key is that they're entirely stand-alone and don't require any external resource to perform their duties. Once the object is packaged as a microservice, testing gets even easier, since there are a host of testing tools and strategies for microservices that the team can reuse. Combining microservices and containers opens up entirely new possibilities for an IoT testing process. You can mock, stub or simulate physical input and output, which allows you to use more traditional API testing strategies and makes everything much simpler.
Putting it all together
By taking advantage of containers and microservices, we now have a way to simulate the entire ecosystem of an IoT device. Now that we are no longer constrained by physical requirements, we can increase the scope of our IoT testing process without impacting other areas. Thus, the IoT testing team can deliver more often with higher quality and less risk. Human interaction testers can focus on usability, user experience and "black swan" defects, including a true end-to-end experience, additional use cases and other neglected scenarios, while the computer handles the myriad of combinations it can predict.
What used to be a room full of testing equipment and wires can now be shrunken to a single machine, whether it's a laptop or a Raspberry Pi. Physical location is no longer an issue, either; you can use those portable options to test where it makes the most sense, for example, a client site, in a conference room or at the beach -- all of which would make for a much more difficult IoT testing process without containers and microservices.
Using these tools can improve testing efficiency, coverage and quality. By any metric, these are positives and should be considered if you're charged with creating an IoT testing process. Tools such as Docker or Kubernetes can help you get up and going with containers, and adopting a microservices architecture will have many more benefits beyond just testing.
As always, before jumping into a new platform or architecture, do a solid analysis of the pros and cons. Containers and microservices will save a lot of time, but be sure to factor in startup costs as well.
The role of automation in IoT testing
Worried about IoT security? Test, test, test
When it comes to IoT testing, try algorithms