How a Robot That Plays Angry Birds Can Aid Development and Quality Assurance

This story starts in an interesting place: the quest to produce a robot that plays Angry Birds; but it ends in a place very familiar with anyone who has ever worked in product testing: a robot that does the tedious task of pressing every button combination on a physical UI. Recently, I spoke with Jason Huggins, co-founder of application testing cloud provider Sauce Labs, about a robot that will no doubt transform the way that quality assurance for mobile apps will work in the future.

In the world of mobile apps, most DevOps teams hang their hats on the tried and true application of virtual machines in order to test their applications across numerous devices. When that doesn’t go quite far enough, they start loading it onto multiple physical devices (or soon-to-be-released developer prototypes) in order to watch the app in action before it gets released into the market.

Huggins retells the story of the evolution of this robot, named Appibot, with a bit of hacker-culture nostalgia about how sometimes a great idea starts out with “wouldn’t that be cool,” and then snowballs into a project that runs away with its own inertia. The quest started with a script that could play Google’s Pac Man anniversary doodle in browser but eventually rose to the level of wondering about how testers interact with mobile devices.

It crested at the point Huggins prototyped a robotic finger that could track the colorful birds on Angry Birds and play the game—albeit with a script running the show and not the robot itself—but this gave rise to the idea that a robotic finger could go through the motions on any mobile application and operate the software like a person. As a result, a robot finger could test not just the software, but it’s current state; capable of doing the same task ad infinitum without tiring or getting bored.

Robot testing mobile software? This isn’t a Roomba or a robotic welder

Appibot functions by acting as an actual finger, triggering UI events on screen by touch, and thus the test itself runs through the device itself and not through a virtualized instance. There’s several immediate uses for this in pre-production and a big one for DevOps in post-production.

In pre- there’s always that chance that a particular mobile device’s hardware has some sort of glitch or oddity (perhaps its even relegated to a particular model) and a finger-robot going through the motions of executing every possible permutation on an app might just suss that out by tediously going through the motions and triggering the issue where it can see it happen.

At the post- level, there’s already a giant market for virtualized devices that run tests on the product every few minutes 24/7 in order to act as a bellwether if something has gone wrong within the last build or one of the services. Huggins explains that as mobile devices become connected more and more with external devices (such as iPhones and Zipcars; or Android devices remote controlling the Smart TV) it becomes much harder to simulate the physical activity in a virtual space. For this, manual testers are usually employed, but a robot could exhaustively and tediously run through the motions that a manual tester would and without the risk of carpal tunnel syndrome.

Currently the cloud and virtualization are the best pre- and post-production testing methods that DevOps has for completing the cycle with testing—but as with everything, virtual can only go so far in an analog world where the rubber does meet the road; or, in this case, the finger meets the screen.

Huggins sees this robot as a necessary element of the development and production continuum that helps make that last-mile solution for testing on product that much closer to developers and enables a better understanding for operations how their app functions under real world conditions. He also hopes that this robot might become the newest desktop killer app—sitting in the QA department, next to computers, robotic finger poised to test the newest app on the newest hardware.

Maker and hacker-culture at its best

From my discussion with Huggins, I saw the best of the hacker culture in his discussion on how Sauce Labs came to see the Angry Birds playing robot as a potential killer app in the DevOps ecology. Here is a literally “thinking outside the box” solution for something that is becoming one of mobile software’s biggest concerns: how do consumer apps function in the real world?

Most mobile developers have drawers stacked high with devices—models of Samsung, Kindle, iPad, the list stretches on and on—but they rely heavily on virtualized machine testing and data coming in from operations telling them what’s happening to the app once it goes out the front door. Being able to bring the tests on devices themselves back in house is an act of employing beta testers who can get a lot of work done in being people using the device all day; but it would not be humanly ethical to have them sit in a cubicle for eight hours just pressing the same sequences over-and-over.

“I’ve spent a long time wondering: What is it about humans that make them better at mobile testing?” Huggins wrote in a blog post about this subject. “It’s three things, really: brains, fingers, and eyes. With a brain, a finger, and an eye ball, you can pretty much test anything, anywhere, on any device – they are the universal testing API.”

Seeing this, Huggins threw together two common allies—software scripting and robotics; i.e. a brain coupled with a finger and eyeball—and produced the prototype which he is now using to demonstrate the product. It’s still primitive and has a long way to go, but having something small enough to sit on a desk and capable of being scripted to perform a particular operation and acknowledge proper results, it could be the next stage in product testing.

Huggins says that Sauce Labs intends to go to conferences and even provide workshops in how to program and apply the robot. I can almost see excited developers walking away from a workshop, carrying a box with a robot and a DVD in it, prepared to be the first step in adding a new robot to our ever expanding culture of automation and perhaps change thinking about mobile app testing along the way.

About Kyt Dotson

Kyt Dotson is a Senior Editor at SiliconAngle and works to cover beats surrounding DevOps, security, gaming, and cutting edge technology. Before joining SiliconAngle, Kyt worked as a software engineer starting at Motorola in Q&A to eventually settle at Pets911.com where he helped build a vast database for pet adoption and a lost and found system. Kyt is a published author who writes science fiction and fantasy works that incorporate ideas from modern-day technological innovation and explore the outcome of living with those technologies.