Software Testing

Lets talk about robots... by Taylor Stapleton

Ok. So, I can't tell you how many times I have walked into a Walmart or a McDonalds and I just know that the caliber of employee that is going to help me is probably going to be terrible. In fact I often wish that a lot of these service jobs would just be replaced by robots. Sleek, friendly, reliable robots. Well maybe there are some of you out there that would love to just have a robot for a helping hand around the office or for testing as well. Turns out, this dream can actually be a reality (for an enormous sum of money I'm sure). I got the to see a presentation this week about one such company that makes some very specialized kinds of robots just for software testing.

OptoFidelity is a company that makes robots for the purpose of testing mobile phone applications. Essentially, besides an actual human, an OptoFidelity robot is the best you can get for running end to end manual tests on a device. Actually, in the presentation I saw several ways that this robot totally exceeds a humans capability. They seem to have many different form factors of robots with different capabilities. But among the offerings their robots have are:

  • Mimic any kind of gestures including different pressure levels and multiple fingers for pinching and zooming.
  • Provide very accurate finger hovering distance from screens.
  • Do perfect stylus testing including getting angles of stylus and pressure exactly correct.
  • Provide an array of sensors by which to measure the devices output (bluetooth, NFC, Wifi) with terrific accuracy.
  • Capture the screen with high accuracy cameras.
  • Unlike almost any other form of test, the robot could reboot your device manually

One of the main things I was really impressed with is that they had found things to test on a phone that were way outside the scope of what I had imagined. They have a high speed camera watching the phone screen that is accurate enough to perfectly measure the frame rate of the screen at all times. So for instance, the robot finger can touch the screen and the high speed camera will measure exactly how fast and smooth the triggered animation is. So as part of your continuous integration build, you could have a step where you ship the build off to the robot controlled phone. Once on the phone, the robot would do a quick check to see if your application is still as visually fast as it used to be and if you are too slow, your code submission fails. WOW! I WANT THAT!

Apparently as one of their features, they also have the ability to do screen diff-ing and optical character recognition through the camera as well. Although, screen diff-ing through a camera sounds like an insanely hard computer science problem to overcome. So the questions I developed whilst watching this presentation are these:

  • How does one define a test suite for a robot? Can it be abstracted to the point where I can write a script for a robot that just gets called by methods like "SwipeUpFromBottom" or "PressTheGreenButton"?
  • Can you provide me an interface for which to import the video from the camera into my own software space and use my own image recognition software on the stream?