The FedEx Tour
October 20th, 2009 | Published in Google Testing
By Rajat Dewan
I appreciate James' offer to talk about how I have used the FedEx tour in Mobile Ads. Good timing too as I just found two more priority 0 bugs with the automation that the FedEx tour inspired! It was fun presenting this at STAR and I am pleased so many people attended.
Mobile has been a hard problem space for testing: a humongous browser, phone, capability combination which is changing fast as the underlying technology evolves. Add to this poor tool support for the mobile platform and the rapid evolution of the device and you'll understand why I am so interested in advice on how to do better test design. We've literally tried everything, from checking screenshots of Google's properties on mobile phones to treating the phone like a collection of client apps and automating them in the UI button-clicking traditional way.
Soon after James joined Google in May 2009, he started introducing the concept of tours, essentially making a point of "structured" exploratory testing. Tours presented a way for me to look at the testing problem in a radical new way. Traditionally, the strategy is simple, focus on the end user interaction, and verify the expected outputs from the system under test. Tours (at least for me) change this formula. They force the tester to focus on what the software does, isolating the different moving parts of software in execution, and isolating the different parts of the software at the component (and composition) level. Tours tell me to focus on testing the parts that drive the car, rather than on whether or not the car drives. This is somewhat counter intuitive I admit, that's why it is so important. The real value add of the tours comes from the fact that they guide me in testing those different parts and help me analyze how different capabilities inter-operate. Cars will always drive you off the lot, which part will break first is the real question.
I think testing a car is a good analogy. As a system it's devilishly complicated, hard to automate and hard to find the right combination of factors to make it fail. However, testing the dashboard can be automated; so can testing the flow of gasoline from the fuel tank to the engine and from there to the exhaust, so can lots of other capabilities. These automated point solutions can also be combined to test a bigger piece of the whole system. It's exactly what a mechanic does when trying to diagnose a problem: he employs different strategies for testing/checking each mechanical subsystem.
At STAR West, I spoke about evolving a good test strategy with the help of tours, specifically the FedEx tour. Briefly, the FedEx tour talks about tracking the movement of data and how it gets consumed and transformed by the system. It focuses on a very specific moving part, and as it turns out a crucial one for mobile.
Soon after James joined Google in May 2009, he started introducing the concept of tours, essentially making a point of "structured" exploratory testing. Tours presented a way for me to look at the testing problem in a radical new way. Traditionally, the strategy is simple, focus on the end user interaction, and verify the expected outputs from the system under test. Tours (at least for me) change this formula. They force the tester to focus on what the software does, isolating the different moving parts of software in execution, and isolating the different parts of the software at the component (and composition) level. Tours tell me to focus on testing the parts that drive the car, rather than on whether or not the car drives. This is somewhat counter intuitive I admit, that's why it is so important. The real value add of the tours comes from the fact that they guide me in testing those different parts and help me analyze how different capabilities inter-operate. Cars will always drive you off the lot, which part will break first is the real question.
I think testing a car is a good analogy. As a system it's devilishly complicated, hard to automate and hard to find the right combination of factors to make it fail. However, testing the dashboard can be automated; so can testing the flow of gasoline from the fuel tank to the engine and from there to the exhaust, so can lots of other capabilities. These automated point solutions can also be combined to test a bigger piece of the whole system. It's exactly what a mechanic does when trying to diagnose a problem: he employs different strategies for testing/checking each mechanical subsystem.
At STAR West, I spoke about evolving a good test strategy with the help of tours, specifically the FedEx tour. Briefly, the FedEx tour talks about tracking the movement of data and how it gets consumed and transformed by the system. It focuses on a very specific moving part, and as it turns out a crucial one for mobile.
James' FedEx tour tells me to identify and track data through my system. Identifying it is the easy part: the data comes from the Ads Database and is basically the information a user sees when the ad is rendered. When I followed it through the system, I noted three (and only three) places where the data is used (either manipulated or rendered for display). I found this to be true for all 10 local versions of the Mobile Ads application. The eureka moment for me was realizing that if I validated the data at those three points, I had little else to do in order to verify any specific localized version of an ad. Add all the languages you want, I'll be ready!
I was able to hook verification modules at each one of these three data inflection points. This basically meant validating data for the new Click-to-Call Ad parameters and locale specific phone number format. I was tracking how code is affecting the data at each stage, which also helps in localizing a bug better than other conventional means...I knew exactly where the failure was! For overcoming the location dependency, I mocked the GPS location parameters of the phone. As soon as I finished with the automation, I ran each ad in our database through each of the language versions verifying the integrity of the data. The only thing that was left was to visually verify rendering of the ads on the three platforms, reducing the manual tests to three (one each for Android, iPhone and Palm Pre).
The FedEx tour guided me to build a succinct piece of automation and turned what could have been a huge and error prone manual test into a reusable piece of automation that will find and localize bugs quickly. We're now looking at applying the FedEx tour across ads and in other client and cloud areas in the company. Hopefully there will be more experience reports from others who have found it useful.
Exploratory Testing ... it's not just for manual testers anymore!