We demystified the automation puzzle. Relentless validation tamed!

A large global provider of BI solutions has a product suite that runs on five platforms supporting thirteen languages with each platform suite requiring multiple machines to deliver the BI solution. The entire multi-platform suite is released on single CD multiple times a year.

The problem that stumped them was “how to automate the final-install validation of multi-platform distributed product”. They had automated the testing of the individual components using SilkTest, but they were challenged with “how to unify this and run off a central console on various platforms at the same time”.

Considering each platform-combination took about a day, this required approximate two months of final installation build validation, and by the time they were done with this release, the next release was waiting! This was a relentless exercise, consuming significant QA bandwidth and time, and did not allow the team to do things more interesting or important.

The senior Management wanted single-push-button automation -identify what platform combination to schedule next, allocate machine automatically from the server farm, install and configure automatically, fire the appropriate Silk scripts and monitor progress to significantly reduce time and cost by lowering QA bandwidth involved in this effort. After deep analysis, in-house QA team decided this was a fairly complex automation puzzle and required a specialist! This is when where we were brought in.

After an intense deep-dive lasting about four weeks, we came up with a custom master-slave based test infrastructure architecture that allowed a central console to schedule various jobs onto the slaves, utilizing a custom developed control & monitoring protocol. The solution was built using Java-Swing, Perl, Expect and adapters to handle Silk scripts. Some parts of the solution where on Windows platform while some on UNIX. This custom infrastructure allowed for scheduling parallel test runs, automatic allocation of machines from a server farm, installing appropriate components on appropriate machines, configuring them and finally monitoring the progress of validation through a web console.

This test infrastructure enabled a significant reduction of the multi-platform configuration validation. The effort reduced from eight weeks to three weeks. We enjoyed this work simply because it was indeed a boutique work fraught with quite a few challenges. We believe that this was possible because we analyzed the challenging problem from wearing a development hat and not the functional test automation hat.

Add a Comment

Your email address will not be published. Required fields are marked*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Shopping Basket