Blogs about Jazz

Blogs > Jazz Team Blog >

Success with System Verification Test (SVT) Automation

Back when the System Verification Test (SVT) team started thinking about Collaborative Lifecycle Management (CLM) 6.0, they put some automation goals in place.  Having identified a key bottleneck of long periods of time being spent on setting up the configurations for scenario testing, the first goal was to build automated scripts to speed up the configurations in order for the manual testers to start testing faster and reduce the test cycle time.  The second automation goal was aimed more conventionally at uncovering defects in code changes and increasing automated test coverage.  This was focused on adding automation to cover PLE scenarios and also to enhancing scenarios to include new features like configuration management.

Prior to 2014, System Test focus was on deploying configurations in an automated way.  At the beginning of 2014, System Test had one and a half automation engineers who were automating the out of the box customer experience, the Money that Matters scenario.  While that goal of ensuring that scenario was mostly automated was achieved by end of year, the process was painful and slow. In light of the DevOps transformation, at the and of 2014, SVT decided to invest more in automation.  They added three more engineers to work on this. That wasn’t an easy decision, but a necessary one, and it meant taking on a risk as less manual test coverage was provided as a result of this move.

2015 has been a great year and the five engineers working on automation have already commissioned seven automated scenarios this year.  These have been focused on data generation, to help manual testers and shorten the test cycle in addition to the automated configurations. There is good communication with development teams.  An automation team is mostly useless if you don’t get the right amount of support from development when you raise concerns or defects if they don’t follow up quickly.  The close relationships with development allows the System Team to cover the cutting edge use cases and quickly respond to defects found.  Now, turn around time is typically within a day.

The teams have come from a time where there was little support for automated testing to today with a fully automated set of tests.  They didn’t have a BVT, or rather it was manual.  Each team would blindly take the build and hope it installed.  They moved to automation where the build is created, gets picked up, automation runs jUnits, then it gets deployed to our golden topologies, using different app servers, different tests, and they are all designed according to how customers use the products.  The team is also using our tools, Rational Quality Manager (RQM) for test execution and reporting, and Rational Team Concert (RTC) for test development.  They have gone from no automation to now where a large number of automated tests are executed daily against both maintenance and new release streams.

This has been worth the investment, but it has not been easy.  For one thing, there have been some plan changes along the way.  Trying to keep up with developing the new automation while product plans are changing makes it difficult to make sure that the teams are covering the right user experience.  Also, there are new members to the automation team who need to be trained not only on the development of the automation, but also the framework for the automated tests.

There is a process in place to get the automation engineers up to speed quickly, but probably the most important part is working together as a team and starting simple before moving to the more complex areas, like the framework.  There is a review process with the team so that everyone learns from one another and if there is a problem the reviewer will identify the area in need of improvement and also make a suggestion about how to solve the issue.

Given that there is a good mix of manual testers and now test automation developers, it’s important to work together for the least risk and the most coverage.  This has paid off for the team, so that the team is building automation to help the manual testers find defects quickly, build automation to find regressions in existing code and to find defects in new features.  There has been a significant number of defects found prior to GA so far, and the majority of them have been critical and higher severity.  Just knowing that they won’t be released is a huge confidence builder for the teams.

The System Test team will continue to invest in automation, making use of the automation community that has been built up out of these efforts.  The community is geographically distributed, under different leadership and has great collaboration with development.  The openness and transparency allows for the development teams to execute newly created automated tests against their code changes prior to delivery.  The increased confidence in quality has been and continues to be worth the investment.

Beth Zukowsky
Program Director and Rational DevOps Protagonist