(This is part of a series of blog posts where we describe enhancements and changes planned for the upcoming release of our Collaborative Lifecycle Management solution, comprising: Rational Team Concert, Rational Quality Manager, and Rational Requirements Composer.*)
One of the major focus areas for our upcoming CLM solution is reporting. We’re working on a solution that enables you to report across the three Collaborative Lifecycle Management (CLM) applications: Change and Configuration Management (CCM), Quality Management (QM), and Requirements Management (RM). As we’ve talked about in earlier posts on jazz.net, for our next release we’re working on additional capabilities to link data between applications. Our new reporting feature lets you see this data in different ways from each of the applications.
Below are two cross-product reporting examples using data from our new sample application: “Money That Matters”. Learn more about CLM setup and installation, including how to create the sample application, by reading our Beta 3 announcement.
First, as a tester recording execution results in QM, defects are created in CCM and linked to the execution result. The Execution and defect by owner report shows this cross application relationship. This report is available in the QM application. To try it out for yourself, click Reports > View Reports then choose the Execution and defects by owner report (under Execution).
This second report shows requirements that are impacted by failed test cases and the associated defects. This report shows traceability between all three CLM applications: the requirements are stored in RM linked to testcases in QM and defects linked to failed tests stored in CCM (more about the CLM link relationships here). To run this report from QM, click Reports > View Reports then select Plan Requirements Defect Impact grouped under Requirements.
This new cross-application reporting capability is implemented with a common data warehouse shared by the CLM applications. Data from each application is loaded into this shared data warehouse periodically through a process called “Extract, Transform, and Load”, or ETL. ETLs are part of each of the CLM applications and run automatically. The out-of-the-box reports in CCM and QM now use this common data warehouse. This enables cross application reporting. The optional Rational Reporting for Developer Intelligence (RRDI) server also uses this common data warehouse to provide additional sample reports and IBM Cognos based tools to author custom reports. We’ll talk more about RRDI in a future blog post.
Rational Insight provides the same cross application capability we’ve been discussing built on the same common data warehouse. This enables an easy upgrade from CLM common reporting to Insight which supports more complex deployment topologies including multiple JTS servers and integration with other other applications including Rational Clear Quest, Rational RequistePro, and Rational DOORS. The Insight team is developing a new version of Insight to support this upcoming CLM solution.
Getting started with CLM reporting is straight forward and integrated into the setup wizard: just follow the installation and setup instructions. The common data warehouse does need a separate database. By default Derby is used which enables you to quickly kick the tires.
As part of our efforts to improve the quality of the CLM solution, we have two self host efforts underway: jazz.net and SVT self-host. The teams developing the CLM solution use the latest milestone drivers of the CLM applications to develop and test. On jazz.net, we’re hosting all three CLM applications in a single server integrated with RRDI server. On SVT self-host we have a more complex topology with multiple JTS, CCM, QM, and RRC instances integrated with Insight. Below is an example of a custom report we’ve used as part of running this release to identify defects impacting failed test points.
Reporting configuration details.
There are three wizard pages for the setup of the data warehouse for reporting: one in the JTS section and then one each of QM and CCM. The JTS page configures the common data warehouse connection. The image below shows the configuration with the default Derby database.
Then at the bottom of that page, you configure a user that is used to run the ETLs. Be sure to note the username and password you enter here.
The CCM and QM applications each have a page in the setup wizard where you need to enter the same userid and password you configured above on the JTS page. The database information will be automatically set to the same information you entered on the JTS page. I’ve included the CCM page as an example; QM looks exactly the same.
After you finish the setup wizard, you can go to the /jts/admin page to run the ETLs. Navigate to the report menu in the admin page.
You’ll notice that the user you entered in the setup wizard above has been set for the ETL jobs on this page. To run the ETLs click the Run all data warehouse collection jobs for all applications link. To see the status of the running jobs, click the Data Collection Job Status link on the menu on the left of the page. You’ll see something like this below:
Once the ETLs complete, all the data is available in the data warehouse for reports. By default these ETLs will run every day at midnight.