Download 7.0.2SR1 Release

Jazz Reporting Service

Default reporting option for Jazz solutions

Jazz Reporting Service 6.0.2

Product Release / Trial | April 25, 2016
This is not the most recent version. We recommend Jazz Reporting Service 7.0.2 7.0.2SR1. This is made available for archival purposes and may contain bugs and/or security vulnerabilities. If you do download this version, it is being provided AS IS, without warranties of any kind, including the implied warranties of merchantability and fitness for a particular purpose. We strongly advise you review the support pages for this version and update the product or take actions recommended therein. Security bulletins contain instructions for the security vulnerability addressed therein, and may require upgrading to a newer version. Link to security vulnerability blog: IBM PSIRT blog.

Jazz Reporting Service 6.0.2 "New and Noteworthy"

Jazz Reporting Service 6.0.2 New & Noteworthy

Jazz Reporting Service is a new offering that provides an alternative to Rational Insight. It is not based on Cognos and it includes these components:

Jazz Reporting Service is an optional program for Collaborative Lifecycle Management.

Jazz Reporting Service is an integral part of the Rational solution for Collaborative Lifecycle Management (CLM). For new and noteworthy information about other CLM applications, see these pages:

Report Builder

Report Builder is a key component of the Jazz Reporting Service, which you can use to quickly and easily consolidate data from across your tools and project areas. Depending on the applications that you use, and the type of information that you want to report on, you can choose to report on data that is collected in a data warehouse or that is indexed by Lifecycle Query Engine (LQE).

You can use the ready-to-use and ready-to-copy reports to quickly generate and share information about your projects, or you can create new reports. Your reports can show trends in project or team progress and quality throughout a release, sprint, or a specific time range. Traceability reports can show how artifacts are linked across lifecycle tools. If you use the configuration management features in the RM and QM applications, you can use Report Builder and Lifecycle Query Engine (LQE) to report on data in the configurations. You can export the report data in different formats, including Microsoft Excel spreadsheets or a template that you can use with Rational Publishing Engine to generate document-style reports. You can also export graphs as images for use in documents.

With Report Builder, it is easy to set up reports and integrate them with the CLM dashboards through the widget catalog. The Report Builder widget catalog includes the ready-to-use reports that cut across testing, requirements management, and change management. You can also share reports that you create.

Report creation

General user interface enhancements

Selecting a data source when importing reports

When report administrators import a set of Report Builder reports from a different server, they can now select which data source to import to - the Data Warehouse or LQE (the Lifecycle Query Engine). If the reports are not compatible with the selected data source, an error is shown. This function is useful when a particular type of data source (like the data warehouse) has multiple data sources. By default LQE offers two data sources - one for regular projects and another for projects that use configurations.

Duplicating a report while editing

When editing an existing report, you can now copy the report contents into a separate report in the same way as when viewing a report or starting it from the Use page. If it's already been saved, you'll see a Duplicate button beside the Save button in the upper right.

Learn page

Report Builder now has a central hub for documentation, articles, and videos about the Jazz Reporting Service. Click the new Learn button on the left sidebar to see useful links in common areas of concern. The page also provides quick links into for the forum and other library articles. It's a quick way to get started learning about the product.

Report creation (all data sources)

Creating "My" reports for the logged-in user

You can now add a condition to user-based fields (owner, creator, and others) and specify that it should apply to the current user. When adding a condition to a user field, use the new Current User radio button. Then, when the report is shown on the dashboard, the user who is logged in is used for the condition. The result is that you can now create reports such as "My Open Defects" that show different results, depending on who is viewing the report.

Adding your own axis labels for graphs

When creating a graph visualization, you might want to change the description of the X and Y axis to clarify what the data means. You can now use the graph configuration pane of the Format Results tab. Enter your text beside the X and Y axis selections to create your own headings for each axis.

Reporting on timesheet information for work items

You can easily create timesheet reports by creating a traceability link from Work Item to Time Entry.

Then add conditions for Time Entry in the Conditions pane for a particular date range, day of the week, or month.

Tracing multiple relationships between the same source and target

Now you can report on multiple relationships between the same source and target. For instance if you want to report on all "Contributes to" and "Duplicate of" relationships and exclude other relationship, multi-select these relationships to create a traceability report. The only caveat is that the multiple relationships path is still not possible when the relationship is to different target artifacts.

Then the traceability path lists the relationships in the link between the source and the target, separated by commas.

Report creation (data warehouse data source)

Displaying current data in work item trend totals and burndown and burnup reports

Historical trend reports in the Report Builder are created from metrics data gathered by the data mart fact collection jobs in the Data Collection Component (DCC). These jobs are typically run once a day, usually overnight when server activity is low. When looking at trends for work item totals, the trends for the current day are based on the fact collection job that ran overnight. Consequently, data for the current day could be stale if many updates (creations, resolutions, and so on) were in the live repository since the total metrics was gathered.

Now, for all burn-down and burn-up ready-to-use reports and the Work Item Totals (live current data) historical trend, the results include the latest current data collected by DCC for the current day (from the operational data store, ODS). Although the data is not exactly live, it is current as of the last synchronization of the DCC ODS data collection jobs. For some deployments you can tune the synchronization to within the last 5 or 10 minutes.

You'll see two totals in the list. The other trend, just Totals, behaves as before, and its data comes from the last overnight data mart job. Totals was retained so that you can migrate old reports manually and in case you need to keep all the trend data consistent at a given point in time. As well, the current data option might result in a small performance penalty, because it requires running an extra query to retrieve the latest metric information.

Including test environment information for test execution

When creating test execution records during a test cycle, a test lead assigns someone to create a set of test environments that represent supported deployment scenarios and then run the same test case against each of the test environments. For instance, you might require your web application to support the Firefox, Chrome, and Internet Explorer browsers. Each browser configuration could be represented by a different test environment. Now you can report on test execution records with conditions for the associated test environment. For example, you might want to create a report of all test results that were run against one or more of the supported browsers.

Then you can view the particular test environment attribute in the report in one of the columns.

Reporting on requirement collections and modules

You can now report on requirement collections or modules and create a traceability path (such as Collection --> Requirement).

Using this capability, you can add conditions to a collection or module to isolate particular artifact types, such as a Use Case Specification.

Report creation (Lifecycle Query Engine data source)

External URIs for custom properties when using LQE

Report Builder now supports external URIs for custom properties in LQE-based reports. In tools such as Doors Next Generation (the RM application), users can specify an external URI for custom properties. Without an external URI, if DNG has multiple project areas, and an identical custom property is created in each project, the relationships list and the attribute list would display multiple copies of that property, each followed by the name of its project area. An LQE-based report using such custom properties is effectively limited to artifacts in the corresponding project area.

But with the same external URI specified in every DNG project for an equivalent custom property, Report Builder now shows a single property name. When you select that property, the resulting report includes artifacts from all of the projects.

Creating timesheet reports

Timesheet information is a core part of the Rational Team Concert informaton available for work items. Users can track the time they spend each day on each work item. Using the LQE data source you can now create a traceability report on time spent.

After selecting Work Item as the primary artifact, you can navigate to the Traceability links section and then pick Time entries.

This choice creates a traceability path from work items to time entries, and then you can add conditions or columns based on the attributes of the Time Entry field.

Creating work item approval reports

The LQE data source now supports creation of work item approval reports, similar to what you create using the data warehouse. Use the same Traceability links approach.

Using calculated columns in summary reports

The LQE data source now supports calculated columns, just like the Data Warehouse data source. Use the Format results tab when creating your report.

By using calculated columns, you can create more complex reports that aggregate values based on a count of all artifacts in a group, and you can create a sum or average of numeric values.

Reporting on global configuration and components

You can now easily report on components and global configurations.

Performance improvements

Query caching has been improved. The set of projects a user has access to is now cached and updated when the data model is refreshed, every 12 hours or whenever a report manager explicitly refreshes the data source. Now you don't need to run the query again to get the projects a user belongs to, which is a fundamental requirement to invoke access control at run time.

Also, some of the general query caching heuristics were improved to minimize having to re-run the query in multi-user scenarios. This should make common dashboards perform faster when different users with the same project access control are using the same dashboard.

Lifecycle Query Engine

Lifecycle Query Engine (LQE) implements a linked lifecycle data index over data provided by one or more lifecycle tools. A lifecycle tool makes its data available for indexing by exposing its data by using a tracked resource set, whose members MUST be retrievable resources with RDF representations, called index resources. An LQE index built from one or more tracked resource sets allows SPARQL queries to be run against the RDF dataset that aggregates the RDF graphs of the index resources. Thus, data from multiple lifecycle tools can be queried together, including cross-tool links between resources.

This release contains only quality improvements.

Data Collection Component

The CLM Java ETLs have been replaced with the Data Collection Component. Install and configure the Data Collection Component to report on data warehouse data. Designed to address key performance and deployment concerns with the current ETL solutions, Data Collection Component employs parallel processing and optimizes system resource utilization by using available system resources to process the data efficiently. Overall execution time for the ETL process for CLM data is reduced, whether there is a single instance or multiple instances to process.

  • The ETL for DCC can now be extended. New ETL files can be dropped into the DCC mapping folder and loaded by using the new Load Jobs button on the Data Collection Jobs page. You can also delete jobs.
  • The ETL files that are shipped with DCC are no longer located in the DCC mapping folder. Instead, they are bundled inside one of the server plugins. This has two advantages: the out-of-the-box ETL is better protected and the ETL files can be delivered as part of an iFix in a Patch Service compliant way.
  • The initial DCC setup now schedules the out-of-the-box jobs to run automatically without user intervention. If the CLM server is using the Rational Insight Data Manager ETL, you must manually disable DCC jobs or uninstall DCC.

Adding custom ETLs

DCC now supports adding custom ETL (Extract, Transform and Load) files added to the mapping folder. Use them mainly to add custom metric tables and load extra information in the ODS (operational data store) tables. New tables or fields will only be accessible through Report Builder using advanced queries or through a customized Cognos Framework manager data model.

A new wiki page goes through the process of adding custom ETLs to DCC:

Using fact details jobs instead of history jobs

The History out-of-the-box DCC jobs are now called Fact Details. The new name avoids any misunderstanding of what those jobs do. The system issues a warning when those jobs are enabled. Additionally, you no longer see the Select All button that enables all jobs, so you can't inadvertently enable all jobs.

ALM Cognos Connector

ALM Cognos Connector enables a Jazz server to connect to an existing Cognos BI server already deployed in a WebSphere Application Server environment. This component provides the configuration for connecting the Jazz and Cognos BI servers. When it is configured, users can create and manage Cognos-based reports based on the artifacts from the Jazz server, similar to the capability provided by Rational Reporting for Development Intelligence in previous versions.

For more information on how to use the ALM Cognos Connector with a Cognos BI server and move existing reports to the new Cognos BI server refer to this topic: Migrating Rational Reporting for Development Intelligence reports to Cognos Business Intelligence using the ALM Cognos Connector

This release contains only quality improvements.

New in previous versions of Jazz Reporting Service 6.0.2

For details about new features and functionality in previous releases, see these pages: