Reports for the Quality Management (QM) application

Engineering Test Management includes several predefined BIRT reports, including defect reports, requirement reports, and test case reports, which you can use to analyze your test runs and identify trends in your project.

Each report has a Parameters section that lists the criteria that you can use to run that report. If you repeatedly use the same report and parameters, you can make a copy of the report and save the parameters for easy access. Many of the reports are interactive, which means that you can navigate into the information in the report. You can click links, bar graphs, and pie charts to see specific information.

If you run a report without configuring specific values for the parameters, the report runs all the values for each unspecified parameter. For example, if you run Execution Status by Owner using TER count, and do not select values for the test case parameter, the report selects all test cases in the value range for the parameter. Running a report that includes many test cases can affect software performance.

By default, reports are grouped in folders by report type, as shown in Table 1. After you run a report, you can print the results or export them to be saved in a specific format.
Table 1. Report types
Report type Description
Defects See Defect reports.
Execution See Execution reports.
Requirements See Requirements reports.
Scorecard The Scorecard report provides an overview of test case status, test execution counts and result points, and defect counts.
Summary See Summary reports.
Test Case Test Case reports list a range of test case details, filtered by test case, team, configuration, creation state, and defect impact. Use the Test Case Review report to see side-by-side charts that show test case details, requirements work items, and other work items by test case.

You can also use the Report Builder component of Jazz™ Reporting Service, to access a collection of ready-made reports and run them with your project data. You can build your own reports to view artifacts from across your projects, including the ones that use configurations. For more information, see Authoring reports with Report Builder.

You can generate reports to display the different properties of test artifacts, with each property representing a category type with values. These categories are displayed in the final report as separate columns. You can also use Report Builder to create filters or sort categories. When you add category attributes to the report, each artifact is displayed in a single row. Multi-valued categories are displayed as separate entries for the artifact.

Important: After an administrator enables configuration management, project data is sent only to Lifecycle Query Engine (LQE) data sources. Business Intelligence and Reporting Tools (BIRT) reports and historical data reports that use an Engineering Lifecycle Management or an external data warehouse as a data source will not work and are not shown on the Reports menu.
You can report on configuration-enabled projects in these ways:
  • Report Builder: Choose an LQE-based data source, create reports, and export them to spreadsheets, IBM® Engineering Lifecycle Optimization - Publishing, Microsoft Word, PDF documents, and HTML files. You can export graphs to image files or documents. For details, see Authoring reports with Report Builder.
  • Template-based reporting in ELM applications: Generate simple, document-style reports.
  • Engineering Publishing: Create document-style reports that include a cover page, table of contents, and copyright statement. For details, see Authoring document-style reports.

video icon Video

Jazz.net channel
Software Education channel

learn icon Courses

IoT Academy
Skills Gateway

ask icon Community

Jazz.net
Jazz.net forums
Jazz.net library

support icon Support

IBM Support Community
Deployment wiki