This information is for planning purposes only. The information herein is subject to change or removal without notice before the products described may become available.
The CE/CLM products were renamed in version 7.0. As the help is updated to reflect the changes, the topics might contain inconsistencies. For details on the name change, see Renaming the IBM Continuous Engineering Portfolio.


Example: Using quality objectives and entry and exit criteria

Your team can set overall quality objectives and manage both entry and exit criteria. These objectives are defined at the project level and implemented in individual test plans, where you can track whether or not each objective has been met.

About this task

Quality objectives often track things like the number of open defects and the number of blocked or failed execution records. Any quality objectives involving defects now take into account those defects that are managed in OSLC-linked Change and Configuration Management (CCM) providers such as Engineering Workflow Management and Rational® ClearQuest®.

Note: To prevent inaccuracies in the evaluation of quality objectives when using Rational ClearQuest as your defect provider:
  • Verify that the Rational ClearQuest server is correctly configured for Engineering Lifecycle Management integrations
  • Verify that the State Predicates correctly define the "Closed" status.

Here is an example of how a test team can use quality objectives:

Procedure

  1. During the planning process, the test team defines the quality objectives.
    1. A test manager examines the predefined quality objectives in the project properties and evaluates whether they are suitable for the test team.
      Manage Quality Objectives

      Each quality objective includes a Name and Description, as well as a Condition and a Target.

    2. The test manager decides to modify the settings of some of the predefined quality objectives and also create some new, user-defined ones.

      For example, one of the predefined quality objectives states that the Percentage of Failed Execution Records must be less than 10%. The test manager replaces less than 10% with less than 5% by double-clicking and replacing 10 with 5.

      Both the predefined and user-defined quality objectives can be used in any test plan within that project area. However, only predefined quality objectives that are known by the Quality Management application can be used in computations. User-defined quality objectives are informational only and do not contain computed values.

    3. A test lead, who is responsible for a particular test plan, defines the overall quality objectives for the test plan using quality objectives defined by the test manager, and if necessary, creates additional quality objectives in the test plan.
      Note: Any new quality objectives that are added in the test plan are available to all test plans in the project area.

      For example, the test lead might want to add some overall quality objectives that have not been defined in the project properties, such as the number of concurrent users that the application under test must support or the maximum time allowed for the application under test to open.

      When a quality objective is added to a test plan, the Condition and Target are merged together in the Expected column, as shown in the following figure:

      Quality Objectives

      The test lead sets the status to Not Started.

    4. The test lead opens the Entry Criteria section of the test plan to define the prerequisites that must be met before testing can begin.

      For example a System Verification Test team might want to require that all functional verification tests have been attempted and that 95% have been completed; a Functional Verification Test team might want to require that the user interface is frozen.

    5. The test lead opens the Exit Criteria section of the test plan to define the conditions that must be met before testing can be concluded.

      For example a System Verification Test team might want to require that all System Verification Tests have been attempted and that 95% have been completed.

  2. As the development effort progresses, the test lead determines if the test entry criteria are being met.
    1. The test lead refreshes the view with the most recent values by clicking the Evaluate Quality Objectives icon (Evaluate quality objectives). This causes Quality Management to gather the actual data associated with the objective and measure it against the expected value.
    2. The test lead compares these values with the expected values, sets the Status, and comments on each quality objective.
      Entry Criteria
    3. The test lead meets with other team members to determine whether the entrance criteria have been met.

      The test lead may decide to stick with the original entry criteria estimates or to make adjustments.

  3. As the testing effort moves forward, the test lead determines whether the exit criteria are being met, following a similar process to that of the entry criteria.
  4. At the end of the testing effort, the team evaluates whether the overall quality objectives have been met.

video icon Video

Jazz.net channel
Software Education channel

learn icon Courses

IoT Academy
Skills Gateway

ask icon Community

Jazz.net
Jazz.net forums
Jazz.net library

support icon Support

IBM Support Community
Deployment wiki