Test Management

The Test Management practice provides a good starting point for practitioners relatively new to test management, but it also offers a structured reference model for more seasoned professionals.

For any sizable software engineering effort, test management is an important concern. As any management process does, test management deals with different aspects of organization, planning, team preparation, monitoring, and reporting the test process to be followed. The target audience for this practice is the more mature test organization that needs to independently plan and track the test effort, but still interface and collaborate with the rest of the overall development team.

Very few people will argue against test management, because the need of organizing and controlling the testing effort is evident. The more important questions are how and how much extra effort will be required?

It is important to consider the tools that assist in planning, designing, implementing, executing, evaluating, and managing test tasks or work products and how well they align with your current practices. The Test Management practice is tool-enabled, as the tasks and artifacts can easily be realized by specific tools in this domain.

Identifying test motivators

You can manage the information related to a test effort using a test plan centric approach. The motivators for a specific test effort can be captured and maintained within the associated test plan.

This guidance is similar to information that was previously found in Tool Mentors in the IBM® Practice Library.

Before you begin

Open an existing test plan or create a new one by clicking Planning in the main menu and then clicking either Browse Test Plans or Create Test Plan. For details about locating existing test plans, see Searching, filtering, and querying test artifacts. For details about creating test plans, see Creating test plans.

Step 1: Identify candidate motivators

Document the factors that will motivate the test effort in the Motivators section of the test plan. Because the Motivators section is not included in any of the supplied test plan templates, you will need to create it.

To create the Motivators section:

  1. Click the Manage Sections button to open the Manage Sections window.
  2. Click the Create Section (Create section) icon and type a Section Name, such as Motivators, and an optional description.
  3. In the Section Type field, select Grid.
  4. Type Motivators as the name for the first column, and then click the Add column icon (Add) to add two more columns.
  5. Type Priority as the name for the second column, and Description as the name for the third column, and then click OK.
  6. In the test plan table of contents, select the Motivators section.
  7. Click the Add Row icon (Add table row) and type the name of the motivator and a description. Repeat for each motivator that you identify.
  8. Click Save to save your test plan.

Step 2: Identify and document risks

List the risks driving the current test effort in the Risk Assessment section of the test plan. Maintain the list of risks and associated information (impact, relative importance, mitigation plan, and so on) within this section, or if your risk planning is part of a larger risk management effort, attach a link to the external sources.

Each test plan comes with a default risk profile that includes a predefined set of risks. If the existing risk profile does not meet your needs, you can switch to a different risk profile, add new risks, or remove risks.

After you create the initial risk assessment, other team members can assign a risk ranking of their own in the My Risk section of the Risk Assessment. The software then calculates a Community Risk assessment by averaging the My Risk selections of each user.

For details about listing risks in a test plan and working with risk profiles see Managing risk.

Step 3: Prioritize the list of test motivators

Return to the Motivator section of the test plan that you created in Step 1. Prioritize the list by adding a priority value to each row.

Create specialized test plans driven by test level and type. Create master-child relationships between the plans. Use a naming convention for maintaining the relationships. For details about creating master and child test plans, see Creating master and child test plans.

Step 4: Maintain traceability relationships

Based on the motivators list, create traceability relationships between the requirements and test cases in your test plans. You can list requirements in the Requirements section of the test plan. You can list test cases in the Test Cases section of the test plan. For details about working with requirements in the test plan, see Requirements management.

Step 5: Evaluate and verify motivators

Verify your list of motivators with the team by distributing your test plans for review.

  1. Create a list of reviewers in the Formal Review section of the test plan. Each reviewer will receive a notification to complete the review. The status of each person's review will be listed as Pending until that person completes the review.
  2. Make sure that each reviewer has access to the test plan. This requires that each reviewer is a member of the project area that contains the test plan and has the appropriate license.
  3. Optional: Set up authentication for the review process so that reviewers are required to authenticate with the Jazz™ Team Server each time they complete a review. Reviewers can authenticate by providing their electronic signature. By default, this option is disabled.
  4. Track the status of the review in the My Tasks widget of your Home page on the dashboard.
  5. Optional: To copy the link for the current editor page, including the selected section, click Copy link for this page. This option also enables you to copy the title of the page, which is the ID and the title fields.
    Note: For test artifacts that support OSLC, there is an option to copy the editor page link in the Concept and Versioned OSLC format. The OSLC formatted URL is used for debugging, but navigates to the editor page if the URL is pasted in a browser window. For a configuration management enabled project area, the configuration of the copied link is preserved for the Concept OSLC URL. For the Versioned OSLC URL, the configuration is not preserved. Navigating with the Versioned OSLC URL from a browser can display the artifact in the local configuration or the global configuration that included the same local configuration in the configuration contribution. To work with artifacts in the web client, use the Copy Link URL, not the Copy OSLC URL.

For more information about test plan reviews, see Setting up a review process.

Results

By completing this task, you have identified and prioritized a list of test motivators and have added them to a Motivators section of the test plan. You have identified a set of risks in the Risk Assessment section of the test plan. You have established traceability between the requirements and test cases in the test plan, and you have set up and completed a test plan review process with the appropriate reviewers.

video icon Video

Jazz.net channel
Software Education channel

learn icon Courses

IoT Academy
Skills Gateway

ask icon Community

Jazz.net
Jazz.net forums
Jazz.net library

support icon Support

IBM Support Community
Deployment wiki