It's all about the answers!

Ask a question

Requirement Tracing to Test Script


Bob Zurn (61162) | asked Apr 28 '09, 12:48 p.m.
In RQM, I can trace a requirement to a Test Plan and to a Test Case but not to the actual Test Script. I would like to be able to trace to a step in a Test Script, insert the Test Case Requirement into the step. We may have 20 requirements in a test case and we need to verify that those 20 are incorporated. And automatically update (or indicate) the requirement text when change happens. I would also need metrics on # requirements actually incorporated in at Test Script. This would provide full tracking of the requirement. Is there any plans to provide this capability? Thanks.

13 answers



permanent link
Ashish Mathur (12176) | answered Apr 30 '09, 11:37 a.m.
JAZZ DEVELOPER
A TestCase is viewed as the artifact where the requirements are manifested. Test Scripts are in a way transient since you might start out by implementing the script as a manual script and at some point turn to automation. The requirements coverage would still be valid then. The Test Scripts are various implementations of the test case where which contains the data about what is being tested.

Is there a specific usecase/situation/domain where this is not sufficient?

permanent link
Bob Zurn (61162) | answered May 06 '09, 8:01 a.m.
We need the traceability to the Test Script. We need to verify that a requirement, with text, is incorpated, tested, and verified. We've had cases where a requirement is assigned to a Test Case but was not tested in the script. This would also provided the metrics of how many requirements are actually incorporated in a script as Verification Points.

permanent link
Daniel Chirillo (1801823) | answered May 06 '09, 4:47 p.m.
You don't need to trace requirements to scripts to get what you need. Execution is tracked not through the script, but through the Test Execution Record (TER). When you execute a Test Case, you select a TER, which is associated with a script.

permanent link
Bob Zurn (61162) | answered May 07 '09, 7:40 a.m.
I should have said a requirement(s) that were not incorporated into the Script. Therefore, executing the script would have left this requirement(s) out. If 10 requirements are assigned to the Test Case, but only 5 are incorporated in the Script, execution would be 5 of 5. I have to manually compare Requirements in the Test Case to those actually in the script and tested. If there was a link from the Test Case to the Test Script, I'd know if all ten were actually incorporated or only five.

permanent link
Daniel Chirillo (1801823) | answered May 07 '09, 6:04 p.m.
There is a link from the Test Case to the script. Maybe I'm missing something?

permanent link
Bob Zurn (61162) | answered May 08 '09, 7:58 a.m.
Could you supply the steps? or where to look. I've not found how to do this. I'll check Help again. Thanks,

permanent link
Daniel Chirillo (1801823) | answered May 08 '09, 3:07 p.m.
1. Create a Test Plan.
2. Import requirements into your Test Plan
3. In the TP, click on the Test Cases link and add a Test Case to the TP (you can add a Test Case you've already created or create a brand new one)
4. In the Test Case, click on the Requirements link and add requirements. Here's your traceability to TC
5. In the Test Case, click on the Test Scripts link and add a Test Script (you can add a script you've already created or create a brand new one).

permanent link
Bob Zurn (61162) | answered May 08 '09, 3:37 p.m.
I've done that. Works fine if there is only one requirement per test script. My problem is with 10 requirements and only 5 being actually incorporated in the script.

Think of it this way. I assign 100 Requirements to a Test Plan. These are assigned to multiple Test Cases. Reports will show how many are actually incorporated in Test Cases. This is good. I need to go to the next level for incorporation into the Test Script. Joe Tester may only have incorporated 5 of the 10 in the Script (50%) do to incomplete software and steps to run the script, but the Test Case is 100% as requirements are assigned to it. This will provide better metrics as to Test Script progress.

Hope this clarifies my issue.

permanent link
Daniel Chirillo (1801823) | answered May 08 '09, 3:45 p.m.
As I understand your use case, you want a human process that verifies that the requirements that have been associated with the test case have actually been tested by the script. If that's the case, it doesn't matter whether 1 or 5 requirements have been associated with the test case. RQM will tell you what requirements have been associated with what test cases and a reviewer will determine if the script associated with those test cases tests all the requirements associated with the test case.

Even if you could associate a requirement with a script, you'd have the exact same problem: you need a reviewer to verify that the implementation is correct (that the script actually tests the requirements that have been associated with it).

permanent link
Bob Zurn (61162) | answered May 08 '09, 4:15 p.m.
Currently the process is Human checking that 1)requirement is in script and 2)script tests the requirement. #2 can't be automated. #1 can at least to show the requirement in the script (and I can tell which Test Cases are Draft/Reviewed/Approved...). If #1 is automated to show which requirements are incorporated in the Script, life is much better. We used to run VBA scripts against MS Word documents to see if the requirement was actually in the Word document, and not missing or in another document.

Hope this helps- thanks,

Your answer


Register or to post your answer.


Dashboards and work items are no longer publicly available, so some links may be invalid. We now provide similar information through other means. Learn more here.