Does anybody talk to the test team?
Which of the following two scenarios is closest to your truth?
We sort out the requirements and get everything approved and agreed, involving 'all' stakeholders, including design. Changes are discussed with the design and development teams. When we are ready we hand over to test, who seem to complain a lot and they invariably make the program late by finding too many issues.
OR
We discuss the requirements with all the stakeholders, including design, development and test, at the start of the project. The test planning is fully integrated into the development lifecycle and verification activities are started as soon as feasible. The test department find issues in the high risk areas of the project early on, and we normally deliver on time.
The first scenario is just throwing the product over the wall to test and is where many organizations have their roots even if they have now evolved to something more mature. The second scenario is 'requirements driven quality' or 'requirements driven development'. Pure software organizations are often good at involving the test team early, but often are less good at the requiremnts activities, leading to 'development driven quality' or 'development driven development' - the problem in that is obvious from the name.
Too often the first is what happens, and the bearer of bad news (the test department) is blamed for the poor quality. Fully integrating test, and indeed all of the validation and verification activities, into the development lifecycle involves changing the organizational culture. This cultural change benefits from tool support, but cannot be replaced by tool support.
So... where does your organization fit? Throw over the wall, or fully integrated? Somewhere in between is quite likely, but would you benefit from tighter integration? What is utopia?
We sort out the requirements and get everything approved and agreed, involving 'all' stakeholders, including design. Changes are discussed with the design and development teams. When we are ready we hand over to test, who seem to complain a lot and they invariably make the program late by finding too many issues.
OR
We discuss the requirements with all the stakeholders, including design, development and test, at the start of the project. The test planning is fully integrated into the development lifecycle and verification activities are started as soon as feasible. The test department find issues in the high risk areas of the project early on, and we normally deliver on time.
The first scenario is just throwing the product over the wall to test and is where many organizations have their roots even if they have now evolved to something more mature. The second scenario is 'requirements driven quality' or 'requirements driven development'. Pure software organizations are often good at involving the test team early, but often are less good at the requiremnts activities, leading to 'development driven quality' or 'development driven development' - the problem in that is obvious from the name.
Too often the first is what happens, and the bearer of bad news (the test department) is blamed for the poor quality. Fully integrating test, and indeed all of the validation and verification activities, into the development lifecycle involves changing the organizational culture. This cultural change benefits from tool support, but cannot be replaced by tool support.
So... where does your organization fit? Throw over the wall, or fully integrated? Somewhere in between is quite likely, but would you benefit from tighter integration? What is utopia?
3 answers
I can't say this about my current organization, but a former employer was great at developing its product software in a test-free environment. At the last possible moment (usually a few days before the delivery date), the test team was given the code and asked to shake it out as thoroughly as possible, but without impacting delivery.
Needless to say, we missed quite a few delivery deadlines and suffered more than a little embarrassment over defective products in the field. The shame of it was that the company had some very talented developers, but limited money for test staff. Had they the benefit of some in-process testing, I have no doubt that this company would still be going strong (as opposed to being a "former employer").
Needless to say, we missed quite a few delivery deadlines and suffered more than a little embarrassment over defective products in the field. The shame of it was that the company had some very talented developers, but limited money for test staff. Had they the benefit of some in-process testing, I have no doubt that this company would still be going strong (as opposed to being a "former employer").
Some projects that are subject to certification by regulatory bodies have no choice in the matter, but to thoroughly and rigorously test, with test efforts being close to 3 to 4 times that of development efforts for avionics software projects that follow the DO-178B safety critical standard.
Which of the following two scenarios is closest to your truth?
We sort out the requirements and get everything approved and agreed, involving 'all' stakeholders, including design. Changes are discussed with the design and development teams. When we are ready we hand over to test, who seem to complain a lot and they invariably make the program late by finding too many issues.
OR
We discuss the requirements with all the stakeholders, including design, development and test, at the start of the project. The test planning is fully integrated into the development lifecycle and verification activities are started as soon as feasible. The test department find issues in the high risk areas of the project early on, and we normally deliver on time.
The first scenario is just throwing the product over the wall to test and is where many organizations have their roots even if they have now evolved to something more mature. The second scenario is 'requirements driven quality' or 'requirements driven development'. Pure software organizations are often good at involving the test team early, but often are less good at the requiremnts activities, leading to 'development driven quality' or 'development driven development' - the problem in that is obvious from the name.
Too often the first is what happens, and the bearer of bad news (the test department) is blamed for the poor quality. Fully integrating test, and indeed all of the validation and verification activities, into the development lifecycle involves changing the organizational culture. This cultural change benefits from tool support, but cannot be replaced by tool support.
So... where does your organization fit? Throw over the wall, or fully integrated? Somewhere in between is quite likely, but would you benefit from tighter integration? What is utopia?