Should the “testors” be involved in requirements development?

ArgonDigital - enterprise automation experts

Share This Post

On more than one occasion, I have been asked about early involvement of the test team.  To paraphrase a most recent question, I was asked:: “I have studied different articles about this new tendency of involving testers earlier in the project, namely during requirements analysis.  This early involvement means that the tester gives input as soon as a requirement is defined.  In this way, the tester will have a stronger relation with the business scope. What’s your opinion about that?”

Note: I don’t really like referring to the people doing system verification as “testors” or “test team”; I prefer “verification team”.  There are four major methods of accomplishing system verification: test , demonstration, inspection, and analysis. In addition, tests are conducted for other reasons than system verification or system validation.  Developers run tests to validate their design and code.  Tests are also needed for complex systems to understand emerging properties and to characterize what the system can actually do vs. what its requirements said it must do.

A mandatory characteristic of all requirements is that the requirement is verifiable.  What this means is the requirement is written such that it is possible to obtain objective evidence that the designed and built system meets the requirement.  When writing requirements anyone that writes a requirement should be asking whether or not a requirement is verifiable. For more details on determining if your requirement is verifiable see my blog  “Is the Requirement Verifiable?“.

As a minimum, I recommend that when writing a requirement, you include, as requirement attribute, the recommended verification method (VM) (test, demonstration, inspection, analysis) that will be used to demonstrate the system meets its requirements.  In addition, you can go into more detail and provide other attributes such as a high level description of what the VM needs to consist of and the activities you expect to see accomplished as part of system verification You can also define the success criteria.  For example: the recommended VM is Test; the Test will consist of ………….; The requirement is successfully verified when the VM shows ………

Thinking ahead to verification and including this information as you write your requirements is key to ensuring your requirements are verifiable.

Those involved in planning for and conducting system verification are key stakeholders in your project.  Involving the verification team, is a good best practice because these are the people who are the most qualified to determine whether or not the requirement is written so it can be verified as well as making sure the recommended method, approach, and success criteria make sense.

Verification Team can be a source of requirements.  This is another important reason why the verification team needs to be involved early. There may be a need for requirements to enable the stated verification activity to be performed.  These could be requirements on the system being developed as well as other things (test equipment, test stands, emulators, simulators, models, facilities, etc.) that may need to be modified or created to enable the system verification activities to take place.

System Verification has a direct impact on Project Cost and Schedule.  Another reason for thinking ahead to system verification is from a project management viewpoint.  Agreeing on the VM is really a decision concerning cost, schedule, and risk.   We would like to test everything, but the costs involved may be prohibitive.  To reduce cost, you may settle for a less expensive VM.  Note that when you do this you are adding risk.  I would combine risk and priority.  I would want to test (if possible) all the highest priority requirements.  Lower priority requirements can be verified using less expensive methods.  From a cost and schedule standpoint, integration, verification and validation activities can be a significant portion of a project’s budget and schedule – 50% if you have well defined requirements, 70%-80% if you don’t!!  So, at the early stages of the project, you better get the system verification team involved or you will be setting yourself up for failure with massive cost overruns and schedule slips.  In order to understand what resources (cost, schedule, facilities, and equipment) will be needed for integration, system verification, and system validation, those responsible for those activities must be involved from the beginning of the project.

Managing Risk.  Managing risk during your upfront planning and providing realistic budgets is key to delivering a winning product.  Note:  ArgonDigital defines a winning product as one that delivers what is needed, within cost and schedule, with the desired quality.  If the customer doesn’t budget for the real cost of a project, which includes Requirement development, integration, system verification, and system validation, they are setting the project up for failure with certain cost overruns and schedule slips.   Another mistake is assuming all will work and there will be no problems exposed during system verification activities.  This “green light” approach is dangerous!!  Problems happen! The costs to resolve these problems and the time to fix the problems and rework involved can have a significant impact on your budget and schedule.  To mitigate these risks, you need to include in your budget and schedule money and time to resolve problems that may occur during system verification.

There are plenty of case studies that show the risks of not adequately planning ahead for integration, system verification and system validation.  I presented a paper to INCOSE several years ago dealing with this topic: Thinking ahead to Verification and Validation. You can send me an email at info@argondigital.com and I will send you a copy.

Another great source concerning the importance of planning ahead for testing is from Donald Firesmith from the Software Engineering Institute (SEI) at Carnegie Mellon. He has a book titled “Common System and Software Testing Pitfalls” that has just been published. You can go to my blog, “Common Testing Problems” to get more details about the book and Donald’s website.

Comments to this blog are welcome.

If you have any other topics you would like addressed in our blog, feel free to let us know via our “Ask the Experts” page and we will do our best to provide a timely response.

More To Explore

b2b auto pay

B2B Auto Pay: Automation Use Cases

Migrating a B2B “Auto Pay” Program Companies migrating to SAP often have daunting challenges to overcome in Accounts Receivable as part of the transition. You might have different divisions running

ArgonDigital | Making Technology a Strategic Advantage