ArgonDigital - enterprise automation experts

Share This Post

ArgonDigital Requirements Management Tool Evaluation – Part 2

9/22/11 IMPORTANT NOTE AND UPDATE: We have updated the criteria and tool scores — please go to this post for the most current material.

Phase 2 of our requirements management tool research is complete! We have posted the criteria and tool scores in a downloadable Excel document, though we have a whitepaper coming out shortly to share some of the interesting observations about the tools, as well as some of the challenges we faced in this phase of the research.

For a reminder about our research process, you can see our post on phase 1. We did a similar (though less rigorous) study back in 2007, and one thing I will share is that I’m so much happier with the requirements management tools on the market now than four years ago. At ArgonDigital, we are particularly interested in advanced modeling functionality within the tools to support RML®, and there are a lot of tools that support this relative to our original study. There is also better support now in general for working offline. You can see the results for yourself.

This result list is meant for you to use, so please download and do what you want with it! The key though is that you remember that there are raw scores and weighted scores. But it’s really important to remember that the weighted scores are based on ArgonDigital’s prioritization of the criteria – which may not be right for your organization, so feel free to change those. You also can spend some time skimming the comments for the criteria you most care about  as well. Anyway, please download and use however you want, we’d just like credit where appropriate for the results.

Within the spreadsheet, you will find a few columns that may be useful to you. The Phase 2 evaluation is the first tab and the resulting scores are summarized in the 2nd. We kept the Phase 1 evaluation results in as well, as long as a summary of those scores. And while we didn’t write full Use Cases, we did think about them by title/concept and so you can find those in the fifth tab and the last one shows you a pivot table of features by use case – this is part of what we did to ensure we weren’t missing major features that we cared about.

Now, a few disclaimers seem appropriate here as I’m publishing this for the first time. First of all, the results are probably not perfect, but we think they will be useful. Now, by that what I mean is that there are going to be some criteria we unknowingly miss-scored. We ran into a situation where some vendors were more helpful than others in providing evaluation copies or demos of their tool, and with those that did not, we had to be creative about our evaluation. In the cases we only got to see demos, the results are at risk for being less reliable only because  we didn’t get to get our hands on the features. Anyway, there will be more on this in the full paper.

We are now beginning phase 3 of our research where will use a select few of the tools on our actual projects. We are hoping to select the tool that works best for us, but it also is exciting because it allows us to really see how the features are supported beyond just demoing them. More will come out on that later in the year!

ArgonDigital Requirements Management Tool Evaluation – Part 2

More To Explore

AI to Write Requirements

How We Use AI to Write Requirements

At ArgonDigital, we’ve been writing requirements for 22 years. I’ve watched our teams waste hours translating notes into requirements. Now, we’ve cut the nonsense with AI. Our teams can spend

ArgonDigital | Making Technology a Strategic Advantage