I love numbers, and so I’m always interested in what metrics we can pull from projects. Sometimes I pull them just because I’m curious, sometimes they actually help support opinions and gut feels, and most often they are useful to make decisions on your project.
On a recent project, I pulled metrics from our issue tracking system to look at the types of issues that were logged. I will first mention that we actually start tracking issues much earlier than most projects – we do it in the requirements phase.
For context, this snapshot was taken about 1 week before the initial deployment of the project.
The included:
- Requirements questions – for any open questions about the requirements
- Configuration – these are most relevant for out-of-the-box solutions that are heavily configured instead of developed
- Test case – when QA test cases need to be updated based on misunderstanding of requirements, design, or changes to either
- Functional – the functionality developed doesn’t work as expected
- Data – migrated or created data has errors, fields missing, or the wrong set was extracted from an existing system
- Training – when defects are logged because a user or tester did not understand how the system should work
- Change requests – every time a new enhancement is requested it is logged
- Development ignored requirements – this is an interesting category that came out of the fact that some developers did not read the requirements when they developed.
- Test data – we had a lot of errors in test data that was created that caused functionality to look like it didn’t behave
A few interesting observations about the chart:
- You will never close all your issues so at deployment time, so it makes sense that there are still open issues (yellow).
- Good news is that there are fewer relative requirements questions, test case issues, and test data issues at deployment (yellow).
- Also good news in the closed issues (green), in that very few change requests were fixed and closed at deployment, relative to functional, data, and configuration issues.
- While a number of user experience issues are open at deployment (yellow), a large number were fixed as well (green). It is good to see this – it’s an indication (hopefully) that only the most critical ones were closed.
- Of the overall issues (blue), the high level of requirements questions is because we started tracking them early in the project.
- The number of overall of test case issues is actually concerning (blue). That’s a sign they were poorly written from the beginning or that the system changed significantly after they were developed – and notice that the change requests that were closed were low however here (green).
- Also in the overall issues, the number of change requests is extremely high if you look at it relative to functional issues (blue).
- In this project, the number of user experience improvements is high, but also not surprising, since we didn’t put a lot of the UI design in place until late in the project (blue).
- Finally, the number of data defects open at deployment is probably too high (yellow) while it’s not a huge percentage of the total defects (blue). The issue on that project was they didn’t do the data-code merge until very late in the project so the issues spiked just before deployment.