I am on the opinion that there is a clear difference between overall project success and the performance of the IT team associated with the project. You can find my detailed discussion on this viewpoint here.
Given that there is a fundamental dichotomy in the outcomes between the related and symbiotic efforts of the Business and IT teams working on a common objective, how can we measure the performance of the IT team independent of the overall project outcome they are supporting? What artifacts, models (or derivations therefrom) from the ArgonDigital methodology can be used to assist in this effort?
1. Business Objectives Model (BOM)
The success metrics defined to measure the objectives identified in the BOM must be defined for BOTH the Business and IT teams. For example, if a project has a stated objective of “increasing online sales by 1 Million dollars per annum,” then we can have two sets of success metric measurements.
a. For Business – Measure annualized sales before and after the implementation of the new web storefront. This will measure whether or not the business objectives of the project were realized.
This is a simple measurement that will immediately let us know whether or not we are on track to hit the 1 Million dollar per annum sales increase target.
b. For IT – Measure the time and cost to deliver the feature set improvements needed to increase online sales by 1 Million dollars per annum.
This will measure the ability of the IT team to deliver needed functionality within the time and cost parameters agreed to at the beginning of the project. Cost and / or time overruns will negatively impact the ability to hit the 1 Million dollar increase while simultaneously reducing the returns that will be realized by any increase in sales.
The tacit assumption made in this kind of dual measurement is that the feature set created to meet the business objective of increasing sales will actually support and contribute to the targeted improvement. It is obvious that a poorly defined feature set or a botched implementation can be causal to the failure of the overall effort even if the IT team meets the cost and time constraints. So, while measuring the performance metrics of the IT team is a good start, it is clear that we need additional measurements that address the actual definition and implementation of features.
2. Key Performance Indicators (KPI)
While KPI are typically associated with IT projects involving process improvements or system replacement, there is no reason why they cannot be used in practically any type of process heavy project, including Greenfield projects. KPI can be defined for any and all parts of a process including individual steps.
Consider for example, if one of the planned features for the 1 Million dollar annual sales increase project is “Single Click Order Creation.” A KPI can easily be defined in terms of the maximum number of steps or clicks that a user will be required to perform under certain conditions to place an order. This KPI then becomes one of easily measurable success metrics that the IT team will need to satisfy. Similar KPI can be defined for key portions of the new functionality that can all be measured at project completion.
Taken collectively, these KPI can be a very objective measure of the quality of the final product delivered by IT to meet the objectives. The advantage of this method is that we are measuring not just the financial and budgetary performance of an IT team in relation to a project but also quality of functionality that will directly impact the user experience and sales. KPI serve as a common understanding of quality that both Business and IT teams can define and agree on well before any code is written, let alone delivered.
3. Non Functional Requirements
These are perhaps the most overlooked requirement artifact that are invaluable both as a means of conveying the qualitative aspects of a product AND as a tool to measure IT success. My experience has been that teams spend almost all their time exclusively on defining the functional requirements. Non functional requirements are usually a hasty afterthought, assuming they are addressed in any meaningful way at all.
Most IT project failures I have seen usually have their roots in the non-functional aspects of the delivered product. They are rejected for poor usability, performance, unintuitive interfaces, inability to scale, poor security and other reasons that are usually not related to the actual functionality or features provided. Most teams are pretty good when it comes down to defining WHAT a product must do (functional requirements) but pretty bad at specifying HOW it must do these things (non-functional requirements).
I will stick my neck out and go so far as to say that a well defined set of non-functional requirements can be directly used to measure IT success on a project. If we assume that the feature set is fundamentally sound, then the success of the delivered product centers directly around the user experience.
For example, does it start to crawl as the number of users accessing the system increase during peak traffic hours? – Performance
For example, is sensitive user data exposed? – Security
For example, do a lot of users abandon their carts midway through a checkout because it is too cumbersome? – Non intuitive
For example, do a lot of users stop coming to the site because of onerous login processes? – Usability
All of these issues and more can easily be addressed with good non-functional requirements. Clearly written non-functional requirements can easily be tested and accepted by internal users before the product is actually released to the general public. If the “soft” requirements are satisfied, then the IT team has been successful. Else, they have failed. It is almost as simple as that, at least when it comes to the qualitative aspects of a product.
As this post shows, there is no need to go to any significant extra effort to measure IT success independent of the overall Business outcome. By using the same models and requirements (functional and most importantly, non-functional) derived from these models, it is very simple to establish clearly defined IT success measurements that the IT team can deliver on and be evaluated against.