Selecting a Content Management System
It may not be sexy, but selecting the right software tools can have positive effects on one’s business for years to come. On the other hand, a poor choice can ripple through your organization for years, leaving devastation in its wake. Overdramatic? Perhaps, but for those forced to live with poorly vetted enterprise-level software might beg to differ. The right fit can be like that dance partner who knows how to move with you, how to anticipate your next steps and ultimately helps deliver a winning performance. The alternative makes you feel like you have two left feet and leaves you questioning your own abilities. Unfortunately, many companies still spend more time arguing over the fonts and colors on their website than they do really performing due diligence in selecting a new marketing platform or content management system. The assumption is either that “they’re all pretty much the same” or “the team can make it work”. I’m fairly confident that we’ve all been there at one time or another and can say without hesitation that they are not all the same, and “making it work” is very different from “making it work for us”.
This begs the question – why does this happen? See if any of these sound familiar:
- “We used XYZ for our content management system at my last company and it worked out really well. Besides, I know a guy over there who can hook us up”
- “I know our competitor uses ABC for their file management software, and if it works so well for them it should be good enough for us”
- “I saw this comparison chart that listed the top 5 CRMs on the market. 4 of them looked really expensive and I’m sure our IT group can make the other one work just fine. They all look like overkill anyway”
- “I don’t use this stuff anyway, but Marketing/IT/Sales/HR says that this is what we need”
The reality is that software selection is a specific skill that few have the opportunity to develop. There are specific processes and methodologies to follow, although most of us rely on the promises and presentations of sales reps who are suddenly difficult to reach as soon as the virtual ink on the contract dries. There’s a better way than throwing darts at a wall. Let’s take a look a project run by our team to help a state agency select a file management tool and see what insights can be learned from their experiences.
A little backstory –
The agency (we’ll call it Little State Agency or LSA for short) is tasked with managing the investment of state funds under the auspices of a much larger parent agency, a job it has been doing for over 150 years. This investment portfolio is currently worth in the neighborhood of $22 billion. As these are public funds, the LSA has a fiduciary responsibility to both the residents and state government. Not surprisingly there is a huge amount of financial documentation, both current and historic, that is required to remain in compliance with state auditors, as $22 billion comes with more than a little oversight. Historically these have been paper files stored on site – challenging to organize and retrieve and lacking the physical security that modern document management demands. (If you are picturing the final scene of “Raiders of the Lost Ark,” you’re not too far off.)
There have been attempts made over the years to digitize the files, most recently with the parent agency selecting and implementing an ECM (Enterprise Content Management) system. Unfortunately the selection process didn’t give much weight to the unique needs of the LSA, although the project did increase awareness of the issues facing any new implementation. Very quickly, the “something is better than nothing” motto gave way to a heightened set of expectations related to ease of use, integration with other systems, maintenance, and workflow management. The LSA was able to marshal support for another solution that was better suited to their needs and turned to ArgonDigital to facilitate the selection process.
One of the advantages to bringing ArgonDigital in is that we bring a fresh, outsider perspective. We can ask all those questions about the organization, their past experiences, and expectations that an insider might feel uncomfortable or dumb asking.
We began the engagement by having everyone at the agency participate in a prioritization exercise to identify the drivers and tradeoffs for the project. We learned very quickly that user adoption was the number one priority. Not only did all future work documents need to be digitized and stored, but those mountains of existing files would also have to be processed. The team wisely realized that unless everyone was on board with the new system, there could be pushback to the effort to convert all of the historical records.
The next step for ArgonDigital was to conduct reviews of over fifty different vendors to narrow the list to three top choices for deep analysis. To arrive at a recommendation, eight factors were prioritized through the lens of User Adoption; Functional Requirements, Integrations, Workflows, Records Management, Use Case Requirements, System Requirements, and Pricing and Usability, with each of them containing between five and thirty-five separate requirements to be evaluated. The output of this exercise was a blind, weighted scorecard that would be the driver behind the selection process for the three finalists.
The concept of this type of scoreboard might be novel to many. Imagine a long list of requirements down one side of a spreadsheet, with columns for descriptions, notes, and a numerical rating – in this case from 1-5. Each of the requirements for the three competitors should be judged independently as we are looking for absolute, not relative scores, so a separate sheet for each is necessary. If solution A scores a 4 in Accessible via an API, that means it is very good, albeit not perfect in this area. It doesn’t matter how it ranks against the others – we’re not grading on a curve. If the scorecard was the entire process, then we would simply total up all the scores at the bottom of the page and declare a winner.
Here’s the secret sauce: remember the prioritization exercise we did earlier? Each of the requirements matches to one of the eight factors we’d ranked during the elicitation sessions so that some of those 4s we scored could be worth 12 points, while others would be much less. These weighted totals would only be shown on a separate tally sheet where all three finalists scores would be broken down by category and a winner crowned. If done properly, the reveal can add a little bit of fun and drama to the process as everyone is keeping a running (imperfect) tally in their head!
The importance of the scorecard cannot be overstated. People often enter these evaluations with both implicit and explicit bias that can cause the process to be less than impartial. They might have prior experience with one or more of the finalists that can cloud their judgement. Sometimes a trusted friend will weigh in with a strong opinion that is lacking in context. People may forget the organizational priorities and substitute their own. On more than one occasion, we’ve discovered the initial prioritization turned out to not be as accurate as it needed to be. (Sometimes Price suddenly becomes much more important after the scores are tabulated, for example).
Imagine how much fun it would be to have to redo the analysis from the beginning? However, with our impartial, weighted scorecard, the in-depth analysis can remain, and the final tally adjusted automatically to reflect the newly discovered priorities.
Not to spoil the ending, as that will be reserved for a future case study, but a clear winner emerged based on the priorities defined by the LSA . Did they end up with the best ECM on the market? Perhaps. Did they end up with the best ECM system for them? Most definitely.