Peter Harrad discusses some of the caveats involved in relying solely on tool certifications for standards compliance when evaluating a tool.
I’ve answered plenty of Request for Proposal's and Request for Information's in my time, and one thing that I had drilled into me was that the people who review responses to them start by looking at one thing – the proportion of responses that say ‘compliant’, ‘partially compliant’, ‘non-compliant’ and so on. It can take control of your mindset. It does. One time I got to bed at 5am after a 700-page RFP response and had to go into the office to get a specification released the next day. Someone asked me if I fancied a drink at lunchtime (it was Friday) and I instinctively replied “Fully Compliant”. The thing is, most tool certifications display the same thinking.
Summary: Blindly relying on tool certifications for standards compliance is very dangerous. The requirements are often open to interpretation so that everyone ends up with high number of compliant responses… and the requirements themselves are susceptible to pressure and lobbying. So the best way to use them is if you are evaluating the tool and an area is important to you – responses that a vendor gives in an area can provide talking points during evaluation.
The first and most important point to be aware of is where the certification requirements come from. In general, they come from the vendors that are filling them out, in that often the vendors themselves are members of the standards body that owns the standard – and they are natural candidates to form the working party. By itself, this is not sinister, but I am aware of at least one tool certification that was held up by a year because of attempts by one or two vendors to put in requirements that only their product could fulfill.
The other side of the coin to skewed requirements is vague ones. Requirements questionnaires are often vague, in order to not favor one tool or the other. To take the TOGAF questionnaire as one representative example, “ Tool facilitates the development and review of work packages” is one of 650 compliance points. But – *how* does the tool do this? Even the most conscientious vendor is not going to write a full explanation for each of 650 points, so the answer becomes ‘ yep, we do that’.
The final reason to not blindly trust tool certification questionnaire responses is the fact that vendors themselves are the ones fill them out – it’s a self-certification system. As mentioned above, vendors are used to the RFP response game where what counts most is how many times you say ‘ compliant’. So when you combine this impetus with the point above, you end up with a set of ‘compliant’ responses.
So where does this leave the outsider considering tools? Are the questionnaires useless? No. First of all, participation does at least show a commitment to that standard – there is invariably a fee for submitting the certification questionnaires. More than that, the questions and responses can provide talking points during an evaluation. The requirements in these questionnaires are not plucked out of thin air – they do relate to the standard. So scanning the list of requirements in a questionnaire can be a better source of requirements for your evaluation, or for just questions in a demo or evaluation, than paging through the standard itself.