Why you have to do your own competitor comparison
People evaluating our software often ask us to send them an analysis document showing how we stack up against our competitors. It’s a very reasonable request, but because it’s one I always turn down I thought it would be worth sharing why I don’t think they’re meaningful.
- They’re selective
During elections, politicians spend a lot of effort not on debating issues, but on trying to shift the focus of conversation towards topics where they’re likely to win – there are some policy areas where candidates and parties can expect to come out better.
If you’re trying to persuade a company to buy your product instead of a competitor’s, it’s a bit like running for election. Competitor comparisons are all about demonstrating why you’re better, and the easiest way to do that is making people think about the areas where you come out top.
To take an example, here’s a screenshot from a document published by Idera (April 2016) comparing their monitoring tool to several others. It continues this way through 16 items, and it’s a miracle – they’ve managed to build a product which is better than everything else on the market on every conceivable axis!
Companies do this all the time, with the occasional nice touch of losing in a single category to prove their impartiality. Everybody not on medication for clinical gullibility knows this, but we still waste time reading them. There’s certainly a place for companies talking about the general areas they think they can be most advantageous, or why they benefit certain groups of people, but this kind of feature by feature tick box comparison isn’t it.
- The information is unreliable
Looking at that same comparison above, there are some suspicious gaps. SQL Monitor’s support for TempDB is just as capable as its support for other databases, and for what it’s worth, so is the support from each of the other vendors too. SQL Monitor also has great support for AlwaysOn Availability Groups. Those certainly aren’t the only errors on the page either.
These kinds of comparisons are usually put together by people with limited product knowledge, and certainly limited knowledge of their competitors’ products. It’s hard to know whether they misunderstand product documentation, or just lie for simplicity, but either way, information is often wrong (usually in a convenient way).
Two tricks I find especially fun are when companies list ludicrously niche features which their competitors miss, or common features under special names. Here for example, a company called LogoMaker boasts about being the only company to offer their specific (“love your logo”) guarantee package. Who would have guessed?
- Features aren’t comparable
Reducing product capabilities to boolean feature comparisons misses much of the richness of what makes one preferable over another for YOU. In the LogoMaker example, they say “All icons designed by paid professionals”. That comparison misses important detail, because their paid professionals might suck, while another company might have great professionals (or indeed great unpaid student interns).
For all but the most commoditised products, feature implementations are different, so the useful question isn’t “which products have that feature?” but “which is best in that category for my needs?” If you were doing your own comparison of LogoMaker vs LogoGarden, you wouldn’t want to know which ones have more than 50 fonts, you’d want to know which one’s fonts you prefer.
- They miss what actually matters
It’s rare that one product is unequivocally better than all others in all ways for all users. Different people have different needs, personal preferences, and experience, which makes decisions subjective. If you want to compare two products, you have to try them both yourself.
There’s a huge amount you won’t find on these comparison charts, like whether you trusted the people you dealt with at each company, or how fast and reliable the products seemed. Above all don’t underestimate the importance of how much you just naturally like a product – something which is hard to quantify but makes a big difference to how you’ll enjoy using it.
Don’t get me wrong, it’s definitely a good idea to talk to vendors about their own product, including a conversation about the ways they think they have an edge over their competitors. But if you let Vendor A give you a simplistic tick-box view of competing Vendor B, you’re only going to get burned.
Was this article helpful?