Bob Lewis
Columnist

Why IT surveys can’t be trusted for strategic decisions

Opinion
Apr 18, 20236 mins
IT LeadershipIT Strategy

Whatever you’re researching, you have to decide: Do you want evidence, or just company? Because IT surveys offer insights that shed little light.

business analyst at work
Credit: Jacob Lund / Shutterstock

Information, according to the mathematical theory that bears its name, reduces uncertainty. If, for example, I tell you I tossed a coin twice, you’ll know there were four equally probable outcomes. But if I then tell you the first toss came up tails, the number of possible outcomes cuts in half: tails/heads or tails/tails.

In this way, the information I have given you has cut your uncertainty in half. Everything we do in IT starts here, with the definition of a “bit.”

And yet when it comes to reading about our industry, content that too often fails to reduce our uncertainty about a subject in any useful way.

Why do I say that, you ask? One reason is that surveys dominate research into IT practices, and their results generally follow the well-worn template: X percent of Y does or is planning to do Z.

Surveys, that is, only reduce our uncertainty about how many people or organizations are doing something we care about (or are supposed to). And even that is clouded by our lack of certainty as to how truthful the respondents are.

You can’t trust the answers

Let’s take a random example in which a CIO’s survey response indicates they’re planning to rationalize their applications portfolio. That doesn’t mean they’ll get the budget to actually rationalize it. Often their “yes” answer to a question is wistful yearning —something they’d like to do, if only they could.

Or, as they’re being surveyed by a prestigious analyst firm, they don’t want to admit they have no idea what the question means. Or, if they do, they’re embarrassed to admit that even though the analysts tell them that if they don’t follow this latest industry trend they’ll be out of business, following it just isn’t in the cards this year.

For the most part, survey value comes down to this: You think your company should be doing something. Someone’s survey associates a big bar with that subject. A big bar looks important. But really, using a survey to justify a course of action is little more than playing follow-the-leader.

Whom you measure matters more than what they say

Surveys also fail to reduce our uncertainty when they aren’t accompanied by an account of who responded to it — not only which companies or types of company, but also the specific job title or titles. After all, ask a CIO what they plan to spend on next year and compare it to what information technology the CEO or chief marketing officer plan to pay for and it’s far from guaranteed their responses will sync up.

Error bars offer little more than false precision

Yes, survey perpetrators are getting better about letting us know their survey’s sample size. But does anyone have the time and energy to use this information to compute error bars?

Even if you did, the thing about error bars is that — speaking of uncertainty and the reduction thereof — error bars have an interesting property: They reduce our uncertainty about how certain survey results are.

Error bars are a useful remedy for how so many surveyors indulge themselves in the sin of false precision. They might, that is, “inform” their audience that 53.7% of respondents say they’re doing something or other. This is a bit like the officials at a football game unpiling the stack of players who are all trying to shove the football in a favorable direction, placing the ball in what seems to be a fair spot, then bringing out the chains and measuring, with micrometer precision, whether the team on offense earned a first down.

Those who survey skew results

Which brings me to one final issue: Here in the world of IT many of the most prominent firms that conduct surveys and report on the results also pride themselves as being industry thought-leaders, oblivious to the logical circularity of their branding.

You might think this carping is less than fair to the community of researchers into IT management practices. After all, just getting a decent value of n for their survey is hard enough.

And it is. But.

The point of a typical survey is to inform its audience that something or other, whether it’s a specific product, class of technology, management practice, workforce preference, or what-have-you, is important enough to pay attention to.

Surveys might accomplish this, if your definition of “important” is alotta, as in “Alotta folks say so.”

But the history of the world is filled with examples of the majority opinion being wrong. Here in IT-land many of CIO.com’s readers will remember, with varying degrees of fondness, IBM’s ill-fated OS/2 operating system, whose success was, according to the surveys of the era, assured.

A possible antidote

In principle, I’m violating the principle that nobody should identify a problem without proposing a solution. So if surveys aren’t as useful as they purport to be for helping decide what new technologies are likely to matter; what IT services the enterprise should invest in; what IT management practices should change and how they should change — the question is, what would be more helpful?

My best answer isn’t particularly empirical. It follows this template:

  1. What problem or opportunity are we trying to address? Is it important enough to need addressing?
  2. How does the solution we’re researching try to address the problem or opportunity, and is the solution plausible?
  3. What’s the plan for overcoming all the forms of inertia the solution will have to overcome?
  4. Imagine some strange publisher wants you to write a piece of fiction that tells the story of the solution’s success. Could you imagine authoring a short story or novelette whose average Amazon score would exceed 3 stars, and whose typical review wouldn’t contain the phrase, “Even for fiction this is too stupid to waste time reading”?

No, it isn’t an evidence-based approach. But then, alotta surveys are about the future. And there’s a funny thing about the future — there just aren’t any facts about it to be had.

Yet.

Bob Lewis
Columnist

Bob Lewis is a senior management and IT consultant, focusing on IT and business organizational effectiveness, strategy-to-action planning, and business/IT integration. And yes, of course, he is Digital. He can also be found on his blog, Keep the Joint Running.