Skip to content

Pitfalls When Measuring The Success Of Analytics Programs

There are many factors that go into making an enterprise analytics and data science program a success. At IIA, the application of our Analytics Maturity Assessment methodology to hundreds of companies over the past several years has allowed us to identify some important and intriguing patterns. Recently, my colleague David Alles wrote about some of the challenges faced when advancing in maturity. Here, I’ll walk through a few of the patterns IIA has identified that can appear counter-intuitive at first but make perfect sense upon reflection.

Capability Without Awareness & Adoption = Failure!

Many analytics and data science organizations have made the mistake of focusing purely on the technical aspects of progress. What is often neglected is the ongoing communication and internal marketing of that progress to stakeholders throughout the enterprise. Those involved in a major project will be acutely aware of its potential and progress, but without expanding that awareness to the eventual users and beneficiaries, progress will not be subjectively perceived and recognized.

It would be terrific if successfully implementing world class analytical platforms, tools, and talent led directly to success. Unfortunately, such efforts are necessary but not sufficient. Success is not only dependent on objective increases in capability, but also on subjective awareness of and excitement about that progress. Further, real success requires the business community to actually adopt and drive value from the improved capabilities. Unless value is actually created, then the capability enhancements are not a success, but an expensive and theoretical opportunity waiting to be taken advantage of. Bottom line: capabilities do not make success, results do! Don’t fall into what David Alles calls the “checkbox” trap.

This can be immensely frustrating for an analytics organization. The team diligently focuses on making technical progress, but that progress isn’t recognized. IIA’s surveys have uncovered this pattern many times. However, if you’re not helping stakeholders to see and understand the progress being made, how can they be expected to recognize it, let alone make use of it? It is absolutely critical to develop and implement a program to frequently internally communicate:

1) What is planned for the future?

2) What progress is being made currently?

3) What are the newest capabilities stakeholders have access to today?

In an ideal world, stakeholders would seek out and keep up on progress on their own. In reality, it won’t work that way.

Technical Progress Necessarily Precedes True Success

Some of the analytics and data science organizations IIA has worked with have been frustrated by the fact that they made true, objective progress in the capabilities they delivered. Yet, survey scores do not reflect that progress. One important point to keep in mind in these situations is that objective capability enhancements necessarily precede both subjective recognition of that progress and objectively realized results. This is because without the enhancements in place, stakeholders can’t take advantage of them. This causes a lag between when a new capability becomes available and the adoption and usage of that capability. A further lag then exists between adoption and usage and the end goal of value generation.

It is important to anticipate and plan for this lag so as not to get discouraged. Turning back to the communication plans discussed previously, it is critical to prepare stakeholders for the delay between capabilities and results. Be sure stakeholders are aware of the progress being made, as well as when they will actually be able to see and make use of that progress. Unless you proactively manage people’s expectations to keep those expectations realistic, you can be certain that the expectations will become unrealistic. Once unrealistic expectations take hold, it can be very difficult to recover from the disappointment that follows.

A Raised Awareness Level Can Hurt In The Short Term

Another counter-intuitive scenario is one where an analytics organization does a good job of raising awareness within the stakeholder community of what is possible and what capabilities are coming. Ironically, stakeholder sentiment can drop in the short term as a result of this new level of transparency. This is because while the intent of the communication is to focus people on all of the positive change coming, the reality is that a lot of eyes are also being opened to what has been missing all along. Once stakeholders better understand what capabilities they don’t have access to today, their subjective sentiment can turn more negative because they now know what they are missing!

In the long run, of course, raising stakeholder expectations and meeting those expectations with greater capabilities will be good for everyone and stakeholder sentiment will rise. However, it can be quite demoralizing to see short term drops in sentiment as a result of what was meant to be a very positive and motivating view of the future. The way to soften the blow of this counter-intuitive pattern is by explaining up front that such a pattern has been seen in other organizations – and why. Prepare stakeholders to recognize when they enter this cycle.

Navigating The Short Term Bumps

Evolving an enterprise analytics and data science program requires tracking your progress objectively with a formal survey that includes stakeholder sentiment along with technical progress. At the same time, as IIA has seen with our Analytics Maturity Assessment work, those surveys can yield results that might appear both counter-intuitive and negative without taking into account the larger context of progress within which the surveys are collected.

To sum up, there are a couple of recommendations that will help your team avoid becoming dejected and / or taking a public beating if objective progress and subjective sentiment are not initially fully correlated:

1) Never forget that technical progress will not matter if stakeholders are not made aware of the progress and enabled to take advantage of the new capabilities.

2) Remember that it takes time for a new capability to move from availability, to adoption, to realization of value. Being “done” rolling out a new capability is not the end of the story.

3) As you make stakeholders aware of the exciting new capabilities coming, you’re also making them more aware of what they are missing today. This can lead their sentiment to become more negative in the short term. Don’t let this be a surprise.

By planning for both the objective and subjective aspects of progress, you’ll be much more likely to succeed in your efforts to improve your organization’s analytics and data science capabilities and the impact those efforts have on the enterprise. The best long-term outcomes will occur if you successfully raise expectations while meeting or exceeding those raised expectations. Just remember that in the short term, things can be perceived worse even as the groundwork is being laid for a much better future.

Bill Franks, Chief Analytics Officer, helps drive IIA's strategy and thought leadership, as well as heading up IIA's advisory services. IIA's advisory services help clients navigate common challenges that analytics organizations face throughout each annual cycle. Bill is also the author of Taming The Big Data Tidal Wave and The Analytics Revolution. His work has spanned clients in a variety of industries for companies ranging in size from Fortune 100 companies to small non-profit organizations. You can learn more at http://www.bill-franks.com.

You can view more posts by Bill here.

Originally published by the International Institute for Analytics