Research

Why Predictions Are Not Enough

By Bill Franks, Jan 12, 2017

Analytics Matters

In recent times, I have read a number of articles lamenting the frequent lack of value resulting from large scale analytics and data science initiatives. While I have seen substantial value driven from many efforts, I have also seen examples where the results were very poor. My belief is that oftentimes the problems can be boiled down to one basic mistake. Namely, thinking that generating predictions, forecasts, or simulations is enough. It is not.

Predictions Are The Starting Point…

Almost by definition, advanced analytics or data science initiatives involve applying some type of algorithm to data in order to find patterns. These algorithms are typically then used to generate one or more of the following:

  • Predictions about future events. For example, who is most likely to respond to a given offer?
  • Forecasts of future results. For example, what sales can we expect from the upcoming promotion?
  • Simulations of various scenarios. For example, what will happen if I shift some of my budget from paid search to television advertising?

There are other uses of algorithms and nuances between different types of predictions, but for our purposes here these three examples suffice and illustrate the point.

In each case, the output is information about what might be expected in the future given one or more specified conditions. Such information is critical. However, by itself, it doesn’t solve anything, as no value is actually achieved. Simply creating and storing the output of the algorithm provides the potential for value, but leaves you short of the finish line.

…But Not The Ending Point

At a minimum, you must enable business people to access, interact with, and act upon the results of an analysis. In other words, what decisions will be improved as a result of the findings of the analysis? Going back to our examples from the last section:

  • Given that we know how likely each customer is to respond, a marketing executive can determine an appropriate pool of people to receive an offer
  • Given that the sales forecast is above normal, an executive can request some additional inventory to be delivered
  • Given the lift possible if some funding is shifted from paid search to TV, an executive can suggest some rebalancing

The point is that without some action, the results do nothing. They are simply potential results waiting to be realized. This isn’t too dissimilar from all the food in your refrigerator that has everything you need for a gourmet meal. It is easy to get all the ingredients, but the real investment is cooking the meal. Unless you actually spend the time to cook this gourmet meal, you will either go hungry or live off of peanut butter sandwiches.

A big part of any analytics initiative, therefore, has to be giving consideration to how the relevant business people will be able to work with the results, determine what actions they’d like to take, and then make those actions happen. This means planning for and delivering dashboards and interactive interfaces along with the detailed data output.

Taking Results A Step Further

Note that in the prior section, the onus was placed upon an executive to interact with, interpret, and act on results. Sometimes this is necessary, but often it can be a burden that an executive has neither the experience nor the time to deal with. To really drive value, you need to make your analytics become prescriptive. This was the central premise to my book The Analytics Revolution, and it is something that many organizations are still failing to do.

To really drive value from analytics processes, it is necessary to automate the interpretation of the results, and the resulting actions, as much as possible. It just isn’t possible to scale processes that require intensive manual intervention. Even if a person must still approve the actions, having a process that provides recommendations along with supporting data that explains the recommendations will greatly streamline the process.

While I can’t disagree with the concern that some analytics initiatives fail to deliver, I do think that if you examine the failures closely, you’ll see that they often failed to go beyond the generation of predictions, forecasts, or simulations. Without enabling action to be taken based upon those results, whether via manual interaction or automation, it is hard to achieve success.

Originally published by the International Institute for Analytics

About the author

Author photo

Bill Franks is IIA’s Chief Analytics Officer, where he provides perspective on trends in the analytics and big data space and helps clients understand how IIA can support their efforts and improve analytics performance. His focus is on translating complex analytics into terms that business users can understand and working with organizations to implement their analytics effectively. His work has spanned many industries for companies ranging from Fortune 100 companies to small non-profits.

Franks is the author of the book Taming The Big Data Tidal Wave (John Wiley & Sons, Inc., April, 2012). In the book, he applies his two decades of experience working with clients on large-scale analytics initiatives to outline what it takes to succeed in today’s world of big data and analytics. Franks’ second book The Analytics Revolution (John Wiley & Sons, Inc., September, 2014) lays out how to move beyond using analytics to find important insights in data (both big and small) and into operationalizing those insights at scale to truly impact a business. He is an active speaker who has presented at dozens of events in recent years. His blog, Analytics Matters, addresses the transformation required to make analytics a core component of business decisions.

Franks earned a Bachelor’s degree in Applied Statistics from Virginia Tech and a Master’s degree in Applied Statistics from North Carolina State University. More information is available at www.bill-franks.com.


Tags