A Brief History of Big Data Analytics

I had the opportunity back in June to attend IIA’s CAO Summit in Chicago, where I sat on a panel discussing the topic “Creating an Analytics Culture.”

The session was kicked off by Tom Davenport, the recognized ‘guru’ of analytics who has written several books and many articles on the subject. Tom founded the International Institute of Analytics, and I work with him and his team as a lead faculty specializing in Manufacturing for IIA.

Tom began the discussion with a simple question: What do people think about ‘Big Data’?  In general, he noted that not all comments were positive.  Many people don’t like the term and he pointed out that analytics is not a new concept.  In fact, Big Data has been used for a while, with companies such as UPS and others doing a lot of these things for some time. There is definitely something going on and we have been heading in a new direction for a while, but it is relatively unstructured at the moment.  Many new kinds of data are being analyzed and new ways of addressing are coming to the forefront; this is a new development afoot, called Analytics 3.0.

So what are companies doing with their data? In general, the theme of the discussion by the panel I sat on was that companies are using data to make the same decisions, but are making them a lot FASTER! For example, Macy’s can re-price all of their merchandise, all of their SKUs in 20 minutes, not 24 hours.  This might allow them, say, to re-price shorts on a hot day in Chicago, and Intel is making digital signage that could allow that.

The gaming industry has been at the forefront of analytics and it has revolutionized the business.  For example, Caesars Palace is good at detecting if people are unhappy losing at slot machines, in which case they walk out the door with a bad experience related to gaming on their minds.  Unfortunately, they blame it on Caesars, not the probabilistic nature of casinos.  So, the thinking goes: If we can intercept them before they go, saying, “Here is a free meal at the buffet; maybe when you come back things will be better.” – In such cases, the likelihood that they return is 40% (as opposed to 20%).

Companies can make better decisions with better types of data. For example, United Healthcare wants to know if customers are very upset with them. Call centers register user dissatisfaction in their conversations and are working to rapidly convert voice to text, which is then analyzed. This can result in a quicker intervention and is a better predictor of attrition.

LinkedIn, which had a strong presence at the CAO conference, has been the most forward in developing a culture of analytics. For example, they are very good at taking the data they collect on their social media website and converting it into new data products – such as with its “People You May Know” feature.

Tom Davenport also discussed the evolution of analytics, beginning with Analytics 1.0.  This started in the mid-1950s, with primarily descriptive analytics and reporting activity, using structured internal data analysis.  Small teams were secreted away in backrooms where they did their work.  The purpose was for internal decision support. Many versions of 1.0 still exist – as many people still use spreadsheets for analytics.

But then people started to hear about packages, and a company called SAS came out with some advanced techniques. But in reality, only 6 to 7 percent of the masses could use these tools. And most of the tools were using historical data. Tom joked:  “Look backwards – because that is where the threats are!” And most of the effort was spent on getting data ready as opposed to analyzing it.  Analytics for the most part was inside the sheltered confines of the IT organization, not nearly as widespread or as accessible to the masses as it is today.

Analytics 2.0 started about a decade ago in Silicon Valley –and in this environment, things start to look unruly, and there began to emerge the application of unstructured information.  The majority of information in big offline companies is unstructured, and there is a whole world of external information, including the Internet generating this data.  Given this genomic data and complex large unstructured data sources, a new role called the Data Scientist began to emerge.  This term was coined by people at Facebook and LinkedIn, and the online firms start to go crazy as the new database products and services began to emerge.  These companies are in the data economy, and the idea was to put everything we can into a data processing paradigm. But this was not yet big analytics – this is still small math.  Davenport recalled being at a major CPG company and noted the presentations being made using the data – which consisted primarily of bar charts and pie charts.  His colleague joked, “Welcome to the World of Big Analytics – and Small Math.”

In my next blog post, I will share my thoughts on IIA’s new concept of Analytics 3.0 – and what I think this means for the supply chain.