Data-Driven Decision Making - Part 1: A Single Source of the Truth?
By Geoffrey Moore, Nov 07, 2017
Data-driven decision-making: who doesn’t think it is a good idea? But it typically has a rough go in the real world of enterprise management, in part because the data itself often proves unreliable. For much of my business life IT has been tasked with building systems that could represent a single source of the truth. Unfortunately, that quest proved to be right up there with the holy grail and the fountain of youth—at best, aspirational, at worst, delusional.
Today we have an opportunity to make a great leap forward, however, because for the first time in history we have broad access to high-volume data from a variety of sources that, when matched against each other, dramatically increase the probability of something like truth, and do so in a time window that is actionable. Not everyone, of course, has access to all the sources, so to kick things off let me present a framework of the possible, within which each organization can determine what its actual will be.
Systems of record occupy the bottom step in this maturity model, having been prevalent for closing in on three decades. They are the foundational business systems that consolidate around ERP and explicitly capture all the material transactions of the enterprise in a relational database from which they extract and present data to help manage internal operations. We know them well and could not operate without them.
Systems of engagement, by contrast, are relatively new to the scene, be they in service to marketing automation, customer service, or sales enablement. Here again the transactions are explicit, but the data now reflects external market signals to help managers better align their operations with stakeholders outside the enterprise. Most enterprise IT organizations today are focusing the bulk of their attention at this level, the goal being to “get digital.”
The next level up, systems of intelligence, is just emerging onto the scene, having become highly visible in the leading disruptive enterprises who are currently eviscerating their competition. These are analytical systems that operate primarily on tacit rather than explicit data, extracted from the log files of websites, mobile phones, sensor-enabled devices, network traffic, and the like, supplemented with public data streams from Facebook, news broadcasts, and their ilk. Log data is tacit because it is typically being collected for syntactic rather than semantic reasons—meaning it helps with the technical operation of the digital system and is not engaged with its content. What systems of intelligence are able to do, however, is infer from these syntactic signals conclusions that have great semantic value, as, for example, a person’s propensity to buy, or click on an ad, or cancel a subscription, or perpetrate a network attack or fraudulent transaction.
Finally, at the highest level, just coming out of the labs in most cases, are systems of autonomy, be they drones, self-driving cars, computer-navigated tractors, self-adjusting thermostats, autonomic factory equipment or the like. These are real-time operational systems that are also being guided primarily by tacit data coming from a wide variety of sources and sensors, including GPS, lidar, radar, sonar, accelerometers, barometers, maps, video, and the like. From all these signals these systems are able to infer position, attitude, and current condition relative to a given mission, and prosecutes a path through the physical world accordingly.
These four very different kinds of systems generate four very different kinds of data in service to four very different purposes, as summarized below:
Now, given these four very different types of data sources, how should we be thinking about data-driven decision making? Generically it is clearly desirable, but what it entails varies considerably.
Originally published on LinkedIn Pulse.
About the author
Geoffrey Moore is an author, speaker, and advisor who splits his consulting time between start-up companies in the Mohr Davidow portfolio and established high-tech enterprises, most recently including Salesforce, Microsoft, Intel, Box, Aruba, Cognizant, and Rackspace.
Moore’s life’s work has focused on the market dynamics surrounding disruptive innovations. His first book, Crossing the Chasm, focuses on the challenges start-up companies transitioning from early adopting to mainstream customers. It has sold more than a million copies, and its third edition has been revised such that the majority of its examples and case studies reference companies come to prominence from the past decade. Moore’s most recent work, Escape Velocity, addresses the challenge large enterprises face when they seek to add a new line of business to their established portfolio. It has been the basis of much of his recent consulting. Irish by heritage, Moore has yet to meet a microphone he didn’t like and gives between 50 and 80 speeches a year. One theme that has received a lot of attention recently is the transition in enterprise IT investment focus from Systems of Record to Systems of Engagement. This is driving the deployment of a new cloud infrastructure to complement the legacy client-server stack, creating massive markets for a next generation of tech industry leaders.
Moore has a bachelors in American literature from Stanford University and a PhD in English literature from the University of Washington. After teaching English for four years at Olivet College, he came back to the Bay Area with his wife and family and began a career in high tech as a training specialist. Over time he transitioned first into sales and then into marketing, finally finding his niche in marketing consulting, working first at Regis McKenna Inc, then with the three firms he helped found: The Chasm Group, Chasm Institute, and TCG Advisors. Today he is chairman emeritus of all three.
Accelerate your organization’s journey to analytics maturity
Get the data sheet to learn how the Research & Advisory Network advances analytics capabilities and improves performance.