Data-Driven Decision Making - Part 3: Systems of Autonomy
By Geoffrey Moore, Nov 16, 2017
Data-driven decision-making: who doesn’t think it is a good idea? But it typically has a rough go in the real world of enterprise management, in part because the data itself often proves unreliable. For much of my business life IT has been tasked with building systems that could represent a single source of the truth. Unfortunately, that quest proved to be right up there with the holy grail and the fountain of youth—at best, aspirational, at worst, delusional.
Today we have an opportunity to make a great leap forward, however, because for the first time in history we have broad access to high-volume data from a variety of sources that, when matched against each other, dramatically increase the probability of something like truth, and do so in a time window that is actionable. Not everyone, of course, has access to all the sources, so to kick things off let me present a framework of the possible, within which each organization can determine what its actual will be.
Data-driven decision-making with Systems of Autonomy
I’m not going to say a lot about these systems except that, as recently as ten or so years ago, they were simply unimaginable. Now they look instead like the natural extension of systems of intelligence integrating with robotics. Who knew?
The data is tacit, and combines both internal and external signals to determine a machine’s location, velocity, and attitude relative to whatever environment it is in. The algorithms combine machine learning for machine vision and AI for navigational decision-making. This is still a wickedly hard computing problem, but with the wickedly smart people at work on it, it seems the truly persistent challenges will again be social, this time around safety, liability, and consequential ethics. For now the workaround is simply to sequester the bulk of the use cases to private property with restricted access.
Well, that little digression didn’t take long, did it? Whew! At any rate, now we have a framework for digging into how our own organization might become more data driven.
Let me begin by suggesting we have already extracted most of the value we can from the traditional systems of record approach. As with all highly valuable legacy capabilities, this means we are over-staffed in the function as historically conceived and understaffed everywhere else. How can we move our forces closer to the point of attack?
To start with, the increased deployment of systems of engagement gives us ready access to two options that were not really available before. First, we can get explicit data simply by asking users for it, and we can do so at scale, across any size population, in any time window, at a marginal cost of near zero. This is a spectacular gift which most organizations still lag far behind in adopting. Second, we can, if need be, take a “guess and then check” approach. This is the fast-fail lean start-up approach of agile development and minimum viable products, the one that puts an end to long planning cycles that do little to reduce actual risks and replaces them with low-cost experiments that can course-correct their way to innovative, differentiating outcomes. In both cases we are supplementing imagination with data on a fast-cycle-time basis. That is the big difference between the old approach to data-driven and the new—it’s no longer just about quarterly reviews of an annual plan; now it also includes weekly commits around strategic intents. The old mantra was “Measure twice, cut once,” a good approach whenever you are making a big bet. The new mantra is No big bets! Rather we want tons and tons of tiny iterative bets that add up to one big outcome. Time is the scarce resource, and every second spent hesitating is time lost.
Until your organization truly embraces agile decision-making, it is too early to transition your focus to systems of intelligence, so let us suppose in your case that it has, and therefore that it is time to do so. Now a new rule arises, one that can be socially disturbing: If it can be automated, it should be automated. That is, wherever they can be made to work, algorithmic systems are cheaper, faster, and better than human systems—full stop. They don’t start out that way, to be sure, but in the end IBM’s Deep Blue defeats Gary Kasparov, IBM’s Watson knocks off Ken Jennings, Google’s Deep Mind defeats Fan Hui, and Amazon’s Alexa knocks your socks off (and then orders you a replacement pair). More immediately for the rest of us, any enterprise that has successfully deployed systems of intelligence at scale, regardless of industry, has dominated its competition in record time. If one of these barbarians is at your gate, you have no option but to move, and move fast.
The key here is to target your most critical moments of engagement with customers and prospects, determine the tacit data signals that give you your best insight into how that moment is unfolding, do whatever it takes to get access to that data in real or near-real time, and set hot shot data scientists to work at translating those signals into actionable responses. You are not replacing in humans here by any means, but you are promoting them to a new status. They are no longer your primary source of data; now they are your first or second wave of response, one that can bring imagination, experience, empathy, and pattern recognition to situations that cannot be resolved algorithmically. Without data, however, these folks are flying blind, doing their best to pick up the pieces on the fly. Armed with data, they can look like water-walking wonders. But only after you have deployed the systems of engagement necessary to collect that data and connect them to the customer.
So, to wrap, four types of data, for types of data-driven decision making, aligned around a digital systems maturity model. Your assignment, should you choose to accept it, is to identify your current state, target your future state, and leverage this framework to build and inspire a coalition of the willing.
Originally published on LinkedIn Pulse.
About the author
Geoffrey Moore is an author, speaker, and advisor who splits his consulting time between start-up companies in the Mohr Davidow portfolio and established high-tech enterprises, most recently including Salesforce, Microsoft, Intel, Box, Aruba, Cognizant, and Rackspace.
Moore’s life’s work has focused on the market dynamics surrounding disruptive innovations. His first book, Crossing the Chasm, focuses on the challenges start-up companies transitioning from early adopting to mainstream customers. It has sold more than a million copies, and its third edition has been revised such that the majority of its examples and case studies reference companies come to prominence from the past decade. Moore’s most recent work, Escape Velocity, addresses the challenge large enterprises face when they seek to add a new line of business to their established portfolio. It has been the basis of much of his recent consulting. Irish by heritage, Moore has yet to meet a microphone he didn’t like and gives between 50 and 80 speeches a year. One theme that has received a lot of attention recently is the transition in enterprise IT investment focus from Systems of Record to Systems of Engagement. This is driving the deployment of a new cloud infrastructure to complement the legacy client-server stack, creating massive markets for a next generation of tech industry leaders.
Moore has a bachelors in American literature from Stanford University and a PhD in English literature from the University of Washington. After teaching English for four years at Olivet College, he came back to the Bay Area with his wife and family and began a career in high tech as a training specialist. Over time he transitioned first into sales and then into marketing, finally finding his niche in marketing consulting, working first at Regis McKenna Inc, then with the three firms he helped found: The Chasm Group, Chasm Institute, and TCG Advisors. Today he is chairman emeritus of all three.
Accelerate your organization’s journey to analytics maturity
Get the data sheet to learn how the Research & Advisory Network advances analytics capabilities and improves performance.