Skip to content

Big Data And The Analytics Arms Race

You think big data is big today? Just wait until next year…or the year after that…or the year after that. It is growing exponentially. Whatever seems big now will likely seem relatively small just a short time from now. My inner analytics geek is thrilled when several times per week I see an article or have a discussion that calls my attention to yet another company or industry with yet another source of data growing explosively. It is just amazing how quickly these sources of data are growing. With that comes an explosion of analytic potential as well.

On October 29, I spoke at the Rock Stars of Big Data show in Silicon Valley. There were several hundred people in attendance and a solid line up of speakers. Other speakers provided perspectives from Google, Netflix, Cray, IBM, General Electric, Kaiser Permanente, and Intel. It was amazing to hear so many well-known companies hit upon the same general point – there is just so much data coming at us and so much analysis to do against the data. In fact, it is requiring organizations to rethink their entire analytic strategy from how to collect data, to how to set up systems to manage it, to how to apply analytics on top of it.

Turbines, windmills, and similar industrial products have always been sophisticated. But, to hear GE talk about how the sensor data they provide is enabling active adjustments to operating environments is fascinating. Did you know that under certain conditions, heating fuel before it gets fed into a generator can increase fuel efficiency by several percentage points and that at other conditions it can decrease fuel efficiency? I didn’t. But it was fun to learn that analytics are helping the operators of such equipment save money by guiding them to the right call on a minute by minute basis.

As the medical field sees an explosion in data, it isn’t just from electronic medical charts and notes. Virtually all diagnostics from x-rays to MRIs to EKGs are now digitized and stored as data. The petabytes being generated for large sized health organizations is intimidating, but the promise of how that data can help improve outcomes is exciting. One research study sponsored by Kaiser Permanente has 100,000 people who have allowed their DNA sequence to be captured for research purposes. This can only help researchers looking for ways to improve our health. I am sure there are other similar studies out there as well.

In my speech I talked about telemetry data captured by video game manufacturers. This data includes information on all the controller movements and button presses that players make. It gets very big very fast. With new systems that allow players to control games with their body, we get into a whole new level of big. With full body control, it will be necessary to capture where in 3D space each of your body parts is, what direction each is moving, and at what velocity – at millisecond intervals. This data provides an entirely new level of detail to analyze as video game producers try to understand how people interact with their games and attempt to improve the gaming experience.

It was also interesting to hear some of the history and future of processing massive data volumes at Google. They have a lot of interesting ideas. One that I found intriguing is allowing selected researchers to leverage excess server capacity at Google to perform analytics that require a massive amount of processing. By loaning this capacity to people who couldn’t otherwise afford it, it will enable research to actually occur that otherwise would have stayed a theoretical idea.

The takeaway point that I want to stress is that if you don’t focus on how to effectively capture and analyze the data coming at you today, you’ll be even further behind tomorrow. There is an arms race of sorts going on in the world of big data and analytics. Those organizations that can build the biggest, fastest, most accurate analytics processes are going to have a competitive advantage that is almost impossible for competitors to overcome without undertaking a similar evolution.

My guess is that you’re either going to make that evolution at your organization or you’ll be in trouble a few years down the road. Better to start evolving sooner than later, wouldn’t you say? After all, big data is only getting bigger as every day passes.