How Thinking Small Can Make Big Data Better

Enterprises need no introduction these days to Big Data, the emerging field of information analytics that helps companies translate the motivations and behaviors of large numbers of consumers into actionable intelligence. But before you run out and invest in a multi-petabyte server rack and a Ph.D-level statistician to run correlations, spend some time with actual data analysts and you’ll learn that the majority of these assets can be found closer to the periphery, in clusters of information that require much less effort to digest.

In an April 2013 white paper, the analysts at Nucleus Research warned what can happen by focusing too heavily on the “big” end of Big Data equation and not enough on the “data” side.

“Companies have raced towards the goal of increasingly large volumes and sources of data that are curated and analyzed by data gurus,” they write. “However, for all the investment that vendors and Fortune 500 companies have placed into Big Data, it has been difficult for many companies to articulate the value of these investments because they are poorly aligned to measurable business outcomes.”

The problem isn’t “the technology, the skill sets, or the business goals” of these companies, but their belief in a “Big Data approach focused on increasing the size and velocity of data analysis.”

In other words, bigger is not always better. In fact, as commentator Christopher Mims noted recently, even at major Internet companies like Facebook and Yahoo, the typical data crunch is measured in mega and giga, not tera and peta. When you bite off more than you can chew, Mims warns, “big data is as likely to confuse as it is to enlighten.”

The lesson here is simple: Drawing real value from data analytics requires identifying viable business objectives and applying a finely-tuned strategic approach to knowledge gathering to help see those goals met. It’s not about the amount of data; it’s about having the right data.

So What is the Right Data?

The bulk of it is sitting on end-user devices and computers ripe for the plucking. According to the research firm CSC, while 80% of all data is housed on enterprise servers, more than 70% of it is created by individuals. And the volume of that digital information is poised to explode.

Finding the sweet spot between the PC and the data center means learning to tap into smaller richer morsels of information in ways that leverage its power and engage consumers as partners in the action.

Experts say this comes from augmenting a top-down “Big Data” approach with user-centric “Little Data” solutions that can open new opportunities for customer engagement.

For example, Mark Bonchek, Chief Catalyst of Orbit & Co., writes that in the utilities sector, end-user analytics — such as applications that let consumers track their own energy usage — combined with Big Data apps that enable them to compare and contrast their consumption with the people in their neighborhood, can create a powerful partnership between companies and their clients and lead to new revenue streams.

Once companies place data quality ahead of quantity, the same dynamic can be applied to retail, financial services, health care, government and other sectors of the economy.

Big Data is not one tool, but a collection of tools. Don’t use a hatchet when a scalpel will do.

Leave a Reply

Your email address will not be published. Required fields are marked *