Does Big Data Have to Be Big?

481135637
One of my hot button issues these days is Big Data.

By many media and vendor accounts, Big Data is simply that: big data. Large volumes of structured or unstructured data the likes of which are generally associated with companies in data-crunching industries like oil and gas, seismology, genomics and finance.

Even Wikipedia defines Big Data as “the collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing.”

The question is, Does Big Data have to be big? And if it does, how big is “big” exactly? Are we talking petabytes? Exabytes? Zettabytes? Bigger? And if size is the determining factor, is Big Data then only a big-company, select-industry concern?

My answer: no, no, and no.

As I was wrestling with this topic for this post, two great articles came across my desk: one from the folks at SearchCIO; the other from The Harvard Business Review (HBR).

In the SearchCIO article, editor Harvey Koeppel says “thinking about big data as a bigger version of little data puts us at a significant disadvantage when it comes to understanding what it is; how it can be harnessed; and the enormous value that it has created already and will continue to create with respect to how we live, work and play.”

I couldn’t agree more.

This type of thinking limits the discussion and application of Big Data to a world that existed two, three or more years ago. It doesn’t account for the huge technology changes that are transforming our lives and businesses today, or the ones that our kids’ kids and generations thereafter will surely encounter. It doesn’t have the Second Machine Age, the Social Network of Machines, or the Third Platform in sight. It isn’t forward-minded.

As Andrew McAfee and Erik Brynjolfsson point out in the HBR piece Big Data: The Management Revolution, velocity (i.e., the speed of data creation, or how ) and variety (i.e., the sources of data) have as much, if not more, to do with Big Data than the volume of data that’s actually created. In this way, we’re all Big Data generators; some of us just create more of it, faster, from more sources and for greater business benefit than others.

Therefore, it isn’t really isn’t how big Big Data is but how it is leveraged for business-benefit that really matters. It’s all about “using Big Data intelligently,” explain McAfee and Brynjolfsson. And this is relevant to big and small businesses alike, regardless of industry. 

As a proof point of this line of thinking, McAfee and Brynjolfsson, along with a research team from the MIT Center for Digital Business, McKinsey’s business technology office, and Wharton, surveyed 330 North American companies about their Big Data practices. Their research found that companies that were data-driven performed better.

How much better? The survey further found that they were 5% more productive and 6% more profitable than those that weren’t.* Enough said.

Don’t let semantics hold you back.

As I was wrapping up this post, our seventh annual EMC Digital Universe study published. In my next blog, I’ll take a look at some of the survey key findings, but, in the meantime, I encourage you to peruse the survey results and related resources. Oh, and while you’re at it, be sure to check out our recent webcast event: Redefining data protection for a software-defined world. In may just give your business the jumpstart you need.

*For more information, see Is Your Company Ready for Big Data?

About the Author: Heidi Biggar