What’s Been Around for Awhile is New Again
Did you ever notice how in business we’re sometimes just as imprisoned by the latest trend of the month as we were back in high school? This month, the trend is big data. You can’t open a business magazine or attend an industry conference without a half dozen people ponderously weighing in on how big data will change everything.
What is perhaps newer, however, is the ease with which the average even modestly sized organization can slice and dice large data sets to arrive at some actionable intelligence. That’s being called data analytics, and you can mostly thank Moore’s Law for that--you know, the idea that computing power doubles every 18-24 months. Over time, that dynamic has put more computing power at the fingertips of the average person than entire Fortune 500 companies used to have access to with their vast mainframes.
I’m more aware than most about how all this has played out for the last couple of decades. I spent a number of years working at Capital One, a bank that grew largely because of its innovative practices in big data. Our founder and CEO, Richard Fairbank, was a true pioneer in this area. He understood before most that by leveraging lots of consumer data, smart players in the financial services industry could deliver highly customized products and services, the kind that people might pay a little more for.
But as this understanding filters ever-lower down the business chain of command, all the way to the operational stages, let’s not forget that the promise of big data will only be fulfilled when quality data becomes the norm, with lots of checks and balances built into the process. Remember: You can't manage what you can't measure -- and you manage poorly when you manage with bad data. Whether the data is little, big or somewhere in between, if it’s incorrect or otherwise lacking in quality, the sheer quantity of data won’t matter.