It seems that everywhere you turn in business today people are talking about “Big Data”. For most of us, a definition is in order. The term “Big Data” refers to “data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process the data within a tolerable elapsed time”. And more and more data is becoming available. The world’s technological capacity to store information has roughly doubled every 40 months since the 1980s and as of 2012, every day, 2.5 quintillion (2.5×1018) bytes of data were created. That’s BIG.
Today it is not uncommon for a retail or pharmaceutical company to have a single set of data, say on store shopping preferences between 3pm and 5pm, or for hospital usage of specific pharmaceutical products, to contain more than 1 petabyte; that is 1000000000000000B = 1015bytes = 1000terabytes. Is your head spinning yet?
And according to a recent McKinsey&Co report, “the amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity, growth and innovation”.
Data, alongside labor and capital, comprise the modern pillars of business productivity. The potential is huge. With integrated storage, analytics, and applications, Big Data can help drive efficiency, quality, and personalised products and services, producing higher levels of customer satisfaction and enhancing the customer experience.
While the issues of personal security, ownership and privacy are obvious, what is not so obvious about the accumulation and use of Big Data is how to find the “insights of information” in mountains of data. Data without information is pretty much a bunch of 1’s and 0’s.
Because we best understand our world through stories (stories and pictures seem to be hard-wired into our DNA as the pathway to understanding and learning) unless scientists can find ways to visually present Big Data and convert that data into an insightful story, it will be difficult to derive real value from all this accumulated and processed data.
Manufacturing efficiency and process improvement took a giant leap forward with the use of Visual Management tools (a product of Lean) and I suggest that Big Data will become much more useful when analysts are able to use it to tell a story, not just show trends or deviations.
In a way, with the use of Big Data and new ways to visualize and interpret Big Data, we are going back to our storytelling roots.
Tight Lines . . .
John R Childress
john@johnrchildress.com
John, couldn’t agree more.
The June edition of Foreign Affairs carried a piece regarding Data. It highlighted our growing tendency to use data to correlate multiple events at the cost of not investigating the root cause of individual events in isolation. While correlation is a wonderful tool in predicting certain occurrences, having established a link once, we abandon tackling root causes.
The state I live in (Uttarakhand) witnessed some of the worst flash floods last week. Thousands have perished and many more are stranded for days in the Himalayas. While rescue efforts are still in full swing, there are already calls for developing a better weather monitoring system to predict cloud bursts and flash-floods. While the government will satisfy the people by installing a great predictive tool, it will conveniently neglect the need to tackle the main issue: prevent illegal constructions, mining, environmental damage that makes people vulnerable to these floods.
I also observe an increase in number of stock market analysts moving from fundamental analysis to technical. People have become so captivated with data that past trends are becoming the sole indicators of future estimates. No wonder financial markets do not follow old school theories anymore.
LikeLike