A Blog by Jonathan Low

 

Apr 13, 2017

We All Thought Having More Data Was Better. We Were Wrong

Big is not necessarily better. Especially when it comes to data. JL

Bob O'Donnell reports in Re/code:

Analytics is hard, and there’s no guarantee that analyzing huge chunks of data is going to translate into meaningful insights. Sometimes the goals are too grandiose, sometimes the datasets are too large, and sometimes the valuable insights are buried beneath a mound of numbers or other data that just really isn’t that useful. Implicit in the phrase “big data,” as well as the concept of data as gold, is that more is better.
For years, the mantra in the world of business software and enterprise IT has been “data is the new gold.” The idea was that companies of nearly every shape and size, across every industry imaginable, were essentially sitting on top of buried treasure that was just waiting to be tapped into. All they needed to do was to dig into the correct vein of their business data trove and they would be able to unleash valuable insights that could unlock hidden business opportunities, new sources of revenue, better efficiencies and much more.
Big software companies like IBM, Oracle, SAP and many more all touted these visions of data grandeur, and turned the concept of big data analytics, or just Big Data, into everyday business nomenclature.
Even now, analytics is also playing an important role in the Internet of Things, on both the commercial and industrial side, as well as on the consumer side. On the industrial side, companies are working to mine various datastreams for insights into how to improve their processes, while consumer-focused analytics show up in things like health and fitness data linked to wearables, and will soon be a part of assisted and autonomous driving systems in our cars.
Of course, the everyday reality of these grand ideas hasn’t always lived up to the hype. While there certainly have been many great success stories of companies reducing their costs or figuring out new business models, there are probably an equal (though unreported) number of companies that tried to find the gold in their data — and spent a lot of money doing so — but came up relatively empty.
The truth is, analytics is hard, and there’s no guarantee that analyzing huge chunks of data is going to translate into meaningful insights. Challenges may arise from applying the wrong tools to a given job, not analyzing the right data, or not even really knowing exactly what to look for in the first place. Regardless, it’s becoming clear to many organizations that a decade or more into the “big data” revolution, not everyone is hitting it rich.
Part of the problem is that some of the efforts are simply too big — at several different levels. Sometimes the goals are too grandiose, sometimes the datasets are too large, and sometimes the valuable insights are buried beneath a mound of numbers or other data that just really isn’t that useful. Implicit in the phrase “big data,” as well as the concept of data as gold, is that more is better. But in the case of analytics, a legitimate question worth considering: Is more data really better?
In the world of IoT, for example, many organizations are realizing that doing what I call “little data analytics” is actually much more useful. Instead of trying to mine through large datasets, these organizations are focusing their efforts on a simple stream of sensor-based data or other straightforward data collection work. For the untold number of situations across a range of industries where these kinds of efforts haven’t been done before, the results can be surprisingly useful. In some instances, these projects create nothing more than a single insight into a given process for which companies can quickly adjust — a “one and done” type of effort — but ongoing monitoring of these processes can ensure that the adjustments continue to run efficiently.
Of course, it’s easy to understand why nobody really wants to talk about little data. It’s not exactly a sexy, attention-grabbing topic, and working with it requires much less sophisticated tools — think Excel spreadsheet (or the equivalent) on a PC, for example. The analytical insights from these “little data” efforts are also likely to be relatively simple. However, that doesn’t mean they are less practical and valuable to an organization. In fact, building up a collection of these little data analytics could prove to be exactly what many organizations need. Plus, they’re the kind of results that can help justify the expenses necessary for companies to start investing in IoT efforts.
To be fair, not all applications are really suited for little data analytics. Monitoring the real-time performance of a jet engine or even a moving car involves a staggering amount of data that’s going to continue to require the most advanced computing and big data analytics tools available.
But to get more real-world traction for IoT-based efforts, companies may want to change their approach to data analytics efforts and start thinking small.

0 comments:

Post a Comment