A Blog by Jonathan Low

 

Aug 18, 2014

Will Smart Devices and Their Data Be As Transformative and Economically Stimulating as the Automobile?

We think the present is a tad overwhelming. Smartphones, laptops, iPads, home security systems, personal athletic performance monitors - to say nothing of interactive billboards, cars and refrigerators.

But if the projections are correct, this is just a beginning about to be dwarfed by the scale of the digitization to follow.

In 10 years, 17 billion devices will have sensors. And all of them will generate data. In fact, that is their primary purpose. Sure, remembering to turn off the office computer or home TV is a convenience, but the information generated about habits, uses and the insights to be extrapolated from that knowledge is the real source of value because it is that which enables enterprises to figure out what to sell with a greater expectation of achieving goals.

There are, of course, a few challenges. People are becoming leery of how much personal information they are being strongly encouraged to surrender in order to gain the benefits that the associated technology promises. While the data suggest that this has not yet had a noticeable impact on the metrics designed to measure acceptance versus resistance, the fact that anyone is even monitoring these trends suggests that there is concern that, eventually, it will.

Of greater significance is that the infrastructure created to collect, store, move and interpret this data is not capable of performing those tasks at the anticipated growth rates predicted for the addition of new device sensors and the petabits of data they will generate. This is a serious challenge because the purchase and installation of this equipment is predicated on the assumption that the benefits of the information they will produce should be immediately and conveniently available to all who wish to use them, whether consumer or producer - and however much they paid for the privilege.

This is where the question of economic stimulation becomes crucial. The automobile was transformative not just because of how it increased mobility but because of the enterprises that sprang up to service the core product and its owners. The impact in terms of new businesses, jobs, government oversight, roadbuilding and public investment equaled or exceeded that stimulated by the core product.

The same is hoped for in the device/data matrix. But that vision will only be realized if society and the institutions that both support and benefit from it are willing to invest to capture that opportunity. JL

Quentin Hardy comments in the New York Times:

“In 10 years, 17 billion pieces of equipment will have sensors. We’re only one-tenth of the way there.”
New technology products head at us constantly. There’s the latest smartphone, the shiny new app, the hot social network, even the smarter thermostat.
As great (or not) as all these may be, each thing is a small part of a much bigger process that’s rarely admired. They all belong inside a world-changing ecosystem of digital hardware and software, spreading into every area of our lives.
Thinking about what is going on behind the scenes is easier if we consider the automobile, also known as “the machine that changed the world.” Cars succeeded through the widespread construction of highways and gas stations. Those things created a global supply chain of steel plants and refineries. Seemingly unrelated things, including suburbs, fast food and drive-time talk radio, arose in the success.
Today’s dominant industrial ecosystem is relentlessly acquiring and processing digital information. It demands newer and better ways of collecting, shipping, and processing data, much the way cars needed better road building. And it’s spinning out its own unseen businesses.
A few recent developments illustrate the new ecosystem. General Electric plans to announce Monday that it has created a “data lake” method of analyzing sensor information from industrial machinery in places like railroads, airlines, hospitals and utilities. G.E. has been putting sensors on everything it can for a couple of years, and now it is out to read all that information quickly.
The company, working with an outfit called Pivotal, said that in the last three months it has looked at information from 3.4 million miles of flights by 24 airlines using G.E. jet engines. G.E. said it figured out things like possible defects 2,000 times as fast as it could before.
The company has to, since it’s getting so much more data. “In 10 years, 17 billion pieces of equipment will have sensors,” said William Ruh, vice president of G.E. software. “We’re only one-tenth of the way there.”
It hardly matters if Mr. Ruh is off by five billion or so. Billions of humans are already augmenting that number with their own packages of sensors, called smartphones, fitness bands and wearable computers. Almost all of that will get uploaded someplace too.
Shipping that data creates challenges. In June, researchers at the University of California, San Diego announced a method of engineering fiber optic cable that could make digital networks run 10 times faster. The idea is to get more parts of the system working closer to the speed of light, without involving the “slow” processing of electronic semiconductors.
“We’re going from millions of personal computers and billions of smartphones to tens of billions of devices, with and without people, and that is the early phase of all this,” said Larry Smarr, drector of the California Institute for Telecommunications and Information Technology, located inside U.C.S.D. “A gigabit a second was fast in commercial networks, now we’re at 100 gigabits a second. A terabit a second will come and go. A petabit a second will come and go.”
In other words, Mr. Smarr thinks commercial networks will eventually be 10,000 times as fast as today’s best systems. “It will have to grow, if we’re going to continue what has become our primary basis of wealth creation,” he said.
Add computation to collection and transport. Last month, U.C. Berkeley’s AMP Lab, created two years ago for research into new kinds of large-scale computing, spun out a company called Databricks, that uses new kinds of software for fast data analysis on a rental basis. Databricks plugs into the one million-plus computer servers inside the global system of Amazon Web Services, and will soon work inside similar-size megacomputing systems from Google and Microsoft.
It was the second company out of the AMP Lab this year. The first, called Mesosphere, enables a kind of pooling of computing services, building the efficiency of even million-computer systems.
“What is driving all this is the ability to collect, store and process data at a speed and granularity never seen before, over wide areas,” said Michael Franklin, director of the AMP Lab. “When you do this, you can see patterns you never saw before.”
If this growing ecosystem of digital collection, shipment and processing is the new version of cars and highways, what are the unexpected things, the suburbs and fast-food joints that grew from cars and roads?
In these early days, businesses like Uber and Airbnb look like challengers to taxi fleets and hotels. They do it without assets like cars and rooms, instead coordinating data streams about the location of people, cars, and bedrooms. G.E. makes engines, but increasingly it coordinates data about the performance of engines and the location of ground crews. Facebook uses sensor data like location information from smartphones.
Of course, it’s impossible to know if these are the big changes. We’re only one-tenth — or is it one ten-thousandth? — of the way there.

0 comments:

Post a Comment