A Blog by Jonathan Low

 

Jan 14, 2013

Prediction and the End of the Productivity Paradox

The big knock on computerization initially was that businesses were investing tremendous sums in new machinery or devices but had difficulty demonstrating what they were getting from it. Productivity. Profitability. Full spectrum dominance. All proved elusive. At first.

No one questions the importance of computers anymore, though most managers would probably be hard-pressed to describe exactly what benefits they have wrought. Funny how little some things change.

We now look at mobile and social and big data and The Cloud and we sense that we can not live without them - but we are having trouble articulating what, exactly, the benefits will be. And perhaps more to the point, when they kick in.

The history of technology adoption cycles tells us that benefits are often hard to identify at first. Water power, electricity, automobiles, telephones, computers. All changed the world around them, but few could have predicted how it would all work out.

The secret, as it were, turns out to be in the process. As the following article explains, what technology does is speed up the process of evolution by enabling simulation. This gives organizations the ability to more quickly - and less expensively - experiment with alternatives, eliminating those that do not appear to offer optimal solutions to the problems the enterprises face.

Knowledge and speed encourage simulation which, in turn, stimulates more accurate predictions. The resultant process shortcuts save time and money by reducing wasted resources and misplaced effort. In effect, the ability to simulate and therefore predict winning solutions is the answer to the need for heightened productivity.

Every new development comes wrapped in a web of paradoxes. We can guess at the value of the innovation, but history teaches us that we are frequently wrong. Not that the latest and greatest dont have any value, but that its optimal application may not become apparent until we have had more experience with it. And until the products and services that emerge in co-evolutionary response have had time to better frame the opportunity it presents.

Every new development - technological and organizational - can and should now be viewed through this prism: how it enhances our ability to more accurately predict the future in a shorter time frame. JL

Greg Satell comments in Digital Tonto:
In 1982, Steve Jobs first made the cover of Time magazine, where he was celebrated as the 26 year-old college-dropout-wunderkind who created the personal computer industry and made a fortune in the process. It seemed like a new age had dawned.

Unfortunately, tangible results were frustratingly hard to find. By 1987, the economist Robert Solow complained that “You can see the computer age everywhere but in the productivity statistics,” a phenomenon which came to be known as the productivity paradox.

Today, nobody questions that computers have fundamentally changed the way we create, deliver and capture value. Erik Brynjolfsson, who coined the term “productivity paradox,” even has a new book out touting technology’s impressive contributions. What’s changed? I would argue that a big part of it is our ability to enjoy success while simulating failure.

A Universal Computer
Computers, in the broad sense, have been around a long time. The abacus was used as early as 2700 BC in Mesopotamia. Blaise Pascal invented the mechanical calculator in 1642 and Charles Babbage designed his analytical engine in 1837. Analog computers were used during World War II to crack the the supremely important German Enigma codes.

However, those machines were limited to specific tasks. The machines we know today as computers have their roots in Alan Turing’s legendary 1936 paper describing a universal computer which could be programmed to do any task. He would later write in 1948:

"We do not need to have an infinity of different machines doing different jobs. A single one will suffice. The engineering problem of producing various machines for various jobs is replaced by the office work of ‘programming’ the universal machine to do these jobs."

Claude Shannon (the father of information theory), then followed up by showing how Boolean logic could be represented electronically, thereby incorporating not just calculations, but statements including terms such as “and,” “if, then,” “not,” “or” and so on. These were engineered into logic gates that would allow machines simulate thought.

This subtle shift, from specialized to universal machines, created enormous ripple effects which are still reverberating today. As Turing himself would later write:

"The survival of the fittest is a slow method for measuring advantages. The experimenter, by the exercise of intelligence, should he able to speed it up."

And that’s what Steve Jobs’ appearance on the cover of Time signified. The process of evolution was about to speed up, exponentially.

An Office in a Box
The early universal machines, such as the ENIAC, were unwieldy, both physically and financially, so were used exclusively by large institutions. That’s why Thomas Watson of IBM thought that there was only a market for five of them in the world. In time though, computers became small and cheap enough for personal use.

I remember those early personal computers. They were mostly for goofing around. Our parents bought them for us as high priced toys that they hoped would be more educational than TV. In truth, they didn’t really know what we were doing on them, but there was a sense that computers were the future, so we were given free reign.

Things really changed when Dan Bricklin created Visicalc, the first spreadsheet program. By the 1990’s, computers made many basic tasks easier to do by yourself, which put a lot of clerical staff out of work, but you can see why Solow was skeptical. Automating office tasks did not drive significant productivity gains.

The real value, looking back, is that we began to simulate. We would rework documents endlessly before printing them out. Finance and budgeting became an exercise in scenario planning. With a universal machine at your fingertips, you could spot problems before they entered the real world and could do actual damage.

Turing Goes Mobile
It is not just our computers that have become universal machines, but our phones as well. What used to be a highly specialized device, used only for communication, has become a universal one that can do a variety of tasks, from e-mail to navigation to even a level to hang pictures and a medical monitor for our vital signs.

A smartphone is just the kind of universal machine that Turing envisioned. It’s function at any given time isn’t dependent on how it was engineered, but what software we download for it. My smartphone is not only different from yours, but probably much different than it will be in 6 months time, without altering a single molecule of hardware.

Yet for all of the convenience of being able to stay in touch on the go, the real boost in productivity we get from smart phones is in their ability to simulate. Much like personal computers allowed us to test out documents and spreadsheets before they become actual, our smart phones allow us to simulate the real world.

When we want to navigate through town, pick a restaurant or a movie or do just about anything else, we can simulate the experience in our smartphones. If it looks crappy, we don’t do it, which saves time and money. As augmented reality navigates the hype cycle and eventually gives way to holographic technology, our power to simulate will expand.

The Third Industrial Revolution
in his new book, Makers, Wired editor cum entrepreneur Chris Anderson explains that we’re now in the midst of a third industrial revolution. The first began in the late 18th century, with the invention of the steam engine while the second got started in the early 20th century, with the assembly line and the creation of the modern factory.

The seminal technology of this new industrial revolution is computer aided design (CAD) and, again, simulation is at its core. Designers can experiment in the virtual world before trying things out in the real one. Then they can build rapid prototypes cheaply with 3D printers, CNC routers and laser cutters.

Much like desktop computers and smartphones, manufacturing technology is becoming universal. It can be programmed to make anything, including airplane parts. Even the assembly line is being replaced by a new breed of industrial robots which enable factories to retool in minutes rather than in months, further reducing the cost of failure.

And it’s not just the product that can be simulated, but the market too. Anyone with an idea has a variety of crowdfunding options, such as Kickstarter and Indiegogo, where they can not only receive financing, but also gauge demand. If it is successful there, they not only get money for their venture, but a built-in market to sell to.

If it’s not successful, nothing gets built and little is lost. That’s the beauty of simulation. It can’t tell us that we’re surely right, but it can tell us when we’re wildly off the mark.

Finding 10,000 Things that Don’t Work
Thomas Edison was probably the most successful inventor in history, creating modern day staples such as the electric light, sound recording and motion pictures. He also failed a lot, which didn’t bother him in the least. He said of his many false starts:

If I find 10,000 ways something won’t work, I haven’t failed. I am not discouraged, because every wrong attempt discarded is another step forward.

Therein lies the secret to the simulation economy and the dissolution of the productivity paradox. While failing 10,000 times in the age of Edison required superhuman fortitude, today it’s relatively easy because we have the opportunity to fail in the virtual world as many times as we like at minimal cost in blood and treasure.

We can experiment with business models, tweak designs, rapidly prototype, present to investors and test the market, all during our morning coffee. As our technology advances further, these simulations will become more realistic through holographic technology and agent based models.

The more we continue to improve our ability to experiment in the virtual world, the more we will succeed in the real one.

0 comments:

Post a Comment