A Blog by Jonathan Low

 

Jun 24, 2016

Why Technology Demands A More Collaborative Future

In the 12th century a British king named Canute set his throne next to the sea and commanded the tide to recede. You can imagine how well that went.

British citizens' vote to leave the European Union is a similarly futile gesture. There are those who believe this was a sign of royal arrogance. Others think that Canute was trying to teach his courtiers a lesson about their relative lack of power.

Technology is the driving force of our socio-economic reality today - the tidal surge, in metaphorical terms. It demands more collaboration and cooperation, not less. Capital markets may well crash today and in the coming weeks but the reality is that global commerce and the scale it requires will determine the future, not the votes of the disenfranchised searching for scapegoats. There will be further dislocations and some, perhaps most in some countries, will suffer for it. But that will not change the outcome and prescient enterprises will act accordingly. JL

Greg Satell reports in Digital Tonto:

Innovations such as electricity and the internal combustion engine had broad implications, (while) the impact of digital technology has been narrow. Instead of replacing manual labor, technology will automate routine cognitive work. In the 20th century, firms could achieve competitive advantage by optimizing value chains; the future belongs to those who can widen and deepen connections. It doesn’t matter how fast chips can process if they need to wait too long to communicate with each other.

For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.
Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.
Over the next decade, Moore’s Law will end. Instead of replacing manual labor, technology will automate routine cognitive work. As information technology fades into the background, second order technologies, such as genomics, nanotechnology and robotics will take center stage. Here are the four major paradigm shifts that we need to watch and prepare for.
From The Chip to The System
In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years. He also predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.
That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations.
Yet Moore’s law is now nearing its end. The problem is twofold. First, there are only so many transistors you can squeeze onto a chip before quantum effects cause them to malfunction. Second, is the problem known as the von Neumann bottleneck. Simply put, it doesn’t matter how fast chips can process if they need to wait too long to communicate with each other.
So we have to shift our approach from the chip to the system. One approach, called 3D stacking, would simply combine integrated circuits into a single three dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it could increase speeds significantly and allow progress to continue.
From Applications To Architectures
Since the 1960’s, when Moore wrote his article, the ever expanding power of computers made new applications possible. For example, after relational databases were developed in 1970, it became possible to store and retrieve massive amounts of information quickly and easily. That, in turn, dramatically changed how organizations could be managed.
Later innovations, like graphic displays, word processors and spreadsheets, set the stage for personal computers to be widely deployed. The Internet led to email, e-commerce and, eventually, mobile computing. In essence, the modern world is little more than the applications that make it possible.
Till now, all of these applications have taken place on von Neumann machines—devices with a central processing unit paired with data and applications stored in a separate place. So far, that’s worked well enough, but for the things that we’ve begun asking computers to do, like power self-driving cars, the von Neumann bottleneck is proving to be a major constraint.
So the emphasis is moving from developing new applications to developing new architectures that can handle them better. Neuromorphic chips, based on the brain itself, will be thousands of times more efficient than conventional chips. Quantum computers, which IBM has recently made available in the cloud, work far better for security applications. New FPGA chips can be optimized for other applications.
Soon, when we choose to use a specific application, our devices will automatically be switched to the architecture—often, but not always, made available through the cloud—that can run it best.
From Products To Platforms
It used to be that firms looked to launch hit products. If you look at the great companies of the last century, they often rode to prominence on the back of a single great product, like IBM’s System/360, the Apple II or Sony’s Walkman. Those first successes could then lead to follow ups—like the PC and the Macintosh—and lead to further dominance.
Yet look at successful companies today and they make their money off of platforms. Amazon earns the bulk of its profits from third party sellers, Amazon Prime and cloud computing, all of which are platforms. And what would Apple’s iPhone be without the App Store, where so much of its functionality comes from?
Platforms are important because they allow us to access ecosystems. Amazon’s platform connects ecosystems of retailers to ecosystems of consumers. The App Store connects ecosystems of developers to ecosystems of end users. IBM has learned to embrace open technology platforms, because they give it access to capabilities far beyond it own engineers.
The rise of platforms makes it imperative that managers learn to think differently about their businesses. While in the 20th century, firms could achieve competitive advantage by optimizing their value chains, the future belongs to those who can widen and deepen connections.
From Bits To Atoms
In The Rise and Fall of American Growth, economist Robert Gordon argues that the rapid productivity growth the US experienced from 1920-1970 is largely a thing of the past. While there may be short spurts of growth, like there was in the late 90’s, we’re not likely to see a sustained period of progress anytime soon.
Among the reasons he gives is that, while earlier innovations such as electricity and the internal combustion engine had broad implications, the impact of digital technology has been fairly narrow. The evidence bears this out. We see, to paraphrase Robert Solow, digital technology just about everywhere except in the productivity statistics.
Still, there are indications that the future will look very different than the past. Digital technology is beginning to power new areas in the physical world, such as genomics, nanotechnology and robotics, that are already having a profound impact on such high potential fields as renewable technology, medical research and logistics.
It is all too easy to get caught up in old paradigms. When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.
In an age of disruption, the only viable strategy is to adapt.

0 comments:

Post a Comment