A Blog by Jonathan Low

 

Dec 8, 2017

Why Ideas Alone Can't Drive Transformation, Behaviors Also Have To Change

There are a lot of great ideas, but the reason that only a select few become transformational is that getting people and organizations to change is hard work: processes, structures and routines are disrupted; unintended consequences occur; unexpected costs arise; relationships suffer.

Leaders are required to encourage and, where necessary, keep the process on track. Customers, employees and investors have to be convinced that the outcome is worth the challenge.

Companies successfully transforming themselves recognize that it takes not just individuals or ideas, but teams, communities and economic ecosystems to prevail. JL

Greg Satell reports in Digital Tonto:

Innovation is never a single event, but a process of discovery, engineering and transformation. It is often the transformation phase that is most difficult and takes the longest. Just because you have a breakthrough idea doesn’t mean anyone knows what to do with it, because every meaningful innovation requires us to change practices and behaviors. Technology alone cannot drive change. Scientists and engineers can only provide tools. The rest of us will need to figure out what to do with them.
Stories of innovation usually follow a simple, but common narrative. Someone gets an idea, figures out how to make it work and changes the world. Yet that is rarely how it actually happens. Far more often, someone comes up with a great idea and it never gets off the ground because no one is willing to accept it.
Ignaz Semmelweis came up with the idea that washing hands could drastically reduce infections, but was considered a quack and died in a mental hospital. Chester Carlson shopped his idea for the Xerox machine to 20 different firms but had no takers. It took Jim Allison three years to get anyone to invest in cancer immunotherapy.
The truth is that innovation is never a single event, but a process of discovery, engineering and transformation and is often the transformation phase that is most difficult and takes the longest. Just because you have a breakthrough idea doesn’t mean anyone knows what to do with it, because every meaningful innovation requires us to change practices and behaviors.

The Making of The 20th Century

As Robert Gordon explains in The Rise and Fall of American Growth, prosperity in the 20th century was largely driven by two technologies, electricity and the internal combustion engine. Neither were linear or obvious. Both took decades to go from their initial invention in the 1880s to the transformative impact they would have in the 1920s.
In both cases, the technologies were ill-suited to their times. Electricity did little to improve the productivity of factories designed for steam. So new factories had to be built based on very different concepts in order to leverage the advantages of the new technology. Even more importantly, management and work practices had to adapt.
Automobiles required both new manufacturing practices and new infrastructure. Henry Ford’s assembly line provided the former in the early 20th century, but it took much longer to build out roads and gas stations. Eventually, the automobile reshaped society around it, leading to suburbs, shopping malls and a retail revolution that vastly improved efficiency.
In both cases, the transformation was driven by both technical and human factors. Complementary technologies, such as highways and electrical appliances, needed to be developed to make the general purpose technologies useful, but the much larger impact came from people learning to incorporate new capabilities into their daily lives and work.

The 3rd Digital Revolution

While the first half of the 20th century was shaped by the electricity and automobiles, the second half was driven by two digital revolutions. The first, born out of Alan Turing’s universal computer breakthrough, began to gain traction in the late 1950s. It was nearly invisible to the general public, but helped to automate back office processes and scientific calculations.
The second digital revolution began in 1968, with Douglas Engelbart’s Mother of All Demos, incubated at Xerox throughout the 1970s and was blast into the public consciousness with the Macintosh in 1984. However, it took more than decade for personal computers to combine with applications and the Internet to create a productivity boom in the late-90s.
We are now at the dawn of a third digital revolution. As Moore’s law begins to peter out, we’re developing new architectures, such as quantum and neuromorphic computing. These new technologies will combine with machine learning to extend our current capabilities by several orders of magnitude.
In a sense, the 2020’s may turn out to be a replay of the 1920’s, with a number of technologies, such as solar power, electric cars and autonomous machines overtaking their old economy predecessors. Yet what will truly drive the next century will be the new computing architectures interacting with complementary technologies to transform the way we live and work.

The Next Big Thing: Atoms

Electricity and the internal combustion engine extended our capacity to do physical work. The first two digital revolutions extended the power of our minds. The third digital revolution will allow us to manipulate things at a molecular scale. Much like the first digital revolution, it will be mostly invisible, but will touch just about every facet of our lives.
The first shot across the bow was the Human Genome Project, completed in 2003. Using massive computing power, it created a map of our genetic code and allowed us to begin to catalogue abnormalities in databases, where they could be studied to develop new cures. More recently, scientists have begun to build materials genomes to transform how we make things.
In the years to come, the new more powerful computing architectures will combine with machine learning to detect patterns in these massive data hoards far more efficiently than humans ever could. Even at this early stage, this new revolution is already transforming the scientific method in ways that would have seemed unlikely a decade ago.
Innovation, at its core, thrives on combination and it’s hard to predict in advance what will combine and to what effect, but the rough outlines are already beginning to take shape.

The Great Transformation

The basic elements of the next era of technology are already in place. A number of companies already have working prototypes of quantum computers and neuromorphic chips. Commercial versions can be expected in the next five years. These will help add to the massive databases already in place and machine learning algorithms will allow us to make sense of it all.
Yet technology alone cannot drive change. Scientists and engineers can only provide tools. The rest of us will need to figure out what to do with them. Going from an initial discovery to commercial impact takes decades because we need to adopt new capabilities into our life and work styles and apply them to new problems.
In the early 20th century, no one knew what a factory driven by electricity would look like or how a highway system would function. These were problems that had no relevance to earlier paradigms. It wasn’t until the 1950s that electric appliances became common in every home and transportation infrastructure reshaped how we live and shop.
Today, all we can say is that the future will be profoundly different. The change that will happen in the next few decades will likely dwarf anything we have seen before. So there is little use in trying to predict with any accuracy what it will look like. That’s why innovation needs exploration. We need to be open to new possibilities to solve new problems.

2 comments:

Post a Comment