A Blog by Jonathan Low

 

Dec 1, 2017

The Next Great Transformation Will Be From Bits To Atoms

Our knowledge and capabilities to act on them are becoming more granular, more immediate - and more impactful. JL


Greg Satell reports in Digital Tonto:

It takes 30 years to go from initial discovery to market impact. We are about a decade into the next great transformation. That puts us where Xerox executives were in 1977 (when) they had no idea what personal computers would unleash. (They) saw a tool to automate secretarial work. The way we interface with the physical world is changing. The revolution underway is reshaping the scientific method.
When engineers from Xerox PARC showed off their revolutionary new personal computer, the Alto, at the company’s global conference in 1977, senior executives weren’t particularly impressed. It just didn’t seem to be relevant to their jobs or their business. Their wives, however, were transfixed.
The reason for the disparity was that the executives saw a tool to automate secretarial work, which they considered to be a low value activity. The wives — many of whom had been secretaries — saw an entirely new world of possibility and, when Steve Jobs built the Macintosh based on the Alto, everyone else saw it too.
It’s easy to shake our heads and laugh at those shortsighted executives of the past, but we’d do ourselves a much greater service by realizing that we are not that different. The truth is that the next big thing always starts out looking like nothing at all, so it’s hard to grasp its implications early on. That’s essentially where we are today with the shift from bits to atoms.

Anatomy Of A Revolution

These days we consider personal computers to be revolutionary, but as a stand-alone technology they were fairly limited. The original Macintosh was incredibly slow by today’s standards and only had 400 KB of storage. It wasn’t easily connected to other computers, which made it useless for sharing information.
Over time, that would change. Complementary technologies, such as the relational database, which led to ERP software, as well as the Ethernet, which connected computers together and eventually the Internet, made the information age possible. As all of these technologies became vastly more powerful, the world was significantly transformed.
One of the most overlooked aspects of computing technology is how it made it possible to do simulations. Once computers were hooked up to massive databases, information could be downloaded and analyzed in spreadsheets. Executives could use that information to create different scenarios based on real-world data and apply those insights to make decisions.
None of this was obvious to anyone in 1977. In fact, these aspects of the technology wouldn’t become clear until the late 90s — a full two decades later. What the Xerox executives saw at the conference couldn’t have significantly helped them do their jobs, so it shouldn’t be surprising that they didn’t see what the big deal was.

The End Of Moore’s Law And The Rise Of New Computing Architectures

Computers have become so ubiquitous in the world today that it’s easy to miss something extraordinary going on. After decades of continuous improvement, our machines aren’t getting any better. Buy a laptop today and it’s likely to have nearly identical specifications to one you bought five years ago.
There are two reasons for this. First, the chip technology itself is nearing theoretical limits, so basic advancement is slowing down. Second, because computationally intensive tasks can be done more cheaply and conveniently in the cloud, we don’t have any great need for vastly more computing power on our desks or in our pockets.
Amid this slowdown of legacy technology, new revolutionary computing architectures are emerging. The first, called quantum computing can handle almost unimaginable complexity. The second, neuromorphic chips, can recognize patterns much more efficiently than conventional architectures and use far less power.
One indication of what’s at stake is how many top firms are investing in these technologies. Google and IBM have very advanced quantum programs, but others such as Microsoft and Intel and startups like Rigetti and D-Wave are also progressing fast. IBM, Intel, Qualcomm, Nvidia all have advanced neuromorphic programs.

An Emerging Physical Stack

When most people think about digital technology, they usually only think about the top layer, the device and the user interface, but that is just a small fraction of the whole. There is an entire stack of technologies, from databases to middleware to applications that go into making it all work.
Today, a similar stack is being built for the physical world. New databases, such as The Cancer Genome Atlas and the Materials Genome Initiative, catalogue specific aspects of the physical world. These, in turn, are analyzed by powerful machine learning algorithms. The revolution underway is so profound that it’s reshaping the scientific method.
In the years to come, the new, more powerful computing architectures will drive the physical stack. Simulating chemistry is one of the first applications being explored for quantum computers, which will help us build larger and more detailed databases. Neuromorphic technology will allow us to analyze complex patterns and derive new insights.
The way we interface with the physical world is changing as well. Nanotechnology allows us to manipulate materials on a molecular scale, while new techniques such as CRISPR helps us edit genes at will. Virtual reality will help us internalize insights and advanced manufacturing techniques, such as 3D printing will bring these visions into reality.

The Great Transformation

Innovation is never a single event, but a process of discovery, engineering and transformation and we almost always underestimate the complexity and duration of the transformation stage. Douglas Engelbart presented the basic features of personal computers in 1968, but the economic impact didn’t hit until the late 1990s. Edison completed the first power station in 1882, but electricity didn’t begin transforming our lives until the 1920s.
There are two reasons transformation takes so long. The first is that complementary technologies need to emerge. We get little out of computers without applications and electricity is of little use without machines designed to use it. Second, we need to redesign our organizations, work practices and lifestyles in order to get the most out of new technology.
On average, it takes about 30 years to go from initial discovery to significant market impact and we are about a decade into the next great transformation. That puts us almost exactly where those Xerox executives were in 1977. They had no idea of what personal computers would unleash and, if we’re honest, we need to admit that we are in the same boat.
What we can do is recognize that there is a great transformation underway that will unlock possibilities and opportunities that are impossible to see clearly right now. However, it’s more important to explore than to predict and that’s what we need to do today. We don’t need to understand the future to be open to it.

3 comments:

Stan Vines said...

Very informative with clarity. Understanding where we have come from is paramount to grasping where we are going and what we will be doing there. Stan Vines info@i4c.ch

Unknown said...

The Enlightenment Period of Technology is upon us!

Unknown said...

The Enlightenment Period of Technology is upon us!

Post a Comment