A Blog by Jonathan Low

 

Sep 29, 2021

Why Incremental Physics Is the Real Engine Propelling Innovation

It turns out that the rate of technological innovation is not due to individual genius or one spectacular breakthrough. It is due to the underlying physics of that technology and the decades of research and development on which innovation on which it has been built. And it is predictable.

This does not mean that entrepreneurs, inventors and venture capitalists are not valuable. In fact, quite the opposite. But it does mean that to be successful, they need to focus on history, data and incremental improvement. JL 

Christopher Mims reports in the Wall Street Journal:

Across decades, individual technologies change at a steady rate. This rate is due to the underlying physics that any given technology is built on, and not to any particular genius or single breakthrough to which we usually attribute technological advances. The number of patents in a given technological field is only weakly correlated with its rate of improvement. The technologies behind even the most fantastical advances are the product of many decades of research and steady improvement. A prediction algorithm can determine the pace of progress for all technologies mentioned by the USPTO that have 100 patents. "There are flashes of insight but you’re building on something that existed previously. It isn’t how good the managers are, or how wisely the capitalists spend.”

Should the U.S. invest in a generation of new intercontinental ballistic missiles? What has really propelled decades of consistently rising computer performance? Is research into new forms of nuclear power a dead end? And should we credit Elon Musk with revolutionizing the automobile industry, or is he just riding the coattails of history?

These are the sorts of questions that researchers who study the history of innovation and what it says about the future say we can now answer—thanks, of course, to innovation. Using both previously untapped pools of data and new analytical methods, along with the usual tools of modern-day forecasting—namely, the predictive algorithms often described as “artificial intelligence”—they are taking a quantitative approach to examining how quickly technologies improve.

The result isn’t a crystal ball for what’s next. Indeed, one of the conclusions of this group of academics is that attempts to predict the exact nature of the next technological advance are doomed to fail. But their research could help us understand how quickly existing technologies are getting better.

If they work, these innovation-prediction algorithms stand to benefit investors, company leaders and government planners. The goal is to help us make better-informed decisions about where to direct money, time and attention. Whether that means deciding what’s in our portfolios, or which line of R&D to pursue to solve a pressing problem, these systems can help, says Christopher Magee, an emeritus professor of engineering at the Massachusetts Institute of Technology and one of the authors of a forthcoming paper describing such a system.

A Little Better All the Time

Out of 1,757 different technologies, more than three quarters improve at a rate less than 20% a year, but a handful are improving very quickly.

Microchips’ performance

improves 37% a year.

8

%

PERCENTAGE OF TECHNOLOGIES

6

4

2

0

1%

50%

100%

150%

200%

SLOWER

FASTER

ANNUAL RATE OF IMPROVEMENT

Source: ‘Technological Improvement Rate Predictions for All Technologies: Use of Patent Data and an Extended Domain Description,’ Anuraag Singh, Giorgio Triulzi and Christopher L. Magee

Dr. Magee, who spent 35 years at Ford Motor in areas including technology strategy, says one of his motivations for conducting this research is that he always felt as though he and other analysts were just guessing when they tried to predict what technology a company should invest in next. He also felt their guesses were fraught with personal bias.

Anuraag Singh, a former MIT fellow in system design and management and the lead researcher on the innovation project, had a similarly frustrating experience as an engineer working on what were supposed to be breakthrough technologies at Honda’s R&D division.

“When I was at Honda, we wanted the answer to the question of, what do we work on next?” says Mr. Singh. “A lot of people were saying robotics and AI are going to change the world, but there was no way to actually figure out if robotics was improving as fast as people thought it was—and it turned out it wasn’t.”

Now Mr. Singh and Dr. Magee can answer in a fraction of a second the question of how quickly any given technology is advancing. And anyone else can too, by typing the name of that technology into a Google-like search engine the researchers created. Robotics, for example, is improving at the rate of 18.5% a year, which sounds like a lot, except that the average rate of improvement for the more than 1,700 technologies the researchers studied is 19% a year.

The underlying driver of this improvement is that all technologies, even software, are ultimately governed by the laws of physics, which over the long run determine just how far, and how quickly, we can get them to evolve.

A lot of factors go into those percentages, including analysis of patents. The database they’ve made available includes more than 97% of all U.S. patents from 1976 to 2015. Their work builds on decades of previous research on how certain characteristics of patents can predict the rate at which a technology advances.

It turns out that the number of patents in a given technological field is only weakly correlated with its rate of improvement. A far better predictor is a measure of how a patented technology borrows from seemingly unrelated technologies. Innovation, it turns out, can come from anywhere, and breakthroughs are driven by the incorporation of technologies into one another.

‘If we get rid of the hero worship and look at the actual process of innovation, we find that it is learnable, just like the piano is learnable.’

— Bill Buxton

Using these insights, as well as an empirical data set painstakingly gathered on the rates of improvement of 30 of the technologies in their database, the researchers trained a prediction algorithm that can determine the pace of progress for all technologies currently mentioned by the U.S. Patent and Trademark Office that have at least 100 patents related to them.

“I think their work is really good, and they are filling a vacuum in the literature,” says J. Doyne Farmer, a professor of mathematics at the University of Oxford who wasn’t affiliated with the work.

The MIT team’s research shows that “it is the physics of the technology that really matters,” he adds.

For example, the MIT researchers have found through the patent literature that a principal driver of the steady shrinking of microchip circuitry has been improvements in laser technology. This in some ways answers the question of whether “Moore’s Law” was a self-fulfilling prophecy made by Intel co-founder Gordon Moore, or something that would have happened even without his famous prediction, as lasers got better independent of chip manufacturing, says Dr. Magee.

Research done by Dr. Farmer’s group at Oxford backs up one main finding of this and previous research: Viewed across decades, individual technologies change at a surprisingly steady rate. This rate is due to the underlying physics that any given technology is built on, and not to any particular genius or single breakthrough to which we might usually attribute technological advances.

“It isn’t how good the managers are, or how wisely the capitalists spend,” says Dr. Farmer. “It’s a matter of picking the right horse and riding that horse.”

Looking at innovation this way, as an almost mechanistic, deterministic process, might not be as romantic as our countless hero-worshipping stories of daring inventors and entrepreneurs taking huge risks to bring us the next breakthrough, from the lightbulb to the rocket ship. But it is a much more reliable way to get the results that most people actually need and want out of investments in technology, argue these researchers.

‘It isn’t how good the managers are, or how wisely the capitalists spend. It’s a matter of picking the right horse and riding that horse.’

— J. Doyne Farmer

Here’s an example. The team at MIT says that in conversation with parts of the U.S. Defense Department, including the Air Force Research Laboratory, a question came up related to the military’s nearly $100 billion effort to revamp America’s stockpile of nuclear missiles: As the U.S. debates investing in a next generation of these missiles, will the rate of improvement in the power of lasers soon mean that such missiles could be shot down, should they lack defenses against such weapons?

It’s a question as old as the controversy over the Reagan-era Star Wars program, only now, say the MIT researchers, we have an answer. If we look at the historical rate of improvement in both lasers and missiles, it is likely we’ll be able to shoot down unprotected nuclear missiles with lasers at some point in the next 15 to 25 years, say the researchers.

The Defense Department didn’t respond to a request for comment.

Another question with huge implications for the future of humanity is which technologies we should bet on to produce electricity in a way that won’t render our planet uninhabitable. In research yet to be published, Dr. Farmer and other members of his group compared the rates of improvement in solar photovoltaic technology and nuclear power, and found that while the cost per watt of solar power is now 0.1% what it was 70 years ago, the cost of nuclear power actually went up.

“So if you’re talking about the future, it isn’t nuclear; and if you’re an investor, you should know that, and if you’re a student, becoming a nuclear engineer isn’t something I would recommend to anybody,” says Dr. Farmer.

This isn’t to say that breakthroughs don’t happen. Occasionally an industry will shift to an entirely new technology, as batteries did when they began to move to the lithium-ion cell in the early 2000s, which can hold much more energy in the same size and weight of battery.

“I would never argue that teams and people don’t matter,” says Dr. Magee. “Great scientists are just a little bit ahead of the curve—they aren’t unbelievable heroes who do what no one else can do, and neither is Elon Musk.” Individuals and firms might speed up the overall rate of improvement of a technology by a few percentage points a year, but the technologies behind even the most fantastical-seeming advances, from smartphones to reusable rockets, are inevitably the product of many decades of research and steady improvement, says Dr. Singh.

Bill Buxton, a researcher at Microsoft Research and one of the creators of the interface on which modern touch computing is based, articulated in 2008 a theory that distills some of the insights of this research into a simple concept. He calls it the “long nose of innovation,” and it describes a graph plotting the rate of improvement, and often adoption, of a technology: a period of apparently negligible gains, followed by exponential growth.


“This work is valuable because it shows there are flashes of insight, and people do make changes incrementally, but in general you’re building on something that existed previously,” says Mr. Buxton, referring to the MIT research. “If we get rid of the hero worship and look at the actual process of innovation, we find that it is learnable, just like the piano is learnable.”

What really matters about his team’s research, says Dr. Magee, is that for the first time people have the ability to ask, and answer, questions about how quickly a technology is improving, without resorting to anecdotes or broad theories of how innovation works.

That is good news for anyone who wants to get into almost any kind of software, by far the fastest-improving set of technologies the team uncovered. But it is bad news if you’re looking for improvements in the field of “mechanical hair removal”—the slowest-improving technology out of all 1,757.

0 comments:

Post a Comment