A Blog by Jonathan Low

 

Jul 31, 2018

Why Do the Biggest Companies Keep Getting Bigger? It's How They Spend On Tech

Proprietary tech contributes more to firms' competitive economic advantage than other types of technology.  JL

Christopher Mims reports in the Wall Street Journal:

The secret of the success of the Amazons, Googles and Facebooks of the world—not to mention the Walmarts, CVSes and UPSes before them—is how much they invest in their own technology. IT spending that goes into hiring developers and creating software owned and used exclusively by a firm is the key competitive advantage. The productivity gap correlates with the increase in spending on proprietary IT. In 1985, firms spent 7% of their net investment (software, new buildings, R&D and the like) on proprietary IT. In 2016, 24% of U.S. firms’ net investment went into proprietary IT.
Your suspicions are correct: The biggest companies in every field are pulling away from their peers faster than ever, sucking up the lion’s share of revenue, profits and productivity gains.
Economists have proposed many possible explanations: top managers flocking to top firms, automation creating an imbalance in productivity, merger-and-acquisition mania, lack of antitrust regulation and more.
But new data suggests that the secret of the success of the Amazons, Googles and Facebook s of the world—not to mention the Walmart s, CVSes and UPSes before them—is how much they invest in their own technology.
There are different kinds of IT spending. For the first few decades of the PC revolution, most companies would buy off-the-shelf hardware and software. Then, with the advent of the cloud, they switched to services offered by the likes of Amazon, Google and Microsoft . Like the difference between a tailored suit and a bespoke one, these systems can be customized, but they aren’t custom.
IT spending that goes into hiring developers and creating software owned and used exclusively by a firm is the key competitive advantage. It’s different from our standard understanding of R&D in that this software is used solely by the company, and isn’t part of products developed for its customers.
Today’s big winners went all in, says James Bessen, an economist who teaches at Boston University School of Law and who recently wrote a new paper on the policy challenges of automation and artificial intelligence. Tech companies such as Google, Facebook, Amazon and Apple—as well as other giants including General Motors and Nissan in the automotive sector, and Pfizer and Roche in pharmaceuticals—built their own software and even their own hardware, inventing and perfecting their own processes instead of aligning their business model with some outside developer’s idea of it.

The result is our modern economy, and the problem with such an economy is that income inequality between firms is similar to income inequality between individuals: A select few monopolize the gains, while many fall increasingly behind. Might it eventually be the case that the biggest firms aren’t just dominant, but all-encompassing?
The measure of how firms spend, which Mr. Bessen calls “IT intensity,” is relevant not just in the U.S. but across 25 other countries as well, says Sara Calligaris, an economist at the Organization for Economic Cooperation and Development. When you compare the top-performing firms in any sector to their lesser competition, there’s a gap in productivity growth that continues to widen, she says. The result is, if not quite a “winner take all” economy, then at least a “winner take most” one.
That productivity gap correlates with the increase in spending on proprietary IT, says Mr. Bessen. In 1985, firms spent on average 7% of their net investment (which includes software, new buildings, R&D and the like) on proprietary IT, according to data from the Bureau of Economic Analysis. In 2016, about 24% of U.S. firms’ net investment went into proprietary IT. That’s nearly $250 billion in a single year, and almost matches their outlay for R&D and capital expenditures.
This also has implications for wages—the rise in the wage gap since 1978 is almost entirely attributed to an increase at more-productive firms that occurred as pay at less-productive firms remained relatively static, according to the National Bureau of Economic Research.
When new technologies were developed in the past, they would diffuse to other firms fast enough so that productivity rose across entire industries. Samuel Slater, the “father of America’s industrial revolution,” was able to more or less single-handedly bring England’s pioneering power-loom technology to the U.S. by apprenticing himself to an English weaver and memorizing the design of his looms and mills. And 20 years ago, firms could adopt Microsoft Office or Adobe ’s desktop publishing software and instantly disrupt larger firms that were slower to adopt this new technology.
But imagine instead of power looms, someone is trying to copy and reproduce Google’s cloud infrastructure itself. What if Excel had never been consumer software, but instead was, say, a closely guarded piece of Ernst & Young’s internal infrastructure?
What we see now is “a slowdown in what we call the ‘diffusion machine,’” says Dr. Calligaris.
One explanation for how this came to be is that things have just gotten too complicated. The technologies we rely on now are massive and inextricably linked to the engineers, workers, systems and business models built around them, says Mr. Bessen. While in the past it might have been possible to license, steal or copy someone else’s technology, these days that technology can’t be separated from the systems of which it’s a part.
Think of Facebook’s artificial-intelligence engine, which it developed at great cost for its namesake social network, but then was able to migrate with relative ease to Instagram. Could Instagram have developed something equivalent on its own? Snap and Twitter may try to copy aspects of it, but they can’t see enough under the hood to truly clone it.
And what about Amazon? Sure, you can start a business that uses Amazon’s cloud-computing services and taps into its logistics platform by selling on its site, but the software Amazon developed to enable Amazon Web Services and its retail marketplace are not themselves available for other firms.
Walmart built an elaborate logistics system around bar code scanners, which allowed it to beat out smaller retail rivals. Notably, it never sold this technology to any competitors.
Just spending money on technology doesn’t cut it, however. “In retail, Sears in the ’80s was IBM ’s biggest customer,” says Mr. Bessen. “They were a big investor in IT but they just proved incapable of competing effectively with Walmart and its systems.” Part of the problem with Sears’s approach could be that it hired an outside technology firm instead of doing the work—and building the infrastructure of talent, systems and institutional knowledge—itself.
This seemingly insurmountable competitive advantage that comes with big companies’ IT intensity may explain the present-day mania for mergers and acquisitions, says Mr. Bessen. It may be difficult or impossible to obtain critical technologies any other way.
And Mr. Bessen doesn’t think the advantage is due to differences in regulation, as the biggest firms are becoming more productive across many countries—in both the U.S. and in Europe. It might, in fact, explain why recent efforts by the European Union to penalize Google and other tech companies with massive fines could come to naught.
It’s not clear just how long this phenomenon will drive the biggest firms in each sector to grow faster than their competitors. But as tech’s giants tiptoe toward monopoly, it’s worth asking whether modern information technology has built in a kind of natural law that says we’re destined to buy all our goods and services from just a handful of ultra-giants, once they’re done buying or out-competing everyone else in their markets.

0 comments:

Post a Comment