A Blog by Jonathan Low

 

Sep 1, 2014

The Real Reason Companies Are Spending Less on Tech

Tech and finance have enjoyed a spectacular run of co-evolutionary success. Tech benefited from the financial markets' ability - eagerness is more like it - to monetize their innovations and then sell them to a grateful world.

The financial services did even better: they scored a two-fer (two for the price of one), making a pile on commissions and fees as they took tech to market and employed some of that very technology to revolutionize both the way they identified, managed and traded the new found wealth and by coming up with some tech-inspired innovations of their own which both increased the speed and the profitability of their efforts. What a deal.

But as they say in wedding vows if not in SEC filings, such partnerships are often forced to endure for better or for worse, in sickness and in health.

The reality is that enterprises may have gotten more out of tech than tech has gotten out of them. All the productivity gains, new data, access to new markets and their ilk have delivered spectacular returns. Greater, in fact, than many institutions are equipped or organized to incorporate into their operations.

So, as most human endeavors are wont to do eventually, they got a little greedy and a little lazy. And then some of those cyclical things happened that everyone reads about in their introductory economics courses and promptly assumes are intended for their less intelligent competitors rather than them. The result of which is that the financial industry and all those dependent on it suddenly find themselves under greater pressure to perform. Investment bankers with great pedigrees get fired. Brilliant technological innovations go unfunded. Once sturdy cash cows begin to look anemic.

The point is that sources and uses, needs and wants, tactics and strategies all change. What once looked like a great partnership, suddenly becomes an impediment, even a liability.

Business and government will undoubtedly reinvest in tech at higher rates in the future. But as we seem to have to be regularly reminded, nothing,not even our appetite for innovative new devices, is forever. JL

Justin Fox comments in Harvard Business Review:

Financial pressures are pushing them to focus on efficiency and performance improvement rather than investing in innovations that might create new markets.
After the dot-com bubble, investment in software and information processing equipment in the U.S. tumbled, and stayed down. As a percentage of GDP, it’s now back to mid-1990s levels:
theitinvestment
There’s a version of the chart above in the much-discussed paper that MIT economist David Autor presented last week at the Federal Reserve’s annual Jackson Hole meeting. As part of a thoughtful and generally sanguine look at whether machines are going to take all of our jobs, Autor wrote that whatever might happen in the future, computers and their robot friends didn’t seem to be taking our jobs now:
As documented in [the chart] the onset of the weak U.S. labor market of the 2000s coincided with a sharp deceleration in computer investment — a fact that appears first-order inconsistent with the onset of a new era of capital-labor substitution.
Autor suggested that financial-market troubles (first the dot-com bust, then the global financial crisis) coupled with “China’s rapid rise to a premier manufacturing exporter” probably played much bigger roles in U.S. job market troubles of the past decade than new technology had. That seems reasonable enough.
But I couldn’t help but fixate on that information-technology chart, which seemed to show corporate America giving up on IT. Maybe it was Nick Carr’s famous May 2003 HBR article “IT Doesn’t Matter” that did it. Or maybe corporate executives found that all that money they were pouring into computers wasn’t really paying off, or that even if it did, stock buybacks were an easier and safer path to keeping their paychecks big.
Or maybe modern information technology just keeps getting cheaper.
The software and devices of today can do vastly more than those of a decade ago, usually for the same or lower prices. The Bureau of Labor Statistics and the Bureau of Economic Analysis try to adjust for such quality changes in calculating inflation and real GDP. They catch some flak for this from those who think they are understating inflation, but it seems like a necessary exercise, especially in IT. So I set about redoing the above chart using the “chained-dollar” inflation-and-deflation-adjusted versions of both GDP and investment in information technology equipment and software. I could only easily access data back to 1999, and I should note that the BEA explicitly cautions users of its data against doing what I did, “because the prices used as weights in the chained-dollar calculations usually differ from the prices in the reference period, and the resulting chained-dollar values for detailed GDP components usually do not sum to the chained-dollar estimate of GDP or to any intermediate aggregate.”
Got that? Anyway, here’s the chart:
investmentnonbust
So despite small drops amid the dot-com bust and the financial crisis, real investment in information technology has continued to rise. Well, sort of. The amount of money corporations have been putting into IT, relative to the size of the overall economy, dropped sharply in the early 2000s and has stayed down (that’s what the first chart shows). But the estimated value that they’ve been getting out of those investments has continued to rise.
I don’t think this chart helps a lot in answering whether the machines will take our jobs. The two charts together, though, do illuminate much about the strange economy of the past decade-plus.
Corporations have been spending relatively less on IT and getting dramatically more for the money. Their biggest area of capital investment, thus, has been something of a free lunch. The result: big profits, low capital spending, and big piles of cash that executives and boards don’t quite know what to do with.
You might think that the ever-bigger bang for the buck in IT would lead corporate managers to double down and invest even more, but as Clayton Christensen and Derek van Bever argued in the recent HBR article “The Capitalist’s Dilemma,” a variety of financial pressures are pushing them to focus on efficiency and performance improvement rather than investing in innovations that might create new markets.
The pace of improvement in IT is a giant gift that, since the early 2000s, only a few have been rushing to open.

0 comments:

Post a Comment