A Blog by Jonathan Low

 

Jan 19, 2018

What If the So-Called Skills Shortage Is Actually An Employer Shortage?

Wages have started to rise as the economy heats up.

Which raises questions about whether there was ever really a skills shortage, or merely the realization that people are capable of making informed decisions about the relative value of their time and services that may not be in accord with employers' assumptions. JL

Jordan Weissmann reports in Slate:

If good workers were really in short supply, you’d expect pay to rise as companies outbid each other for talent. Instead, employers carp(ed) about a lack of good job applicants while letting pay stagnate. What the firm loses in reduced output and revenue, it more than makes up in reduced costs paying lower wages. In reality, there’s only a shortage of people willing to work at the artificially low wage you’re paying. The problem isn’t a skills shortage, it’s that you aren’t offering market wages.
Much has been written about America’s alleged skills shortage. Articles in which executives moan about their inability to find qualified workers for job openings are business press perennials, typically focusing on “middle skill” industries like manufacturing and construction that don’t require a bachelor’s degree. In the years immediately following the Great Recession, there seemed to be an entire cottage industry devoted to blaming America’s stubbornly high unemployment rate on the notion that workers just lacked the specific talents employers needed, rather than, say, the hangover from a housing bust and financial crisis that had crippled the economy.
One of the reasons these stories never really added up was that, outside of a select few industries, American wages were relatively flat. If good workers were really in short supply, you’d expect pay to rise quickly as companies tried to outbid each other for talent. Instead, employers spent years carping about a lack of good job applicants while letting pay stagnate.
Why would that happen? One answer may lie in the recent economics paper that I wrote about. In the years following the Great Recession, the U.S. labor market was incredibly concentrated, with a relatively small number of businesses posting help-wanted ads across different industries and cities. That appeared to put downward pressure on wages; the more concentrated the local market, the lower pay tended to be, the study’s authors found. This, the study’s author’s argued, was a sign that U.S. employers had an enormous amount of monopsony power—meaning they were essentially free to set low wages, because few other businesses were around or hiring.
There are a lot of reasons why labor market monopsony is a problem. (First and foremost: Workers have zero leverage to demand a raise). But one of the more subtle issues is that a lack of competition between employers can, in theory at least, actually lead to lower overall levels of employment while creating the illusion of a labor shortage. In a functioning, competitive labor market where employers are all jockeying to hire the best staff, workers should be paid based on the value they add to a business. If you add $25 every hour to your employer’s revenue by soldering engine parts, then you should earn about $25 an hour. Otherwise, another car parts manufacturer will swoop in and offer you that much to work at its factory. (The company’s cut of revenue, for reference, is supposed to come from the value added by its capital investments in things like machinery.)
That’s not how things work when competition breaks down and companies can exercise monopsony power. If that happens, businesses may find it’s more profitable to pay workers less than their worth. (Shocking, I know.) But this creates a dilemma for bosses who can’t find any more workers willing to work for low pay. Management can either advertise higher wages, and risk having to bump up its current workers’ earnings as well. Or it can keep advertising the same cruddy wage and end up not hiring anybody. In the textbook models, employers choose the latter—more profits, less staff. (Shocking, I know.) As President Obama’s Council of Economic Advisers explained in its brief on monopsony back in 2016, “Economic theory shows that firms with monopsony power have an incentive to employ fewer workers at a lower wage than they would in a competitive labor market. What the monopsonistic firm loses in reduced output and revenue, it more than makes up in reduced costs by paying lower wages.”
Here’s a hypothetical example of how the theory might play out in the real world. Let’s say you manage a small construction company, and you’ve been getting away with paying your crew relatively little because there aren’t that many other contractors posting help-wanted ads in your town. You need a new carpenter. But you don’t want to tick off the rest of your men by offering this new potential employee a more generous wage. So you post the job with the same mediocre hourly rate you’ve offered for the past three years. Nobody good responds, and to you, this looks like there aren’t enough talented carpenters out there. But in reality, there’s only a shortage of people willing to work at the artificially low wage you’ve set your heart on paying. The real problem isn’t a skills shortage, it’s that you aren’t offering market wages, because the market isn’t functioning.
It’s easy to imagine how this all played out immediately post-recession, when employers got used to being able to dictate wages in an anemic job market. But while the paper on monopsony I covered only tracks data between 2010 and 2013, it seems plausible that the labor market is still suffering from severe concentration. So remember, the next time you hear about a skills shortage, the real problem may actually be an employer shortage.

0 comments:

Post a Comment