When the Minimum Wage Makes Economists Smile

Chairman of the President's Council of Economic Advisers is a grand title, and it's been held by some pretty impressive academic economists over the years (Arthur Okun, Marty Feldstein, Joe Stiglitz, Ben Bernanke, Cristina Romer — to name a not-entirely-randomly chosen few). But it's usually hard to detect the Chairman's fingerprints in administration economic policy. The big decisions are made in the West Wing of the White House, not over in the Executive Office Building where the Council is housed.

So when a chairman does have a clear impact, it gets noticed. Then-chairman Glenn Hubbard, for example, pushed for and got a reduction in taxes on dividends in 2003. And, in the State of the Union Address Tuesday night, current Chairman Alan Krueger got a kind of shout-out from the President, in the form of a proposal to raise the federal minimum wage all the way from $7.25 to $9 and index it to inflation after that.

In 1992, when New Jersey raised the state minimum wage from $4.25 to $5.05, Krueger and his then-Princeton colleague David Card surveyed 410 fast-food restaurants in New Jersey and eastern Pennsylvania before and after the wage hike. The idea was to compare changes in fast-food employment in New Jersey, where the minimum wage had risen, with those in Pennsylvania, where it stayed constant at $4.25. The surprising result: fast-food employment went up in New Jersey relative to Pennsylvania.

This was surprising because the basic supply-and-demand model of economics teaches that, when you raise the price of something (in this case, low-skilled labor) demand for it will go down. There had been a number of philosophical objections posed to this approach through the years — among them the argument that employers possess more power and information than individual workers in most labor markets, allowing them to push wages below the optimal level in the absence of collective bargaining or government intervention. But Card and Krueger now had empirical evidence that the "textbook model," as they put it, didn't work. The New Jersey fast food restaurants did pass their increased wage costs on to customers in the form of higher prices — but they weren't enough higher to hurt business.

This research was a sensation, as economic research goes. It got lots of media attention back in the early 1990s, and has continued to inspire economist after economist to attempt to refute or back up its conclusions (Google Scholar lists 8,780 citations, and Wikipedia summarizes some of the major work). It's probably accurate to say that most economists still don't believe that raising minimum wages is a reliable way to increase employment (one hopes that Brian Barry and Anil Kashyap of the University of Chicago will ask their Economic Experts Panel about this soon) — but I also get the sense that the percentage of economists who think it has a substantial negative effect on employment has declined since the initial Card-Krueger research was published. Economists in general have become a bit less trusting of "textbook models" than they were in the 1970s through 1990s. And while it's dangerous to equate changing fashions in the economics profession with truth, I'll go ahead say that the business groups making dire claims about the negative economic impact of a minimum wage increase are mostly blowing smoke.

productivitygap2.jpegWhat ails the U.S. economy, and in particular its workers, though, goes well beyond the minimum wage. According to the Bureau of Labor Statistics, 3.8 million people, or about 3% of the country's wage and salary workers, made the minimum wage or less in 2011. Yet workers across the income spectrum, except for those at the very top, have been stuck in neutral for a while. Except for a brief uptick during the dot-com era, labor's share of income has been on a steady decline since the early 1970s. Through the years, this has been mostly attributed to globalization (capital can go anywhere on the globe in search of the highest return, while workers are generally stuck in the country they came from) and technological change (machines are replacing workers). Lately there's increasing sentiment, perhaps not so much among economists as among others who care about economic policy, that maybe political decisions and changing social mores have played big a role, too.

As Jonathan Schlefer wrote on hbr.org in November, early economists like Adam Smith and David Ricardo believed wages were set by "habits and customs of the people," to use Ricardo's words, as much as by economic forces. That's where something like the minimum wage — or collective bargaining by labor unions — comes in. If, in a free market, the wages and salaries paid closely approximate the actual value of the work done, then minimum-wage laws and unions can only get in the way. But if labor markets are naturally riddled with inefficiency and affected by custom and habit, then laws and unions can conceivably bring a healthier economy — and higher profits for business — by raising wages.

Studies of retailers seem to indicate that this might the case. So does the current example of the Northern European countries, which combine strong unions and high wages with higher competitiveness rankings than the U.S.

Of course, it's really hard to imagine at this point that unions will ever regain much of a foothold in the U.S. private sector. The minimum wage is irrelevant to most of the workforce, too. And it may still be that most reliable way to increase the wages of American workers is simply to upgrade their skills. But I get the sense that the conversation about pay in the United States is just getting started — and that it's not going to be dominated by the models out of economics textbooks.

This entry was posted in Leadership. Bookmark the permalink.

Comments are closed.