Epistemic status: Super casual
Raising minimum wages is supposed to reduce employment, right? If you make something more expensive, people buy less of it.
So, if you ran a study, you’d think you’d actually see that. If one state or city raises the minimum wage, employment should drop.
But the evidence actually seems pretty equivocal.
In 1994, the famous Card and Krueger study came out. New Jersey’s minimum wage rose; neighboring Pennsylvania’s didn’t. Yet, over a period of 8 months after the wage hike, full-time employment increased in New Jersey relative to Pennsylvania. Instead, the fast-food employers passed on the extra costs to customers in the form of higher prices of meals.
This study is controversial and its results have been directly challenged, though even that challenge is itself controversial:
Because of concerns about the Card and Krueger data, the Employment Policies Institute examined payroll records for 71 fast-food restaurants and found significant discrepancies between the Card and Krueger data and payroll records for these firms. They found significantly different results when their revised data was used for estimation purposes. Critics of the EPI study argue that the selection process used to generate the Employment Policies Institute sample appears not to be random (all Pennsylvania observations are Burger King restaurants owned by a single franchise owner).
There are more recent studies finding that minimum wage increases do reduce employment, like a study of Seattle’s 2015 minimum-wage hike, which has found that (compared to economically-similar counties without the wage increase) Seattle saw low-wage employment drop slightly and wages rise slightly, for a net decrease in low-wage workers’ earnings.
The results seem to depend a lot on how the studies are conducted. The Economic Policy Institute observes that fixed-effects regression studies tend to show that minimum wage increases have negative employment effects, while matching locations with minimum wage increases with similar locations without them finds that minimum wages don’t harm employment. The former is a more rigorous methodology (since minimum wage hikes may be correlated with economic downturns in some way.)
However, economist David Neumark writing for the WSJ claims that it matters how you match test and control jurisdictions; if you match geographically nearby locations, you find that minimum wage increases don’t cause reduced employment, but if you match locations which are subject to the same economic shocks, you do find a negative employment effect of minimum wage hikes.
This is all very confusing. Even if minimum wages do indeed reduce employment, if it were a huge and unequivocal effect, we wouldn’t find that it was so sensitive to statistical methodology.
What could be going on?
- Employment is sticky. When workers get more expensive, employers don’t fire them now, because high capital costs mean you still need about the same number of people to run your store/restaurant. Instead, what might happen is:
- Firms eat the cost and go out of business (some evidence shows that this happens)
- Firms invest in labor-saving equipment and don’t hire anyone later (some studies show that minimum wage hikes hurt long-term job growth)
- Firms cut back on non-monetary compensation for workers (like AC or benefits)
- People’s demand for goods produced by low-wage workers isn’t very elastic; we’ll still eat a sandwich even if it’s more expensive, so most of the cost of minimum wages gets passed on to consumers.
- There’s massive data falsification in some direction
- Economics is fake: people don’t really always try to get more stuff for less money
I don’t have a great way to find out what’s actually going on. I’d really appreciate more information if anyone has it!
13 thoughts on “What’s Up With Minimum Wage?”
One thing generally worth noting: Most current proposals for increasing the minimum wage are about raising it from ~$7.25 to ~$15 (http://fightfor15.org/?). This is a much, much, much larger shift than those that have been studied before. The shift from $9 to $10 a decade ago should look like the shift from $10 to $11 now, but from $10 to $15 is a very different animal. A small enough change can easily be lost in the noise of economic activity.
Epistemic status: Not an economist. I have no idea, I’m just brainstorming.
Everyone seems to assume that the product that is being purchased (labor) is independent of the price being paid (wage). It is easy to image that people who are paid more (within some reasonable range) are more productive on average (are happier, more energetic, etc) and that businesses that pay more might gain higher quality labor, (have less turnover, etc) such that the actual cost to the business might be significantly less then the raw cost of the price increase (or even be negative, implying that the business gains from a higher wage). If the actual cost is less then the paper cost, the level of employment might not decrease as much as the paper cost would imply, making it hard to measure.
The question becomes, why would the free market not price labor efficiently in low wage markets?
The case that I think might be correct is as follows:
Most low wages business are large multinational conglomerates (Wal-Mart, fast food, etc) that are highly centralized in organization. This leads to large economies of scale; however, it means that those in control of the business operate under strong legibility constrains (see https://www.ribbonfarm.com/2010/07/26/a-big-little-idea-called-legibility/). The cost-benefit analysis of wage work, especially service work, is legibility asymmetric. The cost is a very legible number, whereas the benefit might require a lot of statistics and large studies to understand (average turnover, costs of retaining, etc). This puts a large pressure on decision makers at the top to get the obvious legible reward (lower wages) at the cost of an illegible benefit that might not even be measurable on the timescales they care about (quarterly reports).
This would also imply that although the effects of the minimum wage might be small on large businesses, it could be large on small business that have fine-grained information about the conditions of their workers. I remember reading about some small bookstore going out of business because it could not afford to pay the newly increased minimum wage. The workers accepted a lower wage because working conditions were good, but once that became illegal they were forced out.
The minimum wage is also a very strong Schelling point. There is a lot of work that people mentally categorize as “minimum wage work” and would not want to pay more then that. The real minimum wage has slowly decreased over time due to inflation. If businesses peg their nominal entry level wage to the minimum wage then they might have fallen below the optimal wage over time, even if the minimum wage was close to the optimal wage originally.
Do you know of any literature that would confirm or rebut this line of thinking?
It is a little-remarked development of the economy over the past generation that large chains have become highly centralized. Franchises used to be much more popular, especially in fast food. But now the stores are company owned. Newer chains are more centralized but also older chains are more centralized than they used to be. This may be due to improved centralization technology, but I think it is probably also due to cheap debt being a better choice than equity.
Franchises effectively are small business. They have the incentive an autonomy to properly price labor. Most of the stores in paper are franchises and it considers them separately.
I think this line of reasoning is a good one. Specifically it makes me think that there is a generalized pattern of asymmetry in the diffusion-concentration of costs-benefits which could act as a persistent force. Imagine that you have 2 axis, one for cost one for benefit and they vary from diffuse to concentrated. If a 45 degree line represents perfectly balanced investment ideas then investments in real life will tend to fall off more to the concentrated benefits, diffuse costs side due to the cost of information. We should expect to see a larger discrepancy the larger the cost of information which helps explain some other distortions as well. Easier to see with a crude drawing maybe: http://imgur.com/a/K5uCI
What does it say about the field of economics that so much attention is paid to this one study? Card and Kruger did a meta-analysis shortly afterwards and found no effect. So there were earlier studies that found no effect, but no one paid them any attention. (Also the meta-analysis showed serious publication bias, so the median study showed the expected effect, but if one study was sufficient for people to freak out, why this one?)
link to meta-analysis? and do you mean publication bias *against* minimum wages reducing employment?
1995, just a year later
No, a publication bias in favor of studies that (1) match the theory; and (2) have statistically significant effects. Which is the direction I expect publication bias to go, although I don’t think I’ve seen a systematic study of the topic. (I expect papers against theory to be easier to publish in top journals and to make people’s careers, but I expect papers that match theory to be easier to publish.)
I’m not saying that you should read a 20 year old meta-analysis, just that it is a source for the shape of the literature before Card-Krueger, in particular against the common claim that their result was unique.
There is another pattern that I think is common in social science. People have a theory that makes a qualitative prediction, but not a quantitative prediction. If you don’t have a quantitative prediction, you can’t make an adequately powered study, but no one is willing to take a null result as concluding a small effect, rather than falsifying the theory. So there is strong pressure to massage the data to be statistically significant. The result is a sequence of papers all just barely statistically significant. These are all taken to confirm the qualitative theory, but no one notices that they contradict each other’s effect sizes.
I think the Schelling-point effect may be significant, particularly if minimum wage is slightly below the wage that would otherwise be the equilibrium.
Be careful to distinguish the unemployment rate from employment levels.
I can imagine a scenario in which the minimum wage increases unemployment by putting marginal quality workers out of work, yet increases employment levels by drawing higher quality workers into the labor market.
The theory tells us that – all else being equal – making low-wage hourly employees more expensive should lead to fewer work-hours and lower-quality jobs, but it doesn’t say precisely *when* that will happen. Figuring out exactly when employment will be affected is an empirical question.
I think everybody knows intuitively that a LARGE, SUDDEN, UNEXPECTED increase would hurt employment and kill businesses, so only a mild, gradual, well-telegraphed increase is even politically possible. (And even that is MORE politically possible during economic booms when the natural reserve wage has already increased). So the kind of increases we get tend to be the kind that’s hard to spot.
Suppose the legislature debates over raising the minimum wage for a year or two, then passes a law that first takes effect the following year. Smart well-connected businesspeople might start figuring out how to cut hours (and put expansion plans on hold, and increase automation…) well *before* the law takes effect. Then when a study is done that compares low-wage employment numbers, say, right before versus 6 months after the law *takes effect* it sees no employment loss because the major employment loss (and/or loss of job growth) was already baked into the “before” numbers that the “after” numbers are being compared to. That is the sort of mistake that would be easy to make either accidentally or (in the case of partisans who WANT to find no effect) deliberately. You could also err in the other direction – maybe businesses are optimistic and willing to take losses for a while, so “6 months after” or even “one year after” numbers don’t YET capture all the inevitable job loss.
One thing we might try to do is pay attention to the dose-response curve. *Larger percentage* wage increases should produce *larger percentage* employment effects. So rather than looking at the impact of just one specific increase at a time, look at a bunch of countries/states/cities at once and make a dot plot.
You missed out this one:
increased minimum wage ->increased demand for goods and services -> disemployment effect is counteracted.
Minimum wage workers will spend all the money they have, in their local area – the same is not necessarily true of store/restaurant owners. You can probably find the relevant paper with the model in it if you search around a bit, it’s quite well known but it’s been a while so I can’t give you the cite!