This commentary is part of the American Worker series, which explores a range of critical topics that affect the American workforce. The series is sponsored by RAND Labor and Population.
In 1938, President Franklin Roosevelt signed the Fair Labor Standards Act into law. Among its many provisions, the act established a national hourly minimum wage of 25 cents per hour ($4.27 in 2016 dollars). By 1968, the minimum wage had reached its peak purchasing power of $1.60 per hour ($11.08 in 2016 dollars). Today, the minimum wage stands at $7.25 per hour, nearly $4 less than the inflation-adjusted 1968 peak.
Critics of raising the minimum wage have long argued that proponents are ignoring the laws of supply and demand. Their argument runs along the lines of “if you raise the price of steel, firms will purchase less steel, and so if you raise the price of labor, firms will hire fewer workers.” Consequently, the perverse effect of raising minimum wages—these critics argue—is to generate unemployment.
Analyses of minimum wages often focus exclusively on a simple labor supply argument—where wages have no effect on labor. But the traditional argument breaks down once the idea of what labor supply means is expanded; steel and human labor are not the same. Raising the price of steel does not fundamentally change what steel is. The price of labor (wages), however, is the price of people's time, and people are likely to change their behavior—that is, the very nature of the supply of their labor—when the price of their time goes up.
An increase in the minimum wage leads to a cascade of effects: higher-quality workers enter the labor market, and workers search more vigilantly for work.
Observing that changing wages is likely to change worker behavior is a more realistic approach, since focusing only on the quantity of labor to study the effect of a minimum wage hike has predicted much larger employment losses than actually occurred. More recent economic theory reflects this worker-behavior thinking, detailing how an increase in the minimum wage leads to a cascade of effects: higher-quality workers enter the labor market, and workers search more vigilantly for work. This improves the quality of the employee-employer match, raising productivity and lowering turnover.
Researchers have also noted that as wages increase, worker effort and output could increase. This could occur because higher relative wages reduce shirking (PDF) —higher relative wages make it costly to shirk since a shirker could lose a job and be unemployed, and it may be difficult to find another job with relatively high wages.
Crucially, these conceptual advances produce predictions far more consistent with what actually happens after a minimum wage increase than the simple version of the supply and demand model would. For example, Oxford University economist Andreas Georgiadis' study of the 1999 minimum wage increase in the UK (PDF) found that relatively large wage increases given to caregivers resulted in only modest reductions in employment. Importantly, nursing homes reduced the number of supervisory workers and increased the number of caregivers—leading to a reduction in supervisory costs.
Similarly, managers and business owners do not sit idly by, accepting increases in wages without changing behavior. The most common response from employers is to raise prices, passing the increased labor costs on to customers. Employers may also seek new and novel ways to raise worker productivity. For example, Walmart adopted both a $10 per hour minimum wage and an improved employee scheduling system in an effort to reduce employee turnover and increase morale. In the first quarter of its fiscal 2017, total year-over-year revenue rose 4 percent, and customer traffic in U.S. stores rose 1.5 percent.
Given that raising the minimum wage is likely to change employee and employer behavior in ways that raise efficiency and offset increased costs, does that mean the minimum wage can be increased without bound? Obviously, no: At high levels of the minimum wage, employers must raise their prices to cover higher wages, leading to significantly reduced sales. Ultimately this raises the incentive for employers to find cheaper substitutes for expensive labor such as outsourcing, offshoring, or substituting machines for people.
So how high can the minimum wage go without these negative effects? The high-water mark for the minimum wage was achieved in 1968 without causing economic catastrophe—U.S. unemployment was 3.6 percent in 1968 and fell to 3.5 percent in 1969. The 1968 minimum wage was approximately 53 percent of the average wage for production and non-supervisory workers. Making some reasonable assumptions about wage growth and allowing for gradual phase-in suggests that a national minimum wage of $12 per hour by 2020 would be similar in size and would have a similar “bite” in the wage distribution as the 1968 benchmark. Wage increases beyond $12, however, are without historical precedent in the United States, so there is no evidence to either condemn or commend increases above that level.
Given that historical levels of the minimum wage were much higher and the current national momentum to increase the minimum wage, what can American workers expect in the coming years? The next president is likely to increase the federal minimum wage. However, many states and municipalities are not waiting for Congress and the president to act. Instead, states as different as California and Arkansas, and cities across the country such as New York, Seattle, and Los Angeles, are raising minimum wages independently. Consequently, the minimum wage landscape is becoming ever more complex and varied—in many ways reflecting the underlying complexity of the U.S. labor market.
Jeffrey Wenger is a senior policy researcher at the nonprofit, nonpartisan RAND Corporation.
Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.