Robert Pollin makes a compelling case for the centrality of full employment to the creation of a decent society, to the ability of individuals and families to live with dignity rather than despair, and to the overall health of an economy in which consumer spending is key to sustained growth. His capsule history of economic thinking on the causes of unemployment and the tradeoff between employment and inflation—from Marx to Milton Friedman to Gösta Rehn—is informative, and his main policy recommendations are difficult to argue with: increase employment in the United States by shifting $330 billion in annual spending from the military and fossil-fuel sectors to public and private investments in education and clean energy for a net gain of 4.8 million jobs.

I do have one quarrel with the analysis. Pollin observes the low unemployment achieved by the U.S. economy in the late 1990s despite globalization and accepts this as evidence that the United States doesn’t have to address its trade deficit to achieve full employment. But this was possible only in a bubble scenario. With a high trade deficit, either the public or private sector (the latter, in the 1990s example) must incur debt in order to maintain high employment. Reducing the trade deficit is essential to sustaining full employment without a repeat of bubble boom and bust.

Pollin’s central argument, however, is sound, though it might benefit from further elaboration. I take as my starting point his definition of full employment—with which I am in full agreement—as not simply workers scratching out a living somehow but as an abundance of jobs with decent wages and working conditions. This definition of full employment raises two issues that need to be confronted: first, the implications of employers’ increased power over workers vis-à-vis wage setting, and second, the implicit willingness of policymakers to count as employment care-work jobs that pay poverty wages. Without this fiction, achieving full employment is a far more difficult proposition. If full employment means jobs for all at decent wages, then we need to be concerned about both re-employing the millions of men who lost jobs in manufacturing and construction and about wages and job quality in the rapidly expanding care-work sectors in which millions of women labor.

On the wage front, the decline in unionization means that older ideas of wages as the result of a grand bargain (or great struggle) over the division of productivity gains are no longer relevant. Unions are not the countervailing force they were in the quarter century from 1948 to 1973, able to compel employers—through direct negotiations and the “union threat effect” at non-union companies—to agree to a reasonable division of a growing economic pie.

Asked if employees’ wages rise with productivity, a mid-size business owner answered, ‘I make it. I take it.’

From the point of view of today’s employers, the notion of wages as a means of securing a decent standard of living for Americans is so last century. At a meeting of the Philadelphia chapter of the National Association of Business Economists, I asked the owner of a medium-sized business whether his employees’ wages were rising along with increases in productivity. “I make it. I take it,” he answered. Like most employers—and any manager who has taken a course in human-resource management—he believed that the wage functions to provide workers with an incentive to show up for work and do what managers expect. His workers show up for work—proof that he is paying them fairly.

This represents a corruption of what economists call “efficiency wage” theory. The theory holds that when companies pay employees more than they can earn elsewhere, workers respond with extra effort and greater productivity. Whether they realize it or not, employers use this concept as a cudgel. A manager at Motorola explained that his company pays workers at the 70th percentile of wages for their occupation. That way, workers know that if they don’t perform and lose their jobs, there is a 70 percent chance their next job will pay less. In other words, wages are de-linked from productivity and only increase if wages rise generally. But with innovative companies playing a lagging rather than a leading role in wage setting, prospects for a general rise in wages are dim.

In the large and growing care-work sector, the increased ability of employers to dictate wages intersects with traditional views of the day-to-day care of the most vulnerable members of society. Aid to the young, the old, the disabled, the frail, and the sick is seen as work that requires only the intrinsic abilities of the women who do it. Workers typically receive minimal on-the-job training, and median pay is in the bottom quartile of the income distribution: just over $21,000 a year in 2008 for workers with full-time hours. More than 4 percent of Americans employed in 2008 were home-health workers, personal- and home-care aides, nursing aides and orderlies, medical assistants, child-care workers, and teaching assistants. That is slightly more than the share of all mathematical-, computer-, biological-, physical-, and social-science, and legal jobs. The Bureau of Labor Statistics projects that 10 percent of total U.S. job growth by 2018 will come from the care sector. Wage increases such as those experienced in the late 1990s—due not only to full employment but to increases in the minimum wage in 1996 and 1997—would raise median wages in these care-work jobs to just above the poverty line.

As Pollin remarks, the changes required to get to full employment at decent wages “will require nothing less than an epoch-defining reallocation of political power away from the interests of big business and Wall Street.” What constellation of social forces can accomplish this transformation is unclear. That it is essential for creating a decent society, however, could not be more certain.