Thus far, my equity and eligibility posts have reviewed public policies that direct, attempt to direct, or pretend to direct public benefits to the less well off, or to those most in need. Yet government eligibility rules have often directed public benefits, services and protections to the better off, not the worst off, with not only “conservative” and Republicans but also “egalitarian” Democrats in support. And the policy of providing benefits to the better off hasn’t been limited to obscure, limited-cost programs; it has been characteristic of the history of the most extensive and expensive public programs in the country. Over the past century and a half, many now universal or near universal benefits and protections such as public education, Social Security, unemployment insurance, and the minimum wage were offered first to the organized and influential, then to population at large, and finally to the least well off – the poor, the sick, the disabled, and minorities. The better off, in short, were the “thin end of the wedge,” those with the political clout to get a public benefit created, who later felt a duty to offer the same benefit to others. Until more recently the government, and the tax burden, reached some kind of maximum/equilibrium and the march to universal benefits stopped, leaving inequities in its wake. No one feels much duty to others anymore, even if they are receiving benefits themselves.
Take public education, for example. Universal public elementary and secondary school education developed in fits and starts over the centuries, a process which is now being repeated in developing countries. According to Gotham, the Pulitzer Prize winning history of New York City, in 1840 public schools were only opened in middle-income neighborhoods, not in neighborhoods populated by the immigrant poor. As late as 1940 two thirds of adult Americans had not attended high school, according to data in the Census Bureau’s Historical Statistics of the United States. At that time only 42 percent of all five and six year olds were in school. Eventually, however, public education through 8th grade, and then through high school, and then beginning with kindergarten became a social expectation for almost all children in the United States. The better off in better off communities, however, had received it first.
Social Security underwent a similar evolution, as a quick read through the Social Security Administration’s history webpage shows. Today Social Security is thought to be universal, with the Social Security number the equivalent of a universal national identity system. Initially, however, Social Security payments were only available to workers, not to widows after the workers had died (they were expected to go live with the kids). Also, the 1935 Social Security Act based retirement benefits on wages earned in employment, but the act’s definition of “employment” only included those working in substantial, regular, urban businesses. It excluded agricultural laborers, “casual laborers,” those laboring on vessels, domestic servants, those working in the non-profit sector, and those working for the government. Public employees were presumed to have their own pensions, but those laboring on farms, as sailors, as casual laborers, and as domestic servants comprised a substantial share of the poor –especially the minority poor — of the time.
The history of the minimum wage, as recounted in the history page of the website of the U.S. Department of Labor’s Fair Labor Standards Administration, shows a similar evolution. The 1938 Fair Labor Standards Act was applicable generally to employees engaged in interstate commerce or in the production of goods for interstate commerce – that is large, generally unionized manufacturing corporations. Amendments passed in 1961 extended coverage to employees in large retail and service enterprises as well as to local transit, construction, and gasoline service station employees. Additional amendments in 1966 extended coverage to state and local government employees of hospitals, nursing homes, and schools, and to laundries, drycleaners, and large hotels, motels, restaurants, and farms. Subsequent amendments extended coverage to the remaining federal, state and local government employees who were not protected in 1966, to certain workers in retail and service trades previously exempted, and to certain domestic workers in private household employment. In each case, coverage was extended to lower and lower paying industries and occupations. Today virtually every employee is covered by minimum wage laws, though enforcement against “off the books” employment for less than the minimum wage is sporadic.
For 200 years, as the United States became more affluent and developed, the “thin edge of the wedge” strategy generally led to universal public benefits. In the last three decades, however, it has failed do so.
The best example of this is the advance toward, and then retreat from, universal health care. The first publicly funded health benefits were public health programs, public hospitals and sanatoriums. At a time when few people, especially few poor people, lived long enough to suffer from chronic conditions such as heart disease and cancer, these organizations and agencies addressed the greatest threat of the time – infectious disease. An infectious disease has the potential not only to weaken or kill the person that has it now, but also to spread and to affect others — including the well off — later. The publicly funded treatment of the infectious diseases of the poor, at first by quarantine and then by vaccination, was therefore as much a benefit for wealthy non-beneficiaries as for the poor beneficiaries themselves. The poor seldom had enough money to see physicians and obtain treatment for non-infectious conditions that only affected themselves.
The next wave of public health finance was the growth of private health insurance among employees of large corporations during the Second World War, and the exemption from taxable income of employer payments for that insurance. At a time when the highest federal income tax bracket was 70 percent, that exemption was worth a great deal for the wealthiest; the government was footing, indirectly, almost their entire health insurance bill. The middle class, meanwhile, received a back-door public subsidy worth one-quarter to one-third the cost of insurance, based on its tax rates. Medicaid and Medicare were introduced in the mid-1960s, providing health care to the elderly and dependent poor. This left only working poor, and self-employed people who could not afford health insurance, outside the public health finance system. Both groups were shrinking and the nation became wealthier, and as large corporations replaced small businesses in more and more of the American economy.
Then things began to change. In the late 1970s and early 1980s, large industrial companies downsized, pushing millions of people out of positions in which health care – and similarly tax advantaged pensions – were assured. Many such companies cut costs by purchasing goods and services from new, smaller companies whose chief cost advantage was the absence of such benefits. As the large baby boom generation flooded the labor market, positions in large companies, and positions with health insurance, became scarce, and the trend away from self-employment reversed. Today more and more working people lack health insurance – only half of all private-sector employees had employer-provided health insurance in 2000 according the Bureau of Labor Statistics. With more and more influential and organized people feeling threatened, there was a move, in the early 1990s, toward some form of universal health care. The specific program proposed by the Clinton Administration was rejected. Some alternatives were briefly discussed. And then nothing happened.
A de facto political settlement that has occurred, one that has left many of the most powerless and vulnerable without publicly-financed or subsidized health care, even as they are forced to pay sales taxes for state Medicaid programs and payroll taxes for Medicare – for the benefit of those who are often better off than themselves. And more and more public subsidies are benefits are being provided to those who already have benefits, because the tax subsidy is greater as the cost of health insurance rises and more high tech health care is made available to the insured. A drug benefit for the elderly under Medicare widens the inequity further.
In many states, a similar settlement has been reached for unemployment insurance, with a substantial share of the poorest workers left out of the system even as their employers continue to pay into it on their behalf. The greatest inequity is the denial of benefits to part time workers. According to a 2002 overview by the National Employment Law Project, part time workers are ineligible for unemployment benefits when laid off in 29 states, even though taxes are collected on their wages. Part time work is indicative of the employment opportunities in the low-wage, low-skill retail and food service businesses. Such workers are actually subsidizing (assuming their employer’s need to pay unemployment taxes affects the wages they receive) those in better-paid, full-time jobs.
The NELP is a liberal advocacy organization that favors allowing workers to collect unemployment insurance even if they quit their jobs, for a variety of seemly good reasons that would be impossible to prove or disprove. Such policies would run into all the administrative and fraud issues of similar “means” and “need” restricted benefits. There is, however, no doubt about whether or not a person has been working part time, since there is a record of unemployment taxes paid on his or her behalf. Equal treatment for part-timers, however, would cost money. Having the less well off subsidize those better-paid jobs is a part of a package of benefits and subsidies that inequitable states, most in the South, have used to attract such jobs out of places like New York City. In fact, unemployment insurance is one of the few remaining ways that southern states remain more inequitable than New York.
Pre-kindergarten is a final example. According to Census Bureau, by 2001 the share of women with children under age six who were in the labor force had reached 63 percent for those who were married, and 70 percent for those who were single. Given that many of today’s young parents have to work, whereas their mothers did not, in order to pay the higher taxes needed to fund higher benefits for the elderly, providing a publicly-financed place for their four year olds to be cared for during the day could be considered a fair benefit in turn. Unlike three year olds, for whom toilet training often has yet to be completed, four year olds may be managed in groups, so pre-school works for them. And in the 1990s, when many members of the large and influential baby boom generation had younger children, it became a public priority.
In 1995, when our oldest child was three, we began to see flyers in neighborhood stores advertising public pre-school in the cash starved, staggering New York City schools. We looked into it, and signed our daughter up. During the school year, we found that we were required to have home visit from a social worker as part of the pre-school program. It turned out that a special grant program had been set up to provide a pilot pre-school program for disadvantaged children, but the only schools that received the grants were those in neighborhoods where few such children lived. After few disadvantaged children applied, other children were permitted into the program. The pre-kindergarten program didn’t work for us – it was just a half-day, and individual child care for the other half of the day cost more than private full day pre-kindergarten programs, including the one where we sent our second child. But we had served our purpose. We had received a benefit that the poor did not get, and had served, with others, as the “thin edge of the wedge.”
The State of New York funded a “universal pre-kindergarten” program at the end of the 1990s, when money was flush. This was a great benefit for poor mothers, many of whom were being required to work under welfare reform “workfare” and could not afford private nursery schools. The program was opened up, on a voucher basis, to such schools, and the competition required the public schools to actually provide services. And an additional year of school is a special benefit for immigrant and poor children, whose parents cannot be counted upon to teach them letters, numbers, colors, shapes, and basic life skills before their arrival at kindergarten. But “universal” benefits are not popular in our “one deal at a time, one interest at a time,” political culture. It was much cheaper to serve the affluent while pretending to serve the poor, in the finest “liberal” tradition of New York. When New York State’s most recent fiscal crisis was finally acknowledged, the Governor proposed the elimination of “universal pre-kindergarten” in fiscal 2004. Pre-kindergarten still isn’t universal, and it is unclear when or if it will become universal.
At one time, hooking affluent beneficiaries into a public benefit was a sure way to make it universal. Only the passage of a little time, and an appeal to people’s conscience, was required. Surely the better off would feel uncomfortable receiving a public benefit denied to the less well off who pay for it. Surely senior citizens, receiving health care paid for by the young would be unhappy with having their children and grandchildren lack health insurance. Then it stopped working. The thin edge of the wedge is now at the fat end of public spending, and of political support for public programs and benefits. Perhaps there is no longer a conscience to appeal to.