As venture capitalists raided health corporations to send them to bankruptcy creating widespread unemployment especially in manufacturing----Clinton neo-liberals were pushing NAFTA and global market policies. Reagan deregulated and dismantled oversight and accountability to allow corporations to grow to monopoly and earn billions in profit instead of millions----and Clinton continued this but started to install all that was needed to send corporations global. The 1990s under Clinton saw unemployment soar as US corporations fled overseas under these new NAFTA and global trade policies. Employment charts always show a bump in employment during Reagan/Clinton ----Reagan made the defense industry grow using all of our SS Trust----Clinton made the tech industry grow----building the technology network needed for global corporations overseas. Most of the same long-term unemployed missed that job market and the 1990s saw those people removed from unemployment rate figures and this was A LOT OF AMERICAN CITIZENS. This is what made our social safety net programs grow and they became a poverty capture instead of a help up. Republicans are global corporate empire pols----it is the far-right that loves corporate fascism----corporations controlling everything.
This was the time when the American people and especially parents were shouting FUND SCHOOLS NOT BOMBS. Reagan and Clinton defunded public education to start the march toward corporatization of our school system and now teachers and school employees were being culled from the employment sector. University professors became part time adjuncts and schools lost valuable support staff leaving remaining teachers stressed and often unable to handle all the functions in a classroom especially in underserved schools where classroom aids and guidance/social worker staffing was critical. The rise in employment during Reagan/Clinton made the push from blue-collar manufacturing to high-skill manufacturing and blue-collar became the long-term unemployed. Reagan/Clinton has plenty of high-skilled US grads to send to these industries because US UNIVERSITIES HAD NOT BEEN ATTACKED YET BY CORPORATE DOWNSIZING.
GROWING AMERICAN NUCLEAR CAPACITY TO SUPER-SIZE WAS A STRATEGY TOWARDS ENDING NUCLEAR WEAPONS----HOW DID THAT WORK? ONLY A REPUBLICAN WOULD USE THAT LOGIC.
President Reagan's Legacy and U.S. Nuclear Weapons Policy
By Paul Lettow
About the Author
(Delivered February 6, 2006)
I have been asked to speak about President Ronald Reagan's efforts to eliminate the possibility of nuclear war. That topic is long overdue for serious study.
A substantial amount of primary material is now available to those who wish to study the Reagan presidency. National Security Directives, memos between Reagan and his national security advisers, talking points for meetings, speech drafts, and transcripts of the Reagan-Gorbachev summits, among other documents, have been declassified and released. There is also much to be gained by examining public documents relating to Reagan, including his speeches and writings over the years-especially from before he entered the White House-which scholars have not often explored in detail. This material, together with evidence such as interviews, makes clear that Reagan was not, in Clark Clifford's memorable words, an "amiable dunce." Nor was he a cipher through which his advisers enacted their own agendas.
Reagan as Strategist
Reagan had a specific and unique strategic vision, and worked assiduously as President to see that vision realized. He was an original and often wildly unorthodox thinker, with little regard for the conventional wisdom of either the left or the right. He thought and read and wrote and spoke about nuclear weapons, and about Cold War policy, long before he ran even for the governorship of California.
Reagan was also a skillful wielder of power. As President he constantly pursued his own goals, whether his advisers approved or not, and even when they could not see what he was doing. He combined an idealism that bordered on utopianism with mental acuity and hardheadedness. He was much more complex than is generally known, and his personal influence on his administration was direct and extensive. Reagan's ideas served as the foundation for his administration's approach to the Cold War and to nuclear weapons. It is crucial for us to explore not just what Reagan did, but why.
Reagan as Visionary
Reagan, contrary to his image as a champion of the bomb, was a nuclear abolitionist. This is not a mere historical curiosity. Abolishing nuclear weapons was one of Reagan's fundamental goals for his presidency. His desire to rid the world of nuclear weapons underpinned much of what he did as President in terms of his Cold War policy. In many ways it is difficult to understand Reagan's presidency without taking into account his anti-nuclearism. But thus far that aspect of Reagan has been largely overlooked.
Reagan's anti-nuclearism was part and parcel of his larger vision for U.S. Cold War policy, one that he developed years before taking office as President and that differed from past U.S. policy. Reagan believed that the Soviet Union's economy and technological base represented key weaknesses in its Cold War competition with the United States, because of both the intrinsic flaws of the Soviet system and the exorbitant devotion of Soviet resources to the military. He thought that the United States should lead an expansive competition with the Soviets-politically, economically, and militarily-and that the Soviets could be compelled to change not just their behavior but even the nature of their system. He also believed that in the face of such a competition, the Soviets would be forced to negotiate deep cuts in nuclear weapons. Reagan sought not to manage the Cold War, but to prosecute and win it.
Clinton placed global expansion on steroids and he did as Reagan----defunding and dismantling all Federal agencies of oversight and accountability, defunding public education, all while building the global technology infrastructure overseas for global Wall Street and corporations. This activity may have given a boost in employment but these jobs were now going more and more and more to foreign workers. This is when the movement to long-term unemployment for Americans grew larger as immigrant labor grew. Had Reagan/Clinton wanted to rebuild local economies as they sent corporations overseas----they would have build small businesses to compete with the monopolies they created but they WANTED high-US unemployment and the effects of downsized government at all levels now moved the Federal, state, and local government workers to long-term unemployment with corporate fraud and government corruption soaring. Trillions of dollars stolen through corporate fraud during Clinton/Bush all in the name of redistributing wealth back to the richest.
Clinton gained support from the black citizens in cities because as he privatized and downsized government he encouraged black small businesses and the idea of minority contractors. The base that would have wanted to protect the government sector was now shouting to outsource and privatize so as to have businesses not knowing that goal of Reagan/Clinton and Republicans was global corporate monopoly that would come back and send all that small business outsourcing of government to global corporations as has happened these several years under Obama. This also created the heavy pay-to-play in US city government that sucked all city revenue to the rich and lost to fraud and corruption. Fast forward to today-----small businesses are being killed and global corporations are now doing the corporate fraud and corruption.
If you do not know that the same corporate non-profits filling our US cities are the same global corporate non-profits working overseas in developing nations BUILDING A GLOBAL SOCIAL SYSTEM----if you do not know that COMMON CORE AND RACE TO THE TOP ARE global education policies being installed in developing nations around the world for BUILDING A GLOBAL EDUCATION SYSTEM----then you are not knowing the structures for NEW WORLD ORDER and global corporate tribunal rule-----which was always the goal of Reagan/Clinton neo-liberalism.
Under Reagan/Clinton neo-liberalism corporations went from earning profit with quality products and services to literally fraud and corruption of our US Treasury, state and local government revenue.
Nayan Chanda, editor of YaleGlobal Online, interviewed former US President William J. Clinton on October 31, 2003. The full text of the interview is presented here.
'We must build a global social system' – Bill Clinton
In an interview, former US President Bill Clinton offers ideas for the Middle East and other issues
YaleGlobal, 19 November 2003
Nayan Chanda: You once likened globalization to weather. Why are a lot of people now angry about globalization?
Bill Clinton: Lot of bad weather. First of all, the system is not working for about half the people on earth. There are lots of reasons for that. While the last twenty years have lifted lots of people out of poverty than ever before, there are more people because all the population growth in the world is in poor countries. The second problem is that globalization will not work in the end unless it spurs more internal economic growth, unrelated to trade in poor countries. In Japan's heyday as a trading power in the eighties over eighty percent of its GDP was internally generated. In Germany's heyday – Germany was the most trade-dependent rich country in the 20th century – two-thirds of its GDP was internally generated. In America's heyday of trade in the nineties almost ninety percent of our GDP was internally generated. The first thing we need is to build systems for the developing countries that enable them to do a better job of building sustainable economies within their borders. Then they can take maximum advantage of trade and investment. Second, we have to recognize that we cannot have a global economic system without building a global social system.
That's why we need to have more labor and environmental provisions in the trade agreements, in my opinion. That's why we need to get rid of child labor and put children in school. That's why we need to have a developmental agenda that includes much higher levels of aid and debt relief and other efforts to support the developing world. Until we do those things it is going to be very difficult to sustain support in the developing world for globalization. There is also a lot of opposition to globalization in the advanced world, in countries where the social safety net is not strong, in countries where people lose their job because of trade and they are not immediately retrained and set up for something else. Basically, what happens is that information technology changes, predictably, have outpaced internal development in poor countries and the development of global social systems to follow the economic system. But you see it moving now, you see it moving in the global fund for HIV-AIDS, TB, and malaria, and the work of the Gates Foundation, and the fact that even Christian evangelicals of America support spending more money on AIDS. It is moving in the right direction, but we have got a good ways to go.
NC: You have pulled off a coup by doing a deal to reduce the price of drugs for HIV-AIDS. How did you do that?
BC: Well, we just started working at it. Ira Magaziner, who runs the AIDS project for me and did health care and information technology for me in the White House, has for the last thirty years or so had a business consulting firm. His specialty in the tough years in the eighties was going in to firms all over the world breaking down their processes to figuring out how they could cut cost and increase productivity. He enlisted a lot of retired business executives, and we went to these companies and asked them if we could work with them to cut costs. If we could increase their profit margin by cutting cost and increase their volume, then we asked them, "If those two things happen, would you cut the price?" That's essentially what happened. With the promise of higher volume and more productive manufacturing they can sell these drugs at $139 per person a year and still make money.
NC: What was the earlier price?
BC: There was one small bit of drugs being sold for $255 a year ..but most of the generic drugs in the world were selling for between $350 and $500 a year. For example in the Bahamas, when I went to work there we cut their cost from $3,500 a year down to $493. That's the more typical reduction. From $140 to $400 or $500 a year. It's going to make a huge difference.
NC: How are you going to fund the program?
BC: For one thing – no matter what the price is – if nobody is funding it, it won't matter. The numbers of people who are getting the medicine are so small it is disgraceful. In the countries where we are working, we are attempting to get wealthy countries to sponsor them. For example, Ireland and Canada working in Tanzania and Mozambique We are trying to get the Belgians to help, the Norwegians and Swedes to help. I hope the British, the French, and the Japanese and others will participate. We also have agreed to work with the World Health Organization, which has a very ambitious goal of adding 3 million more people to treatment in the next couple of years. The contracts that I have with all these companies includes the ability of our foundation to help buy these drugs for nations in which we are not working but where the World Health Organization and others, like the Global Fund, want to provide medicine.
NC: The gulf between Europe and the US has grown so much in the past year – over the environment, over GMO, and finally over Iraq. Is the transatlantic alliance doomed?
BC: No, it isn't. Because we have too much in common in terms of values and interests. We also have supported the expansion of the EU and the expansion of NATO, the ending of ethnic cleansing in Bosnia and Kosovo, and the work we have done together in Northern Ireland. So there have been a lot of positive things. GMO is a particularly difficult issue because it is hard to sort out what's the science and the fear of environmentally dangerous food from the desire to preserve the present structure of agriculture in some European countries – which I sympathize with but which may not have anything to do with GMO.
And of course there is a difficult problem we had with the Continent over Iraq when the United States decided not to let the UN inspectors finish before starting the conflict, which I think was a mistake. And then the French and the Germans said they would never support deposing Saddam Hussein as long as the inspectors were there even if he didn't cooperate, which I think was a mistake. I think everybody in the whole mix except for Tony Blair basically mishandled that. But we are where we are. I still believe that on balance we and Europe will be working more closely than ever before because we have no choice.
NC: Were you surprised that no WMD was found in Iraq?
BC: I have been a little surprised. I was not surprised that no weapons were found, but what I expected them to find was some of the unaccounted for stocks. Let me be very specific here. I knew nothing about any of this nuclear business – Niger, the yellow cake and all that. But for eight years I monitored the UN's and our own intelligence and what the Iraqis had at the beginning of the first Gulf War, what was destroyed during the Gulf War, and what was destroyed in the inspection process. When two members of Saddam Hussein's family defected to Jordan and told us what he had, we confronted the Iraqis. They basically admitted that they had it all along and they gave up massive volume of chemical and biological stocks and other related laboratory facilities. Then the boys went back home and got killed after a month they got back, which was a terrible mistake. We still continued to do these inspections.
Then in 1998 Saddam Hussein threw the inspector out. At that point we knew that there were unaccounted for stocks of at least two biological agents, Botulinum and Aflatoxin, and two chemical agents, VX and Ricin. Then the US and Britain bombed for four days at the suspected sites of collection. We obviously had no idea whether we destroyed all the stuff, none of it, or something in between. I just didn't believe that we possibly had destroyed all of it. Or that the recordkeeping at that point was wrong, because at that time we thought it was the UN's numbers on what they thought they had. So when the conflict was over I assumed that Saddam Hussein still had something, otherwise he could have easily proved he didn't to Hans Blix, and there would have been no war. I assume that if he didn't do it he either thought that war wouldn't happen anyway after the British –French-Russian positions have been announced or he was afraid to acknowledge to his neighbors that he didn't have it because he thought that somehow it made him much more powerful to the Israelis, to the Iranians and to anybody else. I still don't know what the truth is. Maybe it's still there, maybe it is buried, maybe he sent it to Syria. No one really knows. Because of the eight years of intelligence I saw, and because I didn't think that we could have gotten all of it in the bombing in 1998, I did assume that at some point we would find something.
NC: So you think it can still be found?
BC: Yes. Still might be. Look, they are still finding canisters and pipes and weapons in underground caves. So they still might find something. It's a very large country. I never knew anything about nuclear materials or weapons. I didn't know that the biological or chemical agents had been weaponized – that is, turned into weapons. But I did believe that there were some biological and chemical agents based on eight years of intelligence.
NC: What can one do to bring the international community back to support the US Iraq policy?
BC: I think the US took a big step by supporting the UN Secretary-General Kofi Annan's donor conference, which had two funds – one of which will be solely administered by the United Nations. That's a big step in the right direction. I would still like to see the security force internationalized under the UN. But I recognize that the United States has a special responsibility now that we have done what has been done.
It seems to me the best thing to do would be to try to get NATO to be designated by the United Nations as the security force acting on behalf of the UN. The US could still have a dominant but not necessarily controlling position if the commander of NATO is an American. That would give the Germans, the French, and the Canadians a way to come into Iraq and not feel that they were part of a unilateral enterprise. That would show the Iraqis that there were people in there now, just trying to keep law and order and helping to move them towards self-government and restore the economy of the country, who did not support our policy. It would reduce the temptation to see as just targets for not only elements of the Baathists who may be still hanging around but any terrorist group that might come over across the large borders of Iraq just to cause mischief because we are in an unsympathetic position in that part of the world now. I like what was been done by Secretary Powell at the donor conference in supporting the UN having its own program in Iraq, but I would like to see him to continue to do that. We don't have to control Iraq to save it. Whether you think we did the right or wrong thing, we now have a vested interest in spending money and investing in Iraq, trying to get people's life normal again and helping them to become self-governing. We will have much more credibility if we internationalize more both the nation-building and the security.
NC: How does one get back to the Middle East roadmap?
BC: First we need a Palestinian partner. We've got to have a functioning government headed by Abu Ala, the new prime minister designate. We need to let him deal with Chairman Arafat however he decides best to deal with him. As long as we believe Abu Ala is dealing with us in good faith, we should do that. I think the Israelis and Americans were expecting Abu Mazen, the previous Prime Minister, to do something that could not be done, which is to keep Arafat in a figurehead position – stripping him of all authority – and not involve him in negotiations. That way, Arafat has no incentive to have the security forces help to maintain peace. If we are going to deal with the PLO, then we'll have to deal with it as we find it. President Bush did a good thing in insisting that the Palestinians get a prime minister so that we have somebody we can negotiate with who is capable of saying ‘yes'.
On the other hand, we cannot expect our negotiating partner to do something that no Palestinian can do, which is to represent the PLO and the Palestinian elements and in effect turn Arafat into a totally powerless figurehead. Nobody has got the power to do that. In the end you have to negotiate with your partners as you find them. Then I think it is important – if the roadmap is going to mean anything – that the Israelis see a more calm security environment and show some substantive movement, and that the Americans get some concrete financial aid to the Palestinians. President Bush has talked about doing that, and other Republicans have said they would support it, so I think that's what ought to be done. Meanwhile, this new comprehensive peace proposal by the Israelis and Palestinians who are in the moderate pro-peace camp could get some legs into this conference in late November. If a lot of people would come and endorse it and push it, it could increase the parameters of what is considered possible for both the Palestinians and the Israelis.
Right now this looks like it's the peace Israelis in conflict with the Sharon government because they're ready to give up more land more quickly. But that's not exactly true, or that's not the whole truth. These Palestinians have made it okay for Abu Ala to make a compromise in the right of return, which is the fundamental thing they have to do. They have got to get there. The idea behind the Oslo Accord was that there would be two states to share the future. Israel will be majority but not exclusively Jewish. The new state of Palestine will be majority but not exclusively Palestinian Muslim. There'll be Christians there, there can even be some Jews there that stayed behind in the settlements. But they can't have an unlimited right of return both in the new state of Palestine and the old Israel, which will now be about half the size it was. So I think that these Palestinians have said, “We at least are willing to compromise on our part.” I think we need to stoke this process and leave a little bit just for public debate, while recognizing that for most Israelis, the Sharon government, and a lot of American Jews, the trauma of the last three years has made it difficult for them to think about giving that much really quickly. They want to see a little capacity to maintain peace and fight terror on the part of the Palestinian government.
Clinton's neo-liberal global market policies were enacted in the 1990s but the brunt of employment losses from these policies came in the 2000s. Bush certainly worked to make unemployment worse but Clinton global market policies NAFTA et al made unemployment soar. Through 2000s again------long-term unemployment took more and more US citizens out of the Federal unemployment rate figures that look only at how many citizens are receiving UNEMPLOYMENT BENEFITS.
This is when the Federal unemployment figures diverged from coming close to real unemployment to being totally disconnected to real unemployment. Before Reagan/Clinton pushed global markets and corporate monopolies if an American lost a job they could usually find another in weeks all while receiving unemployment benefits-----so these numbers were close to real unemployment. Remember, unemployment benefits came with FDR and New Deal and figures have been kept since then. Everyone says-----
WHY DID WE THINK BUREAU OF LABOR STATISTICS UNEMPLOYMENT FIGURES WERE GOOD BACK THEN AND NOW THEY ARE NOT?
It is because of the huge rise of long-term unemployed over these few decades---these citizens are no longer counted in these unemployment rate figures.
Global market unemployment soared under Bush as did manufacturing bankruptcies because------when US manufacturing went overseas to cheap labor the regional manufacturing could not compete. Actually, they could have with help from government that would not happen under a Republican Clinton/Bush economic policy. FDR and New Deal created small business administration and sent funding to subsidize the rebuilding of manufacturing and the same can be done now in rebuilding our local US city economies with small manufacturing factories.
Aughts were a lost decade for U.S. economy, workers
By Neil Irwin
Washington Post Staff Writer
Saturday, January 2, 2010
For most of the past 70 years, the U.S. economy has grown at a steady clip, generating perpetually higher incomes and wealth for American households. But since 2000, the story is starkly different.
The past decade was the worst for the U.S. economy in modern times, a sharp reversal from a long period of prosperity that is leading economists and policymakers to fundamentally rethink the underpinnings of the nation's growth.
It was, according to a wide range of data, a lost decade for American workers. The decade began in a moment of triumphalism -- there was a current of thought among economists in 1999 that recessions were a thing of the past. By the end, there were two, bookends to a debt-driven expansion that was neither robust nor sustainable.
There has been zero net job creation since December 1999. No previous decade going back to the 1940s had job growth of less than 20 percent. Economic output rose at its slowest rate of any decade since the 1930s as well.
Middle-income households made less in 2008, when adjusted for inflation, than they did in 1999 -- and the number is sure to have declined further during a difficult 2009. The Aughts were the first decade of falling median incomes since figures were first compiled in the 1960s.
And the net worth of American households -- the value of their houses, retirement funds and other assets minus debts -- has also declined when adjusted for inflation, compared with sharp gains in every previous decade since data were initially collected in the 1950s.
"This was the first business cycle where a working-age household ended up worse at the end of it than the beginning, and this in spite of substantial growth in productivity, which should have been able to improve everyone's well-being," said Lawrence Mishel, president of the Economic Policy Institute, a liberal think tank.
Question of timing
The miserable economic track record is, in part, a quirk of timing. The 1990s ended near the top of a stock market and investment bubble. Three months after champagne corks popped to celebrate the dawn of the year 2000, the market turned south, a recession soon following. The decade finished near the trough of a severe recession.
But beyond these dramatic ups and downs lies an even more sobering reality: long-term economic stagnation. The trillions of dollars that poured into housing investment and consumer spending in the first part of the decade distorted economic activity.
Capital was funneled to build mini-mansions in Sun Belt suburbs, many of which now sit empty, rather than toward industrial machines or other business investment that might generate economic output and jobs for years to come.
"The problem is that we mismanaged the macroeconomy, and that got us in big trouble," said Nariman Behravesh, chief economist at IHS Global Insight. "The big bad thing that happened was that, in the U.S. and parts of Europe, we let housing bubbles get out of control. That came back to haunt us big-time."
The housing bubble both caused, and was enabled by, a boom in indebtedness. Total household debt rose 117 percent from 1999 to its peak in early 2008, according to Federal Reserve data, as Americans borrowed to buy ever more expensive homes and to support consumption more generally.
As Republican wealth and global corporate empire power drove all of this corporate monopoly-----high small business decline----high unemployment-----great wealth inequity------Republicans have continuously shouted everything else created this problem.
MAKE NO MISTAKE----ALL OF THESE REAGAN/CLINTON/AND NOW OBAMA ECONOMIC POLICIES WERE WRITTEN IN REPUBLICAN THINK TANKS---THEY OWN THIS MESS.
So, now it becomes the fault of labor for demanding middle-class wages and benefits which corporations paid just fine for decades earning millions in profit---Republicans wanted billions in corporate profits. Then is was equal protection laws that required people of color, women, disabled, be given equal opportunity and access to education and jobs----THAT IS WHAT CREATED JOB LOSS said Republicans and all those pesky government jobs filled with workers providing oversight and accountability and regulations keeping corporations from being MONOPOLIES. Then it was all those lazy welfare citizens wanting only to sit around a collect social benefits that became the problem----NOT THE GUTTING OF US ECONOMY AND JOBS BY HOSTILE AND ILLEGAL CORPORATE RAIDING, MONOPOLY STAGNATION, AND GLOBAL MARKETS ------This move to global markets came with the move to end public subsidy and moving all that revenue to CORPORATE SUBSIDY making global corporations even wealthier. Poverty deepened making it harder for US citizens to be ready to work even if there were jobs.
If monopoly and anti-trust laws were enforced by Federal government and Congress shouted for that-----we would be humming along with a strong social democratic real free-market economy!If healthy corporations were not illegally allowed to be torpedoed by venture capitalist pretending 'efficiency' was the problem when simply wanting to create industry monopoly------which was what Reagan/Clinton/Bush/Obama as Presidents and our Congress were supposed to STOP........we would have a humming strong social democratic real free-market economy!!!!!
Below is the Republican excuse for all this stagnation and job loss.
THE FORTY YEAR SLUMP
by Harold Meyerson American Prospect
The steady stream of Watergate revelations, President Richard Nixon’s twists and turns to fend off disclosures, the impeachment hearings, and finally an unprecedented resignation—all these riveted the nation’s attention in 1974. Hardly anyone paid attention to a story that seemed no more than a statistical oddity: That year, for the first time since the end of World War II, Americans’ wages declined.
Since 1947, Americans at all points on the economic spectrum had become a little better off with each passing year. The economy’s rising tide, as President John F. Kennedy had famously said, was lifting all boats. Productivity had risen by 97 percent in the preceding quarter-century, and median wages had risen by 95 percent. As economist John Kenneth Galbraith noted in The Affluent Society, this newly middle-class nation had become more egalitarian. The poorest fifth had seen their incomes increase by 42 percent since the end of the war, while the wealthiest fifth had seen their incomes rise by just 8 percent. Economists have dubbed the period the “Great Compression.”
To read the stories of workers in the age of anxiety on a single page, click here.
This egalitarianism, of course, was severely circumscribed. African Americans had only recently won civil equality, and economic equality remained a distant dream. Women entered the workforce in record numbers during the early 1970s to find a profoundly discriminatory labor market. A new generation of workers rebelled at the regimentation of factory life, staging strikes across the Midwest to slow down and humanize the assembly line. But no one could deny that Americans in 1974 lived lives of greater comfort and security than they had a quarter-century earlier. During that time, median family income more than doubled.
Then, it all stopped. In 1974, wages fell by 2.1 percent and median household income shrunk by $1,500. To be sure, it was a year of mild recession, but the nation had experienced five previous downturns during its 25-year run of prosperity without seeing wages come down.
What no one grasped at the time was that this wasn’t a one-year anomaly, that 1974 would mark a fundamental breakpoint in American economic history. In the years since, the tide has continued to rise, but a growing number of boats have been chained to the bottom. Productivity has increased by 80 percent, but median compensation (that’s wages plus benefits) has risen by just 11 percent during that time. The middle-income jobs of the nation’s postwar boom years have disproportionately vanished. Low-wage jobs have disproportionately burgeoned. Employment has become less secure. Benefits have been cut. The dictionary definition of “layoff” has changed, from denoting a temporary severance from one’s job to denoting a permanent severance.
As their incomes flat-lined, Americans struggled to maintain their standard of living. In most families, both adults entered the workforce. They worked longer hours. When paychecks stopped increasing, they tried to keep up by incurring an enormous amount of debt. The combination of skyrocketing debt and stagnating income proved predictably calamitous (though few predicted it). Since the crash of 2008, that debt has been called in.
All the factors that had slowly been eroding Americans’ economic lives over the preceding three decades--globalization, deunionization, financialization, Wal-Martization, robotization, the whole megillah of nefarious –izations--have now descended en masse on the American people. Since 2000, even as the economy has grown by 18 percent, the median income of households headed by people under 65 has declined by 12.4 percent. Since 2001, employment in low-wage occupations has increased by 8.7 percent while employment in middle-wage occupations has decreased by 7.3 percent. Since 2003, the median wage has not grown at all.
The middle has fallen out of the American economy—precipitously since 2008, but it’s been falling out slowly and cumulatively for the past 40 years. Far from a statistical oddity, 1974 marked an epochal turn. The age of economic security ended. The age of anxiety began.
The economic landscape of the quarter-century following World War II has become not just unfamiliar but almost unimaginable today. It constitutes what historian Vaclav Smil has termed “a remarkable singularity”: The United States came out of World War II dominating the world’s production and markets, and its unprecedented wealth was shared broadly among its citizens.
The defining practice of the day was Fordism (named after Henry Ford), under which employers paid their workers enough that they could afford to buy the goods they mass--produced. The course of Fordism never ran as smoothly as it may seem in retrospect. Winning pay increases in halcyon postwar America required a continual succession of strikes.
At the commanding heights of the U.S. economy, the largest American company, General Motors, and the most militant and powerful American union, the United Auto Workers, had fought an epochal battle in the winter of 1945–1946, the UAW’s members staying off the job for nearly four months in what proved to be a vain attempt to win a co-equal say in the company’s management. In 1948, with GM fearing another massive disruption and the UAW willing to give up on co-management, the two sides reached a pattern-setting agreement: In return for a two-year no-strike pledge from the union, GM signed a contract granting its workers not only a sizable raise but an annual cost-of-living adjustment that matched the rate of inflation, and an “annual improvement factor” that raised pay in tandem with the increase in the nation’s productivity. In 1950, after a brief strike, the two sides signed a five-year contract—dubbed the Treaty of Detroit—that extended the no-strike pledge, the raise, the cost-of-living adjustment, and the annual improvement factor and added health coverage and more generous pensions. As the economy grew, so would the autoworkers’ paychecks.
Within a few years, the increases that GM had agreed to became standard in half the union contracts in America, though workers still had to strike to win these gains. In 1952, 2.7 million workers participated in work stoppages. Throughout the 1950s, the yearly number of major strikes averaged more than 300. The largest strike in American history, in terms of work hours lost, occurred in 1959, when 500,000 steelworkers walked off the job for 116 days to secure increased wages and improved health and pension coverage.
Management was no fan of these disruptions, but they were regarded as the normal ebb and flow of labor relations. Indeed, throughout the 1940s, ’50s, and ’60s, many corporate executives believed that their workers’ well-being mattered. “The job of management is to maintain an equitable and working balance among the claims of the various directly affected interest groups: stockholders, employees, customers, and the public at large,” the chairman of Standard Oil of New Jersey (later Exxon) said in 1951. Once hired, a good worker became part of the family, which entitled him to certain rewards. “Maximizing employee security is a prime company goal,” Earl Willis, General Electric’s manager of employee benefits, wrote in 1962.
During these years, the GI Bill enabled far more Americans to attend college than ever had before. The ranks of America’s professionals swelled, and America’s income swelled with them. But the contracts enjoyed by the nation’s union members—who then made up a third of the nation’s workforce—boosted personal income in the U.S. even more. Indeed, these contracts covered so many workers that their gains spilled over to nonmembers as well. Princeton economist Henry Farber calculated that the wages of workers in nonunion firms in industries that were at least 25 percent unionized were 7.5 percent higher than the wages of comparable workers in industries with no union presence.
In the three decades following World War II, the United States experienced both high levels of growth and rising levels of equality, a combination that confounded historical precedent and the theories of conservative economists. By 1973, the share of Americans living in poverty bottomed out at 11.1 percent. It has never been that low since.
By the early 1980s, the Treaty of Detroit had been unilaterally repealed. Three signal events—Federal Reserve Chairman Paul Volcker’s deliberately induced recession, President Ronald Reagan’s firing of striking air-traffic controllers, and General Electric CEO Jack Welch’s declaration that his company would reward its shareholders at the expense of its workers—made clear that the age of broadly shared prosperity was over.
The abrogation didn’t arrive unheralded. Beginning in 1974, inflation had begun to plague the American economy. The 1970s were framed by two “oil shocks”: the OPEC embargo of 1973 and the U.S. boycott of Iranian oil after the mullahs swept to power in 1979. During the decade, the price of a barrel of oil rose from $3 to $31. Productivity, which had been rising at nearly a 3 percent annual clip in the postwar decades, slowed to a 1 percent yearly increase during the 1970s. Europe and Japan had recovered from the devastation of World War II, and Japanese imports, chiefly autos, doubled during the late ’60s. In 1971, the U.S. experienced its first trade deficit since the late 1800s. Starting in 1976, it has run a trade deficit every year.
Profits of America’s still largely domestic corporations suffered. The Dow Jones Industrial Average, which had inched past 1,000 in 1972, tanked with the oil embargo the following year and didn’t climb back to that level for another decade. Although the biggest contributor to inflation was the increase in energy prices, a growing number of executives and commentators laid the blame for the economy’s troubles on the wages of American workers. “Some people will have to do with less,” Business Week editorialized. “Yet it will be a hard pill for many Americans to swallow—the idea of doing with less so that big business can have more.”
With the second oil shock, inflation surged to 13.5 percent. Volcker responded by inducing a recession. “The standard of living of the average American,” he said, “has to decline.” Raising the federal funds interest rate to nearly 20 percent throughout 1981, the Fed chairman brought much of American business—particularly the auto industry, where sales collapsed in the face of high borrowing costs—to a standstill. By 1982, unemployment had risen to a postwar high of 10.8 percent.
The industrial Midwest never recovered. Between 1979 and 1983, 2.4 million manufacturing jobs vanished. The number of U.S. steelworkers went from 450,000 at the start of the 1980s to 170,000 at decade’s end, even as the wages of those who remained shrank by 17 percent. The decline in auto was even more precipitous, from 760,000 employees in 1978 to 490,000 three years later. In 1979, with Chrysler on the verge of bankruptcy, the UAW agreed to give up more than $650 million in wages and benefits to keep the company in business. General Motors and Ford were not facing bankruptcy but demanded and received similar concessions. In return for GM pledging not to close several U.S. factories, the UAW agreed to defer its cost-of-living adjustment and eliminate its annual improvement increases. Henceforth, as the productivity of the American economy increased, the wages of American workers would not increase with it. Tide and boats parted company. Democrats as well as Republicans responded to the inflation of the late 1970s with policies that significantly reduced workers’ incomes. The Democrats’ solution of choice, promoted by both President Jimmy Carter and his liberal rival Senator Edward Kennedy, was deregulation. At their initiative, both trucking and airlines were deregulated, lowering prices and wages in both industries. In the quarter--century following 1975, drivers’ pay fell by 30 percent. Wage declines followed in other deregulated industries, such as telecommunications.
If Volcker’s and Carter’s attacks on unions were indirect, Reagan’s was altogether frontal. In the 1980 election, the union of air-traffic controllers was one of a handful of labor organizations that endorsed Reagan’s candidacy. Nevertheless, they could not reach an accord with the government, and when they opted to strike in violation of federal law, Reagan fired them all. (His actions contrasted sharply with those of President Nixon, who responded to an illegal wildcat strike of postal workers in 1970 by negotiating a settlement and letting them return to their jobs.)
Reagan’s union busting was quickly emulated by many private-sector employers. In 1983, the nation’s second-largest copper-mining company, Phelps Dodge, ended its cost-of-living adjustment, provoking a walkout of its workers, whom it replaced with new hires who then decertified the union. The same year, Greyhound Bus cut wages, pushing its workers out on strike, then hired replacements at lower wages. Also in 1983, Louisiana Pacific, the second-largest timber company, reduced its starting hourly wage, forcing a strike that culminated in the same kind of worker defeats seen at Phelps Dodge and Greyhound. Eastern Airlines, Boise Cascade, International Paper, Hormel meatpacking—all went down the path of forcing strikes to weaken or destroy their unions.
In the topsy-turvy world of the 1980s, the strike had become a tool for management to break unions. Save in the most exceptional circumstances, unions abandoned the strike. The number of major strikes plummeted from 286 a year in the 1960s and 1970s, to 83 a year in the 1980s, to 34 a year in the 1990s, to 20 a year in the 2000s. The end of the strike transformed the American economy. From the 1820s through the 1970s, workers had two ways to bid up their wages: threatening to take their services elsewhere in a full-employment economy and walking off the job with their fellow workers until managers met their demands. Since the early 1980s, only the full-employment-economy option has been available—and just barely. Save for the late 1990s, the economy has been nowhere near full employment.
The loss of workers’ leverage was compounded by a radical shift in corporations’ view of their mission. In August 1981, at New York’s Pierre Hotel, Jack Welch, General Electric’s new CEO, delivered a kind of inaugural address, which he titled “Growing Fast in a Slow-Growth Economy.” GE, Welch proclaimed, would henceforth shed all its divisions that weren’t No. 1 or No. 2 in their markets. If that meant shedding workers, so be it. All that mattered was pushing the company to pre-eminence, and the measure of a company’s pre-eminence was its stock price.
Between late 1980 and 1985, Welch reduced the number of GE employees from 411,000 to 299,000. He cut basic research. The company’s stock price soared. So much for balancing the interests of employees, stockholders, consumers, and the public. The new model company was answerable solely to its stockholders.
In the decade preceding Welch’s speech, a number of conservative economists, chiefly from the University of Chicago, had argued that the midcentury U.S. corporation had to contend with a mishmash of competing demands. Boosting the company’s share price, they contended, gave corporate executives a clear purpose—even clearer if those executives were incentivized by receiving their payments in stock. After Welch’s speech, the goal of America’s corporate executives became the elevation of the company’s—and their own—stock. If revenues weren’t rising, and even if they were, that goal could be accomplished by reducing wages, curtailing pensions, making employees pay more for their health coverage, cutting research, eliminating worker training, and offshoring production.
(After CEO Louis Gerstner announced in 1999 that IBM, long considered a model employer, would no longer pay its workers defined benefits and would switch to 401(k)s, corporate America largely abandoned paying for its employees’ secure retirements.)
By the end of the century, corporations acknowledged that they had downgraded workers in their calculus of concerns. In the 1980s, a Conference Board survey of corporate executives found that 56 percent agreed that “employees who are loyal to the company and further its business goals deserve an assurance of continued employment.” When the Conference Board asked the same question in the 1990s, 6 percent of executives agreed. “Loyalty to a company,” Welch once said, “it’s nonsense.”
In 1938, while campaigning, successfully, to persuade Congress to establish a federal minimum wage, President Franklin D. Roosevelt told a crowd in Fort Worth, “You need more industries in Texas, but I know you know the importance of not trying to get industries by the route of cheap wages for industrial workers.” In fact, Southern business and political leaders knew nothing of the sort. What prevented most American corporations from establishing facilities in the South was its oppressive weather and even more oppressive racial discrimination.
By the 1970s, the South was both air--conditioned and moderately desegregated. The decade was the first in the 20th century that saw more Americans moving into the region than leaving it. Indeed, during the 1970s, just 14 percent of newly created jobs were located in the Northeast and Midwest, while 86 percent were located in the Sunbelt.
The definitive Southern company, and the company that has done the most to subject the American job to the substandard standards of the South, has been Wal-Mart, which began as a single store in Rogers, Arkansas, in 1962. That year, the federal minimum wage, set at $1.15 an hour, was extended to retail workers, much to the dismay of Sam Walton, who was paying the employees at his fast-growing chain half that amount. Since the law initially applied to businesses with 50 or more employees, Walton argued that each of his stores was a separate entity, a claim that the Department of Labor rejected, fining Walton for his evasion of federal law.
Undaunted, Wal-Mart has carried its commitment to low wages through a subsequent half-century of relentless expansion. In 1990, it became the country’s largest retailer, and today the chain is the world’s largest private-sector employer, with 1.3 million employees in the United States and just under a million abroad. As Wal-Mart grew beyond its Ozark base, it brought Walton’s Southern standards north. In retail marketing, payroll generally constitutes between 8 percent and 12 percent of sales, but at Wal-Mart, managers are directed to keep payroll expenses between 5.5 percent and 8 percent of sales. Managers who fail at this don’t remain managers for long. While Wal-Mart claims the average hourly wage of its workers is $12.67, employees contend it is several dollars lower.
When a Wal-Mart opens in a new territory, it either drives out the higher-wage competition or compels that competition to lower its pay. David Neumark, an economist at the University of California, Irvine, has shown that eight years after Wal-Mart comes to a county, it drives down wages for all (not just retail) workers until they’re 2.5 percent to 4.8 percent below wages in comparable counties with no Wal-Mart outlets.
By controlling a huge share of the U.S. retail market, including an estimated 20 percent of the grocery trade, Wal-Mart has also been able to mandate reduced prices all along its worldwide supply chain. In response, manufacturers have slashed the wages of their employees and gone abroad in search of cheaper labor. The warehouse workers who unload the containers in which the company’s goods are shipped from China to the U.S. and repackage them for sale are retained by low-wage temporary employment agencies, though many of those workers have held the same job for years. Shunting its workers off to temp agencies is just one of the many ways Wal-Mart diminishes what it sees as the risk of unionization. When the employees in one Canadian store voted to unionize, Wal-Mart closed the store. When butchers in one Texas outlet voted to go union, Wal-Mart eliminated the meat department in that store and in every other store in Texas and the six surrounding states. But Wal-Mart’s antipathy to unions and affinity for low wages merely reflects the South’s historic opposition to worker autonomy and employee rights. By coming north, though, Wal-Mart has lowered retail-sector wages throughout the U.S.
The more recent influx of European- and Japanese-owned nonunion factories to the South has had a similar effect. In their homelands, Mercedes, Volkswagen, and Toyota work closely with unions, and the German companies pay their workers as much as or more than the most highly paid American autoworkers. When such companies move into the American South, however, they go native, not only paying their workers far less than they do in Europe or Japan but also opposing their efforts to form a union. (Under pressure from the German autoworkers union, however, Volkswagen has recently committed itself to establishing a consultative works council at its Tennessee plant. Such councils are standard at Volkswagen plants in Germany and other nations; in the U.S., the particulars of American labor law require that the company recognize the UAW as the workers’ representative.)
One way these factories reduce workers’ wages is not to employ them directly. By the estimate of one former manager, roughly 70 percent of the workers at Nissan’s plant in Smyrna, Tennessee, aren’t Nissan employees but rather are under contract to temporary employment-service companies that pay them roughly half the hourly wage of Nissan’s own employees. One academic survey found that while just 2.3 percent of manufacturing workers in 1989 were temps, by 2004 the number had risen to 8.7 percent.
Southern competition is one reason newer hires at the Detroit Three’s auto plants have hourly wages that top out between $16 and $19, while workers hired before the institution of the two-tier system can see their base pay rise to between $29 and $33 an hour. A cumulative effect of Wal-Martization is that incomes in the industrial Midwest have been dropping toward levels set in Alabama and Tennessee. According to Moody’s Analytics, the wage-and-benefit gap between Midwestern and Southern workers, which was $7 in 2008, had shrunk to just $3.34 by the end
As corporate executives came under pressure to reward share-holders by cutting labor costs, the revolution in transportation and communication enabled them to move production facilities to the developing world where workers came cheap. The flight of jobs to low-wage nations was accelerated by a series of trade accords, most prominently the North American Free Trade Agreement in 1993 and the extension of Permanent Normal Trade Relations to China in 2000.
The textile and apparel industry lost more than 900,000 jobs in the 1990s and 2000s. High-tech manufacturing was not spared, either. The computer and electronics--manufacturing sector lost an estimated 760,000 jobs during that time. By offshoring the production of its iPhone to the Chinese labor contractor Foxconn, Apple has realized a profit margin of 64 percent on each device, one of many reasons its stock price soared. From 2000 to 2010, the number of Americans employed in manufacturing shrank from 17.1 million to just 11.3 million. In 2011, the number of workers in the low-paying retail sector surpassed the number in manufacturing for the first time.
The decimation of manufacturing wasn’t due to a sharp acceleration of manufacturing productivity—indeed, productivity increases were higher in the previous decade, which saw less job loss. What made the difference was trade policy. Economist Rob Scott has calculated that the United States lost 2.4 million jobs just to China in the eight years following the passage of normalized trade relations.
Offshoring has had an even broader effect on the jobs that have remained behind. Alan Blinder, the Princeton economist who was vice chairman of the Federal Reserve in the 1990s, has estimated that roughly 25 percent of all American jobs are potentially offshorable, from producing steel to writing software to drafting contracts. This has placed a ceiling on wages in these and myriad other occupations that can be sent overseas.
Economists have long thought that labor’s share of the national income varied so little that it could be considered a constant. The immovability of labor’s share was called “Bowley’s Law,” after the British economic historian Arthur Bowley, who first identified it nearly a century ago.
In the wake of the economic collapse of 2008, Bowley’s Law was swept away—along with many of the economic standards that had characterized American life. Today, the share of the nation’s income going to wages, which for decades was more than 50 percent, is at a record low of 43 percent, while the share of the nation’s income going to corporate profits is at a record high. The economic lives of Americans today paint a picture of mass downward mobility. According to a National Employment Law Project study in 2012, low-wage jobs (paying less than $13.83 an hour) made up 21 percent of the jobs lost during the recession but more than half of the jobs created since the recession ended. Middle-income jobs (paying between $13.84 and $21.13 hourly) made up three-fifths of the jobs lost during the recession but just 22 percent of the jobs created since.
In 2013, America’s three largest private-sector employers are all low-wage retailers: Wal-Mart, Yum! Brands (which owns Taco Bell, Pizza Hut, and Kentucky Fried Chicken) and McDonald’s. In 1960, the three largest employers were high-wage unionized manufacturers or utilities: General Motors, AT&T, and Ford.
The most telling illustration of the decline of Americans’ work life may be that drawn by economists John Schmitt and Janelle Jones of the Center for Economic and Policy Research. They calculated the share of good jobs Americans held in 1979 and in 2010. If only because workers in 2010 were, on average, seven years older and more educated than their 1979 counterparts, they should have been doing better. The two economists devised three indices of a good job: that it paid at least the 1979 male median wage ($37,000 in 2010 dollars), provided health benefits, and came with a 401(k) or pension. By those standards, 27.4 percent of American workers had good jobs in 1979. Three decades later, that figure had dropped to 24.6 percent.
The decline of the American job is ultimately the consequence of the decline of worker power. Beginning in the 1970s, corporate management was increasingly determined to block unions’ expansion to any regions of the country (the South and Southwest) or sectors of the economy (such as retail and restaurants) that were growing. An entire new industry—consultants who helped companies defeat workers’ efforts to unionize—sprang up. Although the National Labor Relations Act prohibits the firing of a worker involved in a union-organizing campaign, the penalties are negligible. Firings became routine. Four efforts by unions to strengthen workers’ protections during the Johnson, Carter, Clinton, and Obama presidencies came up short. By 2013, the share of private-sector workers in unions declined to just 6.6 percent, and collective bargaining had been effectively eliminated from the private-sector economy.
The collapse of workers’ power to bargain helps explain one of the primary paradoxes of the current American economy: why productivity gains are not passed on to employees. “The average U.S. factory worker is responsible today for more than $180,000 of annual output, triple the $60,000 in 1972,” University of Michigan economist Mark Perry has written. “We’re able to produce twice as much manufacturing output today as in the 1970s, with about seven million fewer workers.” In many industries, the increase in productivity has exceeded Perry’s estimates. “Thirty years ago, it took ten hours per worker to produce one ton of steel,” said U.S. Steel CEO John Surma in 2011. “Today, it takes two hours.”
In conventional economic theory, those productivity increases should have resulted in sizable pay increases for workers. Where conventional economic theory flounders is its failure to factor in the power of management and stockholders and the weakness of labor. Sociologist Tali Kristal has documented that the share of revenues going to wages and benefits in manufacturing has declined by 14 percent since 1970, while the share going to profits has correspondingly increased. She found similar shifts in transportation, where labor’s share has been reduced by 10 percent, and construction, where it has been cut by 5 percent. What these three sectors have in common is that their rate of unionization has been cut in half during the past four decades. All of which is to say, gains in productivity have been apportioned by the simple arithmetic of power.
Only if the suppression of labor’s power is made part of the equation can the overall decline in good jobs over the past 35 years be explained. Only by considering the waning of worker power can we understand why American corporations, sitting on more than $1.5 trillion in unexpended cash, have used those funds to buy back stock and increase dividends but almost universally failed even to consider raising their workers’ wages.
So was the America of 1947–1974—the America of the boomers’ youth—the great exception in the nation’s economic history, a golden age that came and went and can never come again? Were the conditions that led to the postwar boom and its egalitarian prosperity so anomalous that the American economic success story will continue to recede in our rearview mirrors? Are the forces of globalization and robotization inevitably going to raise the incomes of the few and depress the incomes of the many?
That the American supremacy over the global economy in the three decades after World War II was a one-time phenomenon is a given. That globalization and automation have made and will continue to make massive changes in America’s economy is obvious. But it’s worth noting that one high-wage advanced manufacturing nation has seen its workers thrive in the past 40 years: Germany. Like American multinationals, all the iconic German manufacturers—Daimler, Siemens, BASF, and others—have factories scattered across the globe. Unlike the American multinationals, however, they have kept their most remunerative and highest-value-added production jobs at home. Nineteen percent of the German workforce is employed in manufacturing, well above the 8 percent of the American workforce. German industrial workers’ wages and benefits are about one-third higher than Americans’. While the U.S. runs the world’s largest trade deficit, Germany runs a surplus second only to China’s and occasionally surpasses it.
To be sure, Germany’s identity is more wrapped up in manufacturing than America’s is, but that’s because of national arrangements that not just bolster manufacturing through such policies as excellent vocational education but also give workers more power. By law, all German companies with more than 1,000 employees must have equal numbers of worker and management representatives on their corporate boards. For the most part, German companies don’t get their funding from issuing stocks and bonds but rather by generating investment either internally or by borrowing from banks; the role of the shareholder is insignificant. By practicing a brand of capitalism in which employees and communities still matter, Germany has been able to subject itself to the same forces of globalization that the United States has without substantially diminishing its workers’ power and income.
What has vanished over the past 40 years isn’t just Americans’ rising incomes. It’s their sense of control over their lives. The young college graduates working in jobs requiring no more than a high-school degree, the middle-aged unemployed who have permanently opted out of a labor market that has no place for them, the 45- to 60-year-olds who say they will have to delay their retirement because they have insufficient savings—all these and more are leading lives that have diverged from the aspirations that Americans until recently believed they could fulfill. This May, a Pew poll asked respondents if they thought that today’s children would be better or worse off than their parents. Sixty-two percent said worse off, while 33 percent said better. Studies that document the decline of intergenerational mobility suggest that this newfound pessimism is well grounded.
The extinction of a large and vibrant American middle class isn’t ordained by the laws of either economics or physics. Many of the impediments to creating anew a broadly prosperous America are ultimately political creations that are susceptible to political remedy. Amassing the power to secure those remedies will require an extraordinary, sustained, and heroic political mobilization. Americans will have to transform their anxiety into indignation and direct that indignation to the task of reclaiming their stake in the nation’s future. ✠