It’s hardly surprising once you think about it, but you probably don’t think about it, so it’s worth noting the role slavery had in building so many of our older institutions of higher education. Finally, some institutions are becoming serious about remembering that history and commemorating the unfree labor who worked on these campuses. Brown University is probably the most famous case, but the University of South Carolina has also done good work on this. It’s important stuff and we need more of it.
The debate over inequality often focuses on income, or how much money people earn. But perhaps even more important is wealth, or how much money people have. A person’s income can vary significantly from one year to the next, but wealth tends to be more durable, not just from year to year but from generation to generation. In this paper, the authors construct a new time series on wealth from Federal Reserve, Internal Revenue Service and other data. They find that wealth inequality, like income inequality, has increased significantly in recent decades. In 1978, the richest 0.1 percent of households held about 7 percent of all household wealth; in 2012, they controlled 22 percent. (The top 0.1 percent in 2012 comprised about 160,000 families with net assets above $20 million.) But wealth inequality still isn’t as high as it was in 1929, when the top 0.1 percent had 25 percent of all wealth. Overall, the authors find that the bottom 90 percent haven’t seen any increase in wealth since the mid-1980s after adjusting for inflation.
Daddy Warbucks for President!
On October 28, 1793, Eli Whitney submitted a patent for his invention known as the cotton gin. Perhaps more than any technology in American history, this invention profoundly revolutionized American labor. Creating the modern cotton industry meant the transition from agricultural to industrial labor in the North with the rise of the factory system and the rapid expansion and intensification of slavery in the South to produce the cotton. The cotton gin went far to create the 19th century American economy and sharpened the divides between work and labor between regions of the United States, problems that would eventually lead to the Civil War.
People had long known of the versatile uses of cotton. This plant produced fibers that could be used for many things, but most usefully clothing, which in the 18th century was often scratchy and uncomfortable for everyday people who could not afford finer fabrics, including cotton. The problem was the seed inside the cotton boll, to which the plant’s fibers stuck. Thus, the labor it took to process it made it a luxury good. The cotton gin solved that problem by mechanically separating the fibers from the seeds. This made cotton a universal product and the production of it an international business that would radically change the entire United States and transform work.
Whitney, from Massachusetts, became interested in the problems of cotton production while visiting a plantation in Georgia. Helping out the plantation’s owner (the widow of Revolutionary War general Nathaniel Greene), he created the cotton gin. On October 28, he send his patent application to Secretary of State Thomas Jefferson. He hoped to make a lot of money on it but American patent law was weak at the time and others copied him. Quickly the invention spread around the South.
The cotton gin immediately transformed the South. By 1815, cotton became the nation’s leading export, tying the Southern elite to the factory owners and investors of Great Britain. By 1840, it was worth more than all other American exports combined. The system of chattel slavery that had under-girded the colonial tobacco economy had become heavily strained during the 18th century. Declining soil fertility and the expansion of tobacco production around the British empire meant that the plantation owners were not making the money off of slavery that they did 100 years earlier. The lack of an economic imperative for the institution went far toward the abolition of slavery in the North after the American Revolution. In the South, it combined with Enlightenment ideals to at least make plantation owners question the institution. Thomas Jefferson and Patrick Henry both admitted the institution was bad but could not imagine freeing their slaves because of the lives of luxury the system provided them. Others were slightly less selfish and either freed their slaves in the 1780s or freed them upon the master’s death, such as George Washington. The general assumption though was that slavery was going to disappear, even if Georgia and South Carolina wouldn’t like it much. As Oliver Ellsworth said at the Constitutional Convention, “Slavery in time will not be a speck in our country.”
The cotton gin ended this equivocation on slavery among the plantation elite and destroyed the myth of disappearing slavery in the North. Combined with the conquest of rich land in the hot climates of Alabama, Mississippi, Georgia, and Louisiana over the next few decades, the planters found new ways to make money using slaves. The southern discussion of slavery transformed from a “necessary evil” to a “positive good.” Thus we would enter the “classic” period of American chattel slavery, replete with the large plantation agriculture you probably think of when envisioning slavery. The lives for slaves were terrible under this system, with rape, beatings, whippings, murder, and the breaking up of families normal parts of life. Further advances in cotton farming created breeds that incentivized working slaves as close to death as possible while keeping them just alive to pick more. As the nation moved toward the Civil War, the southern labor system wrought by the cotton gin was becoming only more entrenched and more brutal for the laborers. Slaves would resist this in any number of ways–breaking tools, running away from masters, even revolt, such as Nat Turner’s revolt or Denmark Vesey’s supposed conspiracy. But by and large the system of racialized violence that kept the labor force in place doomed slaves to miserable lives. In 1787, there were 700,000 slaves in the United States. In 1860, there were 4 million and rising. Around 70 percent of those slaves were involved in cotton production.
In the North, the revolution caused by the cotton gin was just as profound. Samuel Slater had opened the United States’ first modern factory, in Pawtucket, Rhode Island, a couple of years earlier. The textile industry would explode in the next several decades with all the newly available cotton. By the 1820s, New England had already undergone a massive economic shift toward textile mills that moved this region from rural to urban, with courts and politicians serving the interests of the industrialists over workers, farmers, and fishers. At first, this transformation was along the region’s copious waterways–at Pawtucket, Lowell, and Manchester. But further technological advances would for steam power meant owners could build factories anywhere and they dotted the region after the Civil War.
The impact upon northern workers was truly revolutionary. The agricultural economy certainly did not disappear but it soon became secondary to the textile factories in much of the region. The wealth spawned by textiles created other industries and new transportation technologies like the steamship, canal, and railroad, and by 1860, the growing northern industrial might had reshaped the nation. It took workers out of the farms and small shops that defined 18th century work and into giant factories. Eventually, the Industrial Revolution that the cotton gin brought to the U.S. meant that workers would lose control over their own labor, the ability to set their own hours of work, the possibility of drinking on the job, and the artisanship of American craft labor. Replacing it would be the factory floor, the time clock, and the foreman. This is largely in the relatively distant future from 1793, but the transformations began soon after.
It also brought women into the economy in new ways. Supposedly because of their nimble fingers but really because employers could pay them less, women became desirable workers in the cotton factories. This upended gender roles and when American women resisted the treatment they faced in the factories, spurred the migration of immigrants from Ireland and then eastern and southern Europe to fill these low-paid jobs. In the early factories, work was hot, stuffy, and exhausting, with 14-16 hours days not uncommon. The creation of textile work as women’s work and thus highly exploitative never ended and continues today in the sweatshops of Bangladesh, Honduras, and many other nations. The fight to tame the conditions of industrial labor wrought, in part, by the cotton gin, remains underway today.
This is the 123rd post in this series. Previous posts are archived here.
Ah, Saint Ronnie (with an assist from everybody’s favorite genteel segregationist, William F. Buckley Jr.):
Brack Obama displayed inspiring leadership on Friday. He also promoted public health, fought bigotry, and helped calm raging paranoia. His heroic act? He hugged somebody.
Nina Pham, the first person to be infected with Ebola within the United States, had just been declared disease-free and discharged from the National Institutes of Health. Obama is a rational, science-friendly guy, so he knew she wasn’t any danger to him. It didn’t take courage to hug her.
And yet, another modern president failed a similar test. Facing the greatest public health crisis of his administration, Ronald Reagan was not heroic. He was a dithering coward.
The hateful, homophobic, racist response to the AIDS crisis is one of the most shameful episodes in recent American history. Within a few years after the first AIDS cases were reported in 1981, scientists knew the disease was transmitted primarily by sex, blood transfusions, and shared needles.
That knowledge didn’t stop the prejudice and fear mongering. HIV-positive people were fired from their jobs, forbidden from entering the country, kicked out of the military. Jerry Falwell claimed AIDS was “God’s punishment not just for homosexuals” but for a “society that tolerates homosexuals.” William F. Buckley Jr. wrote in a widely syndicated column that people diagnosed with AIDS should have that fact tattooed on their buttocks. Schools refused to enroll children with HIV. When a judge ordered a Florida school to admit young brothers Ricky, Randy, and Robert Ray, their neighbors burned their house down.
But Ronald Reagan? He didn’t do a goddamn thing. He was president when the first cases were reported. He was president when Congress, the National Academies of Science, and anybody with a sick loved one or a conscience called for the federal government to do more to fight the medical and social crisis. He let his reprehensible press secretary Larry Speakes, Reagan’s face to the media, repeatedly joke about AIDS.
Read the whole etc.
Discuss: The end of American empire.
Ah, I never get tired of Krugman arguing in print with a pundit who shall go nameless. Let’s call him “D. Brooks.” No, wait, that’s too obvious…”David B.”
The NYT policy of not allowing explicit disagreement with other people on the op-ed masthead doesn’t really make sense, and yet I hope they keep it — it’s always entertaining in this context.
On a recent afternoon, Hampus Elofsson ended his 40-hour workweek at a Burger King and prepared for a movie and beer with friends. He had paid his rent and all his bills, stashed away some savings, yet still had money for nights out.
That is because he earns the equivalent of $20 an hour — the base wage for fast-food workers throughout Denmark and two and a half times what many fast-food workers earn in the United States.
“You can make a decent living here working in fast food,” said Mr. Elofsson, 24. “You don’t have to struggle to get by.”
With an eye to workers like Mr. Elofsson, some American labor activists and liberal scholars are posing a provocative question: If Danish chains can pay $20 an hour, why can’t those in the United States pay the $15 an hour that many fast-food workers have been clamoring for?
“We see from Denmark that it’s possible to run a profitable fast-food business while paying workers these kinds of wages,” said John Schmitt, an economist at the Center for Economic Policy Research, a liberal think tank in Washington.
And if those fast food companies are less profitable in Denmark than the U.S., well, good! Companies should have lower profits if that money is going into the hands of workers. This seems self-evident to me, but I know even many liberal Americans have so internalized the logic of modern profit ideology that the idea of lower profits in exchange for better lives for low-paid workers makes many people uncomfortable.
This is as good a time as any to announce what I think my next book project is going to be, or at least what I am starting to work on. I want to write a new history of the Industrial Workers of the World that evaluates the union’s successes and failures in terms of what might be useful for modern activists in their own struggles. I am interested in this because the IWW basically exists in leftist memory as a romantic alternative to bureaucratic unionism, the promise of the revolution never achieved thanks to state repression, the AFL, corporate media, name your reason. Especially in an era where activists often don’t see the state as part of the solution, nor 20th century versions of socialism and certainly not the AFL-CIO, the free-flowing, culture producing, decentralized IWW seems an ideal. That the IWW promoted worker participation, bottom up organizing, democratic unionism, and all the other things that modern left critics of labor wants to see makes their vision of it, however accurate or not, powerful.
Theoretically, that should be fine. People are going to use whatever pasts they choose to inform their present. But there are problems. First, the IWW couldn’t actually win anything. Part of that has to do with the conditions in which it organized with a hostile state. But no small part of it was with problems in the IWW organizing model that almost made long-lasting victory impossible. The modern left stance toward the union also leads to cheerleading for a past movement at the expense sometimes of analyzin it. Even professional labor historians are guilty of this, sometimes worse than anyone. When the 2013 Labor and Working Class History Association meeting was coming up, I noted to one of the organizers that it was taking place in New York on the 100th anniversary of the Paterson Strike Pageant. So I was lucky enough to then moderate the panel remembering the event. Before the panel, one of the participants, a major labor historian, was openly talking about how this event should be a celebration.
Well, why? Should any historians be rooting for our protagonists? Does that help? Or is hard-headed analysis pulling no punches about both failures and successes more useful? I’d say the latter. The Paterson Strike Pageant was a complete disaster. The IWW’s cultural production may be appealing to modern leftists, but in this case, it actually split the workers with jealousy since only some workers could participate. It also drew workers away from the actual strike, allowing the factory owners to bring in strikebreakers. It was a horrible decision that doomed that strike (which probably wouldn’t have succeeded anyway). It also basically killed the IWW in the east. After 1913, the Wobblies focused almost exclusively on western resource extractive labor for its campaigns.
But the modern left loves the pageant. Why? Because it brought together workers and culture in fun and radical ways that seem to portend a bread and roses culture that is a dream today. Take this essay, which led me to write this post. It’s well researched and well written, yet seems to present a really heroic view of the strike. I haven’t read the book where this is excerpted, but while a People’s Art History of the United States is cool and all, don’t we have to talk about all the ways the Paterson pageant failed miserably? In this case, isn’t the people’s history of the Paterson pageant that it turned workers against one another? If we want to learn lessons from the IWW, shouldn’t they be the right ones? Isn’t the goal to organize workers and win? And if that is the goal, shouldn’t we think about how the IWW did that well and how it did that poorly, without sweeping the latter under the rug in favor of vague notions of solidarity?
So basically what I want to do is write a decidedly unromantic history of the IWW that considers their actions in the context of thinking about usefulness for modern activists. What should we learn and is there anything they did a century ago that might give us pause today? Moreover, I want this to focus more on the rank and file and less on ideology and leadership. Unfortunately, for all the left loves to talk about “the people,” leftists love their Great Man history more than anyone. Joe Hill. Frank Little. Big Bill Haywood. Elizabeth Gurley Flynn. But what about everyday loggers, miners, textile mill workers? Did the IWW work for them? How did they respond to the Wobblies? What did they want and how did the IWW succeed or fail in providing that to them? By exploring these questions, I hope to peel away some of the romance and provide people a more useful past than I think most writers on the IWW give. Even if some people will be mad that I am far from a partisan for the organization.
A This Day in Labor History post next week will expand upon these ideas in the context of a single incident.
Yglesias makes a very good point here about the suggestions that Obama lacks “passion”:
January 20, 2010 was one of the very most memorable days of my eleven years in Washington. The previous evening Scott Brown had defeated Martha Coakley in a special election to fill the US Senate vacancy left by Ted Kennedy’s death. Genuinely surprising electoral outcomes are rare, so it was natural that the political community was electrified by Brown’s triumph. But to most observers the stunner also had a very concrete significance — the drive to pass an Obamacare bill through the United States Congress was dead.
Of course Republicans spun it that way. But many Democrats — including senior figures on and off Capitol Hill as well the President’s own chief of staff — agreed as well.
The message of the election was clear. Obamacare was finished. The only question was what, if anything, could be salvaged from the wreckage. At the Center for American Progress, where I was working at the time, the halls were buzzing with scenarios. Maybe a bill to cover all kids? Some kind of Medicaid expansion? Having come so far toward universal coverage, nobody wanted to give up. But the crisis clearly required some dramatic turnabout. Some grand gesture to make it clear that the President “got it.”
That’s the day that came to mind as I was reading Josh Green’s Businessweek story detailing Obama’s alleged failings as a crisis manager.
From Deepwater Horizon to Ebola to ISIS, Green alleges, Obama’s cool cerebral technocratic approach denies “the public’s emotional needs.” The president “disdains the performative aspects of his job.” Consequently, he “struggles to strike the right tone.” He is, in Green’s view, a perpetual under-reactor who has “an excess of faith in government’s ability to solve problems.”
So about Scott Brown.
It turned out that a version of Obamacare had already passed the US Senate with a filibuster-proof 60-vote majority. If House Democrats were willing to abandon their own version of health reform and pass the Senate bill, then the Senate could use the budget reconciliation process to enact some limited changes. Nothing about Brown’s victory changed the fundamental reality. Democrats could have Obamacare if they wanted it, and if they didn’t pass the law it would be because they decided not to, not because Brown’s victory forced them out of it.
Nancy Pelosi was a strong advocate of this view, but it struck many party leaders as insane. Rep. Barney Frank, for instance, declared Obamacare dead on the night of Brown’s win. Ultimately, however, Obama became the hero of his own administration by coming down on her side. He refused to give into the panic gripping the Democrats and focused in on the math of the situation — and there, it turned out, Democrats had more than enough votes to pass the bill. As a response to the Brown win, it made no emotional sense. But it did make sense. And today Obamacare is law.
A lot of people were urging Obama to betray most of his supporters by abandoning comprehensive health care reform, including not only his chief of staff but a number of actual liberals. He didn’t, and this choice mattered.
There’s another narrative that comes up sometimes conflating Emmanuel’s timorousness with Obama’s views, and hence arguing that Pelosi had to persuade Obama to keeping pressing with the ACA although he didn’t want to. At least according to the definitive PBS documentary on the subject, this isn’t true:
CECI CONNOLLY: January 19th, 6:30 PM, about an hour-and-a-half before the polls close in Massachusetts, Obama calls for Pelosi, Reid, Biden and Rahm Emanuel to come to the Oval Office.
NARRATOR: They immediately convened an emergency meeting.
DAN PFEIFFER: From the very moment that it was clear that Scott Brown was going to win that seat, he began thinking through what the next steps would be to be able to right the ship and get health care done.
NARRATOR: The president asked Speaker of the House Nancy Pelosi if she could get the House to pass the Senate bill.
CECI CONNOLLY: Pelosi is annoyed and quite adamant that there’s no way she can sell that to her House members, almost kind of lecturing, saying, “You don’t understand the realities in the House. This won’t work.” And Obama finally snaps, uncharacteristically for him, and he says, “I understand that, Nancy. What’s your suggestion?” And there is no suggestion.
My point here is not to deny Pelosi the enormous credit she deserves for getting the Senate bill through the House — her analysis that it would be a very heavy lift at best wasn’t wrong. I still don’t like the term “Obamacare” because it contributes to the pernicious myth that presidents unilaterally impose major legislative choices from the top down. But the point here as that Obama and Pelosi were allies against the Emmanuels and Franks. They didn’t start out on the opposite side. (And as djw observed earlier today in comments, the evidence-free assertions that Obama was the puppet of his chief of staff are not merely wrong but offensive for obvious reasons.)
The key point here is that the interesting counterfactual question isn’t why the ACA wasn’t better — as the pathetic quality of the counterfactuals trying to make an argument that the Senate could have passed something significantly better indicates, that’s not a difficult or particularly interesting question. The real contingency cuts the other way — with different presidential and congressional leadership it absolutely could have failed entirely once again. (Although I still think that Clinton/Pelosi/Reid also would have gotten it done; I suspect Clinton might have been more receptive to this kind of what-about-the-next-midterm caution in general, but not on health care specifically.) But as to suggestions that I should be nostalgic for the Golden Age of the Democratic Party when the White House and congressional leadership were 1)more conservative than Obama/Pelosi/Reid and 2)could screw up a gin-and-tonic, I’m going to continue pass.
Admittedly, the allusion in my title oversells the story a bit, but this isn’t good.
It seems to me there are two issues here worth disentangling.
1) Is it ethical for researchers to interact with voters in a way that might change the outcome of an election (but with no intent to do so, and no partisan valence)?
2) Is it ethical for for researchers to do (1) while misrepresenting themselves as agents of the state?
I’m inclined to answer the first question with a ‘no’ but am willing to listen to arguments to the contrary. An anonymous political scientist in the TPM piece makes the case for a yes answer:
“I would say, just looking at the country at large, is the great threat to the integrity of our process good social science or is it the Koch brothers?” the source, who was not authorized to discuss the situation publicly, said. “You’ve got to be courageous about this. We need to know how to improve our politics and how to renovate it. We can’t just be playing Mickey Mouse games in the classrooms. We’ve got to be out there in the political world trying stuff.”
The Koch brothers thing is risible misdirection; whatever one thinks of the Koch brothers political activity in a post-Citizens United world, this is an entirely separate issue. The implied argument here is that the current trend in cutting edge political science research to eschew observational studies in favor of ‘experiments’ is, inevitably going to lead to manipulation of actual political events if it’s going to be done well. There’s some truth to this; ‘experiments’ conducted in artificial scenarios with a bunch of undergraduate volunteers will lead us to replicate the WEIRD problem in a good deal of psychological research (and only answer a very limited number of kinds of research questions). That said, as someone with no professional attachment (and a skeptical attitude toward) to the experiments trend, my inclination is to say ‘so much the worse for experiments’; as with all social science methodologies it has real limits, in this case ethical ones, that need to be acknowledged. But I suppose it’s possible that there’s a conversation worth having about minor, random influence as potentially acceptable in some circumstances.
As for (2), though, I don’t see how there can be any debate at all. I can’t imagine what they were thinking if this was their intention (and it’s hard to account for their use of the state seal in the mailing in any other way). I would guess it was an effort to get more people to take their flyer seriously, but I can’t fathom what would cause someone to think that misrepresentation is in any way ethical.
On October 27, 1948, an air inversion trapped the pollution spewed out by U.S. Steel-owned factories in Donora, Pennsylvania. The Donora Fog killed 20 people and sickened 6000 others. This event was one of the most important toxic events in the postwar period that sparked the rise of the environmental movement and groundbreaking legislation to protect Americans from the worst impacts of industrialization.
Donora was a town dominated by U.S. Steel. Southeast of Pittsburgh, the town had both the Donora Zinc Works and the American Steel and Wire plant, both owned by U.S. Steel. The pollution throughout southwest Pennsylvania was legendary as the combination of the steel industry and the region’s hills and valleys meant incredible smoke. While Pittsburgh was nationally famous for its pollution, surrounding towns had similar problems. For the 19th and first half of the twentieth century, this pollution was seen as a sign of progress. But after World War II, with the struggles for mere survival that marked American labor history for the previous century over, workers began demanding more of their employers and government when it came to the environment.
The factories routinely released hydrogen fluoride, sulfur dioxide, sulfuric acid, nitrogen dioxide, fluorine, and other poisons into the air. Nearly all the vegetation within a half mile of the Zinc Works was already dead. Donora already suffered from high rates of respiratory deaths, a fact noted at the time, which is significant because people didn’t much talk about that in 1948. The people who had to deal with these problems were the workers themselves. The companies poisoned their bodies inside the factories through toxic exposure on the job and they poisoned their bodies outside the factories through air, water, and ground pollution. Being an industrial worker in mid-twentieth century America was to be under a constant barrage of toxicity.
In Donora, people had been complaining about the air quality for decades. U.S. Steel opened the American Steel and Wire plant in 1915. By 1918, it was already paying people off for the air pollution and it faced lawsuits from residents, especially farmers, through the Great Depression. But in a climate of weak legal repercussions or regulation, this was merely a nuisance for U.S. Steel.
Pollution in Donora (credit here)
The air inversion started on October 27 and continued until November 2. When it began, this meant that the pollution spewing from the smokestacks just sat in the valley, turning the air into a toxic stew. By October 29, the police closed the town to traffic because no one could see well enough to drive. By that time, people were getting very sick. 6000 people became ill out of a town of 13,000. Almost all of these people were workers and their families who relied upon U.S. Steel for survival. Yet that could also kill them. 800 pets also died.
The Donora Fog. This picture was taken around noon.
The smog could easily have been worse. An assessment released in December estimated that thousands more could have died if it lasted a couple extra days. Notably, the weather inversion was region-wide (in fact, there were fogs for hundreds of miles during this larger event), but Pittsburgh, long the famed home of American smoke pollution, avoided any serious health problems like Donora because it had recently passed new ordinances against burning bituminous coal, thus lowering the pollution levels and saving its citizens’ lives. Alas, Donora had not passed such regulations.
U.S. Steel of course called the Donora Fog “an act of God,” because only a higher power could have led to a factory without pollution controls. This is standard strategy for corporations when their environmental policies kill people. The Donora Fog put U.S. Steel workers, organized with the United Steelworkers of America, into a difficult situation. Six of the seven members of the Donora city council were USWA members. And they were sick too. But what if U.S. Steel closed the factories? Even in 1948, this was already on workers’ minds. Yet they also wanted real reform. Workers did not trust federal and state regulators. The U.S. Public Health Service originally rejected any investigation of Donora, calling it an “atmospheric freak.” When investigations finally did happen a few days later, there were no air samples from the pollution event itself and the government recommended the factories reopen.
So the USWA and city council filled with its own members conducted their own investigation. CIO president Phil Murray offered the locals $10,000 to start this process. Working with a medical school professor from the University of Cincinnati, the USWA hired six housewives to conduct health effects survey to create the basis for a lawsuit. This continued pressure finally forced a government response. When the Zinc Works decided to reopen in order to “prove” that the plant could not possibly cause smog, locals pressured the Public Heath Service to make the test public. When it did, the health complaints started rolling in, with parents keeping their children home from school. Ultimately, the Public Health Service had no interest in holding U.S. Steel accountable for their subsidiary plants and the company itself wanted to avoid liability without creating a new regulatory structure that would limit emissions. U.S. Steel openly claimed they would close the plants if it had to make major reforms. And in the end, the Public Health Service report, released in October 1949, did not pin culpability on the factories.
The people of Donora sued the plants in response. The company returned to its “act of God” legal defense. The Zinc Works lawsuit paid 80 families $235,000 when it was settled, but that barely covered their legal fees. The American Steel and Wire suit was more successful, leading to a $4.6 million payout. But this was a still a pittance considering the damage done to the people of Donora by the steel industry. Yet in the end, this was an industry the town also needed to survive. U.S. Steel closed both plants by 1966, leading to the long-term decline of Donora, a scenario repeated across the region as steel production moved overseas. Today, Donora’s population is less than half what it was in 1948.
The Donora Fog helped lead to laws cleaning up the air. The first meaningful air pollution legislation in the nation’s history passed Congress and was signed by President Eisenhower in 1955. 1963 saw the first Clean Air Act and 1970 the most significant Clean Air Act. Supporters of all these laws cited Donora as evidence of the need for air pollution legislation.
I drew from Lynn Page Snyder, “Revisiting Donora, Pennsylvania’s 1948 Air Pollution Disaster, in Joel Tarr, ed., Devastation and Renewal: An Environmental History of Pittsburgh and Its Region for this post.
This is the 122nd post in this series. Previous posts are archived here.