Subscribe via RSS Feed

England…

[ 107 ] July 2, 2015 |
Soccer ball.svg

“Soccer ball”. Licensed under GFDL via Wikipedia.

The end of last night’s England-Japan WWC match was about as devastating as anything I can imagine. I’m thinking through the classic catastrophic sports moments; Chris Webber’s timeout is the only thing that seems comparably decisive on such a large stage. I hope that the next thing is better for Laura Bassett; watching the team comfort her in the aftermath was the only thing that partially redeemed the moment.

[SL] A commenter beat me to it, but although its effectively one country big stage is a smaller one than Webber, an otherwise even better comparison is Steve Smith, the rookie defenseman who celebrated his birthday by eliminating his own team, the greatest regular season version of the Gretzky/Messier/Kurri/Coffey/Fuhr Oilers:

Was this moment of sheer misery one of the best days of my life? Well, yes. But Laura Bassett take note, and heart: the also-UK born Smith played 14 more years in the NHL and is now an assistant coach in Carolina.

Today in Racist History

[ 76 ] July 2, 2015 |

Moynihan

This month is the 50th anniversary of the Moynihan Report. Stephen Steinberg:

A few weeks after Moynihan’s report was leaked to the press, the Watts neighborhood in Los Angeles exploded in violence, triggered by an incident with police that rapidly escalated into five days of disorder and left thirty-four people dead. Pundits and politicians seized upon the report to cast blame for the “riot” on the deterioration of “the Negro family.” The report warned, “The family structure of lower class Negroes is highly unstable, and in many urban centers is approaching complete breakdown.”

Critics condemned the report for pathologizing female-headed households and black families in particular. The most trenchant criticism, however, was that the preoccupation with black families shifted blame away from institutionalized inequalities and heaped it on the very groups that were victims of those inequalities. As James Farmer, cofounder and national director of the Congress of Racial Equality, wrote with blunt eloquence, “We are sick unto death of being analyzed, mesmerized, bought, sold, and slobbered over while the same evils that are the ingredients of our oppression go unattended.”

Today, in the wake of Ferguson and Baltimore, family dysfunction is again cited by politicians, pundits, and scholars as the root of the problem. Rand Paul publicly twaddles about “the breakdown of the family structure, the lack of fathers, the lack of sort of a moral code in our society.” David Brooks opines in the New York Times, “The real barriers to mobility are matters of social psychology, the quality of relationships in a home and a neighborhood that either encourage or discourage responsibility, future-oriented thinking, and practical ambition.” And sociologist Orlando Patterson asserts that “fundamental change” can come only from “within the black community: a reduction in the number of kids born to single, usually poor, women.”

Steinberg goes on to break down the intellectual sources for the Moynihan Report, particularly Nathan Glazer. Intellectual racism that blames people of color for their own poverty has not diminished in the last half-century. Any number of racist sites refer back to Moynihan today; meanwhile this paragon of institutionalized racism became a respected Democratic senator without ever questioning his blaming of black people for their own poverty and ending his career as a big supporter of slashing welfare. Among other great things in this man’s life was ensuring the UN did nothing to stop the Indonesian slaughter in East Timor when he was UN Ambassador during the Ford administration and opposed the Clinton health care plan.

This Day in Labor History: July 2, 1980

[ 10 ] July 2, 2015 |

hazard

On July 2, 1980, the Supreme Court ruled in Industrial Union Department AFL-CIO v. American Petroleum Institute that the Occupational Safety and Health Administration must take economic considerations into account when issuing regulations. This 5-4 decision severely impacted the ability of the government to take an aggressive and preemptive stand against workplace health problems.

One thing that often gets left behind in discussions of OSHA is the health part of the agency’s mission. We focus on safety. That’s because those issues are easier to take care of. You put proper protection around a saw and it becomes a lot less dangerous. But health is a whole other issue. You have a couple of issues making it so. First is the long term impact of work upon health, which means that occupational illness can take decades to become apparent. Second is that remaking worksites so that workers aren’t exposed is a lot more expensive than the saw guard. Protecting workers from benzene, toxic gases, or dust has real challenges. And those solutions can be expensive.

The Occupational Safety and Health Act of 1970 charged the federal government with protecting workers on the job from industrial hazards. OSHAct stated, “no employee will suffer material impairment of health or functional capacity even if such employee has regular exposure to the hazard dealt with by such standard for the period of his working life.” It built on the “Precautionary Principle” that was in favor during these years for dealing with workplace safety and health issues, addressing environmental uncertainties in the regulatory process before they became problems. That means in the case of workplace health trying to figure out what substances might cause health problems and preemptively eliminating them. That requires action even if scientific data doesn’t exist that suggests there is a problem, but only that there could be in theory. This principle drove the move toward environmental and workplace regulation during the 1970s in both the United States and Europe. But the political implications of this were not worked out in the legislation and Congress gave OSHA a lot of leeway in figuring out how the agency would actually operate.

OSHAct tasked the Secretary of Labor is bound to set out rules for substances like benzene, even if only one worker might become unhealthy due to exposure. It was benzene at play in Industrial Union Department. OSHA sought to regulate benzene, an carcinogen, but without really nailing down how many workers’ lives would be saved in doing so.

The American Petroleum Institute decided to fight this, even though the petroleum industry clearly had the money to protect its workers from benzene exposure (it didn’t even bother arguing otherwise). Industry had engaged in a court campaign to slow down OSHA from its beginning, challenging the agency at every turn. On the other hand, the AFL-CIO led the charge to save the Precautionary Principle, building on its significant progress in fighting for workplace health in the 1970s. OSHA finally was up and running at full capacity by the late 1970s with Jimmy Carter naming Eula Bingham as the agency’s head. Bingham, the first OSHA director who really supported the agency’s mission, sought to remake workplace environments around the nation, often with the active support of those unions who saw the agency as a way to empower workers on the shop floor to protect themselves and express workplace power at the same time. So defending the Precautionary Principle became a top OSHA priority after 1977. Bingham’s OSHA created standards for acrylonitrile, cotton dust, lead, arsenic, and benzene.

Yet for organized labor, this was very slow progress. By 1981, the National Institute on Occupational Safety and Health (NIOSH) had recommended 250 standards but OSHA had only implemented 21 of those. Only 4 of these standards dealt with cancer-causing agents. In my forthcoming book on timber unions, I discuss in some detail how the International Woodworkers of America (IWA) was frustrated that their concerns on a wood dust standard was not taken seriously enough by OSHA. So for corporations, these standards were outrageous and for workers, they were too little and usually too late. The Precautionary Principle was a great idea but workers in the 1970s were impatient and wanted immediate remediation of the problems of work.

In the case itself, more popularly known as the benzene case, the Court had two primary objections. First was to rule on the benzene standard itself, specifically the reduction of benzene at the workplace from 10 parts per million to 1 ppm. Second was whether OSHA needed to have a “reasonable relationship” between the costs and benefits of new standards. The Court’s majority (John Paul Stevens wrote the opinion with Burger and Stewart in the majority while Rehnquist and Powell wrote concurring opinions) decided to read Congress’ mind in interpreting the Occupational Safety and Health Act, assuming Congress couldn’t have meant to protect all workers from all health risks without cost consideration. Effectively, the Court rejected the Precautionary Principle as an unreasonable standard with which to hold business. A plurality tried to create a standard for workplace health that would activate OSHA action, rather unhelpfully noting that it should lie somewhere between a 1 x 1000 chance of illness and a 1 x 1,000,000 chance. What this did was allow the Reagan administration to effectively avoid health regulations on the job at all after it took power in 1981 by adhering to the 1 in a million standard. Thurgood Marshall wrote a blistering dissent (Brennan, White, and Blackmun making up the rest of the minority) saying the decision placed “the burden of medical uncertainty squarely on the shoulders of the American worker.”

Despite Industrial Union Department, American work is much safer and healthier today than it was decades ago. Unfortunately, a lot of the reason for that is the outsourcing of such work to Latin American and Asian nations where workers labor in health-destroying conditions making products for American consumption.

While researching this case, I ran across a celebratory essay about the decision by one Antonin Scalia in an American Enterprise Institute publication.

The roots of this week’s decision in Michigan v. Environmental Protection Agency
can be seen in Industrial Union Department, as Scalia’s opinion relied heavily on the same cost-benefit analysis as that case.

I don’t think there is a single book that really deals with this case effectively, but it is mentioned in Gerald Markowitz and David Rosner’s Deceit and Denial: The Deadly Politics of Industrial Pollution, which is a very good book on the larger issue of workplace health. I also consulted Albert Matheny and Bruce Williams, “Regulation, Risk Assessment, and the Supreme Court: The Case of OSHA’s Cancer Policy,” in Law and Policy, October 1984.

This is the 149th post in this series. Previous posts are archived here.

The Neo-Confederate Response

[ 49 ] July 1, 2015 |

imrs.php

The racists have burned 8 black churches in 10 days.

Movie scene bleg

[ 25 ] July 1, 2015 |

Help out here all-knowing LGM collective consciousness.

I have a vague memory of a fairly recent film (like in the last 10-12 years) in which police interrogators try to intimidate a suspect they’re interviewing by pulling their guns and laying them on the table in front of the witnesssuspect. I think this might have been a Ben Affleck movie (The Town?) (Gone Baby Gone?).

Does this ring a bell? Also, extra kudos to anyone who can find a Youtube clip.

. . . actually I’m interested in any film (or TV show episode) that features this scenario, not just the one I sort of remember.

The Subway

[ 77 ] July 1, 2015 |

SubwaySigns

It’s amazing the New York subway system works at all.

But the fundamental reason the MTA is so hard to fix, say transit experts both inside and outside the authority, goes back to those antediluvian switches. The MTA runs one of the largest transit systems in the world on a budget that’s dependent on the whims of elected officials in City Hall and Albany. It’s the equivalent of trying to change the engine and tires on a 1930 Studebaker while driving cross-country at top speed and hoping you can find enough spare change between the seat cushions to buy parts.

“We’re trying to address three or four decades’ worth of disrepair and disinvestment,” says MTA planning director Bill Wheeler. “The last time people sunk money seriously into the subway system was before World War II. It’s taken us a long, long time to come back, and that’s why much of the capital program is about rebuilding.”

“New York started off behind a lot of other places, because most other places haven’t let their physical plant deteriorate to the extent that New York has,” agrees Richard Barone, director of transportation programs for the Regional Plan Association (RPA), one of the local groups that has pushed hardest for improved transit infrastructure. It’s a problem that started in the 1950s and 1960s, when local budgets got tight and subway service for a shrinking (and increasingly nonwhite) city populace no longer seemed like a priority.

“New York really just ignored investing in its infrastructure,” says Barone. “So it took decades to rebuild what we had lost because of neglect.” And while the MTA has spent more than $100 billion on improvements since its first capital plan in 1982 — almost every subway car has been replaced in that time, for starters — Barone says the agency remains in “catch-up” mode.

And of course there’s huge parts of the city the system does not touch. Yet it’s still reasonably reliable. In my limited experience, it seems more functional than that of Washington. I’ll find out more about that in the next few weeks as I’ll be in the nation’s capital for most of July researching a new project and enjoying that sweet, sweet DC weather.

Why Honoring Jefferson Davis Is Unacceptable

[ 109 ] July 1, 2015 |

President-Jefferson-Davis

The discussion that starts here raises a very important point. There’s one defense of monuments to Confederates that runs something like “sure, Davis was a slaveholder, but we have slaveholders on the $1 and $2, a white supremacist on the $5, a slaveholder and ethnic cleanser on the $20, and so on. Why is Davis different?”

I think the answer to this should be clear. There’s a difference between honoring a slaveholder or white supremacist from the 18th or 19th century and honoring them for their support for slavery and white supremacy. Washington isn’t on the $1 because he was a slaveholder, but because he was the first (and still one of the best) presidents and also a major leader in the Revolutionary War. Lincoln is widely honored because of his crucial role in preserving the union and smashing the slave power, not because of the belief he held for most of his life that a multiracial democracy was impossible. The Constitution protected slavery, but its sole purpose was not the protection of slavery. (And we should also remember that the options the framers had in 1787 were a Constitution that provided some protection for slavery, or no deal. The idea that Virginia or Georgia or South Carolina would have agreed to an antislavery constitution with better bargaining is Green Lanternism that makes “Obama could have made Joe Lieberman vote to nationalize the American health care industry” look plausible.) The Revolutionary War and the Constitution were both the product of a combination of admirable motives, immoral motives, self-interest, and practical politics. One can admire the sentiments of the Declaration of Independence while also being mindful that the “all mean are created equal” part was observed in the breach to disastrous effect. Evaluating these things involves complicated judgments.

The Confederacy is a different story. Protecting slavery was its sole reason for being. Confederate leaders aren’t honored in spite of their commitment to treason in defense of slavery; in 99% of cases they’re being honored because of it. (Nobody would be naming highways in Washington state after Davis because he was Pierce’s Secretary of War.) As I said in the previous post, the idea that people like Robert E. Lee are being honored because they were fine gentleman or fathers (except for, you know, the slaves) is absurd even if you take the assertions at face value like you shouldn’t. I have great parents and you probably do too, but nobody’s building statues of them or naming schools after them. Confederate leaders are honored because of their role in the Confederacy. And the purpose of secession was 1)protecting slavery, and 2)that’s it.

To be clear, I’m not arguing that tributes to non-Confederate leaders shouldn’t be assessed critically. (Personally, I’m OK with Washington and Lincoln on the currency, but would remove Jackson with all non-deliberate speed.) A norm may emerge that honoring slaveholders in any way and no matter what else they did is unacceptable, and that would be OK with me. Norms could develop against naming things after political leaders in general. But those are complicated questions. Confederate leaders are an easy case.

The Foolishness of Post-Work Utopianism

[ 176 ] July 1, 2015 |

Claims-for-unemployment-benefits-drop-in-the-First-Week-of-March

Every now and again, you see some essay about the utopia of a post-work society, suggesting that the disappearance of traditional paid labor (a lot of which is not much fun) will allow people’s real passions to flourish. Derek Thompson wrote a very long Atlantic piece exploring these ideas in a very positive way. I was not pleased. There is no utopian end of work. What follows the end of work is poverty. And such articles undermine what we actually need–motivating people to political action for economic justice and good jobs. The threat of automation creating mass unemployment is real enough, as I have discussed here repeatedly. But there’s nothing positive at the end of that process. Moreover, I felt like, although I can’t know, that all the people Thompson talked to as examples of people already engaging in a post-(traditional) work economy are relatively well-educated white people–the PhD who decided to start a foundry where people like mixed media artists and engineers come to labor/leisure, the bartender in Youngstown who is also a PhD student at the University of Chicago, the writer with two master’s degrees working in a cafe. Where are the African-Americans in Youngstown or Native Americans on the reservations already suffering from long-term unemployment? Do they have a place in this post-work future? They sure don’t seem to in Thompson’s article.

Luckily, I’m not the only person rolling their eyes at this sort of thing. Mike Konczal:

There’s been a consistent trend of these stories going back decades, with a huge wave of them coming after the Great Recession. Thompson’s piece is likely to be the best of the bunch. It’s empathetic, well reported, and imaginative. I also hope it’s the last of these end-of-work stories for the time being.

At this point, the preponderance of stories about work ending is itself doing a certain kind of labor, one that distracts us and leads us away from questions we need to answer. These stories, beyond being untethered to the current economy, distract from current problems in the workforce, push laborers to identify with capitalists while ignoring deeper transitional matters, and don’t even challenge what a serious, radical story of ownership this would bring into question.

But what is the impact of these stories? In the short term, the most important is that they allow us to dream about a world where the current problems of labor don’t exist, because they’ve been magically solved. This is a problem, because the conditions and compensation of work are some of our biggest challenges. In these future scenarios, there’s no need to organize, seek full employment, or otherwise balance the relationship between labor and capital, because the former doesn’t exist anymore.

This is especially a problem when it leaves the “what if” fiction writings of op-eds, or provocative calls to reexamine the nature of work in our daily lives, and melds into organizational politics. I certainly see a “why does this matter, the robots are coming” mentality among the type of liberal infrastructure groups that are meant to mobilize resources and planning to build a more just economy. The more this comforting fiction takes hold, the more problematic it becomes and easier it is for liberals to become resigned to low wages.

Because even if these scenarios pan out, work is around for a while. Let’s be aggressive with a scenario here: Let’s say the need for hours worked in the economy caps right now. This is it; this is the most we’ll ever work in the United States. (It won’t be.) In addition, the amount of hours worked decreases rapidly by 4 percent a year so that it is cut to around 25 percent of the current total in 34 years. (This won’t happen.)

Back of the envelope, during this time period people in the United States will work a total of around 2 billion work years. Or roughly 10,000 times as long as human beings have existed. What kinds of lives and experiences will those workers have?

Worker power matters, ironically, because it’s difficult to imagine the productivity growth necessary to get to this world without some sense that labor is strong. If wages are stagnant or even falling, what incentive is there to build the robots to replace those workers? Nothing is certain here, but you can see periods where low unemployment is correlated with faster productivity gains. The best way forward to a post-work atmosphere will probably be to embrace labor, not hope it goes away.

And if you actually were going to promote a post-work utopianism, you’d think you would go so far as to endorse the one policy that might alleviate a few of these problems, which is universal basic income. But nope, not a word about that. Just a vague of sense of fulfillment and belonging through artisanship and a sort of government funded online-WPA type proposal. So the policy recommendations here really fall short of even beginning to think about how to deal with unemployment in the present or in the future.

Finally, Thompson’s story ends with a 60 year old going back to get a master’s degree so he can become a teacher. He writes, “It took the loss of so many jobs to force him to pursue the work he always wanted to do.” Except that where are the jobs for 60 year old teachers?!? Thompson just leaves this here as if personal fulfillment somehow leads to economic stability. And anyone who knows anything about the current state of education and employment knows that even if you do love teaching, the realities of being in a classroom in a Rheeist society of extreme testing and attacks on teachers’ unions is not some glorious result. Rather, Thompson is engaging in a sort of romanticizing of teaching (a long tradition) to avoid real conclusions and a strong basis in the realities of work and labor policy in the United States.

In conclusion, I really have to wonder how many of these people who write about a post-work society in a hopeful way have ever actually experienced poverty or even basic working-class life. Not having employment is a terrible thing. And even if everyone else isn’t working either, it’s not like that leads to some universal acceptance of the situation and everyone getting over their Protestant work ethics. Rather, we can see what a bit of a post-work society looks like. It looks like Youngstown or it looks like southern West Virginia. And that’s not a vision anyone remotely progressive should want to replicate. If Youngstown is someone our national future because all the jobs are gone, there’s nothing to celebrate. There’s no positive endgame to that scenario.

The Confederacy Won the Peace

[ 143 ] July 1, 2015 |

Jefferson-Davis-Highway-markers

The statistic at the end of the second paragraph says it all:

The Confederates won with the pen (and the noose) what they could not win on the battlefield: the cause of white supremacy and the dominant understanding of what the war was all about. We are still digging ourselves out from under the misinformation that they spread, which has manifested in both our history books and our public monuments.

Take Kentucky. Kentucky’s legislature voted not to secede, and early in the war, Confederate Gen. Albert Sidney Johnston ventured through the western part of the state and found “no enthusiasm as we imagined and hoped but hostility … in Kentucky.” Eventually, 90,000 Kentuckians would fight for the United States, while 35,000 fought for the Confederate States. Nevertheless, according to historian Thomas Clark, the state now has 72 Confederate monuments and only two Union ones.

Another excellent example is the fact that if you drive from Seattle to Vancouver you do so in part on the Jefferson Davis Highway. Given that Washington not only didn’t secede, but didn’t exist, during Davis’s brief period heading the treasonous slave state I think we can safely chalk this up to 100% hate, 0% heritage. A bill was proposed to get rid of it in 2002, but it generated intense Republican opposition and was ultimately killed in the Senate:

The opponents describe the highway change as a needless affront to Davis, who remains revered in some quarters and for whom plenty of schools are named in the South.

Now Representative Thomas M. Mielke, a Republican from Battle Ground, has taken up their cause and is opposing the bill, expected to come up for a vote on Thursday.

Mr. Mielke circulated an e-mail message to his colleagues on Tuesday night, attaching a biography of Davis and calling him ”an outgoing, friendly man, a great family man who loved his wife and children and had an infinite store of compassion.”

“Sure, he was a traitor who believed that slavery was a cause worth dying for and supported the establishment of apartheid police states in the South after the civil war, but he was a nice guy.” Hey, maybe Mohamed Atta remembered to call his mother every birthday, we could start naming roads after him too! I’m afraid when it comes to public monuments I’m in the “Nice guy? I don’t give a shit. Good father? Fuck you, go home and play with your kids” school. The fact that Republican legislators in states that had nothing to do with the Confederacy are willing to make such transparently silly arguments to preserve the monuments to the slave power is highly instructive.

Returning to Loewen:

Perhaps most perniciously, neo-Confederates now claim that the South seceded for states’ rights. When each state left the Union, its leaders made clear that they were seceding because they were for slavery and against states’ rights. In its “Declaration Of The Causes Which Impel The State Of Texas To Secede From The Federal Union,” for example, the secession convention of Texas listed the states that had offended them: Maine, Vermont, New Hampshire, Connecticut, Rhode Island, Massachusetts, New York, Pennsylvania, Ohio, Wisconsin, Michigan and Iowa. These states had in fact exercised states’ rights by passing laws that interfered with the federal government’s attempts to enforce the Fugitive Slave Act. Some also no longer let slaveowners “transit” through their states with their slaves. “States’ rights” were what Texas was seceding against. Texas also made clear what it was seceding for: white supremacy.

And there are plenty of other illustrations. Uniform support of the Fugitive Slave Act by the slave power in itself reveals the “states’ rights” argument as a con. Any “strict constructionist” would look at the wording of the Fugitive Slave clause and its placement in Article IV and construe the return of fugitive slaves as a state, not federal, responsibility. And perhaps the single most important issue in the dissolution of the Democratic Party was the unwillingness of Congress to impose a proslavery constitution on Kansas that its citizens didn’t want. The Confederate Constitution did not permit states to abolish slavery. 99% of arguments about “federalism” are really arguments about policy substance, and attempts by Confederates and their apologists to claim they were motivated by “states’ rights” are particularly fraudulent.

Always for Pleasure

[ 27 ] July 1, 2015 |

I had a whole bunch of stuff to write about today and then it didn’t happen for a number of reasons. But I still found time to watch Les Blank’s 1978 film about the culture of New Orleans, Always for Pleasure. It’s not available as a whole film on YouTube; I watched it on Fandor. But there are a couple clips available. It’s pretty great. I know the New Orleans of 1978 is not the New Orleans of 2015 in many ways. But it still made me want to go to New Orleans again.

The only thing to say after that second clip is NOT ENOUGH CAYENNE!!!!

Lefty Purity For Thee…

[ 93 ] June 30, 2015 |

hardha1

What does Harper‘s publisher Rick MacArthur do when he’s not publishing poorly reasoned and anti-factual screeds about the perfidy of the Democrat Party? Why, bust unions of course:

MacArthur may have once defended U.A.W. as “the country’s best and traditionally most honest mass labor organization,” but he contested his staff’s right to unionize, contending that the literary editor and senior editors served as supervisors and hence failed to qualify for protection under the National Labor Relations Act. He hired veteran employment lawyer Bert Pogrebin to advocate on his behalf before the National Labor Relations Board, but the federal agency denied his appeal. The day before staffers held elections and formally joined UAW Local 2110 on Oct. 14, MacArthur wrote a letter assuring them the union would neither give them a voice in the selection of the next editor in chief—he believed Metcalf was angling for the position—nor “solve the financial problems of the magazine or get us more subscribers, newsstand buyers or advertisers.”

Added MacArthur, with a touch of irony: “It will, of course, be able to collect initiation fees and dues from you.”

In January 2011, the magazine laid off union instigator Metcalf and pro-union ally associate editor Theodore Ross, a move that the union interpreted as retaliation and that MacArthur defended as an effort to “cut expenses.”

Of course, one way you can ensure you have the money to pay anti-labor lawyers is to pay your interns a big fat goose egg to work full time in Manhattan.

While MacArthur’s magazine has been unreadable for a while, I was wondering if perhaps there was a commercial justification for what has been intellectually ruinous. Maybe there’s a large market out there that really wants to read the same terrible leftier-than-thou article with a nominally different byline about how Barack Obama betrayed his campaign promises by failing to unilaterally turn the American political economy into Denmark’s every month? Nope: in fact, their circulation is cratering. It’s really a shame what’s happened to what was not that long ago a terrific magazine, but at this point it’s probably never coming back.

On the Search for the Clubhouse Guy

[ 60 ] June 30, 2015 |
Roythomas01.jpg

Roy Thomas. Licensed under PD-US via Wikipedia.

Some interesting thoughts from Russ Carleton on how you would go about searching for clubhouse “chemistry”: (subscription)

I have a feeling that if I surveyed even the most hardcore sabermetricians out there, they would all acknowledge that ideas of chemistry and clubhouse presence aren’t silly. They’d probably push back against the common narrative that Team X won the World Series based on the shining light of justice that came from Smith’s locker. (After all, there were probably veteran guys on all the other 29 teams who did not win the World Series.) They’d probably say that it’s hard to measure. (It is.) But if Smith sits down with Jones, shows him a trick he’s learned over the years on how to hit a curveball and Jones turns from a one-win player to a three-win player, don’t we have to give some of that credit to Smith?

I’m going to start with the assumption that chemistry and clubhouse presence exist and that they can have real, tangible effects on players, making them either better or worse. We don’t know how it works. We don’t know who’s who. We don’t know what the effects are. But what if we could at least make some reasonable assumptions about what those effects might be? Actual data-driven ones. For example, we know that some managers seem to have a special talent for keeping their players from burning out over the course of a year, and that the effect might be as big as 30 runs from the best to the worst.

So, how much could these soft factors actually be worth?

I’d be interested in coming up with a list of things that we assumed-away-because-we-couldn’t-measure, then realized-had-an-impact-when-we-developed-better-tools. I’m guessing that the list would be longer in football and basketball than in baseball, but of course it would also be interesting to track down some examples from politics.

Thoughts?

Page 4 of 2,055« First...23456...102030...Last »