I guess my prediction today will be…Clinton by 13.
Further proof that the Bush administration thinks it is above the law, or that the law is just not worth following: according to the government accountability office (GAO) the administration’s push to restrict the use of S-CHIP funds to cover people above the poverty line was in violation of federal law.
The legal opinion, requested by a bipartisan pair of senators, lambasted the president for vetoing Congress’s twice-passed expansion of the SCHIP health care program, which provides health insurance for kids whose parents are too wealthy to get Medicaid but too poor to be able to afford private health insurance. Congress twice approved more money for SCHIP, and BUsh twice vetoed it, mongering fears about socialized medicine.
So there we have it. 70,000 fewer kids insured than would have been possible plus a violation of federal law for good measure.
(via Bitch PhD)
“Elitism” is thus a crime not of society’s actual elite, but of its intellectuals. Mr. Obama has “a dash of Harvard disease,” proclaims the Weekly Standard. Mr. Obama reminds columnist George Will of Adlai Stevenson, rolled together with the sinister historian Richard Hofstadter and the diabolical economist J.K. Galbraith, contemptuous eggheads all. Mr. Obama strikes Bill Kristol as some kind of “supercilious” Marxist. Mr. Obama reminds Maureen Dowd of an . . . anthropologist.
Ah, but Hillary Clinton: Here’s a woman who drinks shots of Crown Royal, a luxury brand that at least one confused pundit believes to be another name for Old Prole Rotgut Rye. And when the former first lady talks about her marksmanship as a youth, who cares about the cool hundred million she and her husband have mysteriously piled up since he left office? Or her years of loyal service to Sam Walton, that crusher of small towns and enemy of workers’ organizations? And who really cares about Sam Walton’s own sins, when these are our standards? Didn’t he have a funky Southern accent of some kind? Surely such a mellifluous drawl cancels any possibility of elitism.
I was also amused by the Crown Royal mistake; I hate to tell this to wealthy pundits pretending to be populists, but very few dive bars have Crown Royal in the well…
I don’t always agree with him, but Frank will make a much better token lefty at the WSJ than Al Hunt or Alex Cockburn…
An appropriately Ruthless Review, pointing out the problem with Mamet’s Voice position paper was not its conservatism but its jaw-dropping banality and many strawman burnings. I’m glad they reminded me about this part:
The Constitution, written by men with some experience of actual government, assumes that the chief executive will work to be king, the Parliament will scheme to sell off the silverware, and the judiciary will consider itself Olympian and do everything it can to much improve (destroy) the work of the other two branches. So the Constitution pits them against each other, in the attempt not to achieve stasis, but rather to allow for the constant corrections necessary to prevent one branch from getting too much power for too long.
Leaving aside that any remotely knowledgeable person would know that the Constitution hasn’t actually worked that way in practice (the separation of powers often leads to the evasion and delegation of responsibility rather than power maximization by all branches), it’s pretty depressing to see a great playwright deciding instead to write summaries of bad sixth-grade civics textbooks and then triumphantly announcing these insights as producing a political transformation so earth-shaking it requires a cover story to elucidate. It is instructive about the intellectual shallowness likely to produce a “since 9/11, I’m outraged by that some unnamed people still allegedly believe in crude reductionist readings of Rousseau” conservative.
Apparently, the effect of eating four day-old spinach when you think you might have some kind of stomach flu is to remove all doubt. Hopefully blogging will remove shortly. In the meantime, since between the onset of illness and deluge of Real Work I neglected to blog for Equal Pay Day, allow me to delegate to Kay, who explains why it’s important to override Ledbetter. (Much more good stuff can be found here.)
Drew Gilpin Faust’s Republic of Suffering is about memory and the Civil War, but not in the conventionally understood fashion. Although Faust writes a bit about the memory of the war in the national narrative, she’s more interested in how the raw butchery of the war affected American culture on a micro level. The understanding of death in the family and in literature, she suggests, was transformed by the immense human cost of the war and the distance of major battlefields from the homes of many soldiers. Death, as it were, was conducted differently after the war than before.
Faust suggests that a particular understanding of the “Good Death” predominated in the United States before the war. The Good Death involved dying at home, with one’s family, and with the presence of mind to understand and accept the process. The United States had thus far missed out on the opening stages of industrial war, participating only on the periphery of the Napleonic Wars and defeating Mexico without substantial loss. The Civil War represented a demographic event, so to speak, that made the previous appreciation of death difficult. Death came suddenly, often with great pain, and sometimes left no identifiable remains. Even when remains could be identified, the state lacked the bureaucratic and physical infrastructure necessary to transfer the bodies home. Technology also presented a problem, although the use of embalming expanded exponentially during the war.
The Civil War represented a unique expansion in the capacity of the state in nineteenth century America, including growth in its capability to manage death. The raising of large armies, their operation in war, and the management of their demobilization all stressed and expanded the bureaucratic apparatus of the state. Faust details how the managing of Union war dead during and after the war required the state to act in previously unimagined ways. There was substantial difference between the North and the South, of course, in part because the war was fought mostly on Southern soil but also because of the poverty of the South after the war.
The war transformed death bureaucratically, but it also changed how Americans understood mourning at the family and community level. Belief in the literal Resurrection of the body, for example, ran up against the difficulty of missing or scattered remains. The demographic impact of the death of over 600000 military age men left a common set of holes in families and communities. The war also taxed what were widely believed to be pacifist Christian commitments. Christians in the North and the South justified the war in their own ways, but as the United States had not previously experienced a large mobilization for war and had substantially smaller military forces than its European counterparts, pacifist resistance to the idea of killing remained a factor. Faust writes a bit about the problem of killing, but doesn’t really add much to the literature on the creation of the citizen-solider-killer.
It’s an interesting book, and it included quite a few interesting stories, but in the end the effort left me cold. From a social science point of view I would have liked some comparison; the entire nineteenth century was an era of social transformation, and in particular the expansion of the bureaucratic expansion of the state, so I’m skeptical that the Civil War played a singular role in the transformation of the management of death. In fairness, Faust doesn’t explicitly argue that it was such, although I think she heavily implies it. A less social science-y way of approaching the book is to think of it as a story about the reaction to a social shock in early modernity, without judgment about any particular cause or effect. That’s OK, but I guess I want a little bit more analysis. As I suggested, the story that Faust tells is interesting, but perhaps not quite interesting enough that, sans analysis, it can carry a full book.
Early in their graduate careers, most political scientists learn the value of specificity and clarity in the definition of terms. Ken Pollack was apparently sick that day…
In longer discussions on the subject, Mr. McCain often goes into greater specificity about the entities jockeying for control in Iraq. Some other analysts do not object to Mr. McCain’s portraying the insurgency (or multiple insurgencies) in Iraq as that of Al Qaeda. They say he is using a “perfectly reasonable catchall phrase” that, although it may be out of place in an academic setting, is acceptable on the campaign trail, a place that “does not lend itself to long-winded explanations of what we really are facing,” said Kenneth M. Pollack, research director at the Saban Center for Middle East Policy at the Brookings Institution.
Right… because “Al Qaeda” is a term of mainly academic usage, unknown to the greater public and certainly not relevant in a policy context. Indeed, I’m inclined to think that in issues of war, peace, life, and death, we should be extra cautious with our definitions. But then I guess I’m not serious.
Also see Matt.
I just realized that I, um, passed over writing a post telling you all that it’s Passover and I’m out of commission for the weekend. Having eaten enough matzo stuffing tonight to sink a ship (yes, it’s that heavy), I am calling it a night. My posting will resume as normal on Monday.
Happy Passover to those of you celebrating. To the rest of you…happy bread eating.
Late last month, as students returned from spring break, the University of Chicago Law School announced that Internet access would be blocked from classrooms. While individual professors at law schools have created policies banning laptops or allowing them only for specific uses — and while some colleges don’t even have classroom Internet access, or mandate classroom-only use without any enforcement — the move by Chicago appears to be the first institution-wide directive of its kind. Already, there’s been an uproar among students and even senior administrators, while some law professors have stepped up to defend the policy.
Over the last couple of years, I’ve toyed with the idea of prohibiting laptops in my classes; it’s no mystery that nine out of ten laptop users are (by my scientific calculations) instant-messaging, checking e-mail, or surfing the intertubes for hard core man-on-box-turtle pornography. I didn’t fall off the turnip truck just yesterday. I’ve survived my share of faculty senate meetings by watching baseball games on my laptop, and I’ve even thrown up a couple of blog posts in medias tedium. So when students are softly chuckling to themselves, it’s a good bet that they aren’t actually listening to that part of my lecture that covers the Ludlow Massacre; conversely, when they’re typing feverishly I tend to assume they aren’t trying to document my 5-minute tangents on Jimmy Carter’s fight with a rabbit, or the history of early 19th century flatware, or my rundown of the greatest facial hair in American political history.
But in the spirit of fairness, I figure that until I’m actually motivated to ask students what the fuck they’re doing while I’m talking about Chester A. Arthur’s mustache, I can only get so irritated with them. Besides, I’ve never actually seen any evidence that laptop-users do worse in my courses than the folks who stare blankly into space or squander class time drawing pictures of their favorite Star Wars characters. I’ve even allowed students to do beading or knitting in class, on the theory that “kinetic learning” isn’t simply a load of crap invented by people who prefer not to pay attention to what teachers are saying.
Moreover, every now and then we actually need to look up some important piece of historical data — such the date when marshmallows were invented — and we really need the Google-ator.
But there’s another problem with the opening sentence of the Dowd column. “I’m not bitter.” Oh Maureen — who the hell do you think you’re kidding? The woman positively soaks in bitterness. Marinates in it. It oozes out of her pen and pours into just about every damn word she writes. Her bitterness has utterly corroded her soul. It’s turned her into a twisted freak whose chief pleasure in life seems lie in vicious, barking-mad attacks on the only people capable of ending our long national nightmare — the Democrats. Seriously, if there is any other single person in the media who’s been a more powerful enabler of Republican high crimes and misdemeanors than Modo, I don’t know who it is.
There’s always been a weirdly gendered quality to Dowd’s bitterness. The main, and indeed often the only, point of nearly every column she writes is that male Democrats are girly men and female Democrats are castrating bee-yotches. It’s antifeminist, to be sure, but it goes waaaay beyond that into some warped, dark psychosexual realm of its own. Somerby calls her a “gender nut,” which is as good a term for it as any, I suppose.
Make sure to keep reading for the funny setup-with-MoDo anecdote, which will hopefully give pause to people who accept assertions that the media is obsessed with Bill Clinton’s penis because it’s what the public demands…
From August through the following July, there is a steady decline in the likelihood that a child born in the United States will become a major leaguer. Meanwhile, among players born outside the 50 states, there are some hints of a pattern but nothing significant enough to reach any conclusions. An analysis of the birth dates of players in baseball’s minor leagues between 1984 and 2000 finds similar patterns, with American-born players far more likely to have been born in August than July. The birth-month pattern among Latin American minor leaguers is very different—if anything, they’re more likely to be born toward the end of the year, in October, November, and December.
The magical date of Aug. 1 gives a strong hint as to the explanation for this phenomenon. For more than 55 years, July 31 has been the age-cutoff date used by virtually all nonschool-affiliated baseball leagues in the United States. Youth baseball organizations including Little League, Cal Ripken/Babe Ruth, PONY, Dixie Youth, Hap Dumont, Dizzy Dean, American Legion, and more have long used that date to determine which players are eligible for which levels of play. (There is no such commonly used cutoff date in Latin America.) The result: In almost every American youth league, the oldest players are the ones born in August, and the youngest are those with July birthdays. For example, someone born on July 31, 1990, would almost certainly have been the youngest player on his youth team in 2001, his first year playing in the 11-and-12-year-olds league, and of average age in 2002, his second year in the same league. Someone born on Aug. 1, 1989, by contrast, would have been of average age in 2001, his first year playing in the 11-and-12-year-olds division, and would almost certainly be the oldest player in the league in 2002.
The older players are slightly better, receiving more attention from coaches and more encouragement, and are thus more likely to stay in baseball.
I am forced to revisit my own Little League career, in which I played center field (yes!) for the Coast League Royals in Rancho Cordova, California. Another young man from my class played in the same league, but he was much, much better than I was; even at the age of nine, he could hit, catch, and run for more than a minute without getting tired. That young man’s name was Geoff Jenkins, and I’ve always wanted to find some arbitrary reason why he has a Major League career and I don’t. Now, it turns out that Geoff was born on July 21, which would seem to put him on the wrong side of the line, but I am almost certain that I remember that he started playing a year earlier than I did.
Did connections allow Mr. Jenkins to start Little League early? Were there bribes? I think that an investigation is in order, and I’m certain that I somehow deserve a portion of the $42 million that Geoff has thus far earned in his career.