Author Page for Paul Campos
Paul Krugman points out yet again why, as the annual deficit continues to shrink, “deficit hawks” remain undeterred by the spectacular inaccuracy of their predictions:
But what about people who pay a lot of attention to the budget, the self-proclaimed deficit hawks? (Some of us prefer to call them deficit scolds.) They’ve spent the past few years telling us that budget shortfalls are the most important issue facing the nation, that terrible things will happen unless we act to stem the flow of red ink. Are they expressing satisfaction over the fading of that threat?
Not a chance. Far from celebrating the deficit’s decline, the usual suspects — fiscal-scold think tanks, inside-the-Beltway pundits — seem annoyed by the news. It’s a “false victory,” they declare. “Trillion dollar deficits are coming back,” they warn. And they’re furious with President Obama for saying that it’s time to get past “mindless austerity” and “manufactured crises.” He’s declaring mission accomplished, they say, when he should be making another push for entitlement reform.
All of which demonstrates a truth that has been apparent for a while, if you have been paying close attention: Deficit scolds actually love big budget deficits, and hate it when those deficits get smaller. Why? Because fears of a fiscal crisis — fears that they feed assiduously — are their best hope of getting what they really want: big cuts in social programs. A few years ago they almost managed to bully the nation into cutting Social Security and/or raising the Medicare eligibility age; they even had hopes of turning Medicare into an underfinanced voucher program. Now that window of opportunity is closing fast.
A few days ago I noted that, despite the enormous growth of the American economy, median household income has barely increased over the past 40 years, and has actually declined among younger households. There is, however, one group (other than, of course, the upper class) whose real income has increased substantially over that time: the elderly.
Median household income for households headed by Americans 65 and older has increased from $16,831 in 1967 to $35,611, in 2013 dollars. In the late 1960s, a large majority of elderly Americans either lived in poverty or close to it. (The current poverty line for a two-person household is $15,730). Today that bleak state of affairs has been altered drastically, largely if not exclusively as a consequence of Social Security and Medicare. These programs, born of the New Deal and the Great Society respectively, have been nothing less than fabulous successes, which is why they’re so popular.
Obviously both programs require some changes going forward, with Social Security needing some fairly modest tweaks to remain fully funded, and Medicare calling for more challenging reforms (the ACA is a good start in regard to the latter).
Progressives have been living in Nixonland for so long that it’s often easy to forget that most Americans actually like the results of Big Government (sic) just fine, at least as it’s manifested in our most expensive and important social programs.
It’s well known that having more educational credentials correlates strongly with higher income. This correlation has led lots of people to make the common sense assumption that increasing the educational credentials of the population as a whole will in turn produce higher incomes. Common sense assumes, as it so often does in a naive pre-theoretical way, that correlation equals causation.
At a more sophisticated theoretical level, the assumption at work here is that enhanced credentials signal enhanced human capital. In other words, more education (or in any case more educational credentials — a distinction which is usually ignored) creates or enhances abilities in its recipients they would not otherwise have, and these abilities allow them to perform work they would not otherwise be able to do.
If we then further assume that this work would not be performed, or at least not be performed as profitably, in the absence of the enhanced abilities signaled by the credentials, then enhanced human capital increases income by ameliorating structural un-and-underemployment.
That’s why almost all of Tom Friedman’s conversations with garrulous cab drivers invariably end with him concluding that everybody needs to get an advanced degree in bio-mechanical statistics, because in a globalized flat world we can no longer afford for the average person to be average.
There is, however, a very different account of why more educational credentials correlate with higher income. In this alternate world, that correlation exists not, or at least not primarily, because the credentials signal that human capital has been enhanced, but rather because those credentials signal that the possessors of the credentials have certain valuable preexisting abilities, and/or enjoy higher class status, than those without them. To the extent this alternative account is correct, educational credentials are positional goods, which have a realizable pecuniary value precisely to the extent that they are scarce (I’m not going to address the non-pecuniary value of education here, other than to note again that the value, pecuniary or otherwise, of actual education is quite a different thing from the value of educational credentials.)
One way of testing these dueling theories is to look at what happens to incomes across time, when the percentage of a population that holds various credentials changes significantly. Of course any such comparison is going to be incomplete in all sorts of important ways. Still, it would seem that, all other things being equal, increasing educational credentials in a population should correlate strongly with increasing incomes in that population, if the human capital theory is valid.
Consider then the following (all dollar figures are expressed in constant 2013 dollars):
US GDP in 1973: $5,889,810,000,000
US GDP in 2013: $16,768,100,000,000
GDP per capita 1973: $27,790
GDP per capita 2013: $52,986
As I’ve noted before, it has been one of the curious features of cultural and political rhetoric in America for more than a generation now that even many highly educated (or in any event credentialed) people assume that the economy as a whole is stagnant, not growing, weak in comparison to the post-World War II boom times, etc. In fact, in terms of just annual economic output, (a figure which doesn’t include accumulated wealth) the country is 11 trillion dollars richer than it was 40 years ago: real GDP has nearly tripled, and even after taking into account population growth, it has almost doubled.
During this same time, the educational credentials of the population have improved almost as dramatically as the nation’s measurable economic output. Per the enhanced human capital theory, we’re getting richer because we’re getting smarter, and all that’s necessary to extend this virtuous — or at least profitable — circle more or less indefinitely is for various forms of education to become increasingly universal, until we finally inhabit a Friedmanesque Lake Wobegon, in which all the cab drivers can quote Wittgenstein, while writing ever-more elaborate computer programs in their off hours.
A look at the latest census data on household income appears to tell a very different story. Consider the following cohorts:
(a) Households headed in 1973 by people 45-54 years of age.
(b) Households headed in 1973 by people 25-34 years of age.
(c) Households headed in 2013 by people 45-54 years of age.
(d) Households headed in 2013 by people 25-34 years of age.
What sorts of educational credentials did these different cohorts possess? Here, we’ll look at the two most crucial credentials for the purpose of a population-wide analysis: high school and college degrees.
Approximate percentage of 45-54 year old adults who possessed a high school diploma or more in 1973: 50
Approximate percentage of 45-54 year old adults who possessed a bachelor’s degree or more in 1973: 8
Approximate percentage of 25-34 year old adults who possessed a high school diploma or more in 1973: 70
Approximate percentage of 25-34 year old adults who possessed a bachelor’s degree or more in 1973: 19
Approximate percentage of 45-54 year old adults who possessed a high school diploma or more in 2013: 81
Approximate percentage of 45-54 year old adults who possessed a bachelor’s degree or more in 2013: 23
Approximate percentage of 25-34 year old adults who possessed a high school diploma or more in 2013: 83
Approximate percentage of 25-34 year old adults who possessed a bachelor’s degree or more in 2013: 30
Note that most discussions of improving educational attainment focus on increasing the percentage of college graduates. Yet if more education increases income by enhancing human capital, then increasing the percentage of high school graduates should have an even stronger effect. This is because any improvement in abilities due to more education ought to be subject to diminishing marginal returns. For example, someone who goes from running 10 miles per week to running 20 will see far more improvement in aerobic capacity per extra mile run than someone who moves from running 20 to running 30. If we assume the validity of the enhanced human capital theory of education, it would be very peculiar if someone who received 13 years of formal education rather than nine did not get a greater benefit in terms of the resultant enhancement of human capital from each extra year of education, in comparison to someone who received 17 rather than 13.
With these things in mind, let’s now look at the median household income (see Table H-10) for people in these demographic cohorts.
Median household income, 45-54 year olds, 1973: $65,988
Median household income, 25-34 year olds, 1973: $55,458
Median household income, 45-54 year olds, 2013: $67,141
Median household income, 25-34 year olds, 2013: $52,702
Despite the 60% increase in the prevalence of high school diplomas, and the near tripling in the prevalence of college degrees, that took place among middle-aged people between 1973 and 2013, median household income for this demographic group is practically identical to what it was 40 years ago. Meanwhile, despite their impressive gains in educational credentialing relative to their demographic peers of four decades ago, the comparable figures for 25-34 year olds show an actual decline in median household income between 1973 and 2013.
Consider how extraordinary these figures are, given both the almost incomprehensible increase in the nation’s total wealth over the past four decades — $11 trillion dollars more per year in economic output! — and the fact that, in the US population as a whole, college degrees are today as common as high school degrees were in the 1940s.
Note too that, when considering median household income across time, the labor force participation rate is higher today than it was in the early 1970s (approximately 45% of women worked outside the home in 1973, as compared to nearly 60% today), which suggests that the same — or, in the case of 25-34 year olds, lower — median household income today relative to forty years ago is actually requiring more hours of paid labor to produce.
It would be something of an understatement to say these statistics call into question the enhanced human capital theory of educational attainment. Instead, they are precisely what we would expect to find if educational credentialing is a positional good: one whose value must invariably deteriorate as it becomes less scarce. (Currently, somewhere between a quarter and a fifth of 25-34 year old college graduates are earning less than the median high school graduate of the same age).
Meanwhile, consider what has happened to the cost of undergraduate education over this time frame (2013$):
Average Private Four-Year Non-profit College Tuition 1973: $10,783
Average Private Four-Year Non-profit College Tuition 2013: $30,094
Average Public Four-Year College Tuition 1973: $2,710
Average Public Four-Year College Tuition 2013: $8,893
(Interestingly room and board charges have also risen quite a bit, although not as drastically, from $6,200 in 1973 to $11,800 in 2013 at private colleges, and slightly less at public schools).
All this in turn suggests that more than a generation’s worth of rhetoric regarding how we must inculcate the rest of society with upper-middle class mores in regard to the value of obtaining educational credentials has ultimately harmed efforts to combat increasing social and economic stratification.
Let me tell you about the very rich. They are different from you and me. They possess and enjoy early, and it does something to them, makes them soft, where we are hard, cynical where we are trustful, in a way that, unless you were born rich, it is very difficult to understand.
— F. Scott Fitzgerald, “The Rich Boy” (1926)
There’s a rich tradition in American culture of celebrating wealth and the possibility of achieving it. This tradition is built upon something of a paradox: the belief that, on the one hand, rich people deserve their economic and social status because they have always had the rare personal qualities that led to their acquisition of wealth uncountable, and on the other, that you — the purchaser of this book, or lecture series, or self-improvement DVDs etc. — can now acquire these rare personal qualities, through sheer discipline and effort (and with the help of a few, very reasonably priced, authorial tips).
The whole power of positive thinking racket is based on ignoring the latent tension between these beliefs. The Gospel of Prosperity, The Millionaire Next Door, The Secret — it’s all the same grift in the end, and yet we the people never seem to tire of it. Consider this delightful specimen of the genre from Steve Siebold, author of, among other works, Problems in Kierkegaard and How Rich People Think.
The truth is successful people are confident because they repeatedly bet on themselves and are rarely disappointed. Even when they fail, they’re confident in their ability to learn from the loss and come back stronger and richer than ever.
This is not arrogance, but self-assuredness in its finest form.The wealthy have an elevated and fearless consciousness that keeps them moving toward what they want, as opposed to moving away from what they don’t want. This often doubles or triples their net worth quickly because of the new efficiency in their thinking. Eventually they begin to believe they can accomplish anything, and this becomes a self-fulfilling prophecy. As they move from success to success, they create a psychological tidal wave of momentum that gets stronger every day, catapulting their confidence to a level so high it is often interpreted as arrogance.
The ideological function of this sort of hokum is fairly clear. What’s less clear, perhaps, is what continues to make it so attractive, in a culture in which the increasingly vast differences in life circumstances between people born into different classes ought to make the concept of some sort of pseudo-Darwinian meritocracy increasingly implausible.
Oslo is dropping out of bidding for the 2022 Winter Olympics, leaving Almaty, Kazakhstan and Beijing as the only remaining cities seeking to host the event. Why? One reason is that people are starting to realize that spending mega-money to build sporting venues that may not ever be used again doesn’t make economic sense. Another is that the International Olympic Committee is a notoriously ridiculous organization run by grifters and hereditary aristocrats. Norwegian citizens were particularly amused/outraged (amuseraged) by the IOC’s diva-like demands for luxury treatment during the hypothetical Games. Here’s a piece in the Norwegian media about the controversy, with translation provided by a generous Norwegian reader named Mats Silberg:
They demand to meet the king prior to the opening ceremony. Afterwards, there shall be a cocktail reception. Drinks shall be paid for by the Royal Palace or the local organizing committee.
Separate lanes should be created on all roads where IOC members will travel, which are not to be used by regular people or public transportation.
A welcome greeting from the local Olympic boss and the hotel manager should be presented in IOC members’ rooms, along with fruit and cakes of the season. (Seasonal fruit in Oslo in February is a challenge…)
The hotel bar at their hotel should extend its hours “extra late” and the minibars must stock Coke products.
The IOC president shall be welcomed ceremoniously on the runway when he arrives.
The IOC members should have separate entrances and exits to and from the airport.
During the opening and closing ceremonies a fully stocked bar shall be available. During competition days, wine and beer will do at the stadium lounge.
IOC members shall be greeted with a smile when arriving at their hotel.
Meeting rooms shall be kept at exactly 20 degrees Celsius at all times.
The hot food offered in the lounges at venues should be replaced at regular intervals, as IOC members might “risk” having to eat several meals at the same lounge during the Olympics.
See the link for the IOC’s more-in-sorrow-than-in-anger response.
On Saturday, Michigan’s beleaguered football coach Brady Hoke decided to start sophomore Shane Morris at quarterback against Minnesota, over fifth-year senior and long-time starter Devin Gardner. In the first half, Morris was very ineffective against a weak team over whom Michigan is favored by double digits, despite the Wolverines’ poor play this season.
Early in the third quarter, Morris injures his ankle. His play goes from ineffective to catastrophic, as the injury appears to grow progressively worse. By early in the fourth quarter, Morris’ mobility seems seriously compromised, yet Hoke makes no move to replace him with Gardner. With about 11 minutes left to go in the game, Morris is subjected to vicious helmet to helmet cheap shot a full second after throwing yet another wild pass downfield.
(The key sequence starts at around 2:30 in the video).
The 80,000 or so remaining fans in the stands and a national TV audience see Morris wobble back toward the huddle, and then appear to be kept from collapsing to the turf by an offensive lineman, who props him up while other players in the huddle signal frantically to the bench, apparently in an effort to get Morris pulled from the game before he suffers yet more serious injuries to his brain. The coaching staff appears to ignore these gestures; in any case Morris runs another play. At this point Michigan’s offensive coordinator starts signaling to Morris to go down to the ground, probably to give the disorganized Michigan sideline enough time to finally put Gardner in the game without incurring a delay penalty.
In any case Gardner enters, and 90 seconds later (in real time) loses his helmet while scrambling. Under college rules he has to leave the game for at least one play unless Michigan uses a time out. Instead of using a time out, the staff tries to insert third string QB Russell Bellomy, but Bellomy can’t find his helmet. Someone then decides to send Morris back into the game instead of using a precious time out (Michigan trails by 23 at this juncture and the game is effectively over). Morris goes in, hands off, and then is replaced by Gardner again, who promptly leads the team down the field for a TD, incidentally producing more offensive effectiveness in one drive than Morris was able to generate all afternoon.
During all of this sequence much of the crowd has been booing loudly, in protest of the recklessness of keeping an obviously injured and probably concussed Morris in the game. Even the usually docile announcers on ESPN express something like outrage and disgust.
After the game, Brady Hoke is asked why he didn’t take Morris out, given the ample evidence that the sophomore QB, who celebrated his 20th birthday last month, had suffered a concussion. This was Hoke’s answer:
I don’t know if he had a concussion or not, I don’t know that. Shane’s a pretty competitive, tough kid. And Shane wanted to be the quarterback, and so, believe me, if he didn’t want to be he would’ve come to the sideline or stayed down.
This response helps fuel a firestorm of criticism, to the point where by Sunday evening the story is being reported in the national news media.
Meanwhile, some time between the end of the game and at some point on Sunday (more on the timing of this below), Morris is officially diagnosed as having suffered a concussion by the Michigan medical staff. Remarkably, at his lunch time press conference on Monday, Hoke appears not to be aware of this, even though:
(a) The team practiced on Sunday, and it’s standard for the coaching staff to receive injury reports from the trainers and medical staff after a game and prior to the next practice; and
(b) Hoke acknowledges speaking with Morris on both Sunday night and Monday morning, prior to the press conference.
Hoke says that as far as he knows Morris only suffered a high ankle sprain, and if not for that sprain he would have practiced on Sunday with the rest of the team. He also says he hasn’t spoken, at all, to Michigan’s athletic director Dave Brandon, at any time since the incident, even though the incident has now been a national news story for almost 24 hours, and Brandon normally reviews film of Saturday’s game with the coaching staff on Sunday morning.
Finally, at 1:30 AM this morning, Brandon — a multi-millionaire former CEO of Domino’s Pizza, former Michigan regent, and prospective GOP candidate for Michigan’s governorship — releases a statement admitting that “as of Sunday” Morris had been diagnosed as suffering what Brandon termed a “mild” concussion, and that Hoke’s apparent ignorance of this at the Monday press conference was due to a “mis-communication.”
Later this morning, Brian Cook and John Bacon, two journalists with various sources inside the Michigan AD, separately imply strongly that Brandon spent much of the time between Sunday and Tuesday morning trying to strong-arm the Michigan medical staff into covering up, or at least soft-pedaling, their diagnosis that Morris had suffered a concussion before he was sent back into the game.
(1) When was Morris diagnosed with a concussion? Brandon’s middle of the night statement is phrased in a suspiciously weasel-like way on this point, noting that “as of Sunday” Morris was determined to have been concussed. This phrase sounds loaded with truthiness, as surely Morris would have been examined for a concussion immediately after the game by medical personnel — he was taken off the field and into the locker room on a cart — and if he was diagnosed on, as opposed to “as of” Sunday, why not just say that?
(2) When precisely did Brandon find out Morris had suffered a concussion? Did he have any contact at all with his head football coach between that moment and the Monday press conference? If not, why not?
(3) Did Brandon, or anyone else associated with the athletic department, attempt to influence any aspect of the medical report regarding Morris’s injuries?
(4) Did Hoke attempt to contact anyone, either in the AD or among the medical personnel, about Morris’s condition prior to the press conference? If not, why not?
(5) What does this previous incident tell us about Hoke’s attitude toward his players?
Ball State reprimanded two coaches after a football player suffered frostbite during a disciplinary workout in subzero temperatures.
Ball State’s athletic director issued letters of reprimand to head coach Brady Hoke and football strength and conditioning coach Aaron Wellman [Wellman now holds the same position at Michigan] after the workout, associate athletics director Joe Hernandez said Friday.
Redshirt freshman receiver Chris Jackson suffered frostbite to several fingers during the 40-minute workout Jan. 31, Hernandez said. Jackson recovered following medical treatment and has returned to workouts.
During the Jan. 31 workout, Jackson and several teammates carried a 25-pound sandbag up and down steps at the school’s stadium, athletic director Bubba Cunningham said.
(6) How long is it going to take for the university’s president and regents to fire Brandon and Hoke?
. . . see also Jon Chait for more background on Brandon’s history of megalomania, and the perennial stupid/evil epistemological puzzle.
Update: Students march on the president’s house. The whole world is watching . . .
Four years ago, one of the best students I’ve had in 24 years of law teaching killed himself, a year to the day after graduating. This suicide, and what I eventually discovered about the events that led to it, played a key role in pushing me toward first educating myself regarding, and then trying to do something about, the law school crisis.
One thing I learned is that depression is apparently epidemic among both law students and lawyers. As I’ve written elsewhere:
(1) Law students are no more prone to depression than anyone else before starting law school. In the course of law school they develop both clinical and sub-clinical depression at extraordinarily high rates, so that by the time they are 3Ls they are roughly ten times more likely to be in these categories than they were prior to entering law school.
(2) Rates of depression among practicing attorneys are also very high. For instance, a 1990 Johns Hopkins study looked at depression in 104 occupational groups. Lawyers ranked first.
(3) These findings are remarkably consistent across studies, and have remained so for several decades.
(4) Although there is as of yet little work on what effect recent changes in the legal profession are having on these outcomes, the primary environmental cause of depression appears to be stress, which suggests an already serious problem is likely to be getting worse.
Why are law students and lawyers so prone to develop depression? The literature suggests numerous causes, most of which have something to do with the effects of an intensely hierarchical, competitive, emotionally cold, and high-stress environment, in which people are socialized to obsess on external status markers and to minimize or ignore things such as learning for its own sake, doing intrinsically valuable work, and maintaining healthy personal relationships.
Depression is a mysterious disease, and for me that mystery was if anything deepened by reading recently William Styron’s Darkness Visible: A Memoir of Madness, his harrowing account of how an episode of deep depression took him to the brink of suicide. Styron’s account is both powerful and eloquent, but ultimately it left me with more questions than answers about this terrible illness. One very useful aspect of the book, for me, was that it conveyed what an inadequate and ultimately misleading word “depression” is to describe the phenomenon, at least in its more ferocious forms.
“Melancholia” would still appear to be a far more apt and evocative word for the blacker forms of the disorder, but it was usurped by a noun with a bland tonality and lacking any magisterial presence, used indifferently to describe an economic decline or a rut in the ground, a true wimp of a word for such a major illness. It may be that the scientist generally held responsible for its currency in modern times, a Johns Hopkins Medical School faculty member justly venerated–the Swiss-born psychiatrist Adolf Meyer- -had a tin ear for the finer rhythms of English and therefore was unaware of the semantic damage he had inflicted by offering”depression” as a descriptive noun for such a dreadful and raging disease. Nonetheless, for over seventy-five years
the word has slithered innocuously through the language like a slug, leaving little trace of its intrinsic malevolence and preventing, by its very insipidity, a general awareness of the horrible intensity of the disease when out of control.
As one who has suffered from the malady in extremis yet returned to tell the tale, I would lobby for a truly arresting designation.
“Brainstorm, ” for instance, has unfortunately been preempted to describe, somewhat jocularly, intellectual inspiration. But something along these lines is needed. Told that someone’s mood disorder has evolved into a storm- -a veritable howling tempest in the brain, which is indeed what a clinical depression resembles like nothing else-even the uninformed layman might display sympathy rather than the standard reaction that “depression” evokes, something akin to “So what?” or “You’ll pull out of it” or “We all have bad days.” The phrase “nervous
breakdown” seems to be on its way out, certainly deservedly so, owing to its insinuation of a vague spinelessness, but we still seem destined to be saddled with “depression” until a better, sturdier name is created.
For someone who, at least until now, has been lucky enough to ponder serious depression strictly from a distance, but who wants to understand it as best he can, Styron’s book was both of great value, and a spur to try to learn more. I’d appreciate any suggestions commenters might have regarding other resources for helping to encourage a qualitative, as opposed to a merely statistical, understanding of this illness.
I’m personally observing erev Derek Jeter’s Farewell Game by re-reading Updike’s “Hub Fans Bid Kid Adieu.”
Understand that we were a crowd of rational people. We knew that a home run cannot be produced at will; the right pitch must be perfectly met and luck must ride with the ball. Three innings before, we had seen a brave effort fail. The air was soggy; the season was exhausted. Nevertheless, there will always lurk, around a corner in a pocket of our knowledge of the odds, an indefensible hope, and this was one of the times, which you now and then find in sports, when a density of expectation hangs in the air and plucks an event out of the future.
Fisher, after his unsettling wait, was wide with the first pitch. He put the second one over, and Williams swung mightily and missed. The crowd grunted, seeing that classic swing, so long and smooth and quick, exposed, naked in its failure. Fisher threw the third time, Williams swung again, and there it was. The ball climbed on a diagonal line into the vast volume of air over center field. From my angle, behind third base, the ball seemed less an object in flight than the tip of a towering, motionless construct, like the Eiffel Tower or the Tappan Zee Bridge. It was in the books while it was still in the sky. Brandt ran back to the deepest corner of the outfield grass; the ball descended beyond his reach and struck in the crotch where the bullpen met the wall, bounced chunkily, and, as far as I could see, vanished.
Like a feather caught in a vortex, Williams ran around the square of bases at the center of our beseeching screaming. He ran as he always ran out home runs—hurriedly, unsmiling, head down, as if our praise were a storm of rain to get out of. He didn’t tip his cap. Though we thumped, wept, and chanted “We want Ted” for minutes after he hid in the dugout, he did not come back. Our noise for some seconds passed beyond excitement into a kind of immense open anguish, a wailing, a cry to be saved. But immortality is nontransferable. The papers said that the other players, and even the umpires on the field, begged him to come out and acknowledge us in some way, but he never had and did not now. Gods do not answer letters.
This is the kind of moment that, much to the regret of ESPN et.al., can’t be scripted — the combination of the spontaneously perfect athletic feat, and the presence of the then-unknown great writer in the stands.
Updike’s essay also features a revelatory number: 10,454. That was the attendance at Ted Williams’ last home game of his career — and Updike even notes that he and most of the rest of the crowd were there primarily to see the greatest player of his generation one last time.
These are the good old days?
Chart from Piketty’s/Saez’s latest data:
Most LGM readers are familiar with Leiter’s history of cyber-harassment and sock puppetry, so it should come as no surprise that lots of people in the world of academic philosophy are fed up* with his increasingly bizarre bullying.
*The statement of support for Carrie Jenkins (which has now been signed by 149 colleagues and counting) has been temporarily moved to another site, because someone (“no one knows who” — Hyman Roth) lodged a complaint with Google, claiming that the original site violated Google’s terms of service (which apparently include an agreement not to criticize Brian Leiter). Edit: The complaint against the original site has failed. It is now up again.
I’ll just add a few notes to a record that pretty much speaks for itself:
(1) A remarkable number of the targets of Leiter’s cyber-bulling in the world of academic philosophy are women, especially considering the extent to which the field continues to be dominated by men. These two facts are probably connected in some mysterious way, which perhaps the tools made available to us by analytical philosophy could help unpack.
(2) Leiter apparently loves to try to silence critics in the philosophy world with threats of defamation suits. Amusingly, this illustrates the extent to which he longs to play pretend lawyer, although it would be irresponsible not to speculate regarding whether he could even file a motion without professional assistance. He also seems remarkably sensitive (this is a rhetorical phrase; there’s nothing remarkable about it) to claims that he’s not really a philosopher, since he doesn’t have a joint appointment in a philosophy department. All this reminds me of somebody or the other’s remark to the effect that while formerly there were philosophers, today we must make do with professors of philosophy.
(3) Leiter’s current professional aspiration appears to be to end up as crazy as Nietzsche became, without the intermediate period of being an interesting thinker.
. . . if you have a strong stomach, check out the craven message Leiter sent to Carrie Jenkins, when he began to suspect that his latest vendetta wasn’t going to turn out well for him.
Update: The comment thread has dozens of excellent remarks; I wanted to highlight this one from Aimai, regarding why Leiter’s cyber-stalking of Carrie Jenkins is so invidious:
Her original post, which essentially celebrated her happy ascension to being a professor in a treasured field, was instantly stalked and trolled and attacked by a prominent professional in her field who put her on notice that nothing she wrote or published would happen without his eye falling on it, that whatever she wrote could be construed as legally actionable, that he would be watching her to make sure that she steered clear of the sin of ever impinging on his gaping wound of an ego. In other words: she’s minding her own business and an important, touchy, asshole turns out to be stalking her and turning her private and professional life into a legal cause of action.
In an instant she went from being a person celebrating and engaging with her field and her colleagues into, apparently, the enemy of a person with zero sense of proportionality and restraint–a person so narcissistic that they go out of their way to threaten legal action against a perfect stranger for a perfectly innocuous post that doesn’t reference Leiter at all.
Like all women she is instantly advised not to engage with her attacker/bully but to “ignore” him and to take actions (like filtering her emails) which might cause her to re-engage with him or provoke him. In other words she is to change her behavior in order to stop drawing his attention and if she finds that difficult to do–like “remembering to forget about the camel’s left knee” well, she’s no different than any other person who is told to continually steer around an obstacle while pretending the obstacle doesn’t exist.
And the proof that she needs to do that is in the second interaction when her innocuous tweet to a third party creates an opening that Leiter exploits to draw her back into an interaction and to imply that all her thoughts and writings and interactions exist only in reference to Leiter.
The guy is absolutely like a stalker and an ex–someone who forces an interaction onto you and then monitors you and your social media to make sure that he still matters to you.
. . . and a very nice summary from Nobdy:
Leiter appears to have lucked into power and influence just by doing something crass and simplistic that nobody thought to do before BECAUSE it was crass and simplistic but that gained an audience because even philosophers are apparently prone to wanting easy well-defined answers even if they are wrong.
At first blush he appears to be quite arrogant about this tiny accomplishment of being willing to oversimplify, but in reality it appears that he is aware of having accomplished nothing and is wracked by insecurity. His constant fretting about and threats re: his reputation reveal that he is terrified of being seen as the fraud he really is, and believes he must do everything in his power to control his image. It’s pathetic but one can’t be sympathetic because in his desperate increasingly unhinged scramble to hide the truth he does real damage to innocent parties.
Supposedly this is cinema verite a la Don Draper, done in one take:
It would be easy to demonize Peterson as an abuser, but the forthrightness with which he talked about using belts and switches but not extension cords, because he “remembers how it feels to get whooped with an extension cord,” as part of his modes of discipline suggests he is merely riffing on scripts handed down to him as an African-American man.
These cultures of violent punishment are ingrained within African-American communities. In fact, they are often considered marks of good parenting. In my childhood, parents who “thought their children were too good to be spanked” were looked upon with derision. I have heard everyone from preachers to comedians lament the passing of days when a child would do something wrong at a neighbor’s house, get spanked by that neighbor, and then come home and get spanked again for daring to misbehave at someone else’s house. For many that is a vision of a strong black community, in which children are so loved and cared for that everyone has a stake in making sure that those children turn out well, and “know how to act.” In other words, it is clear to me that Peterson views his willingness to engage in strong discipline as a mark of being a good father. . .
Stakes are high because parenting black children in a culture of white supremacy forces us to place too high a price on making sure our children are disciplined and well-behaved. I know that I personally place an extremely high value on children being respectful, well-behaved and submissive to authority figures. I’m fairly sure this isn’t a good thing.
If black folks are honest, many of us will admit to both internally and vocally balking at the very “free” ways that we have heard white children address their parents in public. Many a black person has seen a white child yelling at his or her parents, while the parents calmly respond, gently scold, ignore, attempt to soothe, or failing all else, look embarrassed.
I can never recount one time, ever seeing a black child yell at his or her mother in public. Never. It is almost unfathomable.
It has long been time for us to forgo violence as a disciplinary strategy. But as Charles Barkley notes, if we lock up Adrian Peterson, we could lock up every other black parent in the South for the same behavior. Instead, I hope Peterson is a cautionary tale, not about the state intruding on our “right” to discipline our children but rather a wakeup call about how much (fear of) state violence informs the way we discipline our children.
If the murder of Michael Brown has taught us nothing else, we should know by now that the U.S. nation-state often uses deadly violence both here and abroad as a primary mode of disciplining people with black and brown bodies. Darren Wilson used deadly force against Michael Brown as a mode of discipline (and a terroristic act) for Brown’s failure to comply with the request to walk on the sidewalk.
The loving intent and sincerity of our disciplinary strategies does not preclude them from being imbricated in these larger state-based ideas about how to compel black bodies to act in ways that are seen as non-menacing, unobtrusive and basically invisible. Many hope that by enacting these micro-level violences on black bodies, we can protect our children from macro and deadly forms of violence later.