Now, enjoy this dissing of America’s most ridiculously pompous blowhard, and also Donald Trump.
Ugh, the new Matthew Barney movie looks fucking terrible. pic.twitter.com/ln8265ZWuj
— David Roth (@david_j_roth) June 24, 2016
Running a marathon. Why the hell would somebody do that to themselves?
Indeed a vast, disturbing literature has now accumulated on the ill effects of running marathons. Studies find that up to 1 in 12 participants end up seeking medical help during the race. (At hot-weather events, runners can end up “dropping like flies.”) As many as four-fifths report having gastrointestinal problems such as bloating, cramps, vomiting, diarrhea, and fecal incontinence while on the course. Some runners suffer from blood poisoning. Others must endure a blitz of dermatological conditions: sore nipples (affecting up to 1 in 6 on race day); chafing (another 1 in 6); blisters (1 in 3); and jogger’s toe (1 in 40). Given all the risks, it’s no wonder that some marathon organizers have asked doctors to embed as race participants so they can quickly tend to runners who collapse.
When researchers consider all the injuries that accrue during the period of training—and not just on the day of the marathon itself—they find even greater cause for alarm. One study looked at 255 participants in an extended, 32-week marathon training program and found that 90 of them—that’s 35 percent—experienced “overuse” injuries. (Among the most common training ailments are anterior knee pain, Achilles tendinitis, shin splints, and stress fractures.) Another research group surveyed 725 men who raced in the 2005 Rotterdam Marathon, and found that more than half of them had sustained a running injury over the course of the year. Among those who sustained a new injury during the month leading up to the race, one-quarter were still suffering, to some extent, three months later.
Deaths do occur during the marathon, but I’m glad to say they’re very, very rare. Most runners’ ailments will be temporary; then again, most runners won’t have any benefits to weigh against those modest costs. Even if they don’t ruin their knees, twist their ankles, or bang their toes while training, their weekly hobby won’t do much to help their health. Marathoners fail to lose weight, as a rule, and while aerobic exercise may be good for the heart, doing a huge amount of aerobic exercise brings at best diminishing returns.
The sport isn’t merely dangerous; it’s extravagant. It costs more than $250 just to enter the New York City Marathon and to have the chance to chafe your nipples alongside 50,000 other people. Meanwhile, humanity’s oldest form of exercise has spawned a multibillion-dollar industry in footwear. Even efforts to pare down the sport to fundamentals have been absorbed into this marketing, such that there now exists a set of high-priced products known, improbably enough, as “barefoot running shoes.”
I get the feeling that marathoners think of themselves as gritty, motivated types, who would rather train and get things done than sit around watching videos on Facebook. Indeed, they’ll often note the fact of their accomplishment (we might think of this as “showing off”) on social media. For them, the pursuit of running 26 miles may have less to do with any functional reward than merely having gone through training in the first place. It’s an exercise of will, not one of purpose; the marathoner views achievement as a virtue of its own—like climbing Everest because it’s there.
It’s telling that this monomania gets rewarded—every single time, with cheering crowds and Facebook likes—despite its lack of substance. (At least Everest has a view!) I guess the form itself excites us: We’re so starved for ways to show self-discipline, and to regiment our time, that any goal will do, even one so imbecilic as the marathon. This only calls attention to the wasted opportunity: If we want to celebrate the act of building up to something hard—if we’re ready to devote ourselves, for at least 100 hours, to regimented training—then we should strive for something better. Instead of spending all that time purely for the sake of having spent it, let’s pursue a goal that has some meaning in itself.
I can think of no form of “leisure” less appealing than running 26 miles. Except for those crazy supermarathon bastards running 100 miles or whatever while they figure out where to poop.
Opinions may differ.
Growth in median compensation may have slowed lately, or even fallen for some of the highest-paid chief executives. But this is little recompense for workers who have seen their wages stagnate or fall for decades.
Last year, the average chief executive of an S&P 500 company was paid 335 times more than the average nonsupervisory worker, according to the AFL-CIO’s useful interactive site, Executive Paywatch.
This stunning disparity has been the norm since the 1990s, but it wasn’t always this way. In 1965, the average CEO made 20 times the pay of the average worker; it was around 34-to-1 in 1980. By 1998, it was nearly 322-to-1.
What to do about it is fairly obvious. Tax the living heck out of them:
One measure would be returning to the progressive taxation system that operated from the 1940s until 1981, with a top marginal rate of, say, 70 percent as opposed to today’s 39.6 percent.
Another is to eliminate the tax-option loophole, which helps subsidize high compensation. (It allows companies to deduct the market value of the options, even though they are not a real expense, thus lowering their taxes. This arguably encourages companies to grant even more options in big comp packages.) According to a report from Citizens for Tax Justice, 315 big companies have used this to avoid $64.6 billion in taxes over the past five years.
Corporate tax rates could be set higher for firms with high CEO-to-worker pay differentials. Say-on-pay could be made mandatory rather than advisory. Public companies could be required to separate the chairman and chief-executive jobs. And unionization could be made easier, giving workers greater bargaining power.
Unionization certainly does need to be made easier, although I’m not sure that it would really make a difference unless unions start bargaining over peak executive pay. I’d love that, but it seems that tax, tax, tax is the answer, in a variety of ways. Time to reclaim that money for the public good.
Will hidden sexism hurt Hillary Clinton this fall? Quite possibly, although I’d like to see more than yet another story on Real American Voters, i.e., working-class white men from the Rust Belt, to suggest it may be so.
The great Ralph Stanley died last night at the age of 89. Stanley was the last major living figure of the early bluegrass era. He began recording with his brother Carter in 1947. They never had major financial success–really only Bill Monroe and Flatt & Scruggs did. They were a great band, pretty squarely within the emerging bluegrass tradition. But when Carter died in 1966, Ralph took his music back in time a bit. He always thought of himself as an old-time singer and banjo player, not a bluegrass musician. And that’s accurate. Bluegrass quickly developed into something pretty slick, with fancy instrumentals and a certain sense of virtuosity. Monroe developed the music by taking old-time and combining it with jazz, pop, and country music. While Stanley never completely rejected that, he emphasized the old-time Appalachian music much more. This led to some really outstanding music in the years after Carter’s death. I want to point out a few starting points for Dr. Ralph’s (he received an honorary Ph.D. from Lincoln Memorial University in Harrogate, Tennessee) discography. His 1969 album, Hills of Home, is an outstanding entrypoint. While I don’t care about the subject matter, the 1972 gospel album, Cry from the Cross, is probably the best bluegrass gospel album ever recorded. During these years, he mentored a number of young Appalachian singers, including Roy Lee Centers, Keith Whitley, and Ricky Skaggs. The last two of course became stars after switching to country music while Centers was pointlessly murdered. My collection of Stanley is this 2-CD set from these years, including live performances from all three. Really amazing stuff.
I saw Ralph Stanley perform twice. The first was in about 1998 at the Tennessee Theater in Knoxville. By this point, he was signing with his son, Ralph II. His son doesn’t have a good bluegrass lead voice. Good enough for country music, but not good enough for that style. So it wasn’t like seeing him in the 1970s, but was a ton of fun nonetheless, especially in front of a crowd that cared deeply about that style of music. I saw him again in about 2002 in Albuquerque. By this time, his late-career revival thanks to O Brother Where Art Thou had kicked in. He played to a packed house, played “Man of Constant Sorrow” like 3 times, and during the set break, shook every hand and sold every piece of merchandise he could. An old man now, he was going to cash in while he could. And who could blame him, given his long struggle to be financially successful, even if this meant the set break was a full hour.
Rest in Peace, Ralph. You were a true giant of American music. A few sample cuts:
And since this is a political blog, let’s not forget his endorsement of Barack Obama in 2008.
I suspect we’re going to see a lot of this kind of analysis:
Britain’s stunning vote to leave the European Union suggests that we’ve been seriously underestimating Donald Trump’s ability to win the presidential election.
When you consider all his controversies and self-inflicted wounds over the past month, combined with how much he’s getting outspent on the airwaves in the battleground states, it is actually quite surprising that Trump and Hillary Clinton are so close in the polls. He’s holding his own, especially in the Rust Belt.
Does this make sense? Not really:
I was going to cover this ridiculous piece of crap, I swear, but but beloved dumb jerk, Roy Edroso, beat me to it. Having just moved in to a new house and being on entertain-the-child duty pretty much 24/7 makes blogging difficult, so I suppose I should thank him. But, dang, this kind of aimless stupidity is really in my wheelhouse.
Anyway, it’s about how too many boobies make men stop right in the middle of inventing their time machines so they can whack it. Now, you may be thinking “But can’t women invent things if the men are too busy choking the chicken?” To which I say “WHAT ARE YOU SOME KIND OF BOOBY-SHOWING FEMINIST?!! WOMEN. DON’T. INVENT. THINGS.” Geez.
I don’t want to fill up the cyberpages of LGM with sordid academic squabbles, but I also don’t want to let Steve Diamond quote me in a fraudulent way without making a record of it. Posted as a comment on Taxprof:
Stephen Diamond is a very dishonest man. Diamond does not link to my LGM post he quotes. This is not merely a matter of netiquette, because he quotes me in a way that intentionally hides the fact that my major criticism of him has nothing to do with his quibble regarding the minor point I made in the part of the post he does quote. He is intentionally misquoting me, and in such an egregious way that his behavior is a form of academic fraud. (ETA: Warren Terra in comments suggests that the phrase “academic fraud” shouldn’t apply to this context — a blog post — even if Diamond is behaving in a way that would be academic fraud in a more formal context. I’m of two minds about this).
Here’s what I wrote:
Let’s go to the numbers. Diamond cites Bureau of Labor Statistics occupational employment stats for his claim that incomes for lawyers “have increased steadily for at least two decades.” That’s a very misleading statement, for two reasons, one relatively minor, and the other not minor at all. The relatively minor reason is that, adjusted for inflation, median salaries (a crucial term, as we’ll see shortly) for lawyers have been essentially flat since the mid-1990s, which is as far back as the BLS stats go: adjusted for inflation, the median salary for lawyers has increased by less than 5%, from $110,000 to $115,00. That’s approximately half the wage growth experienced by the average American worker over the past two decades — which, needless to say, have hardly compromised a banner era for American workers in general.
Note that Diamond removes the bolded portion of the paragraph, for reasons that will soon be painfully evident.
Diamond’s complaint is that I compared growth in median lawyer salaries with growth in mean worker salaries. That is a fair point as far as it goes, but it doesn’t go very far: median salaries of all workers still increased more in percentage terms than median salaries of lawyers, and in any case this is all a distraction from my main initial point, which is that the earnings of salaried lawyers have been, as I said, essentially flat. (Diamond claims that a five percent cumulative growth rate in salaries over 17 years means salaried lawyers are staying “comfortably ahead” of inflation. Over this same time frame, Diamond’s employer increased sticker tuition for Diamond’s students by 60% in constant dollars. I wonder if Diamond’s students think that raising tuition 12 times faster than the growth rate in lawyer salaries constitutes a “comfortable” rate of growth for the cost of a Santa Clara law degree?).
But as I said in the original post, my initial point was a minor one, because Diamond’s claims about lawyer earnings are actually far more misleading. My main point, which Diamond hides from any readers he might have by distorting my text via elision, was this:
But, misleading as that part of Diamond’s statement is in context, that’s a minor point in comparison to another one, which is that the BLS wage statistics Diamond cites don’t include self-employed workers. How important is this omission when calculating the actual compensation of lawyers? (Let alone law school graduates, which is a very different category).
Consider that 75% of American lawyers are in private practice, and the large majority of those people are self-employed, either as individuals or in partnerships, meaning that they’re not salaried or hourly workers, and thus not included in the BLS wage stats. Diamond is aware of this, and thinks it means lawyers are making even more money than the BLS stats suggest:
Now, these numbers are “employed” lawyers so they do not include solo practitioners or partners who qualify as employers. But the first number is relatively small, approximately 4% on average of all practicing lawyers over that time period. And the second number is likely to skew income higher not lower, so excluding that number does not help the critics case that much. Arguably solos do less well financially (though we don’t know for sure based on the BLS data) so perhaps they cancel each other out.
Factor in higher paid partners and [it’s] likely they [lawyers] have stayed comfortably ahead of inflation.
Steve Diamond, a man who pontificates regularly on the economic status of lawyers, thinks that 4% of practicing lawyers are in solo practice. He produces this estimate by citing NALP data on the employment status of law graduates nine months after graduation. But many lawyers — perhaps most — graduated from law school more than nine months ago. How many of them are in solo practice? According to the ABA, the answer is roughly two out of every five, i.e., approximately ten times as many as the learned professor estimated. And what’s happened to their wages?
Fortunately, we don’t have to guess: the mean earnings (the median is certainly much lower) of solo practitioners have declined by 30% in real terms over the past 25 years, from $71,000 to $49,000 per year, inflation-adjusted.
In other words, if we combine the BLS data on median lawyer salaries with tax data on the earnings of self-employed lawyers, we find that the median real compensation for lawyers – again, not law school graduates, but actual employed lawyers — is surely a good deal lower than it was a generation ago.
As is evident if one actually reads it, the main point of my post, as I emphasized at the time, was that, contrary to the assertions of Diamond and Michael Simkovic, the median earnings of lawyers (not just salaried lawyers) have decreased over the period covered by BLS data, because almost half of all lawyers in private practice are solos, and their income has decreased markedly over this period.
Looking at the actual post would also reveal to readers that Diamond’s analysis of lawyer income was so radically off the mark because, absurdly, he used the employment status of new law graduates to estimate how many lawyers are solos
: a figure which he then proceeded to underestimate by 825%, when the real figure is 825% higher.
Imagine if Donald Trump claimed that affirmative action was destroying the career prospects of white men in America, and cited the “fact” that in America today only 4% of 25 to 29 year old white men have college degrees. If it were then pointed out to him that the real figure is 37%, would you expect him to give up on his pretensions of being an expert on the subject, and slink away quietly?
Of course not: what you would expect would be for Trump to then misquote his critics, while throwing rhetorical dust in the air and brazenly ignoring the fact that he had been exposed as someone who has no idea what he is talking about. But at least Santa Clara law students aren’t paying $75,000 per year for the privilege of having that particular lying blowhard spout ignorant nonsense at them.
So, that didn’t work. On several levels; one of which is illustrated by the photo above. That’s LGM’s Senior Correspondent on British Politics holding the sign. While British politics suddenly became a hell of a lot more interesting, it’s also significantly less important.
After working GOTV all day yesterday, I skipped the count (where I was due to be a verification agent in Plymouth, but I was too spent to stay up until the declaration which happened around 4AM), and fell asleep by 10PM to Radio 4. When I went to sleep, one of the last tweets I read was that it would take a polling failure “worse” than 2015 or 1992 for leave to win (by whom I do not recall, but it was either a pollster or an academic psephologist). Both Boris Johnson and Nigel Farage were reportedly in negative spirits. Farage would later (around 4AM) say on R4 that he didn’t expect to win. I fell asleep easily.
I woke around 2:30AM with R4 still on, to several gloomy, dire texts, and the first authorities had been called. They were not going as expected for a 50/50 national result as per this guide to expectations as the night progressed. If Sunderland had a leave vote of +6%, we’d be roughly at 50/50 nationally.
Sunderland came in at leave +12%.
I’m somewhat relieved that I didn’t go to the count in Plymouth. While leave was going to win in Plymouth under even the rosiest scenarios, the final result here was Leave 79,997, Remain 53,458 on 71% turnout. That’s a 59.9% leave vote in Plymouth.
It’s difficult to say what happened. Turnout was a mixed bag; reports from Scotland indicate it was lower than expected, and significantly lower than the independence referendum in 2014. It’s likewise possible that my alternative hypothesis hedge from yesterday morning was more accurate than my working hypothesis: that the increase in registration and the relatively high turnout nationwide (72%, higher than any UK-wide election since 1992) was more due to lower socio-economic classes, relatively electorally inactive, being mobilised by the referendum.
I’m not sure we can call this a polling failure, given the polls were all over the place. Clearly, however, on-line polls did better than telephone polls, to which I strongly suspect social-desirability was the cause (in Brit-speak, “shy Brexiters”). NCP did not do nearly as well as in the 2015 general election. As for myself, at least I was consistent: shit at the 2015 general and equally shit at the referendum. In my defense, I only missed the final result by 4%. Which is no defense at all.
But then, the bookies got it very wrong, as did the markets.
I don’t see how David Cameron can continue as Conservative Party Leader or PM. Rumors abound right now of course, including Michael Gove and Boris Johnson in a huddle about Cameron’s future. Likewise, there’s now a real possibility of a snap election before Christmas. This would require overturning the Fixed Term Parliament Act 2011, but that won’t be a problem. That said, the Labour Party are not ready for it.
Martin McGuinness has called for a poll in Northern Ireland to choose between a united Ireland and remaining in the UK. The SNP see Scotland’s future as “part of the European Union”. It doesn’t take a genius to get the hint. While in 2014 I was solidly opposed to Scottish independence, one of the key arguments for remaining with the United Kingdom was that it presented the easiest and safest route to EU membership. It can be argued that England and Wales did not hold up their end of the bargain.
In terms of why 52% of the population voted to leave the European Union, this dovetails neatly with a class that I teach here on the effects of globalization on domestic politics. Yes, part of it was racism, xenophobia, and nationalism. But I don’t believe that 52% of the British (and Irish) population are those things (however, they are those who speak the loudest). To quote my friend, Cllr. Bill Stevens (Labour), “It means England (especially the poorer areas) have felt ignored and saying the only reason they voted that way was due to hate, nationalism, racism etc. will make it worse.”
I don’t have the link, but a couple days prior to the referendum, Michael Gove was confronted with the question of market reaction, and he promised (in the glib manner that the Leave campaign responded to any critique or bothersome fact) that we would wake up on Friday morning, and there would be no crash.
Alas, here’s the reaction of the markets:
Now we know how to view Corey Lewandowski’s interview with CNN’s Dana Bash on Monday after he was fired from the campaign of Donald Trump: As an audition.
Following that interview with Bash, Lewandowski went into a meeting with CNN executives. He’s due to make his debut appearance as a CNN political commentator next Monday on the morning program “New Day.”
Who says this presidential campaign thing is a racket?
The strongest parts of Kennedy’s opinion dealt with Fisher’s contention that UT’s attempts to increase diversity on campus can and should be done exclusively through formally race-neutral measures. First of all, UT presented “significant evidence, both statistical and anecdotal,” that race-neutral measures are inadequate to create a sufficiently diverse campus. Given the compelling interest the state has in such diversity, it must be allowed to experiment, he said.
Even more important from a constitutional (if not a policy) perspective, Kennedy correctly argued that calling the Top 10 Percent system “race-neutral” is disingenuous. Kennedy’s opinion quoted Justice Ginsburg’s observation that such plans are “adopted with racially segregated neighborhoods and schools front and center stage,” and that “[i]t is race consciousness, not blindness to race, that drives such plans.” The Top 10 Percent plan was designed to increase racial diversity. Indeed, unless someone has a secret plan to immediately end endemic de facto school and neighborhood segregation, it is bound to have this effect in practice, since it will guarantee college placement for students in predominantly minority schools.
Kennedy’s increasing impatience with the blind formalism of his Republican-appointed colleagues helps to explain why he finally decided to uphold an affirmative action program. In the 2007 case Parents Involved in Community Schools v. Seattle School District No. 1, Chief Justice John Roberts held that race could not be used even as a tiebreaker to choose between equally qualified applicants, based on the fatuous tautology that “[t]he way to stop discrimination on the basis of race is to stop discriminating on the basis of race.” Given a background of extensive historical de jure discrimination and persistent de facto segregation, however, the idea that simply ignoring race is sufficient to address racism is an obvious fiction.
Justice Kennedy regrettably joined Roberts’s judgment, but wrote a lengthy, somewhat tortured concurrence in which he refused to endorse the implied conclusion that affirmative action programs are never constitutionally permissible. I will admit to being skeptical that the distinctions drawn by Kennedy in that case would ever make a difference, but today they did.
Justice Alito, conversely, was entirely unsurprising:
There is no small irony in the fact that Justice Samuel Alito, who wrote a long dissent to Kennedy’s opinion, is an alumnus of the Concerned Alumni of Princeton, a group that opposed the gender and racial integration of the university. He is hardly the first conservative to opportunistically discover the value of “colorblindness” once its effects were to make campuses more, rather than less, white.
Alito’s opinion attempts a number of would-be “gotchas” that aren’t very convincing. At one point, he complains that “UT also offers courses in subjects that are likely to have special appeal to members of the minority groups given preferential treatment under its challenged plan.” I confess it is not obvious to me how offering, for example, a Black Studies program and giving students substantial freedom to choose courses demonstrates that the University of Texas is not really committed to racial diversity. Alito also chides UT for using SAT scores, which have “often been accused of reflecting racial and cultural bias.” But, of course, the correlation between SAT scores and racial socioecomomic status is exactly the kind of factor UT’s holistic admissions evaluations—which Alito considers unconstitutional—are designed to take into account. UT is constitutionally permitted to ignore SAT scores—but should also be constitutionally permitted to consider them while placing them in the proper context.
Alito’s complaints about colleges offering courses to appeal to minority groups and “give[ing] undergraduates a very large measure of freedom to choose their classes” were telling. I must admit I was not aware that the 14th Amendment enacted Mr. Allan Bloom’s The Closing of the American Mind…