Will hidden sexism hurt Hillary Clinton this fall? Quite possibly, although I’d like to see more than yet another story on Real American Voters, i.e., working-class white men from the Rust Belt, to suggest it may be so.
The great Ralph Stanley died last night at the age of 89. Stanley was the last major living figure of the early bluegrass era. He began recording with his brother Carter in 1947. They never had major financial success–really only Bill Monroe and Flatt & Scruggs did. They were a great band, pretty squarely within the emerging bluegrass tradition. But when Carter died in 1966, Ralph took his music back in time a bit. He always thought of himself as an old-time singer and banjo player, not a bluegrass musician. And that’s accurate. Bluegrass quickly developed into something pretty slick, with fancy instrumentals and a certain sense of virtuosity. Monroe developed the music by taking old-time and combining it with jazz, pop, and country music. While Stanley never completely rejected that, he emphasized the old-time Appalachian music much more. This led to some really outstanding music in the years after Carter’s death. I want to point out a few starting points for Dr. Ralph’s (he received an honorary Ph.D. from Lincoln Memorial University in Harrogate, Tennessee) discography. His 1969 album, Hills of Home, is an outstanding entrypoint. While I don’t care about the subject matter, the 1972 gospel album, Cry from the Cross, is probably the best bluegrass gospel album ever recorded. During these years, he mentored a number of young Appalachian singers, including Roy Lee Centers, Keith Whitley, and Ricky Skaggs. The last two of course became stars after switching to country music while Centers was pointlessly murdered. My collection of Stanley is this 2-CD set from these years, including live performances from all three. Really amazing stuff.
I saw Ralph Stanley perform twice. The first was in about 1998 at the Tennessee Theater in Knoxville. By this point, he was signing with his son, Ralph II. His son doesn’t have a good bluegrass lead voice. Good enough for country music, but not good enough for that style. So it wasn’t like seeing him in the 1970s, but was a ton of fun nonetheless, especially in front of a crowd that cared deeply about that style of music. I saw him again in about 2002 in Albuquerque. By this time, his late-career revival thanks to O Brother Where Art Thou had kicked in. He played to a packed house, played “Man of Constant Sorrow” like 3 times, and during the set break, shook every hand and sold every piece of merchandise he could. An old man now, he was going to cash in while he could. And who could blame him, given his long struggle to be financially successful, even if this meant the set break was a full hour.
Rest in Peace, Ralph. You were a true giant of American music. A few sample cuts:
And since this is a political blog, let’s not forget his endorsement of Barack Obama in 2008.
I suspect we’re going to see a lot of this kind of analysis:
Britain’s stunning vote to leave the European Union suggests that we’ve been seriously underestimating Donald Trump’s ability to win the presidential election.
When you consider all his controversies and self-inflicted wounds over the past month, combined with how much he’s getting outspent on the airwaves in the battleground states, it is actually quite surprising that Trump and Hillary Clinton are so close in the polls. He’s holding his own, especially in the Rust Belt.
Does this make sense? Not really:
- I mean, on one level it’s scary that Trump is within 6 points. But, still 6 points, in presidential election terms, is getting your ass kicked. And there’s no reason to think he has much upside potential.
- It might be possible for a formidable campaign organization to overperform the polls. But Trump has the opposite of that. Clinton’s dominance of the airwaves and superior organization is going to make it harder for Trump to overcome a substantial deficit and harder to get his supporters out.
- The argument against these facts seems to be something like “nobody expected Brexit to win, nobody expected Trump to win, but Brexit won, and Trump has already won once, so Trump can win twice.” But this doesn’t really make any sense. Unlike with Brexit, Trump took a commanding lead in the polls early on in the primaries; skeptics (like me) were ignoring the polls. I don’t think there’s any reason to believe there’s a large reservoir of untapped support for Trump that polls aren’t picking up.
- One major comparative advantage for Brexit is that none of the prominent assholes on its side were actually on the ballot. People who would never dream of voting for Nigel Farage or Boris Johnson in a national election could vote Brexit. Implicitly voting against Cameron didn’t require voting for someone you hate as much or more. If the question on the ballot in November was “do you want Hillary Clinton to be president?” I would be pretty worried. But it’s not. If Trump is going to win, he’s going to need a plurality of voters to affirmatively vote for him, although he’s a very well-known and widely despised figure heading a nationally unpopular party while barely running a presidential campaign at all.
- The United States is a much bigger and more diverse country, which really makes a big difference in terms, which is rather important for how a campaign based around mobilizing white resentment will play out. How is Trump going to win Florida, barely a white majority state? What’s his path to the Electoral College without it? (Hint: even if he can win Ohio and Pennsylvania, that’s not enough.)
- Brexit is helpful to Trump for one reason only: if it harms the American economy, it hurts the incumbent party. Will the effects on the American economy be enough to make a big difference? I doubt it, but that’s the only reason to worry about Brexit in terms of the American presidential election.
I was going to cover this ridiculous piece of crap, I swear, but but beloved dumb jerk, Roy Edroso, beat me to it. Having just moved in to a new house and being on entertain-the-child duty pretty much 24/7 makes blogging difficult, so I suppose I should thank him. But, dang, this kind of aimless stupidity is really in my wheelhouse.
Anyway, it’s about how too many boobies make men stop right in the middle of inventing their time machines so they can whack it. Now, you may be thinking “But can’t women invent things if the men are too busy choking the chicken?” To which I say “WHAT ARE YOU SOME KIND OF BOOBY-SHOWING FEMINIST?!! WOMEN. DON’T. INVENT. THINGS.” Geez.
I don’t want to fill up the cyberpages of LGM with sordid academic squabbles, but I also don’t want to let Steve Diamond quote me in a fraudulent way without making a record of it. Posted as a comment on Taxprof:
Stephen Diamond is a very dishonest man. Diamond does not link to my LGM post he quotes. This is not merely a matter of netiquette, because he quotes me in a way that intentionally hides the fact that my major criticism of him has nothing to do with his quibble regarding the minor point I made in the part of the post he does quote. He is intentionally misquoting me, and in such an egregious way that his behavior is a form of academic fraud. (ETA: Warren Terra in comments suggests that the phrase “academic fraud” shouldn’t apply to this context — a blog post — even if Diamond is behaving in a way that would be academic fraud in a more formal context. I’m of two minds about this).
Here’s what I wrote:
Let’s go to the numbers. Diamond cites Bureau of Labor Statistics occupational employment stats for his claim that incomes for lawyers “have increased steadily for at least two decades.” That’s a very misleading statement, for two reasons, one relatively minor, and the other not minor at all. The relatively minor reason is that, adjusted for inflation, median salaries (a crucial term, as we’ll see shortly) for lawyers have been essentially flat since the mid-1990s, which is as far back as the BLS stats go: adjusted for inflation, the median salary for lawyers has increased by less than 5%, from $110,000 to $115,00. That’s approximately half the wage growth experienced by the average American worker over the past two decades — which, needless to say, have hardly compromised a banner era for American workers in general.
Note that Diamond removes the bolded portion of the paragraph, for reasons that will soon be painfully evident.
Diamond’s complaint is that I compared growth in median lawyer salaries with growth in mean worker salaries. That is a fair point as far as it goes, but it doesn’t go very far: median salaries of all workers still increased more in percentage terms than median salaries of lawyers, and in any case this is all a distraction from my main initial point, which is that the earnings of salaried lawyers have been, as I said, essentially flat. (Diamond claims that a five percent cumulative growth rate in salaries over 17 years means salaried lawyers are staying “comfortably ahead” of inflation. Over this same time frame, Diamond’s employer increased sticker tuition for Diamond’s students by 60% in constant dollars. I wonder if Diamond’s students think that raising tuition 12 times faster than the growth rate in lawyer salaries constitutes a “comfortable” rate of growth for the cost of a Santa Clara law degree?).
But as I said in the original post, my initial point was a minor one, because Diamond’s claims about lawyer earnings are actually far more misleading. My main point, which Diamond hides from any readers he might have by distorting my text via elision, was this:
But, misleading as that part of Diamond’s statement is in context, that’s a minor point in comparison to another one, which is that the BLS wage statistics Diamond cites don’t include self-employed workers. How important is this omission when calculating the actual compensation of lawyers? (Let alone law school graduates, which is a very different category).
Consider that 75% of American lawyers are in private practice, and the large majority of those people are self-employed, either as individuals or in partnerships, meaning that they’re not salaried or hourly workers, and thus not included in the BLS wage stats. Diamond is aware of this, and thinks it means lawyers are making even more money than the BLS stats suggest:
Now, these numbers are “employed” lawyers so they do not include solo practitioners or partners who qualify as employers. But the first number is relatively small, approximately 4% on average of all practicing lawyers over that time period. And the second number is likely to skew income higher not lower, so excluding that number does not help the critics case that much. Arguably solos do less well financially (though we don’t know for sure based on the BLS data) so perhaps they cancel each other out.
Factor in higher paid partners and [it’s] likely they [lawyers] have stayed comfortably ahead of inflation.
Steve Diamond, a man who pontificates regularly on the economic status of lawyers, thinks that 4% of practicing lawyers are in solo practice. He produces this estimate by citing NALP data on the employment status of law graduates nine months after graduation. But many lawyers — perhaps most — graduated from law school more than nine months ago. How many of them are in solo practice? According to the ABA, the answer is roughly two out of every five, i.e., approximately ten times as many as the learned professor estimated. And what’s happened to their wages?
Fortunately, we don’t have to guess: the mean earnings (the median is certainly much lower) of solo practitioners have declined by 30% in real terms over the past 25 years, from $71,000 to $49,000 per year, inflation-adjusted.
In other words, if we combine the BLS data on median lawyer salaries with tax data on the earnings of self-employed lawyers, we find that the median real compensation for lawyers – again, not law school graduates, but actual employed lawyers — is surely a good deal lower than it was a generation ago.
As is evident if one actually reads it, the main point of my post, as I emphasized at the time, was that, contrary to the assertions of Diamond and Michael Simkovic, the median earnings of lawyers (not just salaried lawyers) have decreased over the period covered by BLS data, because almost half of all lawyers in private practice are solos, and their income has decreased markedly over this period.
Looking at the actual post would also reveal to readers that Diamond’s analysis of lawyer income was so radically off the mark because, absurdly, he used the employment status of new law graduates to estimate how many lawyers are solos
: a figure which he then proceeded to underestimate by 825%, when the real figure is 825% higher.
Imagine if Donald Trump claimed that affirmative action was destroying the career prospects of white men in America, and cited the “fact” that in America today only 4% of 25 to 29 year old white men have college degrees. If it were then pointed out to him that the real figure is 37%, would you expect him to give up on his pretensions of being an expert on the subject, and slink away quietly?
Of course not: what you would expect would be for Trump to then misquote his critics, while throwing rhetorical dust in the air and brazenly ignoring the fact that he had been exposed as someone who has no idea what he is talking about. But at least Santa Clara law students aren’t paying $75,000 per year for the privilege of having that particular lying blowhard spout ignorant nonsense at them.
Now we know how to view Corey Lewandowski’s interview with CNN’s Dana Bash on Monday after he was fired from the campaign of Donald Trump: As an audition.
Following that interview with Bash, Lewandowski went into a meeting with CNN executives. He’s due to make his debut appearance as a CNN political commentator next Monday on the morning program “New Day.”
Who says this presidential campaign thing is a racket?
The strongest parts of Kennedy’s opinion dealt with Fisher’s contention that UT’s attempts to increase diversity on campus can and should be done exclusively through formally race-neutral measures. First of all, UT presented “significant evidence, both statistical and anecdotal,” that race-neutral measures are inadequate to create a sufficiently diverse campus. Given the compelling interest the state has in such diversity, it must be allowed to experiment, he said.
Even more important from a constitutional (if not a policy) perspective, Kennedy correctly argued that calling the Top 10 Percent system “race-neutral” is disingenuous. Kennedy’s opinion quoted Justice Ginsburg’s observation that such plans are “adopted with racially segregated neighborhoods and schools front and center stage,” and that “[i]t is race consciousness, not blindness to race, that drives such plans.” The Top 10 Percent plan was designed to increase racial diversity. Indeed, unless someone has a secret plan to immediately end endemic de facto school and neighborhood segregation, it is bound to have this effect in practice, since it will guarantee college placement for students in predominantly minority schools.
Kennedy’s increasing impatience with the blind formalism of his Republican-appointed colleagues helps to explain why he finally decided to uphold an affirmative action program. In the 2007 case Parents Involved in Community Schools v. Seattle School District No. 1, Chief Justice John Roberts held that race could not be used even as a tiebreaker to choose between equally qualified applicants, based on the fatuous tautology that “[t]he way to stop discrimination on the basis of race is to stop discriminating on the basis of race.” Given a background of extensive historical de jure discrimination and persistent de facto segregation, however, the idea that simply ignoring race is sufficient to address racism is an obvious fiction.
Justice Kennedy regrettably joined Roberts’s judgment, but wrote a lengthy, somewhat tortured concurrence in which he refused to endorse the implied conclusion that affirmative action programs are never constitutionally permissible. I will admit to being skeptical that the distinctions drawn by Kennedy in that case would ever make a difference, but today they did.
Justice Alito, conversely, was entirely unsurprising:
There is no small irony in the fact that Justice Samuel Alito, who wrote a long dissent to Kennedy’s opinion, is an alumnus of the Concerned Alumni of Princeton, a group that opposed the gender and racial integration of the university. He is hardly the first conservative to opportunistically discover the value of “colorblindness” once its effects were to make campuses more, rather than less, white.
Alito’s opinion attempts a number of would-be “gotchas” that aren’t very convincing. At one point, he complains that “UT also offers courses in subjects that are likely to have special appeal to members of the minority groups given preferential treatment under its challenged plan.” I confess it is not obvious to me how offering, for example, a Black Studies program and giving students substantial freedom to choose courses demonstrates that the University of Texas is not really committed to racial diversity. Alito also chides UT for using SAT scores, which have “often been accused of reflecting racial and cultural bias.” But, of course, the correlation between SAT scores and racial socioecomomic status is exactly the kind of factor UT’s holistic admissions evaluations—which Alito considers unconstitutional—are designed to take into account. UT is constitutionally permitted to ignore SAT scores—but should also be constitutionally permitted to consider them while placing them in the proper context.
Alito’s complaints about colleges offering courses to appeal to minority groups and “give[ing] undergraduates a very large measure of freedom to choose their classes” were telling. I must admit I was not aware that the 14th Amendment enacted Mr. Allan Bloom’s The Closing of the American Mind…
Donald Trump was a brash scion of a real estate empire, a young developer anxious to leave his mark on New York. Roy Cohn was a legendary New York fixer, a ruthless lawyer in the hunt for new clients.
They came together by chance one night at Le Club, a hangout for Manhattan’s rich and famous. Trump introduced himself to Cohn, who was sitting at a nearby table, and sought advice: How should he and his father respond to Justice Department allegations that their company had systematically discriminated against black people seeking housing?
“My view is tell them to go to hell,” Cohn said, “and fight the thing in court.”
It was October 1973 and the start of one of the most influential relationships of Trump’s career. Cohn soon represented Trump in legal battles, counseled him about his marriage and introduced Trump to New York power brokers, money men and socialites.
Cohn also showed Trump how to exploit power and instill fear through a simple formula: attack, counterattack and never apologize.
Since he announced his run for the White House a year ago, Trump has used such tactics more aggressively than any other candidate in recent memory, demeaning opponents, insulting minorities and women, and whipping up anger among his supporters.
Cohn gained notoriety in the 1950s as Sen. Joseph McCarthy’s chief counsel and the brains behind his hunt for communist infiltrators. By the 1970s, Cohn maintained a powerful network in New York City, using his connections in the courts and City Hall to reward friends and punish those who crossed him.
He routinely pulled strings in government for clients, funneled cash to politicians and cultivated relationships with influential figures, including FBI Director J. Edgar Hoover, mafia boss Anthony “Fat Tony” Salerno and a succession of city leaders.
In the 1990s, a tragic character based on Cohn had a central place in Tony Kushner’s Pulitzer prize-winning play, “Angels in America: A Gay Fantasia on National Themes.”
Trump prized Cohn’s reputation for aggression. According to a New York Times profile a quarter-century ago, when frustrated by an adversary, Trump would pull out a photograph of Cohn and ask, “Would you rather deal with him?” Trump remained friends with him even after the lawyer was disbarred in New York for ethical lapses. Cohn died in 1986.
In case you want to feel depressed, you can follow the monthly updates at Gizmodo as the Earth keeps breaking its all-time heat records. This will end well.
It’s completely ridiculous that it took the New York attorney general’s office to force this to happen, but Jimmy John’s is finally ending its practice of making its employees sign non-compete clauses so they can’t take the valuable skills they learned and get a job at Subway.
The Illinois-based sandwich chain has agreed to stop including noncompete agreements in its hiring documents, a practice that was deemed “unlawful” by the New York attorney general’s office.
The announcement follows an investigation by that office into Jimmy John’s use of noncompete agreements with franchisees in New York, which began in December 2014. The agreements had barred departing employees from taking jobs with competitors of Jimmy John’s for two years after leaving the company and from working within two miles of a Jimmy John’s store that made more than 10 percent of its revenue from sandwiches.
“Noncompete agreements for low-wage workers are unconscionable,” Eric Schneiderman, New York’s attorney general, said in a statement. “They limit mobility and opportunity for vulnerable workers and bully them into staying with the threat of being sued. Companies should stop using these agreements for minimum wage employees.”
It seems that this agreement only covers its New York stores, although Illinois is going after it now, so maybe that will finally stop the practice entirely.