Author Page for Paul Campos
NALP has released preliminary employment statistics for the class of 2011 as of nine months after graduation. They are, unsurprisingly, terrible.
12% of 2011 graduates were completely unemployed in February 2012, and another three per cent had re-enrolled in further graduate study, which can be treated as the functional equivalent to post-law school unemployment. So the first takeaway from these numbers is the nearly 15% unemployment rate for people who got law degrees from ABA-accredited schools last year. This compares with an 8.2% overall national unemployment rate, which, to my surprise at least, is also the unemployment rate among 25 to 34 year-olds (see Table A-10). So getting a law degree correlates with a doubling of the risk that a young adult will be unemployed nine months after receiving it.
But of course this 85.6% “employment” rate includes every kind of job law graduates obtained: legal, non-legal, full-time, part-time, long-term, and temporary. Let’s work with this preliminary data to make an estimate regarding how many 2011 graduates of ABA law schools had real legal jobs nine months after graduation, with a real legal job defined as a full-time non-temporary paying position requiring a law degree.
We can begin by eliminating jobs for which a law degree was not required. 24% of employed law graduates fell into this category, including the large majority of the 18.1% of all graduates who reported being employed in “business” (For most law graduates getting a job in “business” is short hand for either a low-paying service sector job that the graduate could have gotten more easily before going to law school, or in a smaller number of cases a good job that the graduate was qualified for prior to getting a law degree – indeed often literally the same job the graduate left in order to get a law degree).
What about those graduates of the 2011 class who had a job for which a law degree was required? Note that only 60% of graduates were working full-time in a job requiring a law degree. (Since it appears the status of somewhere around 7% of graduates was unknown, and since those graduates surely had far worse outcomes than average, this suggests that perhaps 56% to 58% of graduates had full time legal jobs. 12% of all jobs, legal and non-legal, obtained by graduates were part-time). Now consider how many jobs in this category have to be tossed out if we are limiting ourselves to real legal jobs, even liberally defined. The 5% of all “jobs” funded by law schools themselves for their own graduates must be excluded, as should the 6% of all private practice jobs which consisted of graduates reduced to the desperate expedient of attempting the start a solo practice straight out of law school.
NALP has not yet reported what overall percentage of jobs were temporary — defined as being for a term of less than one year – but for the class of 2010 26.9% of all jobs obtained by graduates were defined as temporary (To be conservative I’m going to treat all judicial clerkships as full-time long-term legal jobs, even though many state court clerkships are one-year way stations on the road to legal unemployment). We do know that 7% of all jobs obtained by graduates were reported as both part time and temporary.
Then we have the always tricky category of jobs with law firms of two to ten attorneys. A remarkable 42.9% of all graduates who obtained jobs in private practice (49.5% of all graduates went into private practice) were listed in this category. Many of these positions are of course real, if generally low-paying, associate jobs with established several-lawyer firms. But some are of a much more tenuous nature, including transient law clerk positions with solo practitioners, eat what you kill arrangements, in which people are given office space in return for a percentage of whatever they manage to bill, and basically fictional “law firms” consisting of two or three graduates banding together in a last-ditch attempt to avoid formal unemployment. But let’s be optimistic and assume that 80% of new graduates who were reported as obtaining jobs with firms of two to ten lawyers were in fact getting real legal jobs, liberally defined.
Thus once we exclude jobs that don’t require law degrees, law school-funded jobs, other temporary jobs, and part time jobs, and then make a generous estimate of how many private practice positions with very small firms were real legal jobs, the numbers look like this:
60% of all graduates whose employment status was known were in full-time jobs requiring a law degree.
Minus the 4% of all graduates in law school-funded temporary jobs.
Minus the approximately 15% of all graduates in temporary (less than one year) legal positions other than law school-funded jobs.
Minus an estimated 4.25% of all graduates in fictional “firm” jobs.
Minus the 3% of all graduates working as solo practitioners.
This leaves us with 33.75% of all 2011 ABA law school graduates in real legal jobs nine months after graduation.
This is, in my view, a conservative estimate of the scope of the disaster that has overtaken America’s law school graduates. It counts almost all positions with law firms and with government agencies as real legal jobs, even though we know some of these “jobs” are actually one-year unpaid internships. (See for example these). Indeed it counts whole classes of time-limited jobs that are likely to leave graduates with no legal employment at their conclusion, such as most state judicial clerkships, as long-term rather than temporary employment. Most of all, it makes what by now must be considered the questionable assumption that law schools are reporting these numbers accurately, rather than misreporting them to their advantage.
Yet even this generous estimate of how many 2011 graduates of ABA-accredited law schools managed to get real legal jobs leads to the conclusion that two-thirds did not.
[One thing I've never seen explained is why, after what was actually a very promising start, the NBA didn't draft any high school players for 20 years. The three players who went straight from high school in the mid-70s included one all-time great, one guy who had a 15-year-career, and an eight-year journeyman]. Anyway from 1995 to 2006, until the one and done rule got negotiated, a total of 38 players got drafted straight out of high school. (This list omits players who declared for the draft but weren’t drafted. This is a problematic subgroup dominated by players for whom college wasn’t a realistic option, such as Lloyd Daniels). I speculated below that the batting average for these draft picks was probably quite high, and in fact it is.
The complete list is here, and it includes:
FIVE NO-QUESTIONS ASKED SUPERSTARS
Amar’e Stoudemire (OK maybe a question could be asked)
SIX STARS OF VARIOUS MAGNITUDES
Andrew Bynum (potential superstar status pending)
Several minor stars/solid career starters/sixth men
Various role players of variable but real utility.
Kwame Brown (Hey, laugh all you want. He’s made $60 million and counting).
Then you’ve got a couple of special cases in Darius Miles and Eddie Curry — guys who were on the way to at least minor stardom when their careers were derailed by injury.
Genuine busts — guys who didn’t have any kind of real NBA career:
I’m not sure this is precisely the best metric, but that track record kicks the absolute living snot out of any comparably-sized randomly selected list of NBA first round draft choices, let alone a random list of drafted players, or players signed to contracts.
The one and done rule looks very much like pure protectionism for fringe NBA veterans.
Edit: This list should probably also include Brandon Jennings, who is tough to evaluate at this point but is likely to end up in at least the solid starter category.
Monday, June 4, 2012
A point about playoff scoring
By Kareem Abdul-Jabbar
The games of the modern era are not usually high-scoring affairs, especially in the playoffs, and that has put the fans who have followed the game since the Lakers-Celtics rivalry of the 1980s in somewhat of a sour mood.
Take for instance the 1985 Finals. In the Celtics’ Game 6 loss, they scored only 100 points, the lowest total for either team in the series. In Game 1, the Celtics had set the benchmark for 3-point shots taking 9 and making 7 (77.8 percent). Both teams focused on creating and taking high-percentage shots, which kept the 3-point attempts much lower than what you see today.
By contrast, in many playoff series this year, even one team cracking 100 is a rarity, and double-digit 3-point attempts (if not makes) are the norm. Read more…
One of the more arcane pleasures afforded by baseball.reference.com is the chance to browse through a season’s worth of 90-year-old box scores.
Apparently in that simpler more innocent time the duration of MLB games was only haphazardly recorded: only 48 of the Yankees’ box scores for the season include this information. What data there is indicates that games today are 50% longer than they were then. The median length of games for the Yankees that season was 1:56, with a quarter of them lasting 1:45 or less. The longest nine-inning game the team played (among the games for which the information is available) was 2:35.
According to Elias the average length of MLB games from 2000-2009 was 2:58. I don’t know if anyone has tried to figure out systematically why games are so much longer now, although some likely contributors are longer breaks between innings for commercial purposes, pitchers waving off catchers three times between pitches, batters lollygagging at the plate (in my youth Mike Hargrove was known as The Human Rain Delay, but I venture to guess that today he would hardly stand out among his peers) and of course the much larger number of pitching changes that games today feature.
In regard to the latter factor, the Yankees roster had a total of 69 relief appearances in 1922, as compared to 356 last season. The 1922 staff threw 100 complete games; the 2011 edition tossed five.
I’ve been watching a lot of European soccer this year, and one of the attractions of the sport is that with rare exceptions games last just about exactly as long as what a MLB game lasted in 1922. This makes me wonder if it might be worth considering altering the rules of MLB in order to speed it up. How about 20 seconds between pitches? Wasn’t there a rule along those lines once that was never enforced? (I seem to remember a story about Charlie Finley installing a horn at Kansas City A’s games that was supposed to be sounded whenever the pitcher took more than the allotted time). Edit: I see there is a rule already, and it’s 12 seconds! LOL! Managers should get no more than one pitching change per game other than between innings, and pitchers should get no more than two pickoff attempts per at bat.
Also, there are too many states these days. Please eliminate three.
Brian Tamanaha has a piece in the New York Times, summarizing some of the main contentions and recommendations of his new book Failing Law Schools. Brian’s description of the situation in legal education is refreshingly concise:
The economics of legal education are broken. The problem is that the cost of a law degree is now vastly out of proportion to the economic opportunities obtained by the majority of graduates. The average debt of law graduates tops $100,000, and most new lawyers do not earn salaries sufficient to make the monthly payments on this debt. More than one-third of law graduates in recent years have failed to obtain lawyer jobs. Thousands of new law graduates will enter a government-sponsored debt relief program, and many will never fully pay off their law school debt.
I hope it comes as a salubrious shock to most readers of the Times to discover that law schools can literally charge whatever they want for their increasingly dubious wares, and the federal government will then loan anyone law schools deign to admit the full cost of attendance (tuition plus living expenses), in the form of high interest non-dischargeable debt.
That this is a recipe for disaster should be obvious to anyone who got to whatever week in the Econ 101 syllabus started touching on the extent to which “the free market” is a heuristic fiction rather than a sociological fact.
Brian’s recommendations fall into two categories: loan reform and accreditation standards. He argues for caps on the amount of federal loan money law schools should be able to take from students, for making private educational loans dischargeable in bankruptcy, and for changing the ABA’s accreditation standards so as to make it easier for at least some schools to offer significantly cheaper models of legal education.
These recommendations are elaborated on at length in Failing Law Schools, and adopting them would certainly represent a significant improvement on the present absurd situation, in which taxpayers are expected to underwrite the budgets of 199 versions of the Yale Law School.
I suspect that for tactical reasons the piece maintains a discreet silence regarding the fairly obvious fact that, given the underlying statistics it is referencing, a large number of law schools need to disappear. Failing Law Schools consistently puts what is, from the perspective of the status quo, the best possible face on those statistics, again I suspect for tactical purposes, i.e., even the most optimistic reading of the data compels the conclusion that the present structure of legal education is unsustainable.
The real bottom line is that there’s a massive and growing oversupply of people with law degrees, and that while stopping the rapid increase and eventually reducing the crazy amounts of debt people are taking on to acquire those degrees would be a positive step, reform efforts must at some point soon address that problem quite directly.
This morning the Bureau of Labor Statistics issued its monthly report on the state of American labor. Keep in mind that, as a technical matter, the Great Recession has been over for three years now (GDP has been growing since June 2009). People who think an economic turnaround is going to save legal academia’s bacon should consider the very real possibility that this is the “economic turnaround,” and that the debt-leveraged growth of recent decades simply isn’t going to be repeated in the foreseeable future.
In any case, let’s consider what’s going on not just in the American economy as a whole, but in the legal sector. Over the last twelve months the legal sector has added a total of 4,800 jobs. Keep in mind that at best perhaps 70% of these jobs have been filled by attorneys, since the sector includes all support personnel (paralegals, administrative positions etc.). So we can estimate that there are about 3,000 more attorneys employed in America today than there were a year ago.
Now a certain number of people who were working as attorneys a year ago aren’t today, because they’ve died, retired, moved into other lines of work, or have simply become unemployed. The BLS estimates the total annual “outflow” from the profession to be about 13,000 people at present. So that means that about 16,000 lawyer jobs have been filled over the last 12 months by people who weren’t working as attorneys at the time they moved into these jobs.
Note this does not mean that 16,000 new law graduates got real legal jobs, since some unknown number of these jobs were filled by unemployed attorneys who moved back into the legal work force. It’s true that the 2011 NALP stats claim that 25,654 of the nation’s 44,258 2010 law graduates had a full-time job requiring a law degree nine months after graduation. For quite some time now I’ve been trying to explain why that (atrocious) 42% functional unemployment rate for new lawyers is actually seriously understated.
The BLS statistics suggest that the real unemployment rate for new lawyers is more on the order of 63%, if “employment” is defined as having a real legal job, as opposed to the the almost unlimited number of fake legal and quasi-legal jobs the NALP statistics count as full-time employment requiring a law degree (such as for example getting hired into a short-term low-paid position by your alma mater in order to goose its reported employment rate).
It’s a very positive development that Brian’s critique of the growing economic crisis in American legal education and the legal profession is reaching a wider audience. And it’s probably unrealistic to get that audience to appreciate all at once how serious that crisis really is. After all, law schools have barely begun to grapple with its true extent. But this is an important start.
Update: This “job opening” generated 32 applications from law school graduates within 24 hours of being posted on a law school’s web site.
University of Tennessee law professor Glenn Reynolds has a piece in the New York Post about the student debt crisis, in law school and more generally. It’s obviously a good thing that this issue is beginning to get some real traction in the media, and we can hope will translate into among other things more attention for Brian Tamanaha’s terrific new book Failing Law Schools, which Reynolds cites.
As for immediate practical questions, Reynolds, like most of our generation, has a weakness for Polonius-like platitudes: Read more…
Among other things,the site Top Law Schools offers a fascinating glimpse into the psychology of prospective law students, aka 0Ls. A common affliction among 0Ls is what the more perceptive law students and recent graduates who post on the site refer to as “special snowflake syndrome.” The classic symptoms of SSS tend to be exhibited by 0Ls who ask for advice regarding questions such as this, which is literally the first post I read on the site this morning: Read more…
MSNBC’s Chris Hayes sparked controversy and debate on Sunday when he said that he felt “uncomfortable” calling soldiers killed in action “heroes” because the term can be used to justify potentially unjust wars. He later apologized for the statement. (See apology below.)
Hayes spent a large portion of his Memorial Day-themed show on questions of war and of the people killed on all sides of military conflicts, from American soldiers to Afghan civilians.
After speaking with a former Marine whose job it was to notify families of the death of soldiers, he turned to his panel and, clearly wrestling with what to say, raised the issue of language:
I think it’s interesting because I think it is very difficult to talk about the war dead and the fallen without invoking valor, without invoking the words “heroes.” Why do I feel so [uncomfortable] about the word “hero”? I feel comfortable — uncomfortable — about the word because it seems to me that it is so rhetorically proximate to justifications for more war. Um, and, I don’t want to obviously desecrate or disrespect memory of anyone that’s fallen, and obviously there are individual circumstances in which there is genuine, tremendous heroism: hail of gunfire, rescuing fellow soldiers and things like that. But it seems to me that we marshal this word in a way that is problematic. But maybe I’m wrong about that.
Hayes’ fellow panelists expressed similar discomfort. Linguist and columnist John McWhorter said that he would “almost rather not say ‘hero” and called the term “manipulative,” even if it was unintentionally so.
Hayes then said that, on the flip side, it could be seen as “noble” to join the military. “This is voluntary,” he said, adding that, though a “liberal caricature” like himself would not understand “submitting so totally to what the electorate or people in power are going to decide about using your body,” he saw valor in it.
The Nation’s Liliana Segura then chimed in, saying that “hero” is often used to paint wars in a “righteous” way.
“These wars in Iraq and Afghanistan … aren’t righteous wars,” she said. “We can’t be so afraid of criticizing a policy.”
Hayes’ words caused a predictable furor with some. One Twitter user said that he was “uncomfortable with calling you an American.”
Others, though, supported Hayes. “Questioning-rather than bolstering-orthodoxies is inherently controversial,” blogger Glenn Greenwald tweeted. “That’s what makes Chris Hayes’ show so rare for TV-& so valuable.”
UPDATE: Chris Hayes issued a statement on Monday apologizing for his comments:
On Sunday, in discussing the uses of the word “hero” to describe those members of the armed forces who have given their lives, I don’t think I lived up to the standards of rigor, respect and empathy for those affected by the issues we discuss that I’ve set for myself. I am deeply sorry for that.
As many have rightly pointed out, it’s very easy for me, a TV host, to opine about the people who fight our wars, having never dodged a bullet or guarded a post or walked a mile in their boots. Of course, that is true of the overwhelming majority of our nation’s citizens as a whole. One of the points made during Sunday’s show was just how removed most Americans are from the wars we fight, how small a percentage of our population is asked to shoulder the entire burden and how easy it becomes to never read the names of those who are wounded and fight and die, to not ask questions about the direction of our strategy in Afghanistan, and to assuage our own collective guilt about this disconnect with a pro-forma ritual that we observe briefly before returning to our barbecues.
But in seeking to discuss the civilian-military divide and the social distance between those who fight and those who don’t, I ended up reinforcing it, conforming to a stereotype of a removed pundit whose views are not anchored in the very real and very wrenching experience of this long decade of war. And for that I am truly sorry
We live in a culture in which someone like Hayes cannot suggest, even the most diffident, nuanced, and self-deprecating way, that automatically labeling every American soldier who dies in war a “hero” might be an oversimplification of a difficult set of moral and political questions without thereby releasing such a storm of indignation that he is forced to immediately recant such a terrible heresy.
When it comes to war and peace nothing less than full-throated stupidity is acceptable in our public discourse, and any sign of ambivalence regarding the righteousness of the various causes for which around 1.34 million American soldiers have died is to be stamped out as an offense to the memory of the honored dead. (This view produces some logical problems in the context of America’s bloodiest war, but logic is never an impediment to pseudo-patriotic fervor).
Note too the perniciousness of the idea that Hayes’ civilian status is assumed even by himself — or at least his contrite persona –to disqualify him from having a valid opinion on such matters — a disqualification that obviously doesn’t apply to the armies of chicken hawk pundits who deploy their keyboards to celebrate whatever foreign adventure they and their masters have deemed worth the cost of someone else’s life.
One of the most horrible feature of war is that all the war-propaganda, all the screaming and lies and hatred, comes invariably from people who are not fighting.
Orwell, Homage to Catalonia
Update: This comment sums up what’s wrong with the backlash to Hayes’ unexceptionable observations perfectly:
While Memorial Day comes to us through an interesting mix of folk mourning practices, it is, like Thanksgiving, a holiday that encourages a kind of thinly cloaked national religion. I question whether making a public ritual around an intensely private act and set of feelings (mourning) is a good thing. One of the many violences of war is the loss of individual identity among the fighting and the silencing of debate; as a holiday, Memorial Day, which asks us to mourn heroic, reluctant, unlucky, ambivalent, peace-loving, honorable, and despicable people on the same day is that it throws our military dead into one mass grave and ask us all to drop flowers and shed a tear there. It is, as I see it, a holiday that perpetuates some of the worst lies of the state used to justify war. While I am not a pacifist, I am aggressively opposed to any state, legal, or cultural attempts to normalize or validate war. We should always look at acts of war with skepticism and doubt and unease. Bullshit terms like “our heroes” are naked propaganda terms designed to promote the idea that those who kill and die for the state deserve special reverence.
What would constitute a valid Democratic party analogy for the willingness of prominent GOP politicians (the Secretary of State of Arizona is a fairly high up elective office — it’s not as if somebody on the Plano TX city council is ranting on local access cable) to play to their lunatic fringe, aka “the base?”
I get very elaborate emails from a guy who claims that the Bush family was or is literally full of Nazis (lately he’s been claiming Obama is one as well so this isn’t a very good analogy). What if high up Dems went on the radio with this guy and told him and his audience — of course in this analogy he has a huge radio audience — that they sure hoped it wasn’t true but there were certainly some disturbing questions about the Bushes’ apparent ties to National Socialism?
Per Intrade, which is probably the most reliable measure of that question at this point.
And of course if Romney is elected president there’s a good chance both houses of Congress will be in GOP hands, since these probabilities are interdependent to a significant extent.
Hard as it may be to summon much enthusiasm for electing the Diversity!(TM) version of Nelson Rockefeller president again, the alternative, which for complex psychological reasons many progressives have been treating as a practical impossibility, is worth contemplating with some attention, for the same reason Dr. Johnson advocated the contemplation of the hangman’s rope.