Subscribe via RSS Feed

Author Page for Paul Campos

rss feed

Harvard professors will now have $20 co-pays for doctor visits. Thanks Obama!

[ 91 ] January 5, 2015 |

In Harvard’s health care enrollment guide for 2015, the university said it “must respond to the national trend of rising health care costs, including some driven by health care reform,” otherwise known as the Affordable Care Act. The guide said that Harvard faced “added costs” because of provisions in the health care law that extend coverage for children up to age 26, offer free preventive services like mammograms and colonoscopies and, starting in 2018, add a tax on high-cost insurance, known as the Cadillac tax.

Richard F. Thomas, a Harvard professor of classics and one of the world’s leading authorities on Virgil, called the changes “deplorable, deeply regressive, a sign of the corporatization of the university.”

Mary D. Lewis, a professor who specializes in the history of modern France and has led opposition to the benefit changes, said they were tantamount to a pay cut. “Moreover,” she said, “this pay cut will be timed to come at precisely the moment when you are sick, stressed or facing the challenges of being a new parent.”

The university is adopting standard features of most employer-sponsored health plans: Employees will now pay deductibles and a share of the costs, known as coinsurance, for hospitalization, surgery and certain advanced diagnostic tests. The plan has an annual deductible of $250 per individual and $750 for a family. For a doctor’s office visit, the charge is $20. For most other services, patients will pay 10 percent of the cost until they reach the out-of-pocket limit of $1,500 for an individual and $4,500 for a family.

Previously, Harvard employees paid a portion of insurance premiums and had low out-of-pocket costs when they received care.

Michael E. Chernew, a health economist and the chairman of the university benefits committee, which recommended the new approach, acknowledged that “with these changes, employees will often pay more for care at the point of service.” In part, he said, “that is intended because patient cost-sharing is proven to reduce overall spending.”

According to the AAUP, the average salary for Harvard full professors is currently $207,100, and their average total compensation (including the lousy health care plan) is $262,300.

. . . numerous commenters make the fair point that Harvard’s new plan might be quite burdensome to the large number of Harvard employees making a lot less than TT professors. The school offers some protection against high co-insurance costs to lower-paid employees, but it’s also fair to ask why an institution with a $36 billion (!) endowment can’t be more generous to its employees, especially those who aren’t near the top of the pay scale. And although Harvard’s new plan is actually a good one compared to the health care options provided by most employers, that’s just another sign of how dysfunctional the American health care system remains, despite whatever marginal improvements are provided by the ACA.

The tyranny of evil men

[ 84 ] January 3, 2015 |

apocalypse

A couple of days ago I noted that Jay Conison, dean of the most wretched — as measured by the admissions credentials of its victims students — of the Infilaw for-profit law schools, was debating David Frakt at The Faculty Lounge. Frakt, who was kicked out of a literal faculty lounge by Infilaw errand boy Dennis Stone when his dean candidacy presentation to the Florida Coastal faculty began to include subversive material, aka facts, was now challenging Conison to provide evidence — any evidence — for Infilaw’s contention that the unprecedentedly horrible LSAT scores of its recent entering classes weren’t going to produce abysmal bar passage rates.

This question has taken on special sharpness, given that bar passage rates for Infilaw school graduates plunged in 2014, and, as documented here, the entrance numbers for the the schools’ most recent entering classes — and Conison’s Charlotte School of Law in particular — are far lower than even the terrible numbers of the 2011 entering classes, which provided most of the graduates who failed various bar exams this summer at rates approaching 50%. (A quarter of Charlotte’s 2014 matrics have LSAT scores of 138 or below. 138 is the 9th percentile of test takers. The average LSAT score of graduates of even the worst for-profit undergraduate schools who take the test — that is, not those who apply to law school, but all those who merely take the LSAT — are much higher).

Conison has now risen to the challenge. I dare you to read that. I double dare you.

The path of the righteous man is beset on all sides by the inequities of the selfish and the tyranny of evil men. Blessed is he, who in the name of charity and good will, shepherds the weak through the valley of darkness, for he is truly his brother’s keeper and the finder of lost children. And I will strike down upon thee with great vengeance and furious anger those who would attempt to poison and destroy my brothers. And you will know my name is the Lord when I lay my vengeance upon thee.

Ideology and belief

[ 47 ] January 1, 2015 |

great pumpkin

Lawyers are notoriously bad at math, and some legal academics are apparently even worse.

LGM punching bag Steve Diamond tries to criticize a WAPO piece that points out employment prospects for many new law graduates are grim:

The delay in the enrollment decline [among law schools] occurred because new college grads tried to flood into law schools from 2008-2010 in order to wait out the economic turmoil. The problem they then faced was that the recovery only took hold in 2012-13 and that meant oversupply in the market. The enrollment bubble can be seen in this chart prepared by the ABA. As the economy took off under the influence of low interest rates in 2003 enrollment steadily climbed and jumped up significantly as the credit crisis was in full swing. The peak in first year enrollment was in AY2010-2011 at 52,488. The continuing impact of that bubble period is indicated by the fact that the highest number of JD’s ever awarded in the US occurred in 2013, three years after the peak of first year enrollment.

This doesn’t even begin to make sense on its face, as Diamond argues that enrollment “steadily climbed” after 2003 because the economy was booming during the credit bubble, and then “jumped significantly” because the economy crashed, and “and new college grads tried to flood into law schools from 2008-2010.” (Diamond goes on to argue that law school enrollment has declined since 2010 in part because the opportunity cost of attending has increased due to the strengthening economy, apparently forgetting that he argued just the opposite in regard to increasing enrollment during the mid-aughts).

Leaving aside this theoretical confusion, the data Diamond cites don’t actually support his claim, and other, more relevant data flat-out contradict it. Diamond cites law school enrollment numbers, when of course the far more germane data in regard to demand for law school admission are law school applicant figures. Even the enrollment figures don’t back up Diamond’s argument, as 1L enrollment barely budged between 2003 and 2008, going from 48,867 to 49,414, which actually represents a 5.4% decline in enrollment per ABA law school (there were 187 such schools in 2003 and 200 five years later).

Enrollment did increase modestly between 2008 and 2010, to 52,488 — this 6.2% increase is Diamond’s “enrollment bubble,” which is supposedly the cause of whatever employment struggles recent graduates have faced — but what Diamond fails to note is that, far from “flooding into law schools between 2008-2010,” significantly fewer people applied to law school during the Great Recession than during the boom years immediately preceding it.

Total applicants to ABA law schools 2004-2006: 285,100

Total applicants to ABA law schools 2008-2010: 257,900

As I’ve noted elsewhere, the fact that demand for law school admission actually declined during the worst economic contraction since the 1930s should have been a warning sign to legal academia. It wasn’t, because instead law schools slashed admission standards, and managed to slightly increase total enrollment — although, again, enrollment fell per school — in the face of declining demand, leading innumerate observers like Diamond, who has a Ph.D. in political science as well as J.D., to conclude that demand was increasing. (Diamond claims that big year-end bonuses for lawyers at a handful of hyper-elite law firms means prosperity is just around the corner for the average law graduate, which is akin to arguing that Robert Axelrod’s salary is a good reason to enroll in a graduate program in political science).

Since then things have gotten much, much worse. Over at the Legal Whiteboard, Jerry Organ has a fascinating post detailing the practical collapse of admission standards since 2010 (recall that standards had already slipped a good deal between 2004 and 2010, as the percentage of law school applicants who were admitted to at least one school increased by 23.6% between those years). It’s difficult to pick out the single most hair-raising stat that Organ has assembled, but here’s a good candidate: between 2010 and 2013, the percentage of law school matriculants with sub-145 LSAT scores increased by 56.8%. And that percentage almost certainly grew again this fall, as ABA law schools collectively admitted an astounding 80% of all applicants.

A few days ago, The Faculty Lounge featured an amusing exchange between David Frakt, the dean candidate at Florida Coastal who got kicked out of his presentation to the faculty because that presentation featured too many unpleasant facts about the school’s situation, and Jay Conison, dean of another of Infilaw’s egregious diploma mills. Conison blustered at length about the supposed success the Infilaw schools have had in getting matriculants with bad LSAT scores to pass the bar, but tellingly he provided no data to back up these claims, even after Frakt challenged him explicitly to do so. (This exchange took place three months after my Atlantic piece accusing the Infilaw schools of admitting large numbers of students who have no realistic chance of passing the bar, let alone actually becoming lawyers, so Dean Conison has had plenty of time to assemble a rebuttal. That none has been forthcoming speaks volumes).

The question that interests me, in a somewhat morbid way, regarding Diamond, Conison, and their ilk throughout American legal academia in this the year 2015 of the Christian Era, is the extent to which they actually believe what they say.

The lawyer and sociologist David Riesman described ideology as the kind of sincere mental state that allows a man to habitually believe his own propaganda. American legal academia apparently remains a very sincere place.

A brief history of college football coaching salaries in the context of the new Gilded Age

[ 41 ] December 30, 2014 |

Jim Harbaugh is being introduced as the University of Michigan’s new head football coach today. Harbaugh has signed a contract worth a reported $48 million over six years. It’s unclear whether that figure, if accurate, includes potential bonus payments for winning conference and national titles, curing cancer etc., or merely represents his base pay (Some reports suggest that bonus incentives could potentially push Harbaugh’s compensation closer to ten million dollars per year).

Update: The terms of Harbaugh’s contract are apparently somewhat fluid. He will be paid $7 million this year, which includes a $2 million signing bonus. After this year the AD will make a determination about appropriate deferred compensation and the like. The contract also includes unspecified performance bonuses. The minimum value of the contract, with no performance bonuses or deferred compensation, is $40.1 million over seven years. (This looks like a pretty slick move by Michigan’s AD Jim Hackett. By leaving deferred comp out of the original contract he holds down the up front annual salary number, and the potential backlash. Next year at this time they could up the total value of the contract to $8 million per year and it’s a small story, even locally).

Since it will take a few weeks to FOIA the documents let’s assume for now that his compensation will be $8 million per year.

Now on one hand this is obviously deplorable. Current average salaries at the University of Michigan outside the athletic department (which, unlike almost all college athletic departments in the USA is actually self-funded) look like this:

Administrative poohbahs (president, deans etc.): Several hundred thousand dollars per year

Full professors: $167,000

Associate professors: $114,000

Assistant professors: $101,000

People who make the wheels go round (clerical staff, food service workers, janitors etc): $20,000-$40,000 generally.

Adjunct instructors, aka the people who do the majority of the actual teaching at the institution: A petrified starfish and a bowl of potpourri (parking passes may be provided on a case by case basis).

You can look up salary data at the school here.

So Jim Harbaugh is going to get paid as much per year as 70 University of Michigan professors, or 250 clerical employees, or a nearly infinite number of adjuncts. This seems . . . disturbing.

On the other hand, hiring him is quite likely going to end up being a big net positive for the coffers of the athletic department and even the university generally, so let’s hear it for “the market.” (For example, real estate developer and Miami Dolphins owner Steve Ross is a big Michigan football fan, and he’s expressed his affection for the program and the school by giving $100 million to the AD and another $100 million to the business school. He’s also rumored to be picking up part of Harbaugh’s compensation package).

On a yet a third hand, the university can pay Harbaugh more than any other football coach in the known universe and still make a tidy profit on the deal only because college football in America is a multi-billion dollar industry that doesn’t really pay its primary labor force (in this regard, big-time football reflects the economic structure of the contemporary universities which host it).

So — how did we get here?

Something to keep in mind is that big-time college football has been an extremely popular sport in America for more than a century (Indeed, until the 1960s it was more popular than the NFL). And debates about the exploitative economic structure of the game are nearly as old: I recently found a book published by Princeton and Michigan coach Fritz Crisler in 1934, and re-issued in 1948, in which Crisler addresses the apparently lively debate at the time regarding whether college football players should be paid overt wages, since, according to him, many were being paid covertly back in that simpler more innocent time (On an unrelated but fascinating side note, F. Scott Fitzgerald’s habit of regaling Crisler with alcohol-fueled late night phone calls featuring Fitzgerald’s creative ideas for helping the Princeton football team may actually have inspired the genesis of modern two-platoon football).

Therefore big-time college football coaches have been very well paid, relatively speaking, for a very long time. But “relatively” is the key term here: (All dollar figures below are in constant 2014 dollars).

Woody Hayes, Ohio State, 1951: $113,534. Hayes was a 38-year-old first-year coach at football-crazed OSU in 1951, and his salary represented a whole lot of money back then. He was making 63% more than what was then the 95th percentile of family income, which means the hard-charging young coach was in at least the 98th and probably the 99th percentile of income in the country at the time (63% more than the 95th percentile of household income today puts a household well into the 98th percentile, and household income distribution was a good deal flatter during the socialist regimes of Presidents Truman and Eisenhower).

Bear Bryant, Alabama, 1958 (Bryant had just become Alabama’s athletic director as well as its football coach): $142,998. Bryant remained Alabama’s coach until 1982. He is reputed to have insisted throughout his career that his salary should always be at least one dollar less than that of the university’s president.

Hayden Fry, Southern Methodist, 1962: $101,654. Fry was Arkansas’ offensive coordinator when he took a phone call from Lamar Hunt, of the Dallas Hunt brothers, during warmups for the 1962 Orange Bowl, offering Fry the SMU job. He accepted without asking about the salary, and later discovered he was taking a pay cut from what he had been getting as the Razorbacks’ OC (Fry, by the way, played an important and courageous role in integrating college football in the south).

Bo Schembechler, Michigan, 1969: $135,127. Schembechler in 1969 was almost the same coach as Hayes had been 1951 (One year older, in his first season, coming, as Hayes had, from Miami of Ohio). His salary was only 15% higher than Hayes’ had been, despite the enormous increase in national wealth over the intervening 18 years (GDP exactly doubled in constant dollars over this time).

College football coaching salaries began to increase rapidly in the 1970s. TV money was beginning to pour into the game, although it was still a trickle relative to what it would become. A major change in the compensation structure for coaches took hold in this decade, which is that universities began to divide that compensation into an official university salary, and another sum, with the latter representing pay for ancillary activities, such as hosting a television show, putatively running a football camp associated with the school, and so forth.

So for example by 1981, Schembechler, who had the highest winning percentage of any coach during the 1970s, was being paid a little more than $155,000 in university salary and $130,000 for other contractual obligations, making his total compensation $285,771 (again in 2014 dollars).

Then in January 1982, Texas A&M, awash in oil money and eager to challenge the University of Texas for football supremacy in the Lone Star State, stunned the college football world by offering Schembechler the then-staggering sum of $250,000 per year in 1982 dollars, which would have more than doubled his salary. (This was equivalent to $611,790 in 2014 dollars).

Schembechler turned TAMU down (Domino’s Pizza king Tom Monaghan gave him a Columbus, Ohio franchise), but Pittsburgh coach Jackie Sherrill didn’t, inspiring this amusingly quaint article in the New York Times, which wrestles with the incredible proposition that any employee of a university could be paid a quarter million dollars per year. (Of course today even some non-sports-related university employees make millions).

From there it was off to the races. Nominal coaching salary milestones, with inflation adjustments:

Bobby Bowden: Florida State 1996: $1,000,000 ( $1,505,105 2014$)

Steve Spurrier: Florida 2001:
$2,100,000 ($2,800,209 2014$)

Bob Stoops: Oklahoma 2006: $3,000,000 ($3,154,152 2014$)

Nick Saban: Alabama 2007: $4,000,000 ($4,555,777 2014$)

Nick Saban: Alabama 2014: $7,000,000

And now we apparently have an eight to ten million dollar man (I should add that as a Michigan football fan I heartily approve of this particular development, while sincerely deploring the overall system that has brought it about).

A potential irony in all this is that the entertainment industry in general, and sports in particular, is one of the very few areas of the economy where it may actually be possible to to construct an efficiency-regarding justification for gargantuan salaries (In the context of college sports, of course, this ignores the grotesque spectacle of the players receiving salaries of zero). It’s a whole lot easier to explain why it makes sense to pay Tom Brady $15 million per year than it is to make a similar argument for why last year a couple of dozen hedge fund managers should have pulled down average compensation packages 60 times larger than that.

Of course efficiency is one thing — and let’s not forget the little detail that Harbaugh’s players won’t be paid anything for their part in this multi-billion dollar annual extravaganza — and justice is another. I suggest it is or ought to be a basic tenet of any even vaguely left or progressive political perspective that any social system in which some people have salaries hundreds — let alone thousands and tens of thousands — of times larger than those of other people* is in need of basic reform.

*Let alone people in the same institution, let alone people in the same non-profit tax-supported educational institution!

A cultural history of inflation in America

[ 58 ] December 25, 2014 |

The title of this post is more in the way of a question regarding whether such a thing exists. The reason I’m asking is that, in the course of researching higher education costs in America back to the middle of the 19th century, I discovered something that flew in the face of what I had always assumed about how inflation works in a money economy. What I assumed was that a moderate amount of price inflation is normal — that is, continual rather than episodic — in such economies, and that deflation is rare. Furthermore, I thought (to the extent that unexamined assumptions can be called thinking) any significant or prolonged deflation is an economic disaster, and is something to be feared and avoided even more than hyper-inflation.

Again, these beliefs were the product of nothing more than the fact that this is how things have “always” been as long as I can remember, and that my extremely limited historical knowledge of the subject stretched back no further than the Great Depression, when deflation did help wreak havoc on both the American and world economy.

As many readers no doubt already know, this historical view of inflation and deflation in America — which I suspect, based on my study featuring an N = 1, is quite widespread — is totally wrong.

In fact until about 75 years ago, deflation had been as a historical matter as common in America as inflation. This fact produced what were to me some shocking revelations, including:

(1) Overall prices in the American economy were about the same at the beginning of FDR’s presidency as they had been at the end of George Washington’s second term.

(2) Prices were nearly 25% lower in 1900 than they were in 1800 — that is, on net the 19th century was deflationary.

(3) Prior to the middle of the 20th century, significant inflation, rather than being seen as a normal thing, was very closely associated with, and clearly caused by, war. Indeed, prices would have been very strongly deflationary over a 200-year period if not for bouts of severe inflation during the Revolutionary War, the Civil War, and World War I.

(4) If we consider American economic history from colonial times to the present, the last 75 years have been an almost freakish exception to the normal course of events, in which prices are as apt to fall as they are to rise.

I suspect the last point has had some important cultural and political effects — hence the title of this post. What are the consequences of a generalized sense that prices always rise? Let me suggest just one of many possibilities: people may become relatively desensitized to real as opposed to nominal price increases, because over the long run nominal price increases become so extreme.*

In other words, a general sense that “things cost so much more today than they did back in X” may tend to blur distinctions between different sorts of things, some of which haven’t actually become more expensive in real terms (or have become much cheaper), and some of which very much have.

This of course is just one of many possibilities. In any case, I’m curious about the extent to which the historical anomaly of continual inflation since the end of the Great Depression has been written about, especially in regard to its possible cultural effects.

*I now understand something that puzzled me when I first read Keynes’ “Economic Possibilities For Our Grandchildren,” which was his statement of money values in nominal terms when he compared the the 18th century with the early 20th century. Habituated as we are to a world in which prices always rise, I naturally assumed that nominal prices in the former and latter periods had nothing to do with comparative real prices, just as looking at, for example, the nominal price of a car in 1950 tells you nothing about the real price of car then relative to now. But it turns out that, until the last few decades, economists could treat even 200-year stretches of time as featuring relatively stable prices in the long run!

The legal precariat and the politics of law teaching

[ 61 ] December 22, 2014 |

An LGM reader writes:

I just graduated from law school this past spring, and passed the bar.

I applied for a job and completed a phone interview. The actual “interview” portion was very short, maybe 5 minutes. The attorney then asked me, without offering me a job or at all discussing compensation, to write a motion (a real motion, for a real case that she sent me materials on) as a sort of “let’s see what you can do” type of thing. So now, I will write a motion for free, for a job I may not even get. I may be too cynical, but to me this does not pass the smell test.

A fair response would be to ask if I had spoken to my career services office. I have, and they did not express much concern. But I suspect that they are more concerned about their percentage of graduates with (legal?) jobs 9 months out. I was wondering if you had heard of legal job applicants being strung along and doing unpaid work in similar circumstances for the promise of a job that may not/does not ever materialize.

This kind of thing is actually increasingly common: there is now even a fancy title — the “gratuitous service appointment” — for spending a year or more working full-time as a government lawyer without getting paid. (If you’re wondering how this is legal, the Fair Labor Standards Act is riddled with exceptions for, among other things, members of “learned professions.”)

I’ve published an essay in Radical Teacher on the political implications for law teachers of dealing with the reality of the legal precariat:

The contemporary employment market for new law graduates has taken on a distinctly neo-feudal flavor, in which a willingness to enter into one or more unpaid apprenticeships is becoming a pre-condition for obtaining a paying job (On the other hand, medieval guilds generally required masters to house and feed their apprentices; new law graduates are not so lucky)
.
Other members of the legal precariat work for pay, but under conditions of employment typical of those endured by casual labor, even when that labor wears a white collar. These include wages that are so low relative to working hours that some graduates find themselves making less than the minimum wage (minimum wage restrictions do not apply to salaried members of professions), extreme employment instability, no fringe benefits, and the sense of powerlessness that comes from knowing that one can be replaced at any moment by someone equally qualified to do one’s job, and even more desperate to collect its meager compensation.

It should be unnecessary to point out that such a system both reinforces and strengthens class stratification. Children of privilege, who can rely on their families to pay the rent and the grocery bills during an awkward year or two while they work for little or literally no pay, in order to get their feet inside the proverbial doors, will end up the real jobs that eventually appear behind those doors, while many less privileged graduates will have to abandon their dreams of a legal career altogether. . .

Given these grim facts, law teachers now face a difficult conundrum. In a world in which university administrators increasingly speak in a manner that is hard to distinguish from the professional patois of business consultants – in which educational institutions are treated as “brands” to be “synergized” in the appropriate “target markets” and so forth – prudent law faculty will be tempted to suppress any impulse to engage in critical pedagogy regarding the nascent professional and personal crisis faced by so many of their students. They will instead keep, as it were, pushing the product.

Yet such prudence, while no doubt conducive to both professional advancement and personal happiness, requires a certain mortification of both the intellect and the capacity for moral action (Here we can recall Flaubert’s dictum that “to be stupid, selfish, and have good health are three requirements for happiness, though if stupidity is lacking, all is lost.”).

In any case, the law school reform movement has acquired sufficient notoriety that it is becoming increasingly difficult for individual law teachers and law schools as institutions to employ silence and denial as either an unconscious psychological defense mechanism or a conscious business strategy. Indeed, in the contemporary American law school, the employment and debt crisis faced by our students is always present in every encounter with them, if only implicitly, and it is now an abrogation of professional responsibility not to address it at appropriate times.

The road toward open enrollment at American law schools

[ 79 ] December 18, 2014 |

Updated

It’s been almost exactly four years since the publication of David Segal’s original NY Times piece on the employment crisis overtaking recent law school graduates. Inside legal academia, Segal’s article was met largely with scorn, and in retrospect it’s easy to see why.

At the time, transparency regarding law graduate employment outcomes essentially didn’t exist, ABA law schools had just admitted their biggest first year class ever, tuition was at an all time high and still rising much faster than inflation (which demonstrated that law school was a fine investment because The Market), and any so-called employment “crisis” among law grads was obviously a temporary function of the recession, and was seriously exaggerated by bitter scamblog losers who went to bad law schools, and should have known better all along because Rational Maximizing of Individual Utility. Also, too, Caveat Emptor.

Today, well . . .

Elizabeth Olson and David Segal in today’s NYT:

The bottom of the law school market just keeps on dropping.

Enrollment numbers of first-year law students have sunk to levels not seen since 1973, when there were 53 fewer law schools in the United States, according to the figures just released by the American Bar Association. The 37,924 full- and part-time students who started classes in 2014 represent a 30 percent decline from just four years ago, when enrollment peaked at 52,488.

The recession was in full swing then, and many college graduates looked at law school, as they have many times in the past, as a sure ticket to a good job. Now, with the economy slowly rebounding, a growing number of college graduates are examining the costs of attending law school and the available jobs and deciding that it is not worth the money.

“People are coming to terms with the fact that this decline is the product of long-term structural changes that are just not going away,” said Paul F. Campos, a professor at the University of Colorado’s law school. “It’s kind of a watershed moment.”

Even after all this time, there’s a part of me that’s genuinely surprised that so many law schools are at present managing to lose so much money. I mean consider this “business” model: The government will loan anyone to whom you choose to sell your product the full price of that product, subject to essentially no actuarial controls. And here’s the kicker: you can charge whatever you want, no questions asked! You get the proceeds of the loans, the buyers and eventually the taxpayers take all the risk, and you can do whatever you want with the money.

Losing money in this situation should be pretty hard to do, but if history has taught us anything, it’s that there is no amount of money that can’t be blown on yachts with three helipads, bottle service, and university administration.

Anyway, even back in 2010 the warning signs should have been plentiful, as that largest first-year class ever, the size of which was necessary to pay for the helipads etc., was gathered in by cutting admission standards quite a bit from where they were a few years earlier. That process has since accelerated, with the result that, this fall, it appears that 80% of law school applicants were admitted to at least one ABA school to which they applied.

Although the total number of applicants who were admitted in 2014 isn’t yet available, the 80% figure can be deduced by observing that 54,527 applicants resulted in 37,924 matriculants. [Updated: The figures are now available from LSAC. They indicate that 43,500 applicants were admitted, meaning that 79.8% of applicants were offered admission to at least one law school in the 2013-13 cycle] In 2013 86.8% of admitted applicants ended up matriculating somewhere: this latter percentage tends to be very stable. If we assume the same percentage of admitted applicants matriculated in 2014, that would mean the overall admission rate for ABA law school applicants will have looked like this over the past decade:

2004: 55.6%
2005: 58.6%
2006: 63.1%
2007: 66.1%
2008: 66.5%
2009: 67.4%
2010: 68.7%
2011: 71.1%
2012: 74.5%
2013: 76.8%
2014: 79.8%

Here’s how this looks at the individual school level. I randomly looked up the percentage of admitted applicants at 13 schools: the holy trinity of Harvard, Yale, and Stanford, and then ten schools ranging from the sub-elite to the sub-basement. The first percentage represents applicants admitted in 2004. The second is the same figure for 2014:

American: 24.6% 49.7%
Boston College: 16.6% 43.9%
Brooklyn: 23% 53.2%
UC-Hastings: 19.5% 49.2%
UCLA: 13.6% 28.1%
Florida Coastal: 35.6% 77.7%
Fordham: 19.3% 35.5%
Hofstra: 26.3% 61.2%
Illinois: 23.1% 41.9%
John Marshall: 35.7% 72.9%

Harvard: 11.3% 15.4%
Stanford: 7.7% 9.1%
Yale: 6.5% 8.9%

So even the high rent district has felt a slight sting, though it’s nothing compared to what’s going on in the outer suburbs.

As to when and where this will all end, applications are down another ten percent so far this year.

Ebola-infected land shark threatens holiday moviegoers with weaponized cannabis

[ 90 ] December 17, 2014 |

Eternal hysteria is the price of vigilance.

Most of the country’s largest theater chains have decided not to show Sony’s “The Interview,” according to a person with direct knowledge of the matter.

The decision follows a strange warning on Tuesday from anonymous hackers that people should avoid going to theaters where “The Interview” is playing.

The comedic film is still scheduled to come out on Christmas Day. Sony (SNE) does not plan to pull the film altogether, but the studio has indicated it won’t object if theaters decide not to show the film, a second source said.

Among the top chains that have decided to not show the movie are Regal (RGC), Cinemark (CNK), Carmike Cinemas (CKEC), Arclight and Southern.

Another smaller chain, Bow Tie Cinemas, has also dropped its plans to show the film.

“It is our mission to ensure the safety and comfort of our guests and employees,” the company said in a statement.

Also:

The shockwaves from the Sony hack have finally reached Hollywood’s development community, as New Regency has pulled the plug on the Steve Carell movie “Pyongyang,” which Gore Verbinski had been prepping for a March start date, an individual familiar with the project has told TheWrap.

Based on the graphic novel by Guy Delisle, “Pyongyang” is a paranoid thriller about a Westerner’s experiences working in North Korea for a year.

Jeb Bush is running for president

[ 199 ] December 16, 2014 |

dale's

Of the United States of America.

OK technically he’s “actively exploring the possibility.” We all know that’s like actively exploring the possibility of of drinking this can of Dale’s Pale Ale I just opened.

So, campaign slogans?

I’ll start: “In five years the Bush family will be completely legitimate.”

BTW Chelsea Clinton becomes constitutionally eligible for the office in February. (If you turn 35 after the general election but before the Electoral College vote are you eligible? What about after the College but before the inauguration? What if you’re from a culture that calls people “35” during their 35th year of life? I’ve heard Germans do this. They’re not constitutionally eligible though).

Law school fires (or otherwise terminates with extreme prejudice) nearly 60% of its faculty

[ 19 ] December 13, 2014 |

terminate

Long-time LGM readers may remember the Western Michigan Thomas M. Cooley Law School from such posts as “Change the Name if the Product’s Weak,” “If Your Lies Are Really Egregious They Don’t Count as Fraud,” and “SLAPP Suits As Experiential Learning.”

Because certain irresponsible critics have been spreading what WMUTMCLS’s Dean and President for Life Don LeDuc has characterized as the “myth” that it has become difficult for graduates of low-ranked law schools to get jobs as lawyers, the school’s federal student loan conduits enrollment has declined from just under 4,000 JD students four years ago to 1,754 this fall. This led WMUTMCLS to announce in August that it was laying off some faculty, although as is the way of such things, the school was very vague regarding how extensive these layoffs would be.

“The process is not complete. I don’t have numbers for you,” Robb told the Lansing City Pulse last Thursday. “And I don’t know that we will release numbers, frankly.”

One source told the Lansing City Pulse that layoffs could be higher than 50 percent. Asked about the number, Robb told the publication, “I think you’re hearing wrong.”

This week’s publication of ABA 509 disclosure forms answers the question that Cooley wouldn’t.

Full Time Faculty:

Spring 2011: 101

Fall 2011: 106

Spring 2012: 110

Fall 2012: 103

Spring 2013: 117

Fall 2013: 115

Spring 2014: 119

Fall 2014: 49

Holy new gilded age Batman. (Among other things these numbers illustrate how LeDuc and Co. seemed to have made the mistake of believing their own propaganda about how prosperity was just around the corner, as the school increased the size of its faculty even after its applicant pool collapsed.)

I guess firing 70 of your 119 full-time faculty in one fell swoop in the kind of gust of creative destruction that’s necessary to protect those precious non-profit margins, that allowed the school to pay President for Life LeDuc $675,626 last year, and kicking $373,550 to the school’s founder Thomas Brennan, for what the school estimated to be five hours of “work” per week, while still maintaining a net surplus of $2.5 million in revenues over expenses. (Additionally I’ve been told — although I will hasten to add before I get served again that I don’t know whether this is actually the case — that WMUTMCLS is a veritable hive of nepotism for the relatives of the school’s powers that be, comparable in this and in no other regard, to a classic Francis Ford Coppola film).

I can’t remember at the moment if I’ve already written about the possibility that law schools will use the genuine need for significant financial restructuring as an excuse to “down-size,” in the all-too-common sense of getting rid of people in reverse proportion to both the magnitude of their salaries and the extent to which they do any useful work.

And sure enough, when we look at the category “Deans, librarians and others who teach” (this doesn’t include adjuncts, who are by definition part-time) we find:

Spring 2011: 25

Fall 2011: 26

Spring 2012: 31

Fall 2012: 28

Spring 2013: 26

Fall 2013: 27

Spring 2014: 24

Fall 2014: 26

This principle explains why staff are always fired before faculty, junior faculty are always fired before their senior colleagues, and why the most useless and highly paid administrators will, along with other remarkably adaptive species, inherit the post-apocalyptic earth.

Law school first year enrollment lowest since 1973 (and other assorted data points)

[ 29 ] December 12, 2014 |

The ABA has put up 509 disclosure forms for 2014. A few preliminary tidbits for possible discussion:

(1) ABA schools enrolled 37,924 1Ls this year. This is the lowest total since 1973, when there were 26% fewer ABA law schools, and student-faculty ratios were approximately 35 to 1 (they were 13.6 to 1 last year).

(2) LSAT scores for low-ranked schools continue to plummet to heretofore unprecedented depths. Not surprisingly, the Infilaw schools are once again leading the way, with Charlotte recording an astounding 142 median LSAT score for its entering class (17.8th percentile) and an even more eyebrow-raising 138 for the class’s 25th percentile. This means that a quarter of the entering class scored somewhere in the bottom ten percent of all LSAT test takers (To score in the 10th percentile of the test, you have to get 34 of 100 questions correct, on a test where answering randomly will produce on average 20 correct answers. Another way of putting this is that people with 138 LSAT scores are answering about one of every six questions correctly, excluding random effects. This is on a test where a few people record perfect scores every year, and thousands of test takers answer at least five of every six questions correctly.)

(3) This year’s 509s include much more information regarding transfers. In this regard, Washington DC law schools reveal legal academic nature red in tooth and claw: American lost 100 1Ls (more than a fifth of the class) to other law schools, with George Washington alone taking 54 American transfers (Georgetown took another 13). A startling aspect of this crosstown traffic is that the median 1L grades of the transfers George Washington accepted were barely above the median 1L grade curve at American, which appears to mean that GW took any American 1L in the top half of the class who applied for transfer. The median LSAT for 2013 GW matrics was in the 92nd percentile, while American matrics were in the 71st, which illustrates how the transfer system is the equivalent of money laundering as applied to academic credentials (only the LSATs of 1L students count for rankings purposes.)

(4) An even more flamboyant example of this game is provided by the hostile symbiosis between Arizona State and yet another Infilaw outfit, Arizona Summit. ASU took 66 transfers, meaning that more than a third of this year’s 2L ASU class spent their first year of law school somewhere else (last year’s ASU 1L class was only 128 students). Exactly two thirds of these transfers — 44 — were escapees from Arizona Summit. Median LSAT for 2013 ASU matrics: 86th percentile. Median LSAT for 2013 Arizona Summit matrics: 23rd percentile.

The upper middle class

[ 99 ] December 11, 2014 |

Chris Rock, New York magazine interview:

For all the current conversation about income inequality, class is still sort of the elephant in the room.

Oh, people don’t even know. If poor people knew how rich rich people are, there would be riots in the streets. If the average person could see the Virgin Airlines first-class lounge,* they’d go, “What? What? This is food, and it’s free, and they … what? Massage? Are you kidding me?

*Offers spa treatments, “expert mixologists,” and, at Heathrow, a “lodge and viewing deck” with an “après-ski vibe.”

Once a social system has moved all or nearly all of its members above the level of brute starvation, wealth and poverty soon become inherently relative concepts, but that doesn’t make them any less real. One of the consequences of living in an extremely rich country which features increasingly extreme wealth stratification is that people who would have been considered rich fifteen minutes ago are suddenly part of the “upper middle class.”

Take, for example, what has happened to economic relations within the American university. It’s well known that American colleges and universities must increase their operating budgets every year at rates faster than inflation because of reasons, and therefore it becomes inevitable, given the contemporary economic structure of the country as a whole, that these institutions will spend enormous amounts of time and money currying favor with super-wealthy potential donors. Giving money to a “non-profit” educational institution provides the masters of the universe with sweet tax breaks, while allowing them to indulge in the ego-gratifying pleasures of plastering their names all over various buildings and centers and even whole schools and colleges.

And so it has come to pass that the highest-paid people within universities (aside from some football and men’s basketball coaches, which is a subject for another day) are those employees who are responsible for, respectively, kowtowing before the great and the good, and investing the proceeds gathered up by successful administrative mendicants.

Thus the highest-paid employee of Columbia University is this guy, who runs the school’s endowment, and who was paid more than five million dollars in FY2013 for his trouble.

Meanwhile the university’s president, Lee Bollinger, had to scrape by on a hair under $3.4 million in total compensation. (Bollinger was one of 36 presidents of American private colleges and universities who were paid more than one million dollars last year).

Bollinger got into academic life more than 40 years ago as an assistant professor at the University of Michigan’s law school. By 1979 he was a full professor, and was pulling down $31,500 per year ($103,000 in 2014$). The following year the university’s new president, Harold Shapiro, earned a salary of $75,000 ($215,000 in 2014$).

Now the thing is I bet Lee Bollinger doesn’t feel very rich, despite the fact that he makes, in real, inflation-adjusted terms, more money every two weeks than he did in an entire year back when he was a full professor at an elite law school, before he got into the administrative rackets (Bollinger was provost at Dartmouth and president of Michigan before ascending to his present position).

After all Bollinger’s job consists largely of hobnobbing with people who “earn” more in two weeks than he makes in year. And some of those people make less in a year than some other people make every two weeks.

Which is how an academic who makes three and a half million per year ends up feeling sort of “upper middle class.”

Page 4 of 93« First...23456...102030...Last »