JFK campaigning in 1960.
The kid with the gun looks very bored.
JFK campaigning in 1960.
The kid with the gun looks very bored.
AKA the “hasn’t he suffered enough?” defense, ably articulated this morning by California’s senior senator:
A top Senate Democrat defended David Petraeus on Sunday, saying the Justice Department erred in recommending charges against the former top Army general and Central Intelligence Agency director.
“This man has suffered enough in my view,” Sen. Dianne Feinstein of California, the former Senate Intelligence Committee chairwoman, told Gloria Borger, CNN chief political analyst on CNN’s “State of the Union.”
Her comments come after news that the Justice Department is recommending charges against Petraeus, first reported by The New York Times.
Feinstein called Petraeus, who led U.S. efforts in Iraq and Afghanistan under President George W. Bush and later President Barack Obama, “the four-star general of our generation” and “a very brilliant man.”
She said Petraeus’ affair with Paula Broadwell, his biographer, and his allowing her access to some classified government documents while she was with him was a mistake — but not one for which he should face criminal charges.
“It’s done, it’s over. He’s retired. He’s lost his job,” Feinstein said. “I mean, how much does government want?”
Her comments came on the heels of similar criticism by Republican Sens. John McCain of Arizona and Lindsey Graham of South Carolina, who called the investigation “grievously mishandled.”
Note: Defense not applicable in all cases.*
*Use this easy test to check whether you’re eligible to have your advocates employ this line of argument in the offices of the executive branch and the courts of public opinion:
If you passed classified information to your mistress, how many senators would appear on Sunday morning talk shows to talk about what a great person you are?
Stop. Do not pass Go. Do not collect $2,000,000 from Kohlberg Kravis Roberts.
Defense may be applicable in your case. Consult the editorial board of the Washington Post for further guidance.
(c) Two or more, at least one of which is from each major party.
Congratulations, you are a Genuine American Hero(tm), and as such outside the jurisdiction of federal criminal law. Please be sure to collect your Augusta National Golf Club membership and other complimentary gifts at the door.
George Zimmerman, the Florida man acquitted in 2013 of the shooting death of unarmed black teen Trayvon Martin, was arrested in Florida late Friday night on charges of aggravated assault with a weapon. . .
Zimmerman is being held on $5,000 bond and has been ordered to surrender all firearms even though this incident didn’t involve one, the judge said. [So much for the 2nd amendment]
Since his high-profile acquittal, Zimmerman has had three other encounters with the Lake Mary police.
In September 2013, Zimmerman’s estranged wife, Shellie Zimmerman, called 911 to tell police he had punched her father and was threatening her with a gun. She opted not to press charges.
In the second incident, which occurred in November 2013, Zimmerman was arrested and accused of domestic violence by girlfriend Samantha Scheibe, who later said investigators had misinterpreted her statements and dropped charges.
In September 2014, Zimmerman was involved in an incident of road rage.
This is the first of a projected series of posts on the economics of American higher education.
“Civilization’s going to pieces,” broke out Tom violently. “I’ve gotten to be a terrible pessimist about things. Have you read ‘The Rise of the Colored Empires’ by this man Goddard?”
“Why, no,” I answered, rather surprised by his tone.
“Well, it’s a fine book, and everybody ought to read it. The idea is if we don’t look out the white race will be — will be utterly submerged. It’s all scientific stuff; it’s been proved.”
“Tom’s getting very profound,” said Daisy, with an expression of unthoughtful sadness. “He reads deep books with long words in them. What was that word we ——”
“Well, these books are all scientific,” insisted Tom, glancing at her impatiently. “This fellow has worked out the whole thing. It’s up to us, who are the dominant race, to watch out or these other races will have control of things.”
“We’ve got to beat them down,” whispered Daisy, winking ferociously toward the fervent sun.
“You ought to live in California —” began Miss Baker, but Tom interrupted her by shifting heavily in his chair.
“This idea is that we’re Nordics. I am, and you are, and you are, and ——” After an infinitesimal hesitation he included Daisy with a slight nod, and she winked at me again. “— And we’ve produced all the things that go to make civilization — oh, science and art, and all that. Do you see?”
Tom Buchanan and Nick Carraway were freshmen at New Haven in 1910, if I’ve got my literary math right. How much did the school spend to turn them into Yale men?
The answer can be deduced from the following:
Yale’s endowment was $12.1 million in 1910. Assuming that 4.5% of this was expendable income that year, this means Yale’s total operating budget was $1,089,000, given that, according to this, the endowment that year generated exactly half the school’s budget.
How much is that in 2014 dollars? Using standard CPI inflation calculators, the answer is about $25.3 million. Yale’s total enrollment that year in all its various schools and colleges was 3,319 students, meaning that it cost $7623 per year, in 2014 dollars to enlighten Nick and Tom. (They paid about $3,000 per year, in 2014 dollars, for tuition and room and board)
One hundred years later, how much did it cost to bring their institutional descendants into the light?
By 2010, Yale’s endowment had grown to $16.7 billion (it was $23.9 billion in June of last year). This sum generated about $751,500,000 in expendable income, which in turn provided 41% of the school’s general fund budget. $751,500,000 is 41% of roughly $1.83 billion. In 2010 Yale had a total enrollment of 11,520, which means the school was spending, in 2014 dollars, about $172,030 per student. (In comments, Mal points out that you could properly back out the nearly 20% of the total budget that represents the operations of the medical complex, as these costs have only a very tangential relation to the cost of educating the vast majority of Yale students. So you might want to reduce that $172,000 to around $140,000). Yale technically charged its students $467,000,000 in tuition, but the school actually distributed 63% of that total — $295,000,000 — back to students in grants, which means that in 2010 the average Yale student paid $14,931, or somewhere around 8% to 12% of the total cost of his or her education, depending on various budgetary assumptions.
We shook hands and I started away. Just before I reached the hedge I remembered something and turned around.
“They’re a rotten crowd,” I shouted across the lawn. “You’re worth the whole damn bunch put together.”
I’ve always been glad I said that. It was the only compliment I ever gave him, because I disapproved of him from beginning to end. First he nodded politely, and then his face broke into that radiant and understanding smile, as if we’d been in ecstatic cahoots on that fact all the time. His gorgeous pink rag of a suit made a bright spot of color against the white steps, and I thought of the night when I first came to his ancestral home, three months before. The lawn and drive had been crowded with the faces of those who guessed at his corruption — and he had stood on those steps, concealing his incorruptible dream, as he waved them good-by.
I thanked him for his hospitality. We were always thanking him for that — I and the others.
In Harvard’s health care enrollment guide for 2015, the university said it “must respond to the national trend of rising health care costs, including some driven by health care reform,” otherwise known as the Affordable Care Act. The guide said that Harvard faced “added costs” because of provisions in the health care law that extend coverage for children up to age 26, offer free preventive services like mammograms and colonoscopies and, starting in 2018, add a tax on high-cost insurance, known as the Cadillac tax.
Richard F. Thomas, a Harvard professor of classics and one of the world’s leading authorities on Virgil, called the changes “deplorable, deeply regressive, a sign of the corporatization of the university.”
Mary D. Lewis, a professor who specializes in the history of modern France and has led opposition to the benefit changes, said they were tantamount to a pay cut. “Moreover,” she said, “this pay cut will be timed to come at precisely the moment when you are sick, stressed or facing the challenges of being a new parent.”
The university is adopting standard features of most employer-sponsored health plans: Employees will now pay deductibles and a share of the costs, known as coinsurance, for hospitalization, surgery and certain advanced diagnostic tests. The plan has an annual deductible of $250 per individual and $750 for a family. For a doctor’s office visit, the charge is $20. For most other services, patients will pay 10 percent of the cost until they reach the out-of-pocket limit of $1,500 for an individual and $4,500 for a family.
Previously, Harvard employees paid a portion of insurance premiums and had low out-of-pocket costs when they received care.
Michael E. Chernew, a health economist and the chairman of the university benefits committee, which recommended the new approach, acknowledged that “with these changes, employees will often pay more for care at the point of service.” In part, he said, “that is intended because patient cost-sharing is proven to reduce overall spending.”
According to the AAUP, the average salary for Harvard full professors is currently $207,100, and their average total compensation (including the lousy health care plan) is $262,300.
. . . numerous commenters make the fair point that Harvard’s new plan might be quite burdensome to the large number of Harvard employees making a lot less than TT professors. The school offers some protection against high co-insurance costs to lower-paid employees, but it’s also fair to ask why an institution with a $36 billion (!) endowment can’t be more generous to its employees, especially those who aren’t near the top of the pay scale. And although Harvard’s new plan is actually a good one compared to the health care options provided by most employers, that’s just another sign of how dysfunctional the American health care system remains, despite whatever marginal improvements are provided by the ACA.
A couple of days ago I noted that Jay Conison, dean of the most wretched — as measured by the admissions credentials of its
victims students — of the Infilaw for-profit law schools, was debating David Frakt at The Faculty Lounge. Frakt, who was kicked out of a literal faculty lounge by Infilaw errand boy Dennis Stone when his dean candidacy presentation to the Florida Coastal faculty began to include subversive material, aka facts, was now challenging Conison to provide evidence — any evidence — for Infilaw’s contention that the unprecedentedly horrible LSAT scores of its recent entering classes weren’t going to produce abysmal bar passage rates.
This question has taken on special sharpness, given that bar passage rates for Infilaw school graduates plunged in 2014, and, as documented here, the entrance numbers for the the schools’ most recent entering classes — and Conison’s Charlotte School of Law in particular — are far lower than even the terrible numbers of the 2011 entering classes, which provided most of the graduates who failed various bar exams this summer at rates approaching 50%. (A quarter of Charlotte’s 2014 matrics have LSAT scores of 138 or below. 138 is the 9th percentile of test takers. The average LSAT score of graduates of even the worst for-profit undergraduate schools who take the test — that is, not those who apply to law school, but all those who merely take the LSAT — are much higher).
Conison has now risen to the challenge. I dare you to read that. I double dare you.
The path of the righteous man is beset on all sides by the inequities of the selfish and the tyranny of evil men. Blessed is he, who in the name of charity and good will, shepherds the weak through the valley of darkness, for he is truly his brother’s keeper and the finder of lost children. And I will strike down upon thee with great vengeance and furious anger those who would attempt to poison and destroy my brothers. And you will know my name is the Lord when I lay my vengeance upon thee.
Lawyers are notoriously bad at math, and some legal academics are apparently even worse.
The delay in the enrollment decline [among law schools] occurred because new college grads tried to flood into law schools from 2008-2010 in order to wait out the economic turmoil. The problem they then faced was that the recovery only took hold in 2012-13 and that meant oversupply in the market. The enrollment bubble can be seen in this chart prepared by the ABA. As the economy took off under the influence of low interest rates in 2003 enrollment steadily climbed and jumped up significantly as the credit crisis was in full swing. The peak in first year enrollment was in AY2010-2011 at 52,488. The continuing impact of that bubble period is indicated by the fact that the highest number of JD’s ever awarded in the US occurred in 2013, three years after the peak of first year enrollment.
This doesn’t even begin to make sense on its face, as Diamond argues that enrollment “steadily climbed” after 2003 because the economy was booming during the credit bubble, and then “jumped significantly” because the economy crashed, and “and new college grads tried to flood into law schools from 2008-2010.” (Diamond goes on to argue that law school enrollment has declined since 2010 in part because the opportunity cost of attending has increased due to the strengthening economy, apparently forgetting that he argued just the opposite in regard to increasing enrollment during the mid-aughts).
Leaving aside this theoretical confusion, the data Diamond cites don’t actually support his claim, and other, more relevant data flat-out contradict it. Diamond cites law school enrollment numbers, when of course the far more germane data in regard to demand for law school admission are law school applicant figures. Even the enrollment figures don’t back up Diamond’s argument, as 1L enrollment barely budged between 2003 and 2008, going from 48,867 to 49,414, which actually represents a 5.4% decline in enrollment per ABA law school (there were 187 such schools in 2003 and 200 five years later).
Enrollment did increase modestly between 2008 and 2010, to 52,488 — this 6.2% increase is Diamond’s “enrollment bubble,” which is supposedly the cause of whatever employment struggles recent graduates have faced — but what Diamond fails to note is that, far from “flooding into law schools between 2008-2010,” significantly fewer people applied to law school during the Great Recession than during the boom years immediately preceding it.
Total applicants to ABA law schools 2004-2006: 285,100
Total applicants to ABA law schools 2008-2010: 257,900
As I’ve noted elsewhere, the fact that demand for law school admission actually declined during the worst economic contraction since the 1930s should have been a warning sign to legal academia. It wasn’t, because instead law schools slashed admission standards, and managed to slightly increase total enrollment — although, again, enrollment fell per school — in the face of declining demand, leading innumerate observers like Diamond, who has a Ph.D. in political science as well as J.D., to conclude that demand was increasing. (Diamond claims that big year-end bonuses for lawyers at a handful of hyper-elite law firms means prosperity is just around the corner for the average law graduate, which is akin to arguing that Robert Axelrod’s salary is a good reason to enroll in a graduate program in political science).
Since then things have gotten much, much worse. Over at the Legal Whiteboard, Jerry Organ has a fascinating post detailing the practical collapse of admission standards since 2010 (recall that standards had already slipped a good deal between 2004 and 2010, as the percentage of law school applicants who were admitted to at least one school increased by 23.6% between those years). It’s difficult to pick out the single most hair-raising stat that Organ has assembled, but here’s a good candidate: between 2010 and 2013, the percentage of law school matriculants with sub-145 LSAT scores increased by 56.8%. And that percentage almost certainly grew again this fall, as ABA law schools collectively admitted an astounding 80% of all applicants.
A few days ago, The Faculty Lounge featured an amusing exchange between David Frakt, the dean candidate at Florida Coastal who got kicked out of his presentation to the faculty because that presentation featured too many unpleasant facts about the school’s situation, and Jay Conison, dean of another of Infilaw’s egregious diploma mills. Conison blustered at length about the supposed success the Infilaw schools have had in getting matriculants with bad LSAT scores to pass the bar, but tellingly he provided no data to back up these claims, even after Frakt challenged him explicitly to do so. (This exchange took place three months after my Atlantic piece accusing the Infilaw schools of admitting large numbers of students who have no realistic chance of passing the bar, let alone actually becoming lawyers, so Dean Conison has had plenty of time to assemble a rebuttal. That none has been forthcoming speaks volumes).
The question that interests me, in a somewhat morbid way, regarding Diamond, Conison, and their ilk throughout American legal academia in this the year 2015 of the Christian Era, is the extent to which they actually believe what they say.
The lawyer and sociologist David Riesman described ideology as the kind of sincere mental state that allows a man to habitually believe his own propaganda. American legal academia apparently remains a very sincere place.
Jim Harbaugh is being introduced as the University of Michigan’s new head football coach today. Harbaugh has signed a contract worth a reported $48 million over six years. It’s unclear whether that figure, if accurate, includes potential bonus payments for winning conference and national titles, curing cancer etc., or merely represents his base pay (Some reports suggest that bonus incentives could potentially push Harbaugh’s compensation closer to ten million dollars per year).
Update: The terms of Harbaugh’s contract are apparently somewhat fluid. He will be paid $7 million this year, which includes a $2 million signing bonus. After this year the AD will make a determination about appropriate deferred compensation and the like. The contract also includes unspecified performance bonuses. The minimum value of the contract, with no performance bonuses or deferred compensation, is $40.1 million over seven years. (This looks like a pretty slick move by Michigan’s AD Jim Hackett. By leaving deferred comp out of the original contract he holds down the up front annual salary number, and the potential backlash. Next year at this time they could up the total value of the contract to $8 million per year and it’s a small story, even locally).
Since it will take a few weeks to FOIA the documents let’s assume for now that his compensation will be $8 million per year.
Now on one hand this is obviously deplorable. Current average salaries at the University of Michigan outside the athletic department (which, unlike almost all college athletic departments in the USA is actually self-funded) look like this:
Administrative poohbahs (president, deans etc.): Several hundred thousand dollars per year
Full professors: $167,000
Associate professors: $114,000
Assistant professors: $101,000
People who make the wheels go round (clerical staff, food service workers, janitors etc): $20,000-$40,000 generally.
Adjunct instructors, aka the people who do the majority of the actual teaching at the institution: A petrified starfish and a bowl of potpourri (parking passes may be provided on a case by case basis).
You can look up salary data at the school here.
So Jim Harbaugh is going to get paid as much per year as 70 University of Michigan professors, or 250 clerical employees, or a nearly infinite number of adjuncts. This seems . . . disturbing.
On the other hand, hiring him is quite likely going to end up being a big net positive for the coffers of the athletic department and even the university generally, so let’s hear it for “the market.” (For example, real estate developer and Miami Dolphins owner Steve Ross is a big Michigan football fan, and he’s expressed his affection for the program and the school by giving $100 million to the AD and another $100 million to the business school. He’s also rumored to be picking up part of Harbaugh’s compensation package).
On a yet a third hand, the university can pay Harbaugh more than any other football coach in the known universe and still make a tidy profit on the deal only because college football in America is a multi-billion dollar industry that doesn’t really pay its primary labor force (in this regard, big-time football reflects the economic structure of the contemporary universities which host it).
So — how did we get here?
Something to keep in mind is that big-time college football has been an extremely popular sport in America for more than a century (Indeed, until the 1960s it was more popular than the NFL). And debates about the exploitative economic structure of the game are nearly as old: I recently found a book published by Princeton and Michigan coach Fritz Crisler in 1934, and re-issued in 1948, in which Crisler addresses the apparently lively debate at the time regarding whether college football players should be paid overt wages, since, according to him, many were being paid covertly back in that simpler more innocent time (On an unrelated but fascinating side note, F. Scott Fitzgerald’s habit of regaling Crisler with alcohol-fueled late night phone calls featuring Fitzgerald’s creative ideas for helping the Princeton football team may actually have inspired the genesis of modern two-platoon football).
Therefore big-time college football coaches have been very well paid, relatively speaking, for a very long time. But “relatively” is the key term here: (All dollar figures below are in constant 2014 dollars).
Woody Hayes, Ohio State, 1951: $113,534. Hayes was a 38-year-old first-year coach at football-crazed OSU in 1951, and his salary represented a whole lot of money back then. He was making 63% more than what was then the 95th percentile of family income, which means the hard-charging young coach was in at least the 98th and probably the 99th percentile of income in the country at the time (63% more than the 95th percentile of household income today puts a household well into the 98th percentile, and household income distribution was a good deal flatter during the socialist regimes of Presidents Truman and Eisenhower).
Bear Bryant, Alabama, 1958 (Bryant had just become Alabama’s athletic director as well as its football coach): $142,998. Bryant remained Alabama’s coach until 1982. He is reputed to have insisted throughout his career that his salary should always be at least one dollar less than that of the university’s president.
Hayden Fry, Southern Methodist, 1962: $101,654. Fry was Arkansas’ offensive coordinator when he took a phone call from Lamar Hunt, of the Dallas Hunt brothers, during warmups for the 1962 Orange Bowl, offering Fry the SMU job. He accepted without asking about the salary, and later discovered he was taking a pay cut from what he had been getting as the Razorbacks’ OC (Fry, by the way, played an important and courageous role in integrating college football in the south).
Bo Schembechler, Michigan, 1969: $135,127. Schembechler in 1969 was almost the same coach as Hayes had been 1951 (One year older, in his first season, coming, as Hayes had, from Miami of Ohio). His salary was only 15% higher than Hayes’ had been, despite the enormous increase in national wealth over the intervening 18 years (GDP exactly doubled in constant dollars over this time).
College football coaching salaries began to increase rapidly in the 1970s. TV money was beginning to pour into the game, although it was still a trickle relative to what it would become. A major change in the compensation structure for coaches took hold in this decade, which is that universities began to divide that compensation into an official university salary, and another sum, with the latter representing pay for ancillary activities, such as hosting a television show, putatively running a football camp associated with the school, and so forth.
So for example by 1981, Schembechler, who had the highest winning percentage of any coach during the 1970s, was being paid a little more than $155,000 in university salary and $130,000 for other contractual obligations, making his total compensation $285,771 (again in 2014 dollars).
Then in January 1982, Texas A&M, awash in oil money and eager to challenge the University of Texas for football supremacy in the Lone Star State, stunned the college football world by offering Schembechler the then-staggering sum of $250,000 per year in 1982 dollars, which would have more than doubled his salary. (This was equivalent to $611,790 in 2014 dollars).
Schembechler turned TAMU down (Domino’s Pizza king Tom Monaghan gave him a Columbus, Ohio franchise), but Pittsburgh coach Jackie Sherrill didn’t, inspiring this amusingly quaint article in the New York Times, which wrestles with the incredible proposition that any employee of a university could be paid a quarter million dollars per year. (Of course today even some non-sports-related university employees make millions).
From there it was off to the races. Nominal coaching salary milestones, with inflation adjustments:
Bobby Bowden: Florida State 1996: $1,000,000 ( $1,505,105 2014$)
Steve Spurrier: Florida 2001: $2,100,000 ($2,800,209 2014$)
Bob Stoops: Oklahoma 2006: $3,000,000 ($3,154,152 2014$)
Nick Saban: Alabama 2007: $4,000,000 ($4,555,777 2014$)
Nick Saban: Alabama 2014: $7,000,000
And now we apparently have an eight to ten million dollar man (I should add that as a Michigan football fan I heartily approve of this particular development, while sincerely deploring the overall system that has brought it about).
A potential irony in all this is that the entertainment industry in general, and sports in particular, is one of the very few areas of the economy where it may actually be possible to to construct an efficiency-regarding justification for gargantuan salaries (In the context of college sports, of course, this ignores the grotesque spectacle of the players receiving salaries of zero). It’s a whole lot easier to explain why it makes sense to pay Tom Brady $15 million per year than it is to make a similar argument for why last year a couple of dozen hedge fund managers should have pulled down average compensation packages 60 times larger than that.
Of course efficiency is one thing — and let’s not forget the little detail that Harbaugh’s players won’t be paid anything for their part in this multi-billion dollar annual extravaganza — and justice is another. I suggest it is or ought to be a basic tenet of any even vaguely left or progressive political perspective that any social system in which some people have salaries hundreds — let alone thousands and tens of thousands — of times larger than those of other people* is in need of basic reform.
*Let alone people in the same institution, let alone people in the same non-profit tax-supported educational institution!
The title of this post is more in the way of a question regarding whether such a thing exists. The reason I’m asking is that, in the course of researching higher education costs in America back to the middle of the 19th century, I discovered something that flew in the face of what I had always assumed about how inflation works in a money economy. What I assumed was that a moderate amount of price inflation is normal — that is, continual rather than episodic — in such economies, and that deflation is rare. Furthermore, I thought (to the extent that unexamined assumptions can be called thinking) any significant or prolonged deflation is an economic disaster, and is something to be feared and avoided even more than hyper-inflation.
Again, these beliefs were the product of nothing more than the fact that this is how things have “always” been as long as I can remember, and that my extremely limited historical knowledge of the subject stretched back no further than the Great Depression, when deflation did help wreak havoc on both the American and world economy.
As many readers no doubt already know, this historical view of inflation and deflation in America — which I suspect, based on my study featuring an N = 1, is quite widespread — is totally wrong.
In fact until about 75 years ago, deflation had been as a historical matter as common in America as inflation. This fact produced what were to me some shocking revelations, including:
(1) Overall prices in the American economy were about the same at the beginning of FDR’s presidency as they had been at the end of George Washington’s second term.
(2) Prices were nearly 25% lower in 1900 than they were in 1800 — that is, on net the 19th century was deflationary.
(3) Prior to the middle of the 20th century, significant inflation, rather than being seen as a normal thing, was very closely associated with, and clearly caused by, war. Indeed, prices would have been very strongly deflationary over a 200-year period if not for bouts of severe inflation during the Revolutionary War, the Civil War, and World War I.
(4) If we consider American economic history from colonial times to the present, the last 75 years have been an almost freakish exception to the normal course of events, in which prices are as apt to fall as they are to rise.
I suspect the last point has had some important cultural and political effects — hence the title of this post. What are the consequences of a generalized sense that prices always rise? Let me suggest just one of many possibilities: people may become relatively desensitized to real as opposed to nominal price increases, because over the long run nominal price increases become so extreme.*
In other words, a general sense that “things cost so much more today than they did back in X” may tend to blur distinctions between different sorts of things, some of which haven’t actually become more expensive in real terms (or have become much cheaper), and some of which very much have.
This of course is just one of many possibilities. In any case, I’m curious about the extent to which the historical anomaly of continual inflation since the end of the Great Depression has been written about, especially in regard to its possible cultural effects.
*I now understand something that puzzled me when I first read Keynes’ “Economic Possibilities For Our Grandchildren,” which was his statement of money values in nominal terms when he compared the the 18th century with the early 20th century. Habituated as we are to a world in which prices always rise, I naturally assumed that nominal prices in the former and latter periods had nothing to do with comparative real prices, just as looking at, for example, the nominal price of a car in 1950 tells you nothing about the real price of car then relative to now. But it turns out that, until the last few decades, economists could treat even 200-year stretches of time as featuring relatively stable prices in the long run!
An LGM reader writes:
I just graduated from law school this past spring, and passed the bar.
I applied for a job and completed a phone interview. The actual “interview” portion was very short, maybe 5 minutes. The attorney then asked me, without offering me a job or at all discussing compensation, to write a motion (a real motion, for a real case that she sent me materials on) as a sort of “let’s see what you can do” type of thing. So now, I will write a motion for free, for a job I may not even get. I may be too cynical, but to me this does not pass the smell test.
A fair response would be to ask if I had spoken to my career services office. I have, and they did not express much concern. But I suspect that they are more concerned about their percentage of graduates with (legal?) jobs 9 months out. I was wondering if you had heard of legal job applicants being strung along and doing unpaid work in similar circumstances for the promise of a job that may not/does not ever materialize.
This kind of thing is actually increasingly common: there is now even a fancy title — the “gratuitous service appointment” — for spending a year or more working full-time as a government lawyer without getting paid. (If you’re wondering how this is legal, the Fair Labor Standards Act is riddled with exceptions for, among other things, members of “learned professions.”)
I’ve published an essay in Radical Teacher on the political implications for law teachers of dealing with the reality of the legal precariat:
The contemporary employment market for new law graduates has taken on a distinctly neo-feudal flavor, in which a willingness to enter into one or more unpaid apprenticeships is becoming a pre-condition for obtaining a paying job (On the other hand, medieval guilds generally required masters to house and feed their apprentices; new law graduates are not so lucky)
Other members of the legal precariat work for pay, but under conditions of employment typical of those endured by casual labor, even when that labor wears a white collar. These include wages that are so low relative to working hours that some graduates find themselves making less than the minimum wage (minimum wage restrictions do not apply to salaried members of professions), extreme employment instability, no fringe benefits, and the sense of powerlessness that comes from knowing that one can be replaced at any moment by someone equally qualified to do one’s job, and even more desperate to collect its meager compensation.
It should be unnecessary to point out that such a system both reinforces and strengthens class stratification. Children of privilege, who can rely on their families to pay the rent and the grocery bills during an awkward year or two while they work for little or literally no pay, in order to get their feet inside the proverbial doors, will end up the real jobs that eventually appear behind those doors, while many less privileged graduates will have to abandon their dreams of a legal career altogether. . .
Given these grim facts, law teachers now face a difficult conundrum. In a world in which university administrators increasingly speak in a manner that is hard to distinguish from the professional patois of business consultants – in which educational institutions are treated as “brands” to be “synergized” in the appropriate “target markets” and so forth – prudent law faculty will be tempted to suppress any impulse to engage in critical pedagogy regarding the nascent professional and personal crisis faced by so many of their students. They will instead keep, as it were, pushing the product.
Yet such prudence, while no doubt conducive to both professional advancement and personal happiness, requires a certain mortification of both the intellect and the capacity for moral action (Here we can recall Flaubert’s dictum that “to be stupid, selfish, and have good health are three requirements for happiness, though if stupidity is lacking, all is lost.”).
In any case, the law school reform movement has acquired sufficient notoriety that it is becoming increasingly difficult for individual law teachers and law schools as institutions to employ silence and denial as either an unconscious psychological defense mechanism or a conscious business strategy. Indeed, in the contemporary American law school, the employment and debt crisis faced by our students is always present in every encounter with them, if only implicitly, and it is now an abrogation of professional responsibility not to address it at appropriate times.
It’s been almost exactly four years since the publication of David Segal’s original NY Times piece on the employment crisis overtaking recent law school graduates. Inside legal academia, Segal’s article was met largely with scorn, and in retrospect it’s easy to see why.
At the time, transparency regarding law graduate employment outcomes essentially didn’t exist, ABA law schools had just admitted their biggest first year class ever, tuition was at an all time high and still rising much faster than inflation (which demonstrated that law school was a fine investment because The Market), and any so-called employment “crisis” among law grads was obviously a temporary function of the recession, and was seriously exaggerated by bitter scamblog losers who went to bad law schools, and should have known better all along because Rational Maximizing of Individual Utility. Also, too, Caveat Emptor.
Today, well . . .
Elizabeth Olson and David Segal in today’s NYT:
The bottom of the law school market just keeps on dropping.
Enrollment numbers of first-year law students have sunk to levels not seen since 1973, when there were 53 fewer law schools in the United States, according to the figures just released by the American Bar Association. The 37,924 full- and part-time students who started classes in 2014 represent a 30 percent decline from just four years ago, when enrollment peaked at 52,488.
The recession was in full swing then, and many college graduates looked at law school, as they have many times in the past, as a sure ticket to a good job. Now, with the economy slowly rebounding, a growing number of college graduates are examining the costs of attending law school and the available jobs and deciding that it is not worth the money.
“People are coming to terms with the fact that this decline is the product of long-term structural changes that are just not going away,” said Paul F. Campos, a professor at the University of Colorado’s law school. “It’s kind of a watershed moment.”
Even after all this time, there’s a part of me that’s genuinely surprised that so many law schools are at present managing to lose so much money. I mean consider this “business” model: The government will loan anyone to whom you choose to sell your product the full price of that product, subject to essentially no actuarial controls. And here’s the kicker: you can charge whatever you want, no questions asked! You get the proceeds of the loans, the buyers and eventually the taxpayers take all the risk, and you can do whatever you want with the money.
Losing money in this situation should be pretty hard to do, but if history has taught us anything, it’s that there is no amount of money that can’t be blown on yachts with three helipads, bottle service, and university administration.
Anyway, even back in 2010 the warning signs should have been plentiful, as that largest first-year class ever, the size of which was necessary to pay for the helipads etc., was gathered in by cutting admission standards quite a bit from where they were a few years earlier. That process has since accelerated, with the result that, this fall, it appears that 80% of law school applicants were admitted to at least one ABA school to which they applied.
Although the total number of applicants who were admitted in 2014 isn’t yet available, the 80% figure can be deduced by observing that 54,527 applicants resulted in 37,924 matriculants. [Updated: The figures are now available from LSAC. They indicate that 43,500 applicants were admitted, meaning that 79.8% of applicants were offered admission to at least one law school in the 2013-13 cycle] In 2013 86.8% of admitted applicants ended up matriculating somewhere: this latter percentage tends to be very stable. If we assume the same percentage of admitted applicants matriculated in 2014, that would mean the overall admission rate for ABA law school applicants will have looked like this over the past decade:
Here’s how this looks at the individual school level. I randomly looked up the percentage of admitted applicants at 13 schools: the holy trinity of Harvard, Yale, and Stanford, and then ten schools ranging from the sub-elite to the sub-basement. The first percentage represents applicants admitted in 2004. The second is the same figure for 2014:
American: 24.6% 49.7%
Boston College: 16.6% 43.9%
Brooklyn: 23% 53.2%
UC-Hastings: 19.5% 49.2%
UCLA: 13.6% 28.1%
Florida Coastal: 35.6% 77.7%
Fordham: 19.3% 35.5%
Hofstra: 26.3% 61.2%
Illinois: 23.1% 41.9%
John Marshall: 35.7% 72.9%
Harvard: 11.3% 15.4%
Stanford: 7.7% 9.1%
Yale: 6.5% 8.9%
So even the high rent district has felt a slight sting, though it’s nothing compared to what’s going on in the outer suburbs.
As to when and where this will all end, applications are down another ten percent so far this year.
Most of the country’s largest theater chains have decided not to show Sony’s “The Interview,” according to a person with direct knowledge of the matter.
The decision follows a strange warning on Tuesday from anonymous hackers that people should avoid going to theaters where “The Interview” is playing.
The comedic film is still scheduled to come out on Christmas Day. Sony (SNE) does not plan to pull the film altogether, but the studio has indicated it won’t object if theaters decide not to show the film, a second source said.
Among the top chains that have decided to not show the movie are Regal (RGC), Cinemark (CNK), Carmike Cinemas (CKEC), Arclight and Southern.
Another smaller chain, Bow Tie Cinemas, has also dropped its plans to show the film.
“It is our mission to ensure the safety and comfort of our guests and employees,” the company said in a statement.
The shockwaves from the Sony hack have finally reached Hollywood’s development community, as New Regency has pulled the plug on the Steve Carell movie “Pyongyang,” which Gore Verbinski had been prepping for a March start date, an individual familiar with the project has told TheWrap.
Based on the graphic novel by Guy Delisle, “Pyongyang” is a paranoid thriller about a Westerner’s experiences working in North Korea for a year.