Cocaine cut with the veterinary drug levamisole could be the culprit in a flurry of flesh-eating disease in New York and Los Angeles.
The drug, used to deworm cattle, pigs and sheep, can rot the skin off noses, ears and cheeks. And over 80 percent of the country’s coke supply contains it. . . .
[Dr. Noah] Craft is one of several doctors across the country who have linked the rotting skin to tainted coke. The gruesome wounds surface days after a hit because of an immune reaction that attacks the blood vessels supplying the skin. Without blood, the skin starves and suffocates.
For a period in the 1990s, levamisole was hailed as an effective component in adjuvant chemotherapy for stage III colon cancer; then a few studies came out showing that the drug significantly worsened the prognosis for patients (and caused potentially life-threatening side effects like white cell crashes), and it was eliminated from the cocktails. Now it’s used almost exclusively on animals. Its use as a cutting agent in cocaine is apparently a recent (and evolving) phenomenon, as this fascinating 2010 piece from The Stranger describes in great detail — among other things, levamisole is virtually undetectable, adds significant bulk to crack (while surviving the purification process), and may accelerate cocaine’s potency. And if you scan back through Google news since the mid-1990s, you can track the growing estimates of the presence of levamisole in the American coke supply, from around 30 percent in 2005 to current estimates of 80 percent or more.
And to think that some people insist that our War on Drugs doesn’t produce measurable results.
So the usual suspects are evidently pissing themselves over the latest, wildly overstated claim that a reduction in sunspot activity will initiate a “mini ice age” and make Al Gore fatter and weepier. Never mind that one of the participants in the relevant study has rejected the wingnut gloss it’s received over the past 24 hours; the larger problem is that AGW doubters insist on citing a hypothesis that has been reluctantly abandoned by one of its original proponents and (for lack of corroborating evidence) ignored or dismissed by nearly everyone else.
Put briefly, climate change “skeptics” often propose an argument based in large part on a small group of studies from the 1990s that subsequent research failed to corroborate. Between 1991 and 1997, two Danish scientists — Eigil Friis-Christensen and Henrik Svensmark — developed a notion that shifts in the intensity of sunspot activity correlate positively with upward or downward shifts in global temperatures. The mechanism for this relationship, they suggested, was a causal link between sunspots and cloud cover. More sunspots, more clouds; more clouds, warmer temperatures. They graphed the data that seemed to validate their hypothesis.
It all sounded plausible, except for the part about the bullshit:
[T]he two key graphs [from Friis-Christensen and Svensmark's work] are based on flawed data. There is no correlation between global warming and solar activity, and no correlation between cloud cover and cosmic rays, the critics say.
The flaws were first identified by Peter Laut, a Danish scientist who was once science adviser to the Danish Energy Agency. Laut, now retired, demonstrated in a study first aired in 2000 and published in a peer-reviewed journal in 2003 that both graphs contained serious errors. When these flaws were corrected, the apparent correlations between global warming and solar activity, and cosmic rays and cloud cover, disappeared.
. . . Six leading experts, including one Nobel laureate, agreed with Laut’s analysis that the graphs of Friis-Christensen and Svensmark showing apparent correlations between global warming, sunspots and cosmic rays are deeply flawed.
Friis-Christensen now accepts that any correlation between sunspots and global warming that he may have identified in the 1991 study has since broken down. There is, he said, a clear “divergence” between the sunspots and global temperatures after 1986, which shows that the present warming period cannot be explained by solar activity alone.
As anyone familiar with the research would tell you, the relationship between sunspot activity and global temperatures is poorly understood. Aside from the the dubious mechanism proposed by Svensmark and others, there does seem to be a measurable (though tiny) correlation between sunspots and overall solar luminosity. But as Joe Romm points out, even if we were looking forward to a “grand minimum” phase of solar activity — a questionable prediction in the first place — the decrease in luminosity would perhaps amount to a reduction of .1 to .3 degrees Celsius. (That’s a decimal point in there, by the way.)
Lacking a stronger mechanism to link solar variation with climate change, it would seem the deniers have only magic and miracles at their disposal. Which I guess explains a lot after all.
Given that our only apparent alternatives are to consider Anthony Weiner’s dick or Sarah Palin’s application of the Barnum Effect to the history of American Revolution, allow me to remind everyone that today also happens to be the International Day of Slayer. Bludgeon yourselves accordingly.
Via TPM, I see that Mike Huckabee is marketing a home remedy for whatever historical literacy your kids might be acquiring in school. As I understand it, Huckabee has marshaled the power of 1990s computer animation software to reveal truths about the American past that only a quintet of impressionable, time-traveling teenagers — one of whom is evidently named “Barley” — could discover. The trailers suggest the series won’t be quite as inspiring as the Drunk Historyouevre, but the no-risk, 30-day introductory offer comes with a pair of snappy blue binoculars and something called a “shoulder sack,” so there’s that to consider.
But before I squander my completely undeserved share of LGM’s advertising revenue on a trial membership, I need to know if the videos possess any scholarly rigor. Well, consider me reassured. To guarantee that the videos meet Huckabee’s exacting standard of historical accuracy, multiple levels of quality control have been set into place. Here’s how:
First, our lead researcher goes through various primary and secondary sources, including printed and online resources to determine the most important events and themes that will be included in each episode. Those events and themes are then woven into a script in which the animated characters experience the history first-hand. After its completion, the script is reviewed by at least two members of Learn Our History’s Council of Masters, who suggest changes to make the film as historically accurate as possible.
So it’s peer-reviewed! By a Council of Masters, no less!
Unfortunately, this particular bukkake party includes the University of Dayton’s Larry Schweikart, whom you may remember from such books as Bill Clinton’s Penis Killed the Indians and Four Dozen Strawmen in Search of an Argument. Schweikart has developed a comfortable niche for himself as a Fox News-approved historian, and my guess is that Huckabee’s new series is will turn out to be more or less a badly-animated version of Schweikart’s triumphalist narrative of US history. It would be difficult to survey adequately the depths of hackery this fellow has mined in the last decade. I’ve tried — see the last two links — but you can judge for yourself here, in the introduction to Schweikart’s book about warfare and why the United States, like Charlie Sheen, is always winning. (Surprise! The hippies help!) Schweikart has also written a new book about applying the wisdom of the Founders to contemporary political issues. In the introduction, he warns us that “[w]hen [the food Nazis] come for your Ho Hos, they won’t stop until they dictate every morsel that goes into your mouth.” Yeah, well, something tells me the kids from the Time Travel Academy will have something to say about that.
The rest of the Council is considerably less interesting, although it does include a fellow who received his most recent degree from Wayland Smithers Baptist University and now teaches at Bryan College in Dayton, Tennessee. I can’t speculate precisely what historical contributions he might offer to the series, but I expect he’ll be available for Huckabee’s next project, “Fucking Magnets: How Do They Work?”
When outlining “The Case for Cursive,” a journalist ought to provide an actually compelling argument. The best substitute, it seems, runs something like this:
Might people who write only by printing — in block letters, or perhaps with a sloppy, squiggly signature — be more at risk for forgery? Is the development of a fine motor skill thwarted by an aversion to cursive handwriting? And what happens when young people who are not familiar with cursive have to read historical documents like the Constitution?
I don’t see why it would. Why standardized, grade-school instruction in cursive handwriting should be celebrated as a useful device in the war against forgery is beyond my comprehension in the era of electronic identity. More broadly, the assumption that cursive is more difficult to forge rests, I suspect, on the dubious premise that cursive script supplies a graphic fingerprint, an expression of individuality that surpasses than any other style of writing. I can’t imagine there’s much — if any — evidence to back up such a claim.
Probably not. The Palmer Method — which I believe still serves as the deep background for (the obviously disintegrating) cursive handwriting instruction in the US — emphasized proximal muscle movements (e.g., shoulder and upper arm) rather than distal muscles on the assumption that fine motor skills would “evolve” from the stability provided by the larger muscles. But as I understand the literature, the relationship between proximal and distal muscle development isn’t entirely clear when it comes to handwriting, and — most importantly — there’s nothing particularly special about a cursive style that facilitates any of the motor advantages that are claimed for it. Handwriting in general is obviously still essential to education, and there are important links between legible handwriting and cognitive development, visual/perceptual acuity, motor control and planning, and academic performance and self esteem more broadly. But while writing still constitutes a huge percentage of what kids do in school, it’s certainly not the only means of developing fine motor skills.
Only an archivist would care. There’s not much to say about this except that the author of this piece clearly needed to come up with a third reason to fret about the disappearance of cursive handwriting instruction. That this is what she came up with tells us pretty much everything we need to know about the severity of the crisis.
Now, I’m sure my views on cursive handwriting are shaped to significant degrees by the humiliation of being exiled to remedial handwriting class for several weeks during the 5th grade. A perennial “C” student in penmanship, I was neither practically assisted nor aesthetically inspired by the therapist’s suggestion that I imagine Mark Spitz gliding through the water as I helplessly stabbed at the paper in front of me. (This was 1981, mind you. How I was supposed to visualize Mark Spitz — who won most of his Olympic medals when I was a year old — during the pre-YouTube era is anyone’s guess. I suspect Eric Heiden would have been more comprehensible to me, at least as a metaphor.) In any event, my cursive skills continued to moulder through the years until at some point in high school we were quietly untethered from the style and allowed to submit our work in whatever fashion we chose. My handwriting continues to be one of our generation’s greater atrocities, but I can’t imagine I would have fared any better a century ago, when my teachers would have clubbed me on the shins for failing to articulate a proper upper-case “Q.”
I awoke this morning with a notion of ridiculing the marginal voices on the intertubes who’d begun winding themselves into knots over a new German “study” claiming that compact fluorescent lights will transform our skulls into husks of cauliflower-sized tumors. Now, however, I see that Reynolds and Althouse are promoting the story, which means it’s reached Full Wingnut Velocity more quickly than I’d expected.
For those keeping score at home, the damning new evidence against CFLs consists of a report by a small German industrial laboratory that appears to spend most of its time doing air quality consulting for businesses and building contractors. The lab — whose website can be read in butchered Yahoo translation here — has no academic affiliation, has never hosted research that’s appeared in a peer-reviewed scientific journal, and doesn’t appear to employ anyone with an advanced degree in chemistry (much less medicine in general or oncology in particular). But their report evidently claims that “several carcinogenic chemicals and toxins were released when the environmentally-friendly compact fluorescent lamps (CFLs) were switched on, including phenol, naphthalene and styrene.”
Sounds terribly frightening, except that (a) it’s not entirely clear that any of these chemicals are actually carcinogenic in humans, and (b) all three — though toxic in truly massive doses — are abundant at low levels in virtually any indoor setting. Phenol, for example, makes up about 12 percent of the putty that’s used to fuse the metal base of a bulb socket to the glass bulb itself; so far as I’m aware, there’s nothing special about compact fluorescent bulbs that would surpass the minuscule (and completely innocuous) levels of gaseous phenol that might be emitted by a warming socket. Similarly, naphthalene (usually in the form of phthalic anhydride) and styrene are found in an almost endless variety of the ordinary crap that clutters Glenn Reynolds’ house. And tonight’s box of merlot will do more damage to Ann Althouse’s liver than the lifetime’s worth of bulbs that George W. Bush will have made her purchase.
CFLs, to put it bluntly, would have to produce an implausibly massive fog of these chemicals to pose even the most remote chance of acute toxicity, much less a distant risk of cancer. But since I assume Al Gore is still fat, there’s probably nothing to be gained from pointing this out.
If you thought the advocates of sham medicine would have the proper sense to avoid offering their services to the Japanese people in their time of distress — well, you would apparently be wrong:
(R)adioactive material carried by wind and air currents may spread contaminated material to neighboring islands and countries. For all concerned, there are protective steps that can be taken with homeopathy. Key remedies that have been used either in research or historically to prevent or treat radiation poisoning include the following: Cadmium iodide; Cadmium-sulph; Phosphorus; Strontium-carbonicum; and X-ray. If at risk of radiation exposure, any one of the above remedies may be taken as an emergency response, three times a day in a 30C potency. Do not exceed 6 doses without guidance from your homeopath.
The last line is especially hilarious, because for chrissakes, we wouldn’t want anyone overdosing on magical homeopathic water.
The folks responsible for the e-mail quoted above — an Australian group called, inexplicably, “Homeopathy Plus” — have helpfully provided the world with a ponderous, gibbering series of dubious claims about the efficacy of various homeopathic remedies for treating the side effects of radiotherapy and chemotherapy (as if there were any meaningful comparison between fractionated, 60 gray blasts of photon beams and, say uncontrolled exposure to Cesium-137.) To their mild credit, our water-bearing friends don’t advise cancer patients to forego treatments that have actually been proven effective. But they do warn that homeopathic remedies are so overwhelmingly powerful that they must be used with caution and proper timing. To wit:
Not only can homeopathy treat many problems, it can also prevent them . . . . For this reason homoeopathic [sic] remedies for chemotherapy and radiotherapy side-effects should not be used AHEAD of treatment as they may also block the cell destroying effects of these approaches. Until more research is available in this area, it would be wise to use homeopathy only as side-effects occur AFTER treatment.
Good to know!
Meantime, over at the Hippie LancetHuffington Post, someone is reminding us that lots of miso soup and brown rice will surely thwart the effects of radiation poisoning. Also good to know!
We’re reading Michael Taussig’s ubiquitous essay on the “Space of Death” this morning in my introductory social science seminar, and in the course of trying to find something interesting to launch the discussion, I happened up this brief New Yorker piece about Taussig’s apocalypse seminar at Columbia.
The discussion moved on to Freud, Adorno, diarrhea-related fatalities, the banality of long-term catastrophe versus the excitement of instant apocalypse, Surrealism, a friend who started doing yoga after her father died, drowning polar bears, Jon Stewart, and, finally, swine flu and whether it is a sign of a real or a “mediated” apocalypse.
Taussig interrupted. “If you were living in Mexico City, how on top of your game would you really be?” he asked. “How well could you cope with catastrophe unfolding around you? That’s really where I’d like to leave this class.”
“Oh!” he said, as students solemnly loaded their backpacks. “Don’t forget! Potluck tomorrow! Eight o’clock!”
Wearing gray wool uniforms, hoop skirts, leather jackets and business suits, several hundred men and women marched to the Alabama Statehouse on Saturday afternoon, where they delivered defiant speeches, fired heavy artillery, and swore in an amateur actor playing Jefferson Davis as president of the Confederacy, 150 years and one day after the event took place.
Confederate minimizers have always appreciated Jefferson Davis’ inaugural address because it allow them to pretend that the defense of slavery had nothing to do with the secession and the formation of the confederate government. Unlike the constitution to which he swore allegiance — and unlike the Confederate apostles who promoted disunion throughout December 1860 and January 1861 — Davis had the delicate taste to refrain from actually using the word “slavery” in his address. At the same time, however, it takes an extraordinarily naive reading of that speech to miss the central point of the Confederacy’s political revolution. For instance:
With a Constitution differing only from that of our fathers in so far as it is explanatory of their well-known intent, freed from sectional conflicts, which have interfered with the pursuit of the general welfare, it is not unreasonable to expect that States from which we have recently parted may seek to unite their fortunes to ours under the Government which we have instituted. For this your Constitution makes adequate provision; but beyond this, if I mistake not the judgment and will of the people, a reunion with the States from which we have separated is neither practicable nor desirable. To increase the power, develop the resources, and promote the happiness of the Confederacy, it is requisite that there should be so much of homogeneity that the welfare of every portion shall be the aim of the whole. When this does not exist, antagonisms are engendered which must and should result in separation.
Everyone in February 1861 would have understood that the “well-known intent” of the Constitution, so far as the Confederate leadership was concerned, was to protect all species of property and prohibit the federal government from discriminating against the (non trans-Atlantic) traffic in human property. Since Lincoln was elected in part on the strength of his party’s vow to resist the Dred Scott decision — a ruling that every good slaveholder regarded as a vindication of the Constitution’s “well-known intent” to protect slavery everywhere — Confederate mobilizers once again yowled, as they had for the better part of the past fifteen years, that Northern madmen were determined to harm the sectional “welfare” of the slave power. The ascendancy of Black Republicanism threatened the “homogeneity” of values needed to view the extension of slavery as an essential national good.
Davis’ inaugural address, in other words, did nothing to deny that slavery was central to Southern states rights nationalism; rather, it chose to make all the usual pro-slavery arguments in more opaque form. (His second inaugural address, which almost no one ever thinks about, is somewhat less guarded than his first effort. There, he continues to avoid using the word “slavery,” but he nevertheless fumes about the North’s “warfare on the domestic institutions of the Southern states” and the South’s need to maintain “our ancient institutions.” And if for some reason you aren’t familiar with 19th century pro-slavery euphemisms, “institutions” is loosely translated as “owning some motherfuckers.”)
At any rate, all of this should be plain to anyone who isn’t a complete moron. What you probably didn’t realize, however, is that the plot against Southern heritage is exactly like all the horrible things that happened to Harry Potter.