A rant:
In the pecking order of physicians, both in terms of salary and status, I'm not especially high up there. As an infectious disease doc, I make a bit more than primary care docs, a bit less than hospitalists, a lot less than some of my fellow internal medicine subspecialists such as gastroenterologists and cardiologists, and a whole lot less than general and plastic surgeons. All of which is fine with me, as salary wasn't the main reason I got into medicine in the first place, and I have always put very little stock in status. Moreover, I have always maintained a huge level of respect for the vast majority of cardiologists and surgeons, most of whom are not only consummate physicians but incredibly hard workers as well. I'm content to play the bottom-feeding catfish to their swordfish (cardiologists, in this somewhat strained analogy) or shark (general surgeons, of course).
Also in this metaphorical fishbowl of American medicine are the clownfishes: small but beautiful, always the envy of the other fish in that they draw the admiring stares from people while flitting through the tank. Orthopedic surgeons are the clownfishes. They're not a big subspecialty: of the roughly 22,000 residency positions available last year, only 641 were for orthopedics. But oh, do they do well: the median income of an orthopod is estimated to be over $400,000 (!). Hence, a lot of very good medical students in med schools in the US work very, very hard as they pine for the joys of knee arthroscopy, laminectomy, and hip replacements, to say nothing of the Cabernet Sauvignon or trips to Bermuda that await.
One of the central ironies of American medicine is that several of these very fine students, whom many a prestigious Internal Medicine or General Surgery program would be delighted to train, will have spent an inordinate amount of time and energy learning medicine only to forget the vast majority of it during their residencies. You see, the orthopedic surgery residency takes people who have doctorates in medicine and turn them into doctors of bones! They spend their residency years un-learning all the medicine that their expensive education gave them in the first place. My experience is that most orthopods can't even deal with the simplest postoperative medical issue for their patients, and they tend to punt problems to a medical consult that even a third-year medical student could handle competently.
Again, bone surgery is not what interests me so I'm not trying to rain on their parade, and I'm not trying to cry about unequal compensation. I do think, however, that with all that compensation comes a few obligations.
Take, for instance, the patient that I saw this weekend while covering for the local ID physician at the nearby community hospital. My pager chirped early Saturday morning and I got one of the hospitalists on the line. "Billy, I'm not sure what's going on with this lady, and she's not even 'mine'," said the hospitalist, meaning that he wasn't the attending of record, but just a consultant managing the medical issues. The attending of record was an orthopedic surgeon from one of those "Sports Medicine Associates"-type groups (not the real name). It turned out that the patient had gotten a new hip eleven days before, and her postoperative course was complicated by a fever that never seemed to go away, even though cultures, x-rays, a CT scan of the hip, and a few other tests turned up nothing. Plus her white count was normal.
I relate this story not because of its "House"-like interest (though MDs out there are welcome to take a crack at the diagnosis), but because during those eleven days she was not once seen by the orthopedic surgeon who put the new hip in. Nor was she seen by the surgeon's partners who were on call; she got visits from three different Physicians Assistants, all of whom assured her that she was on the mend (she wasn't) and would be discharged the next day. Eleven days!
You know what? I don't care that the clownfishes make the big bucks until I see a patient not merely suffering (sometimes that's unavoidable), but feeling abandoned, which is inexcusable! I know that they're just doctors of bones and not real doctors at this stage of their careers, and that they can't manage anything other than deciding between a press-fit stem and a cement-stem, but the patients don't know that! It's fine, let me and the other consultants who actually know how to be doctors of people do the real work of taking care of the patient--I'm handsomely compensated as far as I'm concerned, even if it's a third of what they make. Just try to make the patient feel like you care! So Man Up, you asshole, and see your fucking patient! She's sick!
Lest you think I'm overstating the case, take a look here at this 2007 NYT article about "specialty" hospitals not being able to handle sick patients. Not all of these specialty hospitals are completely orthopedics, but a lot of them are. You know what's really rich about this? These hospitals are often built by the physicians themselves because they want to cut out "traditional" hospitals so that they can receive even higher levels of compensation. Amazing! (And yes, "rich" was an intentional pun.) Physician salaries are often a delicate matter to discuss, because docs want to be well paid, but this isn't just your typical societal working-out of a doc's salary. This is greed! Good grief.
When I signed off for the weekend to the regular ID doc, I related this story. He sighed. "Yeah, we have really had problems with that group over the years," he said. "The amazing thing is that when something goes wrong and the patient doesn't just recover in three days like normal, they tend to get mad at the patients for sticking around."
--br
Where a spiritual descendant of Sir William Osler and Abbie Hoffman holds forth on issues of medicine, media and politics. Mostly.
Tuesday, January 18, 2011
Wednesday, January 12, 2011
Billy Rubin, The Patient
Over the past year I have been having increasing difficulty with my breathing, having frequent episodes of wheezing and chest tightness, to the point where over the summer I could no longer exercise. I had never had such a problem before, but knew enough to know that albuterol would temporarily halt the symptoms. I stole some of my son's albuterol nebs, and the incredible relief they brought made me realize that I shouldn't be self-medicating and so I arranged an appointment with my doc. His take (after noting my room air pulse oxygenation of 93%--abnormally low for a 40 year-old with no major medical problems) was that I had new-onset asthma probably due to reflux, and while I was skeptical, I duly arranged to take a battery of medications, including the acid suppressor protonix, a steroid nasal spray, a steroid inhaler, and most importantly at that time, a two-week tapering course of prednisone, which by the end had me feeling like a million bucks and allowed me to get back on my bike again.
Two months later, though, despite being nearly completely faithful to the medications, I was tightening up again. Each week exercise became harder and harder, and I got more and more reliant on the albuterol (which helps symptoms but doesn't fix the problem). I went to see an ENT doc and he found nothing. My doc shrugged and suggested that I see a pulmonologist. When I called to make an appointment, though, the first opening they had was in mid-February, and I was getting worse by the day. Finally, by last week I had reached the end of my rope and decided to prescribe myself a steroid taper. Treating yourself as a doctor is definitely not considered smart, but I felt I had little option. When the pulmonologist had a cancellation a few days later, I came in and sought to apologize for the self-medication. After she listened to both my story and my lungs, though, she looked me straight in the eye. "You sound terrible, even now, a few days after you've started the prednisone," she said. "You absolutely did the right thing; I just can't believe you waited as long as you did." Well, how about that, I thought, Billy Rubin, model patient!
Then came the conversation about cats.
I knew it was coming. Anyone who suffers from asthma is advised to minimize exposure to a variety of allergens known to (or at least suspected of) drive the pathology in the first place. So my pulmonologist asked me if I had cats. I replied I had three. She paused, then said, "you may need to consider placing them somewhere." (Disclosure--I can't remember if those are her precise words but it's damn close.) Place them somewhere?
Doc, you seem nice and all, but I ain't getting rid of the cats.
A story leaps to mind. Back in med school I had a classmate whose wife had just delivered. She had two cats who had lived with her for about ten years and regarded them as "her kids," but the process of having human-based children had a profound effect on her. "I just came home from the hospital and looked at them," she told us, "and I said, 'Oh my God, they're just cats!' It was like I never really realized it until I had my own."
The family joke around the Rubin household is a bit different: I'm fond of saying that I have two and a half children and two and a half cats, the third cat being a good deal more than just some friendly fur that eats food and wants its box cleaned. This little feller sleeps with us, cries when we leave, comes right to the door as soon as we walk in and jumps straight to my shoulder. The other two are your basic cats, and while I'm very partial to them I suppose that I could banish them if my life were threatened. But this third cat? My baby? Oh, you must be kidding.
During my infectious disease fellowship, I had a clinic with a few patients who had advanced HIV and owned cats. Cats, especially the outdoor ones, carry a parasite called toxoplasmosis that's harmless to them and is in general harmless to us. However, those with compromised immune systems have to be careful--thus warnings against changing your cat's litter box if you are pregnant, are a transplant recipient, or you have advanced HIV. Usually there's some easily found solution so that the person at risk doesn't have to change the box, but twice I had patients who simply didn't have options: it was change the cat's boxes themselves or get rid of the cat. Both times I suggested they consider getting rid of the cat.
I remember both of these moments because I remember the looks I got in response to my suggestion. It was the same look I gave Doctor Pulmonologist last week.
In medicine, sometimes the equations are simple and so is the advice. But the solutions (and the docs that give them) have to take into account the whole of the person: knowing what's adjustable and what's non-negotiable. I don't blame my doc for her advice--it is, after all, the right initial advice to give--nor do I have any trouble with the clinical detachment that she delivered it. She's supposed to be detached! As we continue to visit, though, I'm hoping that she'll come to appreciate how very un-simple the equation of getting rid of some cats would be for Billy Rubin, The Patient.
--br
Two months later, though, despite being nearly completely faithful to the medications, I was tightening up again. Each week exercise became harder and harder, and I got more and more reliant on the albuterol (which helps symptoms but doesn't fix the problem). I went to see an ENT doc and he found nothing. My doc shrugged and suggested that I see a pulmonologist. When I called to make an appointment, though, the first opening they had was in mid-February, and I was getting worse by the day. Finally, by last week I had reached the end of my rope and decided to prescribe myself a steroid taper. Treating yourself as a doctor is definitely not considered smart, but I felt I had little option. When the pulmonologist had a cancellation a few days later, I came in and sought to apologize for the self-medication. After she listened to both my story and my lungs, though, she looked me straight in the eye. "You sound terrible, even now, a few days after you've started the prednisone," she said. "You absolutely did the right thing; I just can't believe you waited as long as you did." Well, how about that, I thought, Billy Rubin, model patient!
Then came the conversation about cats.
I knew it was coming. Anyone who suffers from asthma is advised to minimize exposure to a variety of allergens known to (or at least suspected of) drive the pathology in the first place. So my pulmonologist asked me if I had cats. I replied I had three. She paused, then said, "you may need to consider placing them somewhere." (Disclosure--I can't remember if those are her precise words but it's damn close.) Place them somewhere?
Doc, you seem nice and all, but I ain't getting rid of the cats.
A story leaps to mind. Back in med school I had a classmate whose wife had just delivered. She had two cats who had lived with her for about ten years and regarded them as "her kids," but the process of having human-based children had a profound effect on her. "I just came home from the hospital and looked at them," she told us, "and I said, 'Oh my God, they're just cats!' It was like I never really realized it until I had my own."
The family joke around the Rubin household is a bit different: I'm fond of saying that I have two and a half children and two and a half cats, the third cat being a good deal more than just some friendly fur that eats food and wants its box cleaned. This little feller sleeps with us, cries when we leave, comes right to the door as soon as we walk in and jumps straight to my shoulder. The other two are your basic cats, and while I'm very partial to them I suppose that I could banish them if my life were threatened. But this third cat? My baby? Oh, you must be kidding.
During my infectious disease fellowship, I had a clinic with a few patients who had advanced HIV and owned cats. Cats, especially the outdoor ones, carry a parasite called toxoplasmosis that's harmless to them and is in general harmless to us. However, those with compromised immune systems have to be careful--thus warnings against changing your cat's litter box if you are pregnant, are a transplant recipient, or you have advanced HIV. Usually there's some easily found solution so that the person at risk doesn't have to change the box, but twice I had patients who simply didn't have options: it was change the cat's boxes themselves or get rid of the cat. Both times I suggested they consider getting rid of the cat.
I remember both of these moments because I remember the looks I got in response to my suggestion. It was the same look I gave Doctor Pulmonologist last week.
In medicine, sometimes the equations are simple and so is the advice. But the solutions (and the docs that give them) have to take into account the whole of the person: knowing what's adjustable and what's non-negotiable. I don't blame my doc for her advice--it is, after all, the right initial advice to give--nor do I have any trouble with the clinical detachment that she delivered it. She's supposed to be detached! As we continue to visit, though, I'm hoping that she'll come to appreciate how very un-simple the equation of getting rid of some cats would be for Billy Rubin, The Patient.
--br
Wednesday, January 5, 2011
The Whitewashing of Mark Twain
The most appalling tidbit in this story about an English professor who has redacted the word "nigger" from Mark Twain's novels The Adventures of Huckleberry Finn and The Adventures of Tom Sawyer is not the preciously childlike logic motivating his actions, thoroughly childlike though it is. The professor, one Alan Gribben of Auburn University, Montgomery--more thoughts on the Auburn connection anon--said that he felt that the more polite term "slave" should be put in "nigger"s stead, in part, because he was approached by a number of local teachers who said they would love to teach the book, but can't because "in the new classroom, it's not really acceptable." No, what is truly, deeply appalling, even beyond this triumph of bowdlerizing stupidity is that Gribben actually justifies his decision in the wake of critical e-mails he has received. "None of them mentions the word. They dance around it," he says, and one can hear a certain tone of schoolmarmish pride.
At the Billy Rubin Blog, while we unquestionably react viscerally and angrily to the casual use of such words, we do not shy away from mentioning words that have real power to broadly offend in American society (of which we would only list two: the aforementioned "nigger," and "cunt") simply because such words are deemed impolite.
[A digression: "bitch" used to be in this category but has attained a certain level of acceptability, with restrictions: noting that you've had a "bitch of a day" in polite chit-chat with colleagues won't raise eyebrows, but referring to one of the female SCOTUS Justices as a "bitch" certainly would. And obviously people immediately think of various words that in part helped launch the career of comedian George Carlin--"shit," "fuck," "motherfucker," "twat," "cocksucker," et cetera--words which might induce a scowl by others at work or play for their overall rudeness, but have basically become accepted at all but the most austere gatherings of the D.A.R. or Focus On The Family. Indeed, some of Carlin's words that couldn't be mentioned on television in 1978 are considered fully acceptable, if perhaps vaguely crude: "piss," "turd," and "fart" (this being a favorite word of Benjamin Franklin's...Billy delights in having selected writings of Franklin sit in his bathroom with the large title "FART PROUDLY!" referring to an essay of the same name). Such words, while having perhaps the power to titillate, almost never except in the most tight-wadded communities have the power to immediately and thoroughly offend. "Nigger" and "cunt," however, maintain an absolutely-out-of-bounds status in more social situations than any other words currently in use in the US. And we are, of course, ignoring the fascinating-but-complicated appropriation of the word "nigger" by African-Americans themselves. Suffice it to say that no sane white person would utter the word except among like-minded racists.]
Professor Gribben's exultation of prissiness (and the similarly comic reactions of at least some mainstream media outlets such as USA Today, who refused to actually write the word "nigger" since their policy prohibits it, admirable though such misguided intentions are) ironically misinterprets Twain in more than one way. The clear error, as explained by Jonathan Turley, who delicately tries to avoid the direct use of the word himself, invokes the notion that "to truly appreciate great works of fiction, such books must be read with an understanding of the mores and lexicon of its time." (The blog Rightpundits.com--we can safely assume that the Billy Rubin Blog concurring with such a site is rare if not unprecedented--merely decries the nonsense but doesn't bother to take the time to argue the point since it seems obviously ludicrous, an attitude toward which I am not entirely unsympathetic.) That said, Gribben's logic is not merely embarrassing because it fig-leafs Twain's equivalent of David (though my favorite has always been Pudd'nhead Wilson), but because it seems to operate on the logic that since Twain's work is great, Twain therefore is a Great American, and Great Americans by definition could not have possibly meant to use "nigger" the way most white people actually meant "nigger" until only 20 years ago, and it goes without saying, still used as the butt of jokes in a good many social circles, though much more discreetly.
But the reality is a good deal more complicated, and Gribben's attempt to anoint Twain as saintlike in his pursuit of racial equality ignores the fact of Twain's actual, explicit racism, and astonishingly manages to undervalue his incredible contributions to the American discussion about race (at least from this white Jewish kid's late-20th century perspective). Take, for example, Twain's love of the minstrel show, which without question would make Gribben blush himself into oblivion: "If I could have the nigger show back again in its pristine purity and perfection I should have but little further use for opera," he waxed nostalgic in his Autobiography published in 1906 [my emphasis]. How does Whitewasher-In-Chief, Professor Gribben account for such an unguarded and honest remark from his hero? Could the autoclaved Twain ever have uttered such ugliness?
Illustrating Twain's contradictions, though, should never be used to create the shibboleth that Twain was just another white racist who used the word "nigger" without care or concern for the people it referred to or those who used it. Twain unequivocally maintained a palpable disgust at slavery and the inequalities between whites and blacks, and did so quite vocally throughout his adult life. If one can read Pudd'nhead Wilson and not feel the rage against racism fly off page after every page, then one perhaps belongs in Professor Gribben's little Sam Clemens picnic of decorum, where the finer aspects of human cruelty are swept under the rug in the attempt to tell comforting bedtime stories, where nobody curses and all are treated with respect.
The actual Mark Twain, though, wrote about the real world, and he unleashed his vehemence at the peculiar racial injustices of America over the course of decades. Before he became the affable, wry man as portrayed by Hal Holbrook--Mark Twain! Utterer of Witticisms! Large Mustache and Pre-Einstein Hair!--he was a guy willing to skewer existing attitudes at great personal risk. The following is an essay that Twain wrote while living in Buffalo entitled Only A Nigger, commenting with acidity in the newspaper The Buffalo Express (of which Twain was part owner) on the lynching of an innocent man in 1869, and deliberately emphasizing the word "nigger" by putting it in quotation marks. Twain himself calls deliberate attention to the word--it's no accident! How would Professor Gribben even begin to try to teach this essay to his innocents, sunk as it is in the mire of foul language?
At the Rubin Blog, we consider ourselves to be at one with Twain's rage, though of course the issues have changed (but see the following paragraphs) and we do not believe that Professor Alan Gribben's Dolores-Umbridge-inspired prettyfying of American history does anyone any good as it generally short-circuits the justifiable rage that one could muster about any number of political issues. Such is the dismaying attitude not only of a silly professor in Alabama, but of the vast majority of political pundits and various television celebrities, all of whom would assiduously avoid saying "nigger" but who would be loath to remark on racial matters with anything approaching honesty.
Case in point: as one final note, I can't help but relish the irony that such a rationale is being dished out by an employee of Auburn University, which if you haven't been paying attention, is about to play for the national championship. Look at their webpage, which as of this writing has a gleaming picture of their star quarterback Cam Newton! Cam has been at the center of a number of stories detailing inappropriate transactions between various schools and his father in a "pay-for-play" arrangement.
How badly does this story stink? It depends on whether you choose to point the finger at Newton and his family, or rather the system which blithely chews up and spits out players, the vast majority of whom do not have a future in the NFL. This system does this, moreover, with generally little regard for a given player's education (or indeed, their eligibility to attend such schools), but rather in the quest for tens of millions of dollars in advertising revenue as funneled through ESPN and other major networks. Most of the colleges of the major conferences such as the SEC, Big-10, ACC, Pac-10 and former Big-12 have large numbers of football players, a good many of whom are African-American and frequently are descended from the very people that Twain wrote about; these players are being systematically exploited in ways that any impartial observer would at least find vaguely similar to the economic system of slavery that every Serious Thinker publicly declares as a thing of the dark ages. And what is the skin color of most of the boosters and coaches and University students that come to the stadium on Saturday ready for the rah-rah-rah! as long as their players are winners, damned be everything else? I'm thinking they're mostly white.
Go team!
Sometime soon: Billy Rubin, The Patient.
--br
At the Billy Rubin Blog, while we unquestionably react viscerally and angrily to the casual use of such words, we do not shy away from mentioning words that have real power to broadly offend in American society (of which we would only list two: the aforementioned "nigger," and "cunt") simply because such words are deemed impolite.
[A digression: "bitch" used to be in this category but has attained a certain level of acceptability, with restrictions: noting that you've had a "bitch of a day" in polite chit-chat with colleagues won't raise eyebrows, but referring to one of the female SCOTUS Justices as a "bitch" certainly would. And obviously people immediately think of various words that in part helped launch the career of comedian George Carlin--"shit," "fuck," "motherfucker," "twat," "cocksucker," et cetera--words which might induce a scowl by others at work or play for their overall rudeness, but have basically become accepted at all but the most austere gatherings of the D.A.R. or Focus On The Family. Indeed, some of Carlin's words that couldn't be mentioned on television in 1978 are considered fully acceptable, if perhaps vaguely crude: "piss," "turd," and "fart" (this being a favorite word of Benjamin Franklin's...Billy delights in having selected writings of Franklin sit in his bathroom with the large title "FART PROUDLY!" referring to an essay of the same name). Such words, while having perhaps the power to titillate, almost never except in the most tight-wadded communities have the power to immediately and thoroughly offend. "Nigger" and "cunt," however, maintain an absolutely-out-of-bounds status in more social situations than any other words currently in use in the US. And we are, of course, ignoring the fascinating-but-complicated appropriation of the word "nigger" by African-Americans themselves. Suffice it to say that no sane white person would utter the word except among like-minded racists.]
Professor Gribben's exultation of prissiness (and the similarly comic reactions of at least some mainstream media outlets such as USA Today, who refused to actually write the word "nigger" since their policy prohibits it, admirable though such misguided intentions are) ironically misinterprets Twain in more than one way. The clear error, as explained by Jonathan Turley, who delicately tries to avoid the direct use of the word himself, invokes the notion that "to truly appreciate great works of fiction, such books must be read with an understanding of the mores and lexicon of its time." (The blog Rightpundits.com--we can safely assume that the Billy Rubin Blog concurring with such a site is rare if not unprecedented--merely decries the nonsense but doesn't bother to take the time to argue the point since it seems obviously ludicrous, an attitude toward which I am not entirely unsympathetic.) That said, Gribben's logic is not merely embarrassing because it fig-leafs Twain's equivalent of David (though my favorite has always been Pudd'nhead Wilson), but because it seems to operate on the logic that since Twain's work is great, Twain therefore is a Great American, and Great Americans by definition could not have possibly meant to use "nigger" the way most white people actually meant "nigger" until only 20 years ago, and it goes without saying, still used as the butt of jokes in a good many social circles, though much more discreetly.
But the reality is a good deal more complicated, and Gribben's attempt to anoint Twain as saintlike in his pursuit of racial equality ignores the fact of Twain's actual, explicit racism, and astonishingly manages to undervalue his incredible contributions to the American discussion about race (at least from this white Jewish kid's late-20th century perspective). Take, for example, Twain's love of the minstrel show, which without question would make Gribben blush himself into oblivion: "If I could have the nigger show back again in its pristine purity and perfection I should have but little further use for opera," he waxed nostalgic in his Autobiography published in 1906 [my emphasis]. How does Whitewasher-In-Chief, Professor Gribben account for such an unguarded and honest remark from his hero? Could the autoclaved Twain ever have uttered such ugliness?
Illustrating Twain's contradictions, though, should never be used to create the shibboleth that Twain was just another white racist who used the word "nigger" without care or concern for the people it referred to or those who used it. Twain unequivocally maintained a palpable disgust at slavery and the inequalities between whites and blacks, and did so quite vocally throughout his adult life. If one can read Pudd'nhead Wilson and not feel the rage against racism fly off page after every page, then one perhaps belongs in Professor Gribben's little Sam Clemens picnic of decorum, where the finer aspects of human cruelty are swept under the rug in the attempt to tell comforting bedtime stories, where nobody curses and all are treated with respect.
The actual Mark Twain, though, wrote about the real world, and he unleashed his vehemence at the peculiar racial injustices of America over the course of decades. Before he became the affable, wry man as portrayed by Hal Holbrook--Mark Twain! Utterer of Witticisms! Large Mustache and Pre-Einstein Hair!--he was a guy willing to skewer existing attitudes at great personal risk. The following is an essay that Twain wrote while living in Buffalo entitled Only A Nigger, commenting with acidity in the newspaper The Buffalo Express (of which Twain was part owner) on the lynching of an innocent man in 1869, and deliberately emphasizing the word "nigger" by putting it in quotation marks. Twain himself calls deliberate attention to the word--it's no accident! How would Professor Gribben even begin to try to teach this essay to his innocents, sunk as it is in the mire of foul language?
A dispatch from Memphis mentions that, of two negroes lately sentenced to death for murder inthat vicinity, one named Woods has just confessed to having ravished a young lady during the war, for which deed another negro was hung at the time by an avenging mob, the evidence that doomed the guiltless wretch being a hat which Woods now relates that he stole from its owner
and left behind, for the purpose of misleading.
Ah, well! Too bad, to be sure! A little blunder in the administration of justice by Southern mob-law; but nothing to speak of.
Only "a nigger" killed by mistake -- that is all. Of course, every high toned gentleman whose chivalric impulses were so unfortunately misled in this affair, by the cunning of the miscreant Woods, is as sorry about it as a high toned gentleman can be expected to be sorry about the unlucky fate of "a nigger." But mistakes will happen, even in the conduct of the best regulated and most high toned mobs, and surely there is no good reason why Southern gentlemen should worry themselves with useless regrets, so long as only an innocent "nigger" is hanged, or roasted or knouted to death, now and then. What if the blunder of lynching the wrong man does happen once in four or five cases! Is that any fair argument against the cultivation and indulgence of those fine chivalric passions and that noble Southern spirit which will not brook the slow and cold formalities of regular law, when outraged white womanhood appeals for vengeance? Perish the thought so unworthy of a Southern soul! Leave it to the sentimentalism and humanitarianism of a cold-blooded Yankee civilization! What are the lives of a few "niggers" in comparison with the preservation of the impetuous instincts of a proud and fiery race? Keep ready the halter, therefore, oh chivalry of Memphis! Keep the lash knotted; keep the brand and the faggots in waiting, for prompt work with the next "nigger" who may be suspected of any damnable crime! Wreak a swift vengeance upon
him, for the satisfaction of the noble impulses that animate knightly hearts, and then leave time
and accident to discover, if they will, whether he was guilty or no.
At the Rubin Blog, we consider ourselves to be at one with Twain's rage, though of course the issues have changed (but see the following paragraphs) and we do not believe that Professor Alan Gribben's Dolores-Umbridge-inspired prettyfying of American history does anyone any good as it generally short-circuits the justifiable rage that one could muster about any number of political issues. Such is the dismaying attitude not only of a silly professor in Alabama, but of the vast majority of political pundits and various television celebrities, all of whom would assiduously avoid saying "nigger" but who would be loath to remark on racial matters with anything approaching honesty.
Case in point: as one final note, I can't help but relish the irony that such a rationale is being dished out by an employee of Auburn University, which if you haven't been paying attention, is about to play for the national championship. Look at their webpage, which as of this writing has a gleaming picture of their star quarterback Cam Newton! Cam has been at the center of a number of stories detailing inappropriate transactions between various schools and his father in a "pay-for-play" arrangement.
How badly does this story stink? It depends on whether you choose to point the finger at Newton and his family, or rather the system which blithely chews up and spits out players, the vast majority of whom do not have a future in the NFL. This system does this, moreover, with generally little regard for a given player's education (or indeed, their eligibility to attend such schools), but rather in the quest for tens of millions of dollars in advertising revenue as funneled through ESPN and other major networks. Most of the colleges of the major conferences such as the SEC, Big-10, ACC, Pac-10 and former Big-12 have large numbers of football players, a good many of whom are African-American and frequently are descended from the very people that Twain wrote about; these players are being systematically exploited in ways that any impartial observer would at least find vaguely similar to the economic system of slavery that every Serious Thinker publicly declares as a thing of the dark ages. And what is the skin color of most of the boosters and coaches and University students that come to the stadium on Saturday ready for the rah-rah-rah! as long as their players are winners, damned be everything else? I'm thinking they're mostly white.
Go team!
Sometime soon: Billy Rubin, The Patient.
--br
Wednesday, December 29, 2010
Dumb Science + Politics = Media Fascination!
The recipe is pretty simple: take some current topic of theoretical interest to many people (e.g. coffee drinking, organic food consumption, political views, amount of daydreaming, or some such), then pick some biological outcome to measure (e.g. death, prostate cancer, brain size, happiness, or some such) and try to see whether they correlate. Then, after you have made the fascinating finding (study finds that coffee drinkers less likely to develop constipation as octogenarians!), you get interviewed by "science journalists" and end up on TV and radio shows, even though most of these "journalists" appear to understand very little about science and have dismally low standards as journalists.
Case in point today comes from Salon as it references an article from the British paper The Telegraph about how different political views correlate with differences in the size of certain parts of the brain. The gist: "conservatives" appear to have larger sized amygdalas; the amygdala plays a central role in the processing of emotional reactions, which The Telegraph article describes as "often associated with anxiety and emotions." Moreover, conservatives have smaller anterior cingulates--a part of the brain the articles describe as "responsible for courage and optimism."
The reality? It's likely nonsense. The study hasn't been published yet (indeed, one very big red flag is the fact that the study findings are being released to the media before going through the theoretically rigorous process of peer review) so I can't comment with great precision on its validity. But it sure doesn't smell right: they took the "brain scans" (unclear from either story whether they were MRIs or CT scans) of two, yes, two Members of Parliament, one from Labour, the other a Conservative, and then reviewed the brain scans of 90 "students" (again, unclear if college-aged or not) and matched the sizes of the brain structures against a questionnaire designed to elicit political views. Again, the questions weren't made available, so it's really hard to know how they measured political values. Even so, the study design doesn't seem especially impressive. Students? Isn't the point of being a student to create one's worldview, meaning that some students will surely drift further toward the right in life, some further left, and some change not much at all? So how can this population be used to measure political attitudes? Is the distribution equal between men and women? Between whites and blacks? Immigrants and natives? Students and non-students? Well, we already know the answer to that question--and couldn't that bias the results all by itself?
Even more concerning is the whopper unloaded by the researcher himself when he said that he and his team were "very surprised to find that there was an area of the brain that...could predict political attitude." But his study predicted nothing at all! They took scans, administered a questionnaire, and then after the fact compared the two. Prediction would entail administering the scan and guessing what the responses would be before they were given. That's a freshman-level mistake and sure doesn't boost my overall confidence in the sophistication of this guy.
But the most odious part of this sorry little news flare is found in the first line of The Telegraph article, which begins, "Scientists have found that people with conservative views have brains with larger amygdalas." What's so singularly frustrating about this opener is that it quite possibly leaves less eagle-eyed readers with the impression that this is both a) unquestionably true because it was "found" like a little fact in the forest, and b) "scientists" are all agreed as to its truth--both of which seem dubious at best. At least in Salon, they change the passive verb formulation "scientists have found" to a more-cautious note of "a study to be published next year suggests."
It's still quite likely prattling nonsense, careful phrasing or no. And as the links supplied in the Salon piece make clear, political bloggers (apparently from both sides of the aisle) have seized on the findings and spun their own theories, each sounding as silly as the study probably is. People want scientific literacy? Find some journalists who have a sense of what constitutes good science and have them leave the junk to languish in throwaway journals, that's one place to start.
--br
Case in point today comes from Salon as it references an article from the British paper The Telegraph about how different political views correlate with differences in the size of certain parts of the brain. The gist: "conservatives" appear to have larger sized amygdalas; the amygdala plays a central role in the processing of emotional reactions, which The Telegraph article describes as "often associated with anxiety and emotions." Moreover, conservatives have smaller anterior cingulates--a part of the brain the articles describe as "responsible for courage and optimism."
The reality? It's likely nonsense. The study hasn't been published yet (indeed, one very big red flag is the fact that the study findings are being released to the media before going through the theoretically rigorous process of peer review) so I can't comment with great precision on its validity. But it sure doesn't smell right: they took the "brain scans" (unclear from either story whether they were MRIs or CT scans) of two, yes, two Members of Parliament, one from Labour, the other a Conservative, and then reviewed the brain scans of 90 "students" (again, unclear if college-aged or not) and matched the sizes of the brain structures against a questionnaire designed to elicit political views. Again, the questions weren't made available, so it's really hard to know how they measured political values. Even so, the study design doesn't seem especially impressive. Students? Isn't the point of being a student to create one's worldview, meaning that some students will surely drift further toward the right in life, some further left, and some change not much at all? So how can this population be used to measure political attitudes? Is the distribution equal between men and women? Between whites and blacks? Immigrants and natives? Students and non-students? Well, we already know the answer to that question--and couldn't that bias the results all by itself?
Even more concerning is the whopper unloaded by the researcher himself when he said that he and his team were "very surprised to find that there was an area of the brain that...could predict political attitude." But his study predicted nothing at all! They took scans, administered a questionnaire, and then after the fact compared the two. Prediction would entail administering the scan and guessing what the responses would be before they were given. That's a freshman-level mistake and sure doesn't boost my overall confidence in the sophistication of this guy.
But the most odious part of this sorry little news flare is found in the first line of The Telegraph article, which begins, "Scientists have found that people with conservative views have brains with larger amygdalas." What's so singularly frustrating about this opener is that it quite possibly leaves less eagle-eyed readers with the impression that this is both a) unquestionably true because it was "found" like a little fact in the forest, and b) "scientists" are all agreed as to its truth--both of which seem dubious at best. At least in Salon, they change the passive verb formulation "scientists have found" to a more-cautious note of "a study to be published next year suggests."
It's still quite likely prattling nonsense, careful phrasing or no. And as the links supplied in the Salon piece make clear, political bloggers (apparently from both sides of the aisle) have seized on the findings and spun their own theories, each sounding as silly as the study probably is. People want scientific literacy? Find some journalists who have a sense of what constitutes good science and have them leave the junk to languish in throwaway journals, that's one place to start.
--br
Tuesday, December 28, 2010
A Tale of Two Values
On Christmas Day, the NY Times ran a story whose title really does give readers most of the necessary information in the article: "In Budget Crunch, Science Fairs Struggle to Survive." The article goes on to quote various educators worried about the impact of these cutbacks, but there aren't any real surprises.
But while the story is accurate in collecting and relaying the details, I'd argue that it is narrowly telling the story (granted, as good journalism often should). Even without a strained economy, science education--or indeed, education of any sort--has long played second fiddle to that Great American Obsession, sports. For much of the 20th century the two were able to coexist in something approaching harmony since there was plenty of money to go around. But even before the Great Recession of recent years, public schools have been faced with cutback after cutback, while both amateur and professional sports have gotten fatter and fatter, budgetarily speaking. I recall as a kid growing up in northern Ohio and listening to the radio on election nights as my family listened to the majority of school levies, most of which were asking for a pittance, go down to defeat.
Not so the public financing of stadiums, which the public seems to adopt with a certain slobberingly stupid zeal, as witnessed in these articles on the financing of the Florida Marlins baseball stadium, an attempt to finance a professional soccer team stadium in Portland, Oregon, a brief take on the return-on-investment for Paul Brown Stadium in Cincinnati, and details and an opinion about the huge tax subsidies the city of New York granted in the construction of the new Yankees stadium. Comparing the amounts of public money that these businesses receive to the amount that school boards meekly request is akin to comparing the size of an NFL lineman to that of a Pop Warner player: you're not in the same, um, ballpark.
And we're surprised at the decline in educational standards in this country? Right.
--br
But while the story is accurate in collecting and relaying the details, I'd argue that it is narrowly telling the story (granted, as good journalism often should). Even without a strained economy, science education--or indeed, education of any sort--has long played second fiddle to that Great American Obsession, sports. For much of the 20th century the two were able to coexist in something approaching harmony since there was plenty of money to go around. But even before the Great Recession of recent years, public schools have been faced with cutback after cutback, while both amateur and professional sports have gotten fatter and fatter, budgetarily speaking. I recall as a kid growing up in northern Ohio and listening to the radio on election nights as my family listened to the majority of school levies, most of which were asking for a pittance, go down to defeat.
Not so the public financing of stadiums, which the public seems to adopt with a certain slobberingly stupid zeal, as witnessed in these articles on the financing of the Florida Marlins baseball stadium, an attempt to finance a professional soccer team stadium in Portland, Oregon, a brief take on the return-on-investment for Paul Brown Stadium in Cincinnati, and details and an opinion about the huge tax subsidies the city of New York granted in the construction of the new Yankees stadium. Comparing the amounts of public money that these businesses receive to the amount that school boards meekly request is akin to comparing the size of an NFL lineman to that of a Pop Warner player: you're not in the same, um, ballpark.
And we're surprised at the decline in educational standards in this country? Right.
--br
Thursday, December 16, 2010
No, There Is Not a Cure For HIV
One of the guiding principles of the Billy Rubin Blog is quite simple: namely, that mass media in the US provides grossly distorted and exaggerated "news" to a largely uncritical public that appears to be not only more interested in ESPN, but only interested in said network. This blog's major concern is the distortion of scientific and medical news, and this week provided an example of just how irresponsible "journalists" can become in pursuit of a sexy headline.
"German doctors declare 'cure' in HIV patient," the Reuters headline proclaimed Wednesday. The story got legs immediately, and was picked up by nearly all of the outlets one would consider mainstream: the Fox coverage can be found here, CNN's article can be found here (more on CNN shortly), Yahoo's article is here, the US News & World Report brief is here. Once the stories showed up on the respective websites, it rapidly rocketed onto the "most viewed" and "most e-mailed" lists: at the time of this writing, the CNN site lists just over 10,000 Facebook "shares" and 413 comments--easily among the top 5 health stories for that website over the past week; over at Fair & Balanced, the numbers were almost identical.
One small problem: it's not really true. Oh, sure, the facts of the story, as reported, are correct: there was a patient infected with HIV, and this same patient now does not appear to have HIV due to a bizarre quirk of cell biology and transplant medicine (more on this shortly). But the truth of the matter is that one patient with HIV became improbably, almost impossibly, lucky, and aside from the scientific curiosity that the case has for the physicians and scientists who study and treat HIV, there's nothing so hugely important in this story that merits a splashy shout-out in the health sections of major news outlets.
The headline does most of the damage: doctors declare cure. The online dictionary Your Dictionary has three separate relevant definitions for the word "cure," which are: a) restoration to health or a sound condition; b) a medicine or treatment for restoring health, remedy; c) a system, method, or course of treating a disease, ailment, etc. In declaring "cure," the German doctors were using the first definition. Technically, it is a perfectly adequate definition. But the reason why the news media ran with the story, and the reason why it's gotten flashed all over the internet in very little time, is because they understood the word "cure" in the latter two senses--that there was some new magical treatment on the horizon for the tens of millions of people infected with the virus. And that is, quite simply, a dream: there's no cure at all. One guy just got very lucky.
This reality did not stop major networks from jumping on the "cure" bandwagon, when its science and health "correspondents" should have known better. Here is CNN's Elizabeth Cohen barely containing her enthusiasm for the story:
It's important to note that the anchor, in framing the story, asks his audience to "listen up" and says that "doctors in Germany claim they may have found a cure for HIV"--which isn't what they claimed at all, and again wildly exaggerates the importance of the story based on that misreading of the word "cure." This is followed by Cohen performing the two-step of noting the facts, but ignoring the meaning those facts have, which is that this is a whole lotta ado about nothing. "This is so fascinating that, even if this isn't the cure for HIV, which it's not, it's still amazing what these doctors in Germany did," she gushes [my emphasis]. What ensues is a lot of back-and-forth graphics reminiscent of Ross Perot's presidential bid. There are, indeed, caveats that lace their way through the discussion, but the overall tone of excitement and optimism is unmistakable.
What makes such enthusiasm unfortunate is that it's a lie. Whether Cohen has deceived herself is unclear, but nevertheless the amount of fake hope generated by such a story is corrosive in at least two ways: it oversells what medicine is capable of, and in doing so helps to promote a backlash against the genuinely amazing things that modern medicine can accomplish.
CNN, to their credit, does make an attempt to contextualize the finding with an opinion piece stressing extreme caution when reading the story. But even here they screw the pooch a bit by throwing up the title "Why HIV advance is not a universal cure." Again, this is remarkably misleading--it's a cure only for one lucky person! A better title might have been, "Why HIV advance is not a cure at all."
What happened with this patient? You can read the articles for details but the quick version is that HIV lives in special immune cells, and this HIV-infected man had a cancer of the immune system. One of the principal ways we treat immune cell cancers is by completely destroying the patient's immune system and transplanting a different person's immune system into the patient (we also sometimes harvest the patient's own immune cells, get rid of the cancerous ones, and transplant them back in after "nuking" the patient's body, which is called an autologous transplant). The person who donated the immune cells to the patient had a special mutation in his immune cells that prevent HIV from entering the cell and setting up house; this donor is literally immune from HIV--part of a very, very small number of people on earth who cannot be infected by the virus.
Anyway, this is the Cliff's Notes explanation. What it misses (and what the news stories largely ignored as well) is that the bone marrow transplant that led to this patient's "cure" is basically a game of Russian roulette, with a high one-year mortality rate, a not-especially-impressive remission rate (it varies depending on the type of immune cell cancer, but they have nearly universally horrible prognoses), and causes unimaginable pain and suffering for the few months before and after the transplant. Even if there were an abundance of similar donors, we'd kill tens of thousands for the iffy chance of having "cured" an equal number. In the age of perfectly adequate HIV drugs--which keep the virus at bay but do not produce "cure" and thus need to be taken for life--this seems like a mad scientist's dream.
Where was the breakdown? Part of it lies at the feet of the German physicians, who might have thought more carefully about the implications of touting cures that simply aren't practical and thus don't really, truly exist. But I'd direct my venom at the health and science editors of the news outlets I've mentioned; there just doesn't seem to be any care taken in investigating this story, and in the case of Elizabeth Cohen's CNN piece, it's worse than that, as she runs away from the more sober implications of the story, even though she is fully aware of such implications, swept up as she is in the wonder of the Modern Medical Miracle. It's tremendously irresponsible journalism. This is a story that nobody should have touched.
Curiously, after seeing the article in a Reuters link at The New York Times, I have been unable to find any other mention of it at the Times website. Here is their page devoted to news on HIV. As of this evening (December 16), they appear not to think of it as worthy of their attention or their readers' attention. I not only concur with them, I applaud their restraint!
--br
"German doctors declare 'cure' in HIV patient," the Reuters headline proclaimed Wednesday. The story got legs immediately, and was picked up by nearly all of the outlets one would consider mainstream: the Fox coverage can be found here, CNN's article can be found here (more on CNN shortly), Yahoo's article is here, the US News & World Report brief is here. Once the stories showed up on the respective websites, it rapidly rocketed onto the "most viewed" and "most e-mailed" lists: at the time of this writing, the CNN site lists just over 10,000 Facebook "shares" and 413 comments--easily among the top 5 health stories for that website over the past week; over at Fair & Balanced, the numbers were almost identical.
One small problem: it's not really true. Oh, sure, the facts of the story, as reported, are correct: there was a patient infected with HIV, and this same patient now does not appear to have HIV due to a bizarre quirk of cell biology and transplant medicine (more on this shortly). But the truth of the matter is that one patient with HIV became improbably, almost impossibly, lucky, and aside from the scientific curiosity that the case has for the physicians and scientists who study and treat HIV, there's nothing so hugely important in this story that merits a splashy shout-out in the health sections of major news outlets.
The headline does most of the damage: doctors declare cure. The online dictionary Your Dictionary has three separate relevant definitions for the word "cure," which are: a) restoration to health or a sound condition; b) a medicine or treatment for restoring health, remedy; c) a system, method, or course of treating a disease, ailment, etc. In declaring "cure," the German doctors were using the first definition. Technically, it is a perfectly adequate definition. But the reason why the news media ran with the story, and the reason why it's gotten flashed all over the internet in very little time, is because they understood the word "cure" in the latter two senses--that there was some new magical treatment on the horizon for the tens of millions of people infected with the virus. And that is, quite simply, a dream: there's no cure at all. One guy just got very lucky.
This reality did not stop major networks from jumping on the "cure" bandwagon, when its science and health "correspondents" should have known better. Here is CNN's Elizabeth Cohen barely containing her enthusiasm for the story:
It's important to note that the anchor, in framing the story, asks his audience to "listen up" and says that "doctors in Germany claim they may have found a cure for HIV"--which isn't what they claimed at all, and again wildly exaggerates the importance of the story based on that misreading of the word "cure." This is followed by Cohen performing the two-step of noting the facts, but ignoring the meaning those facts have, which is that this is a whole lotta ado about nothing. "This is so fascinating that, even if this isn't the cure for HIV, which it's not, it's still amazing what these doctors in Germany did," she gushes [my emphasis]. What ensues is a lot of back-and-forth graphics reminiscent of Ross Perot's presidential bid. There are, indeed, caveats that lace their way through the discussion, but the overall tone of excitement and optimism is unmistakable.
What makes such enthusiasm unfortunate is that it's a lie. Whether Cohen has deceived herself is unclear, but nevertheless the amount of fake hope generated by such a story is corrosive in at least two ways: it oversells what medicine is capable of, and in doing so helps to promote a backlash against the genuinely amazing things that modern medicine can accomplish.
CNN, to their credit, does make an attempt to contextualize the finding with an opinion piece stressing extreme caution when reading the story. But even here they screw the pooch a bit by throwing up the title "Why HIV advance is not a universal cure." Again, this is remarkably misleading--it's a cure only for one lucky person! A better title might have been, "Why HIV advance is not a cure at all."
What happened with this patient? You can read the articles for details but the quick version is that HIV lives in special immune cells, and this HIV-infected man had a cancer of the immune system. One of the principal ways we treat immune cell cancers is by completely destroying the patient's immune system and transplanting a different person's immune system into the patient (we also sometimes harvest the patient's own immune cells, get rid of the cancerous ones, and transplant them back in after "nuking" the patient's body, which is called an autologous transplant). The person who donated the immune cells to the patient had a special mutation in his immune cells that prevent HIV from entering the cell and setting up house; this donor is literally immune from HIV--part of a very, very small number of people on earth who cannot be infected by the virus.
Anyway, this is the Cliff's Notes explanation. What it misses (and what the news stories largely ignored as well) is that the bone marrow transplant that led to this patient's "cure" is basically a game of Russian roulette, with a high one-year mortality rate, a not-especially-impressive remission rate (it varies depending on the type of immune cell cancer, but they have nearly universally horrible prognoses), and causes unimaginable pain and suffering for the few months before and after the transplant. Even if there were an abundance of similar donors, we'd kill tens of thousands for the iffy chance of having "cured" an equal number. In the age of perfectly adequate HIV drugs--which keep the virus at bay but do not produce "cure" and thus need to be taken for life--this seems like a mad scientist's dream.
Where was the breakdown? Part of it lies at the feet of the German physicians, who might have thought more carefully about the implications of touting cures that simply aren't practical and thus don't really, truly exist. But I'd direct my venom at the health and science editors of the news outlets I've mentioned; there just doesn't seem to be any care taken in investigating this story, and in the case of Elizabeth Cohen's CNN piece, it's worse than that, as she runs away from the more sober implications of the story, even though she is fully aware of such implications, swept up as she is in the wonder of the Modern Medical Miracle. It's tremendously irresponsible journalism. This is a story that nobody should have touched.
Curiously, after seeing the article in a Reuters link at The New York Times, I have been unable to find any other mention of it at the Times website. Here is their page devoted to news on HIV. As of this evening (December 16), they appear not to think of it as worthy of their attention or their readers' attention. I not only concur with them, I applaud their restraint!
--br
Wednesday, December 15, 2010
Evolution Ain't a Ladder, and We Ain't at the Top
The idea that we humans are the most privileged, most developed, most special creatures on earth has been around, of course, a very long time, further back than when the Bible creation stories came into being, further back than Gilgamesh, probably further back than when some guys took to painting on a wall at Lascaux. Since Darwin, though, we've known better: humans are just another species trying to make its way in the world, quite intelligent to be sure, but no more or less special than, say, a bacterium happily living in the soil.
But thinking of ourselves as the Most Special Species--where evolution is a ladder and the top rung is occupied by humans, who preside over all life much like Adam on the Seventh Day--has been hard to root out. Even intelligent and thoughtful people can fall into the metaphor in the midst of an otherwise lucid discussion about science. Such a thing happened today when Gina Kolata, the science writer for the New York Times, was writing about some new research on the killer bug Staph aureus.
Ironically, when I saw the article I thought that it would make a great blog entry from the perspective of how warped laypersons' perceptions are about the relative dangers of various microbes. Even those only with a nodding acquaintance with science often have heard of Ebola, no matter that Ebola is a very rare disease and no human case has ever occurred in the US (not from the lethal strains, anyway). Staph, meanwhile, which is an outright killer, was barely known outside of medical circles until the past decade, when the drug-resistant staph strain MRSA was the culprit in various outbreaks, including a famous outbreak among the St. Louis Rams football team. I often feel like I could have summarized the major clinical take-home point of my fellowship in Infectious Diseases in two words: "Staph kills."
Alas--a more in-depth discussion about the warped perception the public has about various diseases will have to wait, because Kolata fell into the we're-on-top ditch midway through the article. While discussing the predilection that Staph has for human blood over other animal species, Kolata wrote, "Staph definitely preferred human blood...but there also was a definite trend, the higher up the evolutionary scale an animal was, the more the bacteria liked its blood" [my emphasis]. But Gina, it just ain't so! Those creatures whose blood was more liked by Staph were more closely related to humans on the evolutionary tree, but not higher on a scale, not more "advanced", not more worthy of the description "living thing." Evolution allows for adaption to particular living conditions, and the creatures that occupy each niche are all equal. The best way to visually think of the web of life is as a two-dimensional non-vertical tree (the vertical version, with "man" at the top, can be found in an exemplary drawing from the early 20th century below). We're just hanging out at the end of a branch like every other living thing.
Tomorrow, a brief commentary about this Reuters piece announcing a "cure" in an HIV patient. While the facts are true, the message is deeply misleading.
--br
But thinking of ourselves as the Most Special Species--where evolution is a ladder and the top rung is occupied by humans, who preside over all life much like Adam on the Seventh Day--has been hard to root out. Even intelligent and thoughtful people can fall into the metaphor in the midst of an otherwise lucid discussion about science. Such a thing happened today when Gina Kolata, the science writer for the New York Times, was writing about some new research on the killer bug Staph aureus.
Ironically, when I saw the article I thought that it would make a great blog entry from the perspective of how warped laypersons' perceptions are about the relative dangers of various microbes. Even those only with a nodding acquaintance with science often have heard of Ebola, no matter that Ebola is a very rare disease and no human case has ever occurred in the US (not from the lethal strains, anyway). Staph, meanwhile, which is an outright killer, was barely known outside of medical circles until the past decade, when the drug-resistant staph strain MRSA was the culprit in various outbreaks, including a famous outbreak among the St. Louis Rams football team. I often feel like I could have summarized the major clinical take-home point of my fellowship in Infectious Diseases in two words: "Staph kills."
Alas--a more in-depth discussion about the warped perception the public has about various diseases will have to wait, because Kolata fell into the we're-on-top ditch midway through the article. While discussing the predilection that Staph has for human blood over other animal species, Kolata wrote, "Staph definitely preferred human blood...but there also was a definite trend, the higher up the evolutionary scale an animal was, the more the bacteria liked its blood" [my emphasis]. But Gina, it just ain't so! Those creatures whose blood was more liked by Staph were more closely related to humans on the evolutionary tree, but not higher on a scale, not more "advanced", not more worthy of the description "living thing." Evolution allows for adaption to particular living conditions, and the creatures that occupy each niche are all equal. The best way to visually think of the web of life is as a two-dimensional non-vertical tree (the vertical version, with "man" at the top, can be found in an exemplary drawing from the early 20th century below). We're just hanging out at the end of a branch like every other living thing.
Tomorrow, a brief commentary about this Reuters piece announcing a "cure" in an HIV patient. While the facts are true, the message is deeply misleading.
--br
Wednesday, December 8, 2010
The Dearth of Republican Scientists
There's an essay in Slate today that wins points for turning arguments completely upside-down. In "Lab Politics," author Daniel Sarewitz encapsulates pretty much his entire point in the subtitle: "most scientists in this country are Democrats, and that's a problem."
So far, so good--although my own politics aren't likely to turn Red anytime soon, I don't disagree with his observation. Problem is that he seems to think that the fault of this imbalance lies largely with the scientists themselves, rather than a Republican Party whose unequivocal anti-scientific, anti-intellectual rhetoric has driven the vast majority of scientists away.
Sarewitz spends most of his time discussing the climate change debate, and making the very slippery contention that "disagreements over climate change are essentially political--and that science is just carried along for the ride." To be fair, he doesn't quite assert this: rather, he asks it rhetorically, implying that there's more than a grain of truth to it. He explains:
For 20 years, evidence about global warming has been directly and explicitly linked to a set of policy responses demanding international governance regimes, large-scale social engineering, and the redistribution of wealth. These are the sort of things that most Democrats welcome, and most Republicans hate. No wonder the Republicans are suspicious of the science.
There are at least two misconceptions here. First is that "evidence" about global warming is directly linked to a political program by the scientists themselves--implying that scientists are working in lock-step with the Democratic political establishment to bring about policy change. This is largely nonsense, and ironically describes fairly accurately how the other party operates. Second, nobody "welcomed" climate change as an excuse to legislate "large-scale social engineering,"--whatever that means--so his whole analysis sounds like a daydream to me.
Climate change is a fact. It's extent can be debated, but not its existence. To pretend that "Republicans are suspicious of the science" is to give the Republicans far too much credit: they simply deny that there's any scientific validity to anything with which they disagree a priori. No wonder the Republicans are suspicious of the science--good grief! The Republicans don't believe in evolution! How could you expect them to believe in anything that requires scientific literacy? No Republican politician in this country can reasonably expect to gain the presidential nomination if they support the teaching of evolution in public schools. How could you expect scientists to willingly join such ranks?
There are thinkers on the public scene who are vocal enthusiasts for science and who simultaneously hold beliefs that don't dovetail with the Democratic party: author and football columnist Gregg Easterbrook is one such voice of whom Billy is fond. A generation ago, his politics would have put him squarely in the center of the Republican party. But the Republicans sold their soul to the devil around that time, and decided that the best way to deal with the fallout of unpleasant scientific facts was to attack science itself. Those intellectuals are in search of a home, neither comfortable with the Democratic party (Billy sympathizes, though he no longer finds his home there because he's fallen off the other edge) for political reasons, nor with the Republicans because of their hate-mongering anti-intellecutalism.
So please, Mr. Sarewitz, spare me the false equivalencies.
--br
So far, so good--although my own politics aren't likely to turn Red anytime soon, I don't disagree with his observation. Problem is that he seems to think that the fault of this imbalance lies largely with the scientists themselves, rather than a Republican Party whose unequivocal anti-scientific, anti-intellectual rhetoric has driven the vast majority of scientists away.
Sarewitz spends most of his time discussing the climate change debate, and making the very slippery contention that "disagreements over climate change are essentially political--and that science is just carried along for the ride." To be fair, he doesn't quite assert this: rather, he asks it rhetorically, implying that there's more than a grain of truth to it. He explains:
For 20 years, evidence about global warming has been directly and explicitly linked to a set of policy responses demanding international governance regimes, large-scale social engineering, and the redistribution of wealth. These are the sort of things that most Democrats welcome, and most Republicans hate. No wonder the Republicans are suspicious of the science.
There are at least two misconceptions here. First is that "evidence" about global warming is directly linked to a political program by the scientists themselves--implying that scientists are working in lock-step with the Democratic political establishment to bring about policy change. This is largely nonsense, and ironically describes fairly accurately how the other party operates. Second, nobody "welcomed" climate change as an excuse to legislate "large-scale social engineering,"--whatever that means--so his whole analysis sounds like a daydream to me.
Climate change is a fact. It's extent can be debated, but not its existence. To pretend that "Republicans are suspicious of the science" is to give the Republicans far too much credit: they simply deny that there's any scientific validity to anything with which they disagree a priori. No wonder the Republicans are suspicious of the science--good grief! The Republicans don't believe in evolution! How could you expect them to believe in anything that requires scientific literacy? No Republican politician in this country can reasonably expect to gain the presidential nomination if they support the teaching of evolution in public schools. How could you expect scientists to willingly join such ranks?
There are thinkers on the public scene who are vocal enthusiasts for science and who simultaneously hold beliefs that don't dovetail with the Democratic party: author and football columnist Gregg Easterbrook is one such voice of whom Billy is fond. A generation ago, his politics would have put him squarely in the center of the Republican party. But the Republicans sold their soul to the devil around that time, and decided that the best way to deal with the fallout of unpleasant scientific facts was to attack science itself. Those intellectuals are in search of a home, neither comfortable with the Democratic party (Billy sympathizes, though he no longer finds his home there because he's fallen off the other edge) for political reasons, nor with the Republicans because of their hate-mongering anti-intellecutalism.
So please, Mr. Sarewitz, spare me the false equivalencies.
--br
Tuesday, November 16, 2010
It's Not Nice To Fool Mother Nature
The AP reported this week about an unusual method of battling dengue fever. Dengue, a virus that's transmitted by the bite of the Aedes aegypti mosquito, causes hundreds of millions of infections worldwide and accounts for about 20,000 deaths each year, the vast majority in children. Since early in the 20th century there have been only occasional cases of home-grown dengue in the states along the Texas-Mexico border, though in the past year there appears to have been a full-scale outbreak in Key West, Florida, and a case of dengue in a man who never traveled out of Miami was just reported by that city's newspaper.
While dengue has been a known disease for centuries, serious researcher on the disease dates to about World War II, when Albert Sabin (famed father of the polio vaccine) did experiments with US Army recruits, actively infecting them with the virus and studying their immune response. (It's not an exceptionally lethal virus, killing only a tiny fraction of those who are infected, but still. Times have changed.) Despite the 60+ years of work, a successful vaccine remains elusive, although there are many different groups hard at work on vaccines in various stages of clinical testing. So far, the leader is the company sanofi aventis (no caps is correct, don't ask), whose Sanofi Pasteur division has its candidate vaccine in what is called Phase 3 clinical trials, the final stage before receiving governmental approvals. Their vaccine is based on a very cool concept. They've taken the Yellow Fever vaccine, called 17D, which has been around seemingly forever--it's the first true modern vaccine--and is quite safe and very well tolerated, and spliced a piece of dengue's genetic code into the YF 17D genetic code. Since Yellow Fever and dengue are closely related viruses, this is, in concept, simple to do, and the genes that are swapped are those that encode for the surface proteins, so the human immune system "sees" a virus that looks like dengue, but the vaccine still uses the 17D proteins to replicate itself. We still need to see the results of the Phase III trial but it looks very promising at this point in time.
That said, it makes perfect sense to look for other solutions to the dengue problem, and one approach has been to control the mosquito populations. Olivia Judson of the NY Times speculated about this in her blog over two years ago. The AP report this week indicates that the company Oxitec Ltd has figured out a way to genetically engineer male mosquitoes so that they are sterile. After the science of gene manipulation, the concept is simple: release enough sterile males into the environment so that they mate with females who proceed not to reproduce, and just like that your mosquito population is lowered. The AP story notes that they've released 3 million of the critters in a small area on the Cayman Islands as a pilot project. The story also goes on to quote some worry-warts who fear that the company is jumping into an ecological experiment without having considered all of the variables. The story quotes Pete Riley, the founder of GM Freeze, a British group that opposes genetic modification. "If we remove an insect like the mosquito from the ecosystem, we don't know what the impact will be," says Riley.
Count me among the worry-warts this time. "Teaching" our immune system to recognize pathogens has been a wildly successful strategy at keeping people alive and healthy for over two centuries; messing with the environment as a means to control disease has occasionally worked, but equally as often has spectacularly failed. In the 50s and 60s, there were wholescale "malaria eradication" programs in the western hemisphere where DDT and other insecticides were used in an effort to stamp out mosquito populations, and we know how that solution worked out. One point Riley makes is that a collapse of the mosquito population will have unknown effects in the food chain, and could possibly lead to some other disease being introduced by an insect predator stepping into the breach. I'm a little skeptical about this, only because Aedes aegypti populations have taken off precisely because of human development over the past 150-200 years. These critters are as domestic and reliant on civilization as we are, as we constitute their happy hunting grounds--unlike other mosquito species, who live quite placidly in the wild. That said, I do agree that you never know what's gonna happen when you do a large-scale ecological experiment, and our these kinds of projects almost always, forgive the pun, come back to bite us.
Oxitec notes that they aren't trying to bring about permanent change since the genetically sterile males can only hang around for a generation, although to me that begs the question: what's the long-run solution? Are we going to keep introducing genetically sterile males, generation after generation, to keep mosquito populations low? And how will we define "low," anyway? Leave aside the ecological implications, which has the possibility of bringing about unintended consequences (in a way that a new vaccine almost certainly doesn't), it's an incredibly costly, top-heavy solution. And the last time I checked, a huge number of the countries that really suffer from the burden of dengue don't have the kind of cash or infrastructure that would make this kind of an approach feasible. You'd get a lot more bang for your buck if you just hired people to come visit houses and identify and eliminate all areas of still water where mosquitoes lay their eggs. Do that, and watch your skeeter population drop just as much as with the fancy genetic experiment.
--br
While dengue has been a known disease for centuries, serious researcher on the disease dates to about World War II, when Albert Sabin (famed father of the polio vaccine) did experiments with US Army recruits, actively infecting them with the virus and studying their immune response. (It's not an exceptionally lethal virus, killing only a tiny fraction of those who are infected, but still. Times have changed.) Despite the 60+ years of work, a successful vaccine remains elusive, although there are many different groups hard at work on vaccines in various stages of clinical testing. So far, the leader is the company sanofi aventis (no caps is correct, don't ask), whose Sanofi Pasteur division has its candidate vaccine in what is called Phase 3 clinical trials, the final stage before receiving governmental approvals. Their vaccine is based on a very cool concept. They've taken the Yellow Fever vaccine, called 17D, which has been around seemingly forever--it's the first true modern vaccine--and is quite safe and very well tolerated, and spliced a piece of dengue's genetic code into the YF 17D genetic code. Since Yellow Fever and dengue are closely related viruses, this is, in concept, simple to do, and the genes that are swapped are those that encode for the surface proteins, so the human immune system "sees" a virus that looks like dengue, but the vaccine still uses the 17D proteins to replicate itself. We still need to see the results of the Phase III trial but it looks very promising at this point in time.
That said, it makes perfect sense to look for other solutions to the dengue problem, and one approach has been to control the mosquito populations. Olivia Judson of the NY Times speculated about this in her blog over two years ago. The AP report this week indicates that the company Oxitec Ltd has figured out a way to genetically engineer male mosquitoes so that they are sterile. After the science of gene manipulation, the concept is simple: release enough sterile males into the environment so that they mate with females who proceed not to reproduce, and just like that your mosquito population is lowered. The AP story notes that they've released 3 million of the critters in a small area on the Cayman Islands as a pilot project. The story also goes on to quote some worry-warts who fear that the company is jumping into an ecological experiment without having considered all of the variables. The story quotes Pete Riley, the founder of GM Freeze, a British group that opposes genetic modification. "If we remove an insect like the mosquito from the ecosystem, we don't know what the impact will be," says Riley.
Count me among the worry-warts this time. "Teaching" our immune system to recognize pathogens has been a wildly successful strategy at keeping people alive and healthy for over two centuries; messing with the environment as a means to control disease has occasionally worked, but equally as often has spectacularly failed. In the 50s and 60s, there were wholescale "malaria eradication" programs in the western hemisphere where DDT and other insecticides were used in an effort to stamp out mosquito populations, and we know how that solution worked out. One point Riley makes is that a collapse of the mosquito population will have unknown effects in the food chain, and could possibly lead to some other disease being introduced by an insect predator stepping into the breach. I'm a little skeptical about this, only because Aedes aegypti populations have taken off precisely because of human development over the past 150-200 years. These critters are as domestic and reliant on civilization as we are, as we constitute their happy hunting grounds--unlike other mosquito species, who live quite placidly in the wild. That said, I do agree that you never know what's gonna happen when you do a large-scale ecological experiment, and our these kinds of projects almost always, forgive the pun, come back to bite us.
Oxitec notes that they aren't trying to bring about permanent change since the genetically sterile males can only hang around for a generation, although to me that begs the question: what's the long-run solution? Are we going to keep introducing genetically sterile males, generation after generation, to keep mosquito populations low? And how will we define "low," anyway? Leave aside the ecological implications, which has the possibility of bringing about unintended consequences (in a way that a new vaccine almost certainly doesn't), it's an incredibly costly, top-heavy solution. And the last time I checked, a huge number of the countries that really suffer from the burden of dengue don't have the kind of cash or infrastructure that would make this kind of an approach feasible. You'd get a lot more bang for your buck if you just hired people to come visit houses and identify and eliminate all areas of still water where mosquitoes lay their eggs. Do that, and watch your skeeter population drop just as much as with the fancy genetic experiment.
--br
Wednesday, November 10, 2010
Truth, Found in Football
Killing a bit of time earlier this evening, and I happened upon an article in Slate--don't ask me why, I'm still trying to figure that out myself--by the sportswriter Stefan Fatsis. Stefan, who parlayed his weekend warrior soccer talents into a Paper Lion-style project as a placekicker for the Denver Broncos, has been a keen commentator on NFL issues given his insider experiences. I usually hear him on NPR, but since he had an article in Slate writing about the narcissism of Brett Favre, I figured I'd take a look with a few spare minutes.
Much as I'd like to talk about Favre, what caught my eye was an assertion that Fatsis made about another NFL primadonnish narcissist, Dallas Cowboys owner Jerry Jones. For those who don't follow the sport, Jones is regarded as something of a football equivalent of Yankees former owner George Steinbrenner: megalomanical, autocratic, a man who interferes with his coaches and management because he thinks that he knows everything about everything. Fatsis describes him like this:
...the meddling owner who considers himself a businessman, promoter, player personnel expert, general manager, entrepreneur, public speaker, draft guru, and coach.
This description is couched as part of a larger point, which is the link between Jones's personality and the Cowboys' rather spectacularly bad 1-7 record so far this season. Leave aside the devastating injury to Tony Romo, the star quarterback of the team, thus utterly preventing easy wins for the past several weeks, the Cowboys were still a bad team this year, despite being picked to be a contender to play in their very own stadium for this year's Super Bowl. The question is why they're so bad. Here's Fatsis's analysis:
It is hard to pinpoint exactly how and why an NFL team falls to pieces. There are so many moving parts. But a hefty share of the blame should go to the meddling owner...That ego seeps into every nook and cranny of that organization and clogs up the machinery. The result is a 1-7 team: uninspired and looking forward to Jan. 3, the first day of the offseason.
Truth can be found in a discussion of football, and the truth is this: it's not hard, especially in an age of easily procured data, to fact-check one's arguments, yet people often seem to rely on lazy assertions as long as it fits in with the prevailing narrative. Fatsis, in writing about Jones, makes not only a specious argument, but makes one while the facts are sitting quietly like Christmas presents, eagerly waiting to be opened and used by their owners.
You see, Jerry Jones has owned the Cowboys for over twenty years...there's a very long track record to establish if his ego does indeed "clog up the machinery." And to judge by his record--or rather, his team's record--it does nothing of the sort. I googled "Cowboys record by season" and within four minutes learned that, since 1989, when Jones bought the team, the Cowboys have a record of 185-159 for a winning percentage of .538, plus three Super Bowl victories. That's not too shabby, especially from the point of view of this long-suffering Cleveland Browns fan. Take away his one true lemon season, his first (which one could argue was a perfect storm of new owner combined with the departure of a legendary coach in Tom Landry), and his record is 184-144, with an even more impressive .560 winning percentage. So if Jones's megalomania is really responsible for their failure, why have they had so many successes during his ownership?
Instead, Fatsis cherry-picks the data, musing on the awful 1-7 record of this season, completely ignoring the rest of the data set. As a result, what we get is a tale of an owner run amok, and a moral lesson about the proper role of humility: don't think you're a genius at everything--just look at what Jerry Jones did to the Cowboys! Never mind that this reasoning is totally at odds with Steinbrenner's success as an owner, nor does it explain the perennially bad showing of the LA Clippers, owned by the hands-off Donald Sterling, who appears to take little interest in his team and almost never bids high for a player's services. Indeed, one reading of Jones's stewardship could be the polar opposite of the Slate article: if you are confident in yourself and trust your judgement, provided your judgement is good, sometimes that will pay off even if you don't rely on (or even overrule!) so-called "experts." (The Cowboys' record, it should be noted, is more erratic than consistently good, as the winning percentage only gives an average of the 20+ years and is misleading. They were the best team in football in the early-mid 90s, while from 2000-2003 they were positively mediocre. So Jones isn't always right, but sometimes he appears to have been as right as one could get.)
I don't care a great deal about Jones, and I care even less for the Cowboys, but what I found singularly irritating about Fatsis's assertion is that it could easily have been checked. Why he didn't is a mystery, since he normally is an intelligent writer. What is much more disturbing is that such lazy baseless assertions are not at all confined to the trivial world of sports commentary. And the assertions that people are going to make in the next few years may--I only say "may"--have a huge impact on how we fare locally and internationally for many more years still.
Sometime soon: cat scans, mammography, and lung cancer, and the media coverage thereof.
--br
Much as I'd like to talk about Favre, what caught my eye was an assertion that Fatsis made about another NFL primadonnish narcissist, Dallas Cowboys owner Jerry Jones. For those who don't follow the sport, Jones is regarded as something of a football equivalent of Yankees former owner George Steinbrenner: megalomanical, autocratic, a man who interferes with his coaches and management because he thinks that he knows everything about everything. Fatsis describes him like this:
...the meddling owner who considers himself a businessman, promoter, player personnel expert, general manager, entrepreneur, public speaker, draft guru, and coach.
This description is couched as part of a larger point, which is the link between Jones's personality and the Cowboys' rather spectacularly bad 1-7 record so far this season. Leave aside the devastating injury to Tony Romo, the star quarterback of the team, thus utterly preventing easy wins for the past several weeks, the Cowboys were still a bad team this year, despite being picked to be a contender to play in their very own stadium for this year's Super Bowl. The question is why they're so bad. Here's Fatsis's analysis:
It is hard to pinpoint exactly how and why an NFL team falls to pieces. There are so many moving parts. But a hefty share of the blame should go to the meddling owner...That ego seeps into every nook and cranny of that organization and clogs up the machinery. The result is a 1-7 team: uninspired and looking forward to Jan. 3, the first day of the offseason.
Truth can be found in a discussion of football, and the truth is this: it's not hard, especially in an age of easily procured data, to fact-check one's arguments, yet people often seem to rely on lazy assertions as long as it fits in with the prevailing narrative. Fatsis, in writing about Jones, makes not only a specious argument, but makes one while the facts are sitting quietly like Christmas presents, eagerly waiting to be opened and used by their owners.
You see, Jerry Jones has owned the Cowboys for over twenty years...there's a very long track record to establish if his ego does indeed "clog up the machinery." And to judge by his record--or rather, his team's record--it does nothing of the sort. I googled "Cowboys record by season" and within four minutes learned that, since 1989, when Jones bought the team, the Cowboys have a record of 185-159 for a winning percentage of .538, plus three Super Bowl victories. That's not too shabby, especially from the point of view of this long-suffering Cleveland Browns fan. Take away his one true lemon season, his first (which one could argue was a perfect storm of new owner combined with the departure of a legendary coach in Tom Landry), and his record is 184-144, with an even more impressive .560 winning percentage. So if Jones's megalomania is really responsible for their failure, why have they had so many successes during his ownership?
Instead, Fatsis cherry-picks the data, musing on the awful 1-7 record of this season, completely ignoring the rest of the data set. As a result, what we get is a tale of an owner run amok, and a moral lesson about the proper role of humility: don't think you're a genius at everything--just look at what Jerry Jones did to the Cowboys! Never mind that this reasoning is totally at odds with Steinbrenner's success as an owner, nor does it explain the perennially bad showing of the LA Clippers, owned by the hands-off Donald Sterling, who appears to take little interest in his team and almost never bids high for a player's services. Indeed, one reading of Jones's stewardship could be the polar opposite of the Slate article: if you are confident in yourself and trust your judgement, provided your judgement is good, sometimes that will pay off even if you don't rely on (or even overrule!) so-called "experts." (The Cowboys' record, it should be noted, is more erratic than consistently good, as the winning percentage only gives an average of the 20+ years and is misleading. They were the best team in football in the early-mid 90s, while from 2000-2003 they were positively mediocre. So Jones isn't always right, but sometimes he appears to have been as right as one could get.)
I don't care a great deal about Jones, and I care even less for the Cowboys, but what I found singularly irritating about Fatsis's assertion is that it could easily have been checked. Why he didn't is a mystery, since he normally is an intelligent writer. What is much more disturbing is that such lazy baseless assertions are not at all confined to the trivial world of sports commentary. And the assertions that people are going to make in the next few years may--I only say "may"--have a huge impact on how we fare locally and internationally for many more years still.
Sometime soon: cat scans, mammography, and lung cancer, and the media coverage thereof.
--br
Subscribe to:
Posts (Atom)