Friday, August 19, 2011

When a Microbe "Eats" a Human

There they go again. My guess is that the science & health "editors" at the major television media outlets felt a frisson of excitement when they heard of the deaths of some teenage kids exposed to pond or lakewater from an extremely rare amoeba known as Nagleria fowleri. Why? That's lot's of eyes of worried parents zooming in to their website and passing it along to other worried parents. It's good for the news business. ABC News's piece is here; CBS's story, with a link to a piece giving tips on staying safe, is here; MSNBC's take is here. Of the majors, only CNN appears to have taken a pass at the time I write this; at Fair & Balanced, the story is buried in the "Children's Health" tab here.

That the media Bigs love a good scare story, particularly with respect to some spooky infection, isn't saying anything new (and is discussed thoroughly in Marc Siegel's great book False Alarm: The Truth About the Epidemic of Fear). Suffice it to say that, depending on how you slice the numbers, thousands of American children die every year and that the three deaths so far due to Nagleria hardly indicates that we need to take all of our children out of the lake. Indeed, about a thousand kids die annually due to drowning, but this substantially larger problem isn't grabbing headlines and isn't even being mentioned as a comparison in the Nagleria stories to give some sense of proportion. Yes, lakes can be dangerous places: but mostly because teenagers drink alcohol and do stupid things on boats, not because a microscopic beast lurks underwater.

Which is actually what the Rubin blog is preoccupied with at the moment: the description of Nagleria. "Microscopic beast" is something of a contradiction in terms, right? Nagleria is smaller than a speck of dust and almost pretty to look at under a microscope. Beasts, by contrast, are big. They look scary! They have big, giant...teeth. And with those teeth, they eat. No surprise then, that a sensationalistic news item indulges in a little sensationalist imagery, as every one of the news stories above refer to Nagleria as a brain-eating amoeba.

But it's nonsense for the most part. Humans are, for Nagleria, what we call an accidental host: it makes its living by hanging out in the water feeding on tiny little bacteria. Yes, it does consume brain cells once it finds itself inside a human head, but to call it "brain-eating" just amps up the raise-the-hair-on-the-back-of-your-neck factor. Why not just call it "lethal", as it is almost universally so?

While we're on the subject, "flesh-eating bacteria" is--are you at all surprised?--likewise a misnomer. There is no particular species of flesh-eating bacteria, as it could be any number of bacteria. The most common bug to cause the condition of necrotizing fasciitis (the phenomenon that is caused by so-called flesh-eating bacteria) is from the family streptococcus, which lives harmlessly in the nasal passages, mouth and gut of humans. The problem isn't the bacteria per se; the real problem is when the bacteria manage to get deep into the soft tissues of the body (usually the legs, sometimes the arms, less commonly the trunk or face). In the upper layers toward the skin, bacteria have lots of physical impediments in their way to cause infection, and by the time they've lumbered along to a new patch of tissue, the immune system usually kicks in and clears the infection. We call that cellulitis.

In rare cases, though, these bacteria can dive deep and get down to an area called the fascia. Once there, there are no physical impediments, and the bacteria can move rapidly and make people incredibly sick very quickly, and typically the only "cure" is to filet the person's limb, take out the dead tissue, and hope that they survive. Often the affected limb needs to be amputated, and there's a high mortality rate. But there's nothing special about the bacteria themselves, although you wouldn't know that from seeing the news stories put out by the august organizations noted above.

Thursday, August 18, 2011

Media Overstatement on a Slow News Day

Right now one of the lead stories at the NY Times website deals with a potential new "miracle drug" called SRT-1720. With heavy emphasis on the scare-quotes. The article's title, "Drug Is Found to Extend Lives of Obese Mice," might be generating a huge buzz on the obese mouse circuit, but beyond this, I'm puzzled as to why this story is given such prominence in the Paper of Record. You could even argue that the story is barely worth running at all, even if placed deep in the science section of the website.

Bottom line is that this is a very preliminary study of an experimental drug. Studies like this are a dime a dozen, and it turns out that lots of fascinating things can be done in mice, but most of the time those fascinating things either don't work in humans, or end up having unacceptable risks compared to the benefits. I don't mean to belittle the experiment--it sounds very exciting--but I'm not sure that it's ready for primetime among laypeople just yet. Could it be part of a bigger article talking about strides that science is making in the field of aging? Sure: that's what the TV show NOVA is about, among other forms of popular science media. But there ain't no miracle drug coming down the pike that's going to extend the lives of obese people by 44 percent. So time to bury the story.

I can't wait to see what Gary Schweitzer is going to do to this story in his HealthNewsReview Blog. Go get 'em!

Saturday, August 6, 2011

"Owning" a Patient & Other Quick Medical Thoughts

a. The subspecialties of medicine each have their own slightly different personalities and subcultures, and although this is a gross overgeneralization I have long thought of surgeons--that is, general surgeons and their ilk, not orthopods or, say, urologists--as the Baddest Motherfuckers in the business. These guys & gals are the toughest & most reliable hombres: they work the longest hours and rightly take enormous pride in their work. When I was in medical school and a patient was admitted to surgery, the culture of the team was that nothing, absolutely nothing, would get in the way of caring for the patient--not sleep, not food, not any kind of distraction. Although I wandered down the internal medicine pathway, I have always admired the attitude with which surgeons owned their patients.

The word "ownership" is a term we use in medicine and while it sounds rather paternalistic, I have a fondness for it, as it signifies a kind of special level of responsibility. When you "own" a patient, it means that you consider yourself to be the most important of a team of doctors & nurses, that the buck stops with you. And as I've said, during my training I never saw a group that took ownership more seriously than general surgeons and their subspecialties such as cardiothoracic, colorectal, & vascular. At the risk of redundancy, these folks are tough.

Only I've seen some weird things happening at my academic medical center as well as at my little community hospital over the past year or two, and this week while attending as a consultant I witnessed something that I found quite surprising, and I'm wondering if that cast-iron sense of ownership is eroding amongst that hardcore group. I was asked to see a patient about a pre-operative infectious issue before the patient was due to get a mitral valve--should the patient be on antibiotics & if so how long, does the surgery have to go on hold, that sort of question. When I finished the consult I had my team get on the phone to talk to the intern, whom they dutifully paged. Only the intern who answered wasn't the surgical intern, it was the medicine intern.

"Wait, this lady's on the medicine team?" I said in frank astonishment. The patient had been admitted to the hospital specifically to get a valve replacement; that's purely a surgical issue. She didn't have a lot of medical problems that required an internist to be her primary doc in the hospital. And yet, somehow, she was sitting there on a medicine team with the cardiothoracic docs serving as consultants.

For laypeople out there this may be hard to grasp why this is a bad idea, but suffice it to say that you manage patients differently based on the kind of training you've had, as well as the kind of patients you care for. Surgeons are better taking care of patients undergoing surgery because, well, they do surgeries! And the surgical patient has a host of problems that internists don't encounter in the same way: fluid shift issues, mostly, which doesn't sound like much, but can be the difference between life and death if you misread the signals. This lady really did not belong on an internal medicine service, and search me as to why she was.

This isn't isolated, as I've seen pancreatitis patients, diverticulitis patients, cholecystitis patients all get turfed to medicine in the recent past. Some of these are borderline calls and could be taken care of adequately either way; some of these are what I would consider clear-cut surgical patients and I scratch my head when they are refused by the surgeon and sent to medicine (where I work, internal medicine does not have the luxury of refusing patients except in extreme circumstances).

Anyway, I'm happy to "own" such patients although I'm not sure that it's always in the patient's best interests for internal medicine doctors to be managing surgical cases. I'm also wondering if something's changed in the ethic of those surgeons whom I have held in such high esteem for so long.

b. Many months back I took my best shot at discussing a book before I had read it. The link, which can be found here, is a discussion about a book that had made a bit of a flap in the psychiatry community called Anatomy of An Epidemic by Robert Whitaker. Since I hadn't read the book, I did not venture to offer an opinion about it, but wondered about how the reviews framed what appeared to be a startling hypothesis: namely, that psychiatric drugs have, for at least a generation, made patients who suffer from psychiatric disease worse on the whole. Was the book worth reading? was my simple question, and I concluded it was and that I'd get around to it as soon as I could.

Well, I did, and my initial reaction is wow. Whitaker's book goes to the core of psychiatry and takes a sledgehammer to it, and he makes one hell of a powerful case that there's nothing behind the curtain. This is not the work of a pseudoscientific idiot who is raging against the machine; Whitaker supports his thesis by citing reputable scientific sources, and does so quite thoroughly. Unlike Celia Farber, an AIDS denialist who is short on facts and long on paranoia, Whitaker lays out his argument with the kind of precision and scientific grounding medical schools hope & pray they can impart to their students.

Whether Whitaker's contentions are completely right I cannot say; I just don't know the literature of psychiatry well enough. (A small quibble: I think he didn't portray Peter Kramer's excellent book Listenting to Prozac fairly, but it's been a long time since I read that book.) But he's without doubt persuasive, and has written a book that anyone who is seriously interested in the broad sweep of modern psychiatry should read. Next up on my reading list is Dr. Dan Carlat's Unhinged; the utterly awesome Dr. Marcia Angell, the former Editor In Chief of The New England Journal of Medicine and author of The Truth About Drug Companies, has a review about both books (as well as Irving Kirsch's The Emperor's New Drugs) in a recent New York Review of Books which can be found here.

Anyway, the point is simple: read this book.

c. Prior to her death, my only knowledge of Amy Winehouse was that she was a singer, and that her escapades with drug addiction were tabloid fodder. I had never listened to her music, but the comparison of her to Janis Joplin in the NYT obit, as well as the description of her as a jazz singer, caught my interest. I downloaded Frank and over the span of the next several days I heard her voice while driving to and from work, and suddenly shared in the collective frustration over a life that held such promise and exhibited such talent. I have not yet gotten around to listening to her signature album Back to Black, but I have become a fan.

Her place in music history is of course an open question, but even if I am blown away by Back to Black I think her troubles with addiction, which led to her decline and untimely death, will place her in that rank of singers whose talents we'll never really know. Joplin was the Times's point of comparison but I've spent some time thinking about Billie Holliday as I listen to her. Holliday gave more of her music to the world, surviving to 44 instead of Winehouse's 27, but she too represents the kind of talent that ventures too close to the flame. Match that against perhaps the greatest singer ever, Ella Fitzgerald, who kept care of herself her entire life (she died from diabetic complications, not drugs or alcohol), devoted it mostly to singing, and it's hard to listen to Holliday without some twinge of regret. Holliday in some way played the Charlie Parker to Fitzgerald's Dizzie Gillespie, the former dancing with demons on a nightly basis, the latter plugging away like a tortoise racing against a hare.

But Holliday, and Winehouse too, may have made the Faustian bargain of communing with the dark side in order to create their art, and it may not make sense at some level to shake our heads at their self-destructive recklessness. I love this clip of Holliday singing "My Man Don't Love Me" from a series that CBS television did called The Story of Jazz in 1957, two years before her death. (There's many things to love about this, actually: that one of the three major networks had a primetime series showcasing America's greatest and most serious artists; that Holliday is hardly the only important face in this ensemble, with a kind of Fania All Stars version of American jazz surrounding her like Coleman Hawkins, Lester Young, Gerry Mulligan and others; and that it's the last time Young & Holliday had a musical tete-a-tete before they both succumbed to their addictions.) The song is beautiful, but it's dark, jagged, and bloody; in short, it is the perfect song for Holliday at her peak. Would Winehouse have been able to step into that mix and take over for Lady Day? I think so. Would the First Lady of Song? I think not.

All of which is to say that Winehouse may have paid a price for her short-lived brilliance, but to judge that bargain as inherently wrong (or indeed, to understand it as anything other than a bargain) may be to fundamentally misunderstand her talents. I tell my med students when confonted with the peculiar vices of their patients: don't judge, just understand. Should we not do the same for Winehouse as her audience?