Map of life expectancy at birth from Global Education Project.

Saturday, January 31, 2009


Ed Doerr, in Free Inquiry magazine, quotes Don McLeroy, creationist and chair of the Texas State Board of Education:

If science is limited only to natural explanations but some natural phenomena are actually the result of supernatural causes then science would never be able to discover the truth - not a very good position for science. Defining science to allow for this possibility is just common sense. . . . Then the supernaturalist will be just as free as the naturalist to make testable explanations of natural phenomena.

Now you see, what we have in this country is a political discourse which is not built around competing interests, or values, or intellectually respectable analyses of the state of affairs. Of course those exist and they are what actually underlies much of politics. But they are hidden behind arguments between people who use information and reason to reach conclusions; and people who are total idiots. Bill O'Reilly, Rush Limbaugh, Glenn Beck and Samuel J. Wurzelbacher flaunt ignorance and bigotry as virtues. Among Republicans, it counts as evidence against a conclusion that it is held by smart, well educated people.

As for Doctor (yep, he's a dentist) McLeroy, he undertakes to lecture the scientific establishment on the correct way to undertake science when he clearly hasn't got the slightest idea of what science is or which end of his alimentary canal is which. As a scientist, I hereby invite him to propose any testable explanation he likes for natural phenomena. Problem solved.

Friday, January 30, 2009

Hard Times All Over

Brandeis University, which bestowed a degree upon me, may be on the verge of making an even bigger mistake. Colleges everywhere have seen their endowments hammered, and that is creating some real pain in academic programs and other areas. Brandeis has bigger problems tham most, however, in part because it is a relatively young institution that didn't have all that big of an endowment in the first place, but also because many of its major donors had entrusted their money to Bernie.

I don't know how much it's been in the news nationally, but around here it's a very big deal that the trustees, at the urging of president Jehuda Reinharz, voted to sell off the collection of the university's Rose Art Museum. They think it might be worth $300 million, which would more or less replace the lost funds, but as you can well imagine this is not sitting well with anyone. Meanwhile, the Carl and Ruth Shapiro Foundation, an important benefactor of many cultural, charitable and health care institutions in the Boston area, has cancelled all new grant making for the year because Bernie stole half of their endowment.

The Madoff victims are particularly poignant, and it is particularly obvious that they were robbed, however the truth is that we've all been robbed, mostly by people who aren't going to jail and who in fact are walking away with the loot. All of the gains in the financial markets for several years now have been illusory, the product of pretending that borrowed money was income -- in other words the whole thing was a giant Ponzi scheme. One of the saddest consequences of all this is that it will just increase inequality. It will make higher education harder to afford, it will cause tens of millions of people to lose their health care and quite possibly spiral down into disability and life long poverty, it will make poor kids poorer and send middle class families into poverty. The charitable institutions and state agencies that provide a safety net will be collapsing just as they are most needed.

President Obama has branded his legislation as an economic stimulus, but it's more than that. It's a two-minute drill to save our asses and give us a chance to play again next Sunday. It is really frosting my pumpkin that the punditocracy is saying the Democrats will be held accountable for "failure" if we don't have a strong recovery by 2010. Believe me, if we just manage to hang on to social decency through 2010, it will have been a success.

Thursday, January 29, 2009


Note the new e-mail address in the sidebar, which is now preferred. (The old one will still work for a while.) As ever, my employer is not responsible for my offensive ravings.

Now let me say a bit more about this evolution thing. As you all know if you haven't slept through the winter, this year is the 200th anniversary of Charles Darwin's birth and the 150th anniversary of the publication of On the Origin of Species. Massive blog swarming is planned, along with other festivities, and the inevitable counterfestivities. So we can expect this thing to heat up. In fact, we're already getting very close to the birthday, on February 12. You can definitely expect me to participate.

So here's a bit of warm up. There are two main reasons why people don't accept the overwhelming evidence for evolution and the complete adequacy of evolution to explain the phenomenon of life on earth. One is that they were indoctrinated as children to believe otherwise and it's just very difficult for people to overcome the beliefs stuffed into their heads by their parents and other authority figures. All we can do about that is expose people to more and better information.

But the second reason is that a lot of people just find the real world we have discovered since the 19th Century to be unsatisfying. They don't want to live here, it doesn't feel good to them to believe that they are what they really are, and so they cling fiercely to a fantasy. This problem does not have to be intractable. It turns out that once you understand it, humanism can be a perfectly satisfying philosophy after all. So one of my contributions to the birthday party will be to serve as a positive spin doctor. I hope I can spin you into ecstasy as well.

A Time to Worry

Somebody should ask James Dobson and Rick Warren why the Intelligent Designer has taken to designing antibiotic resistant bacteria. I'll be particularly interested in His reasons for designing vancomycin-resistant Enterococcus faecium. The Lord works in mysterious ways.

Wednesday, January 28, 2009

What I do

In response to C. Corax, unfortunately there are some weird rules to the science game -- rules which may be undemocratic, but which I have to play by. In particular, I'm not allowed to go public with research results until they have been published, except in a limited way at academic conferences. That means we've found out some stuff here that I think is interesting, but I can't tell you about it or I might get in trouble. In fact it would be helpful to get some feedback from a broad audience, and would undoubtedly improve the interpretation and applicability of results, but that would be violating the privileges of the secret society.

So let me at least tell you more generally about my interests.

I originally became interested in what is generally framed as the problem of cross-cultural competency in medicine, and the broader issue of how language and culture shape people’s understanding and engagement with their health and health care, and that of significant others. These interests developed largely for reasons of personal history, but the subject also happens to be of topical importance, complex, and intellectually interesting.

I quickly recognized that the cross-cultural situation just adds a layer of complexity to what is already a very problematic kind of encounter, and that the problem of cultural competency is often misconstrued. So here are a few observations I made early on that are central to my current perspective.

In 1996 I had the opportunity to audiotape 150 pediatric visits, mostly in primary care but also a few pulmonology (all asthma), lead clinic, and growth and development specialty visits. About 2/3 of the families in the set are Latino, with every possible language situation: Dr and mother (or the occasional aunt or grandmother and a couple of fathers) both speak English fluently; mother gets by on less than great English; mother and Dr both speak Spanish fluently; Dr gets by on less than great Spanish (but probably thinks he’s Cervantes); there’s an interpreter (a bad one, in 100% of cases); and in one case, the 12 year old sister of the sick infant interprets. There is also a case in which a Haitian physician and Cape Verdean mother communicate with each other in broken Spanish.

The journal articles which have come out of this data set all concern interpretation; unfortunately I haven’t had the time or resources to turn my numerous conference abstracts on other subjects into articles. But here are the bullets:

Cross-cultural competency was oncen understood as being all about people’s culturally specific health beliefs and practices – the weird voodoo and herbal concoctions of those colorful, primitive exotics. Providers are always getting dragged off to these workshops where an expert will tell them all about mal de ojo and Santeria. Pish tosh . . . and that is now generally recognized.

These practices and beliefs obviously do exist, but learning about them has next to nothing, or maybe less than nothing, to do with becoming a culturally competent provider. After all, quite a few suburban WASPs who played on the same college golf team with the doctor gobble potions they buy at the GNC, have the nuns pray for them, wear copper bracelets, or chant. Providers can always ask about that stuff if they think it’s important.

The real problems of cross-cultural competency are just a crust on the standard casserole. The language barrier is a huge issue of course, and interpretation is at best a necessary evil and hardly a solution – of which more anon. But setting that aside for the moment, cross cultural encounters differ in degree, not in kind.

My observations – as yet informal, so let’s say hypotheses – are that cross cultural encounters are often relatively ineffective due to the following characteristics:

• Misalignment of expectations about role relationships and interaction styles. For example, Latinos often perceive that Anglo doctors are “cold,” overly businesslike, and unfriendly. I don’t know about medicine, but in social services and behavioral health we often run into boundary issues – the clients want to invite the therapist to the family barbecue or the baptism.

• It may seem paradoxical, but this does not imply an expectation of lesser social distance. On the contrary. Providers may be unaware of the extent to which their cultural authority inhibits people from providing intimate or embarrassing information, asking questions, or indicating that they do not understand something.

• Non-comprehension of people’s lifeworlds. Providers don’t appreciate, and don’t think to ask about people’s social, economic and physical context and how it may interact with adherence to medications, life style recommendations, appointments and follow-up, etc. This includes the specific issue of individualism vs. family and community in treatment decision making and self care. (Hint: the dominant Anglo culture assumes these are essentially issues for the patient as an individual.)

You’ll notice right away that patients don’t have to be exotic for these problems to apply, one way or another.

Also notice that I haven’t said anything about health literacy, comprehension of scientific theories of disease and treatment, or remembering and following complex instructions, and that’s because none of that has anything to do with whether a situation is cross-cultural or not, assuming we get past the basic issue of communicating with people in a language they understand. It helps, obviously, when patients have more formal education, but I have found that even well-educated people whose education doesn’t happen to include a lot of biology and biomedicine can be pretty much at a loss when it comes to etiological and therapeutic theories.

It’s important to remember, however, that in general, people don’t know what they don’t know. We can observe from the outside that people’s understanding of how their doctors explain their diseases and treatments is not well aligned with what their doctors actually think, but people very seldom complain that their doctors say things to them that they do not understand. By and large, they either think they do understand, or it goes right over their heads without their really paying attention. Their complaint, if any, is likely to be that they weren’t told anything at all, the doctor never mentioned that. The concerns of “health literacy” and instrumental understanding are pretty much etic to patients. Most of the time, they’ll fill in the blanks with a story that satisfies them, rather than decide they didn’t understand something.

So, what do patients take away from their encounters with their physicians, and vice versa? (Note that the question of what physicians understand about their patients is not as commonly asked.) How do treatment decisions really get made, what communication strategies result in better mutual understanding, more success by both physician and patient at managing disease, and better lives for people?

A second broad interest concerns the social production of health, of which medical care is not such a huge part after all. Call it health equity. Justice if you will. Again, I tend to see it through a frame of culture and ethnicity but that's just an extra layer, you don't have to be a foreigner or a minority group member to get screwed, one way or another. My first graduate degree is in environmental policy (which is how I learned that economics is a crock) and I'm trying to understand how communication in the clinic and people's life worlds are connected.

Finally, there is still that problem of language. How does language construct reality, what are the limitations of interpretation -- ultimately meaning simply cannot be entirely the same in different languages -- and how can language barriers be minimized in clinical practice?

So those are the areas in which I believe I am some sort of expert, but that doesn't mean I know more than you do about them. It just means I know about them in a particular kind of way. We all experience our own lives, our own health, our own encounters with the medical institution, and we know all about those subjects. So I intend to do research in a way that is still fairly unconventional, and that is in partnership with people who used to be treated entirely as subjects: what we call participatory research. So I'm hoping that all of you can be participants as well.

Tuesday, January 27, 2009

More on Democracy and Science

Dennis Overbye, in a very well-written essay, argues that a healthy scientific enterprise is the mark of a healthy democracy. He uses as his counterexamples Communist China and the Soviet Union, but he opens by invoking the past 8 years and the elation we all feel at the restoration of science to its proper place of honor in this country.

Overbye sees science itself as an essentially democratic and democratizing exercise, and I agree that in the long run it has proved to be so. Nevertheless I have long argued here that science as actually practiced is much less democratic than it ought to be, and that many people -- probably most Americans, in fact -- see it as exclusionary and even oppressive. Rank and file creationists don't cling to their beliefs because they are an inferior breed, but they feel scorned by an establishment they perceive as arrogant and hostile to their values. In other words many on the scientific side of the divide do look down on them, or at least on the leaders and champions they respect.

For the cause of science to triumph, we must continually struggle to bring more people inside, and we don't invest enough in doing that. The academy is insular and obssessed with hierarchies of rank and title, degrees, institutions, journals, and awards. People won't be convinced by scientists who insist on speaking a private language of exclusion, strutting about in their glorious professorships, and not letting your kids into their university. Writing for popular consumption and speaking to a mass audience actually earns you demerits at Harvard and has even done some famous professors out of a job. Paul Starr and Cornell West come to mind.

I pledge to work to build a research institute without walls. It doesn't make any sense to study physician-patient communication entirely from the physician's side. We're going to bring in patients as full partners in this enterprise, not as research subjects but as participants who contribute equally to the scientific product. It may seem less than obvious how to go about that in biology, or physics, or cosmology, but I believe it can be done. Of course hard won expertise and exceptional talent are essential to good science, but arcane knowledge and membership in exclusive societies do not contribute to making the right choices about what questions to ask and what sense to make of the answers. Those are the rightful domain of everyone.

Monday, January 26, 2009

If they can send a man to the moon . . .

The other day a guy got on the elevator with me, and I saw from his badge that he was Doctor Umptyump, Rheumatology. So I asked him, "Can you cure my medial epicondylitis?" That's how you say tendinitis in the elbow in doctorese.*

"No, but I can inject it."

"I understand that just makes it worse in the long run."

"Yep, it's not a good idea. You should just live with it. That's what I do."

If you look this up (it's not actually tennis elbow, it's the opposite ligament, on the inside of the elbow, but same idea) the book says that with a few weeks rest, it will ordinarily resolve. Sometimes, and sometimes not. It can also be chronic and quite intractable, as my new friend the rheumatologist obviously knows. And there's not a damn thing medical science can do about it. (The injection he referred to is a cortisone injection, which will knock down the inflammation, but can permanently weaken the tissue. Sometimes athletes have it done so they can get back in the game but it's not a trade off that makes sense to me.)

The larger point here is that medical advances have largely missed some of our most prevalent annoyances. They still can't cure or prevent the common cold, osteoarthritis, or chronic tendinitis. I don't think that colds have much of an impact beyond being a nuisance, but the musculoskeletal deterioration we tend to suffer as we grow older does cause people to become less physically active and so can contribute to bigger problems -- such as diabetes and heart disease, not to mention depression. UI don't intend to slow down if I can possibly help it, but not everybody's pain threshold is as high as mine. Just a little hint for the good people at NIH.

*I once saw a podiatrist because I had suddenly developed a very stiff big toe. He said, "Oh, you have hallux rigidus." I said, "What's that?" He replied, "That means a stiff big toe." I wasn't paying for a Latin lesson. And no, there was absolutely nothing he could do about it except teach me how to say it in Latin. It's osteoarthritis.

Friday, January 23, 2009

A bit more on the politics

I'll try to answer one question briefly. Why do the other members of the United Federation of Planets have some form of universal health care, but we don't? It has a bit to do with accidents of history, and more to do with our political culture.

The accident of history is a bit paradoxical. The present system which is largely based on insurance provided by employers got established during WWII, actually before most of the other countries set up their universal systems. It happened in part because of wage controls imposed during the war, in the middle of a tight labor market. Employers couldn't raise wages, but they needed to offer perks to attract and retain employees, so they padded the benefits, of which health insurance was a particularly nice one. Unions liked this system because it gave them something they could bargain for and win for their members as well.

Remember that back then, health care didn't cost nearly as much as it does now, but on the other hand it wasn't as wonderful a thing. Most of us are too young to realize that it wasn't until WWII and really the post-war era that doctors actually had a decent clue what they were doing. Antibiotics were developed during the war and became generally available afterwards. That was the biggie in itself, and it led to the possibility of reasonably safe surgery. Then came a growing understanding of heart disease and cancer, orthopedic surgery and devices, and so on. Until then, doctors did at least as much harm as good. (That balance is still a lot closer than we'd like it to be, but it has certainly tipped.)

So, when England established its National Health Service in the 1950s, and as the Canadian single payer system developed over the decades through the 60s,
medicine was just coming into its own. The vested interest represented by the pharmaceutical companies and the medical establishment was not as powerful as it is in the U.S. today, and health insurance companies in those countries were minor players. Even so, the Canadian single payer system had to overcome considerable resistance from doctors. In fact, the doctors in Saskatchewan went on strike in 1962, but they kind of lost steam when the death rate immediately went down.

Harry Truman tried to introduce a national health program in the U.S., but the American Medical Association was an implacable opponent, and the doctors' lobby was too well funded and too powerful to defeat. Meanwhile, since many workers already had insurance through their jobs, and health care wasn't all that expensive anyway, the pressure for reform wasn't all that powerful. By the time John Kennedy became president, however, the plight of uninsured elderly and low income people was obvious, and he proposed creating programs to address their needs. So the AMA had Ronald Reagan make a recording called "Reagan Speaks Out Against Socialized Medicine," which was sent out to the Ladies' Auxiliary of the AMA (yep, doctors were presumptively male and their wives formed an Auxiliary), to be played at garden parties. Reagan said, famously, that if Medicare passed, "one of these days you and I are going to spend our sunset years telling our children and our children's children what it once was like in America when men were free."

Lyndon Johnson ultimately got Medicare and Medicaid passed in 1965, but in the larger picture, that took off much of the pressure to create a truly comprehensive system. Meanwhile, even as the AMA shifted its position, the power of the drug and insurance companies grew. So, the bottom line is, we've had both bad timing, and a deep-seated cultural resistance to anything that can be labeled "socialism." No, Medicare didn't turn the U.S. into a totalitarian dungeon, but the drug peddlers are still screaming and yelling that a single payer system will. And enough working class people believe it that we don't have a unified constituency in favor.

Why do working class people in the U.S. fear government intervention to promote the general welfare? That's a longer story, but I'll try to tell it soon. All I'll say for now is that the election of Barack Obama could change that, if he's bold enough. So far he doesn't appear to be, but we'll see.

Thursday, January 22, 2009

You don't just have to take it from me . . .

Lots of good freebies in today's NEJM -- they're making more and more material of broad public interest open access, and so they're getting credit where it's due. Long-time readers know that I have castigated them unmercifully about this, so I guess it's the awesome influence of Stayin' Alive that's brought about the change.

Anyway, I commend your attention to J. Oberlander's essay on the prospects for health care reform. He largely agrees with me in seeing the political interests as stacked against major changes. The drug and insurance companies just aren't going to allow it if they can possibly help it. And if they won't give up any income, real cost containment isn't going to happen because Obama's talk about saving money through electronic medical records and enhanced preventive efforts is speculative at best and unlikely to yield major savings. But he sees a glimmer of hope in all the bad news -- if enough people lose their health care insurance in Great Depression II, and the federal deficit is already a gazillion dollars anyway, maybe providing some real form of universal, affordable coverage will become feasible. I don't know if it's something we ought to wish for, but maybe so.

Michael Sparer (who might be a guy I went to college with, I don't know for sure but it's not the world's most common name) thinks that the idea of expanding Medicare and allowing people to buy into it won't fly, largely because it will invoke a term he considers to be radioactive, to wit "single payer." So he recommends expanding and allowing buy-in to Medicaid, instead. That seems to me like a pretty feeble reason for preferring what, in my opinion, would be a highly inferior option. Michael, or Dr. Sparer, depending on whether I actually know him or not, sees advantages in the state-level administration of Medicaid, which allows the states to try different policies and thereby we can muddle through to whatever works best. However, as far as I'm concerned, Medicare already works, so what's the problem? And as for any stigma that may attach to the term single payer, I'd rather work to change the culture than surrender to idiocy.

Finally, they give rare free access to a research article, in this case an epidemiological study of the effect on life expectancy of reductions in general population exposure to fine particle pollution. This is important as we now have an administration that has pledged to make environmental policy based on scientific truth rather than the venal interest of polluters. The amount of healthy life we can buy by reducing air pollution are impressive indeed -- about 7 or 8 months for every 10 micrograms per cubic meter. Since fine particle concentrations in U.S. cities today are typically around 15 mcg/m^3, we still have room for improvement. (Near major highways, people are exposed to high levels of even smaller ultrafine particles, which are even more dangerous. But that's a story for another day.)

So, let's see if Congress tightens the standard for PM2.5 pollution, which means better controls on auto exhaust, power plants and industrial emissions. It might save your life.

Wednesday, January 21, 2009

Natural Supports

I was interested to see that one of our commenters is an oncological massage therapist. It so happens I was recently involved in a project concerning that very subject. There is good evidence -- real scientifical type stuff -- that massage can be helpful in palliation of cancer. It can relieve pain and malaise, reducing the need for narcotics and other drugs, and relieve stress. Massage therapy is certainly something people should consider in various circumstances, so long as the claims for it aren't overblown.

Our project was a product to teach caregivers -- i.e. spouses, significant others, adult children, siblings, friends, etc. -- of people with cancer basic therapeutic massage techniques, using a manual and a video, based on a full-day instructional workshop given by professional massage therapists. I'm not trying to put our friend out of work -- there's obviously a role for professionals -- but there are also extra benefits to mobilizing natural supports in this way. For one thing, as became very clear in the workshop and follow-up, it can mean a lot to the loved ones to have something so tangible to offer. It can strengthen bonds and make the ordeal of cancer and cancer treatment easier to bear for both parties in profound ways that go well beyond the immediate physical benefits.

As we try to cope with the rising burdens of morbidity and disability that will inevitably come with an aging population, strengthening and enabling natural supports is one strategy that we ought to pursue more aggressively. People can stay at home longer or get out of institutions -- hospitals, intermediate care facilities, rehab hospitals, etc. -- sooner if we give caregivers some help. That includes skills, assistance, emotional support, maybe a little bit of money, and respite. Of course, sadly, not everybody has people in their lives who can do this, but there are strategies to fill even that need. There's a danger in the temptation to force this on people beyond the point where it's a positive choice, in order to save money, but on the other hand many people would prefer it.

Right now, Medicare, Medicaid and most private insurance don't offer much support for these strategies, or place arbitrary time limits and other restrictions that are counterproductive. Payers -- contrary to their own financial interests -- actually force people into institutions rather than strengthening and supplementing informal supports so that people can stay out of them. We need to rethink these policies, on both financial and humanitarian grounds.

Tuesday, January 20, 2009

Nobody cares what I say today . . .

But for the sake of good discipline, I'll do a post. It so happens that Mr. Obama and I both started new jobs today. (Also, the eruption of Mount St. Helen occurred on my birthday, so I'm just a portentous guy.) My new job is at an academic medical center, so even though I'm a researcher not involved in patient care, I had to go through the standard orientation. I now know how to handle hazardous chemicals, what to do if I get blood or excrement splashed in my face, and what to do if somebody abducts a baby. (The latter is a Code Pink, by the way.)

While much of this may not have been directly relevant to my work, as a medical sociologist I was certainly interested in observing it. I actually took a great deal away from the experience, but I'll just note here how much hospitals - at least big urban hospitals - have to be concerned about security issues. Health care workers have the highest probability of any profession of being assaulted on the job, and that includes police officers. Hospitals not only have a lot of mentally ill, delirious and/or demented people in them more or less by definition, but they also have a lot of people who are distraught about the fate of family members or themselves, and may get belligerent about it. People also like to go into hospitals to steal, not only babies, but narcotics and other stuff.

The buildings are wide open, anybody can walk in, and they have to serve everybody regardless of how the people behave or how unsavory they may appear. Hospitals can't do security screening, or make people sign in, or stop pretty much anybody from just walking up to the elevators and heading wherever. It wouldn't be practical, and it wouldn't be friendly.

So there is a great deal going on, largely behind the scenes, to provide security. This is just one indication of the complexity of these organizations and the substantial institutional challenges that they face. The experience of being a patient or the loved one of a patient is often alienating and infuriating, and I do aim to make it better, but at the same time, I ask you to cut them some slack -- it isn't easy. More on all this anon.

Monday, January 19, 2009

Inferiority Complex

I'm seldom at a loss for words, but I sat down today feeling that I could not say anything adequate to the occasion. I have a feeling there's a lot of that going around. Suddenly, I feel as though perhaps I belong in this country after all. In a heartbeat, the values I share have become essential, celebrated, honored, after decades of scorn and contempt from the arbiters of respectable opinion. My friends and relatives have canceled their plans to move to Canada or dust off that Irish ancestry and head for the Eurozone.

Sure, it's just a fad, and we all know that inside of a month Tweety and Modo and Brian Williams will be back to spewing their usual swill. However, it is not too much to hope that racism will no longer be dispositive in convincing working class people to betray their own interests on election day. It's not too much to hope that Americans will stop viewing war like a football game and confusing arrogant bullying with respectability. It's not too much to hope that God will go home to the chapel and the value of life will be understood as pertaining to the living. It's not too much to hope that we'll behave responsibly toward our progeny and toward each other. It's not too much to hope.

Friday, January 16, 2009

Murder Pays

If you're a drug manufacturer. You may have heard that Eli Lilly has agreed to pay a total of $1.42 billion to resolve criminal and civil charges related to its marketing of the antipsychotic drug Zyprexa (generic name olanzapine). Like the other so-called "atypical" antipsychotics, the drug was tested in clinical trials to control symptoms of psychosis. Not a whole lot of people have psychoses, however, so they looked for bigger markets.

The linked article describes their efforts to market the drug in nursing homes and assisted living facilities, to be given to people with dementia; and to primary care doctors, who obviously would have no business prescribing it since psychoses are only appropriately managed by specialists. In fact, like all the atypical antipsychotics, the drug is extremely dangerous. It causes weight gain, diabetes, and heart rhythm irregularities and is therefore associated with both long-term development of cardiovascular disease and increased risk of sudden cardiac death.

The AP article doesn't mention what might be the worst part, but New York Times does: Lilly was heavily promoting the drug for "disruptive" children. The side effects of weight gain and hyperglycemia are particularly pronounced in children, and obviously set them on a lifelong path of ill health and risk for premature death. They got a lot of help from a psychiatrist at Harvard named Joseph Biederman, who has almost singlehandedly created an epidemic of diagnosing bipolar disorder in children and feeding them antipsychotic drugs. According to the NYT:

In the past decade, Dr. Biederman and his colleagues have promoted the aggressive diagnosis and drug treatment of childhood bipolar disorder, a mood problem once thought confined to adults. They have maintained that the disorder was underdiagnosed in children and could be treated with antipsychotic drugs, medications invented to treat schizophrenia.

Other researchers have made similar assertions. As a result, pediatric bipolar diagnoses and antipsychotic drug use in children have soared. Some 500,000 children and teenagers were given at least one prescription for an antipsychotic in 2007, including 20,500 under 6 years of age, according to Medco Health Solutions, a pharmacy benefit manager.

What Biederman wasn't telling us -- what he was in fact actively lying about -- was that during this period he took more than $1.6 million in consulting fees from drug companies, including Lilly; and that his studies of antipsychotic drugs in children, funded by the drug companies, were too small and poorly designed to show the benefits he claimed they showed. In fact, whether such a things as bipolar disorder even exists in children, or could be accurately diagnosed, is questionable. But the harm done by these drugs is unquestioned.

As for elderly people with dementia, Lilly knew for years that Zyprexa increased their risk of death, and the FDA finally issued a "black box" warning to that effect in 2006. But the marketing campaign was so successful that it continued to be effective long afterwards. We discovered that my own father had been given antipsychotics, in an assisted living facility in 2007 and in a nursing home in 2008, without consulting my mother and, in the second case, contrary to her specific orders. The reason? He was wandering around and it was easier for the staff to zonk him out with drugs than to keep an eye on him. And that was indeed the basis for much of the marketing -- that stoning old folks made it easier on the staff.

Now, you may think that $1.42 billion in fines would be a pretty strong deterrent to this sort of behavior, but in fact, Zyprexa brings in $4 billion a year, so Lilly is happy to pay it. As for Biederman, he's in big trouble -- they've made him promise not to do it again. In fact, the executives at Lilly, and Dr. Biederman, are murderers, and their motive is greed. They should be dealt with accordingly.

Thursday, January 15, 2009

Is South Carolina a real place?

State Sen. Fuckwit Ford has introduced legislation to fix the country's real problem:

Be it enacted by the General Assembly of the State of South Carolina:

SECTION 1. Article 3, Chapter 15, Title 16 of the 1976 Code is amended by adding:

"Section 16-15-370. (A) It is unlawful for a person in a public forum or place of public accommodation wilfully and knowingly to publish orally or in writing, exhibit, or otherwise make available material containing words, language, or actions of a profane, vulgar, lewd, lascivious, or indecent nature.

(B) A person who violates the provisions of this section is guilty of a felony and, upon conviction, must be fined not more than five thousand dollars or imprisoned not more than five years, or both."

Don't know if this ridiculous fucking shit will pass, but either way, nothing would please me more than for the South to secede again. Then we'd have a whole nation of dipshits.

Wednesday, January 14, 2009

Some feeble reflections

First of all, thanks to all for the condolences and good wishes. I suspect that long-time readers may have figured out the situation -- that my father had progressive dementia. It was a terribly long process, and his death I'm sad to say really came too late. But it's a relief that it has finally happened.

I have not written a great deal about the problems of long term care and the related issue of the growing prevalence of dementia. That is essentially because I haven't felt I had much to say that was in any way original or profound. There are some policy reforms that would ameliorate this problem to some extent, but there is no real policy solution. It's going to be a great burden on families and on society, part of the price we pay for increased longevity, unless we come up with some technical fixes to prevent or greatly slow the progress of dementia.

As for policy reforms, there is certainly a justice case to be made for socializing the cost of long-term care by expanding Medicare benefits to cover it. As you probably already know, the family has to spend down to near penury, whereupon Medicaid takes over. The quality of Medicaid financed care varies from state to state. Fortunately, in Connecticut, the benefits are sufficient to pay for good quality care, although you have to be lucky enough -- as my family was -- to find a good nursing home. And the only way to do that is for the person to enter the nursing home as a private payer. The good ones won't accept Medicaid patients, although they will let the person stay once they make the transfer.

My parents had a decent nest egg, but it was completely wiped out, as happens to just about everybody in their situation. It takes true wealth to survive four or five years of custodial care. Expanding Medicare to cover this means that everybody will pay more in taxes, but we'll be buying insurance against this catastrophe. It would also end the discrimination against low and moderate income people who must depend on Medicaid and who often end up in situations that we should not tolerate.

However, the most logical existing structure through which to do that would be Medicare Part A, which is financed by a regressive payroll tax. There is nothing unethical, in my view, about requiring all workers to take on a fair share of the burden of caring for their elders, but such a system ought to have progressive financing. And I'm afraid that right now, it just isn't politically viable. Everybody is screaming and yelling that the cost of Medicare is already unsupportable and has to be reduced. To stand up and say, no, we have to raise taxes and expand Medicare is probably feckless. But listen up folks, we're going to pay for it, one way or another.

There are some additional advantages that could come with an expansion of Medicare. It could pay for home based services and adult day care programs, which most people would likely prefer as long they were a viable option; and then for assisted living rather than nursing care. More use of these kinds of services would reduce overall costs. Right now, some people are forced to accept nursing home placement prematurely because that's the only modality Medicaid will pay for.

Sadly, though, this is not going to happen any time soon. In the current discussion of health care reform, it's not even a footnote.

Sunday, January 11, 2009

Personal news

My father died this morning, after a long illness. It was expected. I probably will not be able to post for a couple of days, but I'll be back here as soon as I can.

Perhaps I'll have something to say about long-term care and end of life issues, but unfortunately my family's case is all too common. Anyway, it's good that it's finally over.

The God who failed?

I'm happy to be able to say that I'm not an economist. I do know something about economics, however, because I had to pass qualifying exams in that subject en route to my Ph.D. Notice I said I know something about economics thanks to those studies, not that I know something about the economy. Asking an economist to explain the economy is like asking a theologian to explain biology. A lot like that, actually. Exactly like that.

Well, this curmudgeon is actually pleased -- yes I am -- to see that some economists are actually coming forward and admitting that yep, their profession is full of crap. Here's Uwe Reinhardt, who is big stuff, here just happening to notice that 99% of economists failed to see the big crackup coming:

If groupthink is the cause, it most likely is anchored in what my former Yale economics professor Richard Nelson (now at Columbia University) has called a ”vested interest in an analytic structure,” the prism through which economists behold the world.

This analytic structure, formally called “neoclassical economics,” depends crucially on certain unquestioned axioms and basic assumptions about the behavior of markets and the human decisions that drive them. After years of arduous study to master the paradigm, these axioms and assumptions simply become part of a professional credo. Indeed, a good part of the scholarly work of modern economists reminds one of the medieval scholastics who followed St. Anselm’s dictum “credo ut intellegam”: “I believe, in order that I may understand.”

Well now isn't that exactly what I have been saying here for years? Unfortunately, Professor Reinhardt doesn't draw the obvious logical inference, which is that universities should stop paying these clowns to indoctrinate students with unmitigated baloney. Economists actually have the audacity to claim that they are the only "hard" social scientists, a distinction which they apparently think they earn because they use a lot of mathematics. In fact, if you start out by assuming a lot of stuff that isn't true, then manipulating those imaginary entities mathematically is not any kind of science, hard, soft, liquid or gaseous. It's masturbation. It is unnecessary to teach college students how to masturbate.

The entire discipline is utterly worthless. Economics departments ought to be abolished and the professors turned out to do useful work, clerking in bookstores or something. They need to start over, with the study of reality, but other people will have to do it, who possess working intellects. As Reinhardt concludes,

As far as diagnoses of economic trends and predictions about the future are concerned, the profession’s preferred analytic structure and the groupthink it begets might work superbly well on planet Vulcan, whence hails the utterly logical Mr. Spock of “Star Trek” fame.

On Planet Earth, however, that analytic prism can seriously blur one’s vision. It simply cannot accommodate the fact that our entire 21st-century banking sector, managed as it is by graduates of the nation’s top business schools, supported by highly trained financial engineers, and monitored around the clock by thousands of allegedly bright financial analysts, immolated itself with highly toxic assets, purchased with borrowed money, and in the process infected the entire world economy.

And thus the economics profession slept comfortably as Wall Street was imploding. One can only hope that the medical profession would do better, should America ever be struck by a serious epidemic.

Right. Well, incompetent physicians lose their licenses. Think about it.

Friday, January 09, 2009

Okay, now where were we?

Sorry for erratic posting, I'm dealing with a family emergency. It may take me off line for a few days, but I'll let you know.

Anyway, Borderline Personality Disorder is terrible nomenclature and I really wish they'd change it. The name comes from a time when they thought BPD was somewhere on the borderline (get it?) between the historic constructs of psychosis and neurosis. (Those terms are still used, but the meaning has evolved somewhat.) The idea is that people with the diagnosis stubbornly interpret reality in ways which differ from the perceptions of others, i.e. there is something close to delusional about them, it just drives you nuts trying to get them to have useful insight into their own behavior.

Specifically, and this is my personal description, it's not by the book, people -- who are mostly female -- with this diagnosis have a powerful aversion to being alone and a pervasive fear of loneliness and abandonment. This causes them to develop extravagant crushes, romantic or otherwise, on people who may not reciprocate the attraction -- or who may exploit it -- and to be highly manipulative, clinging and needy in these relationships, whether the relationships exist largely in their own heads or have some mutuality. Naturally, this excessive neediness and the passive-aggressive manipulativeness that comes with it has the effect of driving its objects away, whereupon the person develops a narrative of victimhood and cruel abandonment, putting all the blame on the formerly beloved object who is now utterly contemptible. Along this path there may be an episode of stalking behavior.

This pattern repeats over and over, accompanied by extreme emotional lability (possibly buying a co-diagnosis of bipolar disorder), and often substance abuse, suicidal ideation and gestures (though uncommonly real suicide attempts), sexual promiscuity, self harm, crying fits and tantrums.

As you can see, the people who are most likely to be attracted to this person include narcissists and sociopaths, so you may have some very ugly situations.

The conventional etiological story about this is quite Freudian sounding, and also intuitively fairly convincing. The idea is that it comes from a failure to internalize objects, specifically the protective and nurturing adults of infancy. Babies eventually get the idea that even though Mommy isn't here right at this moment, she still exists and will come back. We learn to comfort ourselves with the dependable love of people even when they aren't with us. As adults, even the deceased are still a part of us. But if you can't do this, you have a problem.

And indeed, people with the diagnosis have a higher than usual prevalence of lack of nurturing during infancy and childhood, even of abuse. It doesn't have to be there, but it's a pretty good indication if it is.

Now, here's my argument for the essential validity of classifying this as inherently a deficit in a human personality, whether or not you like the term "disease," and why this is not similar to labeling homosexuality as a pathology. It turns out that the only reason homosexuals have historically been unhappy and even suicidal is because they have been despised and persecuted. Take away the stigma, and you take away the problem. We now know, from ample experience, that it is perfectly possible to be happy and functional as a homosexual so long as you are in an environment which is sufficiently accepting and affirming. Take it from me, I live in Jamaica Plain and work in the South End. (I'm not gay, but I'm fabulous anyway.)

But the problem with BPD or whatever you want to call it is not moral stigma. People who behave that way are inherently difficult and burdensome to others. It is very difficult to imagine a human society in which BPD is not dysfunctional and a continual source of grief and pain, for the sufferer and those around her. Human social organization is hugely variable, but it is built out of a finite set of elements that work something like Lego blocks -- they link together in a specific, limited way that gives you the freedom to create great, complex edifices.

There is a natural social grammar, if you will, analogous to the natural syntax on which our vast diversity of languages is built. Children who are not autistic quickly acquire this social grammar as easily as they acquire language. It is intuitive, and it is also universal. Although politeness rituals and taboos vary across cultures, these can be described for us by an etiquette book or a human guide, and armed with this intellectual understanding we can readily associate with people of any culture and form friendships and even intimate attachments. We understand that relationships require reciprocity, recognition of the other's needs, boundaries, how much emotional availability they have for you, and whether or not they feel like spending time with you at this particular moment, among other kinds of limits. We get that demanding too much of other people is counterproductive.

The word normal has 3 meanings. It can mean corresponding to moral norms; it can mean average or typical; and it can mean "as we would desire things to be." Whether a person is average or typical in some way often just doesn't matter, or a person can be deviant in a good way, e.g. abnormally intelligent. Homosexuality is abnormal in the sense of being a minority condition, but whether it is abnormal in either the first or third sense is entirely dependent on circumstances. We can just decide that it's not immoral and that's the end of that, and after that the only reason I can think of offhand that it might conceivably be undesirable is if we are in danger of extinction, which we aren't. But BPD is inherently abnormal in the third sense. It just doesn't work.

Finally, there is a lesson here for all of us. Even if we don't merit the diagnosis, there may be a little bit of the maladaptive behavior in most of us. It's easy to slip into dependency, or demanding more of people than they can comfortably give, not giving enough in return, making a relationship more about ourselves than about each other. It's always good to think about that.

Wednesday, January 07, 2009

Disease, sickness, illness, disorder, condition . . .

First of all, I should acknowledge that Kathy A. is basically right, I was a bit sloppy with my vocabulary. ASPD is a pretty vague and often circular diagnosis that can be applied to anybody who badly misbehaves; what I really meant to defend is the much more specific, and rare, label of psychopathy. What I meant to say is that the diagnostic label should be reserved for people who truly lack the capacity for empathy.

So let's back up a bit. In medicine, diseases are usually defined in terms of an etiological process. Similar symptoms may be ascribed to completely different diseases, e.g. infection by different viruses; while people with no subjective symptoms may be given a diagnosis because of some biological fact which is believed to create a risk for illness (the subjective experience of poor health) in the future, e.g. HIV infection or hypercholesterolemia. Physicians generally don't like to call clusters of symptoms without a known etiology diseases, rather they call them syndromes, and they tend to argue about the ontological status of such entities. Good examples are fibromyalgia and chronic fatigue syndrome.

In psychiatry, however, etiology is generally unknown. Practitioners like to tell stories about causation but they are usually poorly supported by evidence. Drug companies also market theories about biological causes of DSM-defined diseases but these are largely mendacious. For example, they have convinced most people that a disease called depression is caused by a deficiency of the neurotransmitter serotonin, but this is almost certainly false. The evidence for this is supposedly that drugs which increase serotonin levels alleviate depression, but even if that were true -- which it is not for the large majority of people diagnosed with depression -- it would not demonstrate that depression is caused by serotonin deficiency. Amphetamines cause people not to feel hungry even when they don't eat, but that doesn't mean that the cause of hunger is amphetamine deficiency, even less that the solution to hunger is amphetamines.

The DSM entities are defined by symptoms, mostly consisting of descriptions of behaviors, not by etiology. Since human behavior is continuously variable, highly flexible, highly complex, and mutable, it's very hard to rigorously sort people into behavioral buckets. Generally speaking, only the most stereotypical, rigid and simple kinds of behaviors can be measured unambiguously. For example, Tourette's Syndrome has a strong ontological claim, because we can say quite definitely whether or not a person has an uncontrollable, repetitive behavior of injecting inappropriate words into conversation. But those sorts of entities are usually thought of as neurological rather than psychiatric, and once a definite etiology is discovered, they cease to be the province of psychiatry at all.

So, now I hope I've cleared the decks enough to discuss BPD. My basic reason for wanting to do so is to argue that it is not entirely feckless to look for psychodynamic explanations for emotional and behavioral problems, and that perhaps there is a coherent story to be told about "normal" human personality development and the ways in which the social environment can derail it. We need a definition of "normal" in order to say that, and in fact the word has multiple meanings. But it can mean more than just whatever the culture currently happens to find morally meritorious, or whatever happens to be most common. In both those respects, homosexuality used to be abnormal, as it still is for some people, and probably always will be in the latter respect. But I mean to argue that BPD is different.

Tuesday, January 06, 2009

The wind blows cold on the borderline

As you know, I'm highly skeptical of the ontological status of the DSM diagnoses, which is a fancy way of saying I think for the most part that they don't refer to real entities. My philosophy is realism: if it's real, I believe in it, but I don't believe in the DSM-IV. Well, the book is real, the rhetorical categories in it exist, but what I mean is that their referents are not really real.

This is most true of the so-called personality disorders. If you read the diagnostic criteria, you'll probably start thinking, "Oh yeah, that sounds kind of like Uncle Fred or Cousin Mae." But do Uncle Fred and Cousin Mae have a disease, or are they just a pain in the ass? And then you'll think, "Well, Fred isn't exactly like that. A couple of those items don't describe him after all, and one or two others aren't exactly right." Research has shown that the diagnostic reliability of the personality disorders is poor -- people get different diagnoses from different clinicians. Furthermore, lots of people get a diagnosis of a little it of this, and a little bit of that. The person is avoidant with dependent features, that sort of thing. Exactly where to draw the line between your friend the drama queen and a person with the disease of histrionic personality disorder is impossible for anyone to say. Finally, there are no evidence-based etiological theories for most of these disorders.

So all of the above knocks the pins out from under any claim that these are labels for real entities. They are more like character sketches from novels, that resemble actual people to varying degrees, and the main reason for calling them diseases is to provide the required diagnostic label for billing purposes. That is not to say that people who have difficulties with social relationships might not benefit from counseling, and these frameworks might even prove useful as a way in to discussing the problem. I'm not saying nobody should ever talk in these terms, that's a more complicated argument.

However, I'm now going to surprise you (maybe) by saying that I find two of the personality disorders reasonably convincing. One is antisocial personality. It has a clear, well-bounded definition: lack of empathy, pervasive indifference to the rights and well-being of others. Something quite specific is missing which we expect a properly put-together human being to have. It is nevertheless rather odd to call it a disease since the person to whom the label applies probably doesn't think anything is wrong with (usually) him, and doesn't want to be fixed. I have discussed this at greater length in the past and will return to it.

The other personality disorder that pretty much works for me has the grotesquely wrong label of borderline personality. The term comes from a historical misperception that it was close to being a psychosis, since the person consistently fails to see certain aspects of reality which are evident to the people around (usually) her. My friend Gary Greenberg has argued that borderline personality should be thought of, not as a disease, but as a kind of moral judgment, much as homosexuality used to be considered a psychiatric disease essentially because of social disapproval. (I can't link to Gary's original article, which as far as I know is not on-line, so the link is to a discussion of his argument by Linda Nicolosi.) That is different from denying the ontological status of both conditions: homosexuality is obviously real, and Gary's argument presumes that borderline personality is just as real.

I agree with Gary about the latter assertion, but the former -- that it's only social convention that makes borderline personality a problem -- is one of those rare instances where we disagree. I'll take that up in the next post.

Monday, January 05, 2009

The chronosynclastic infundibulum

As Vonnegut fans know, that's the place where all possible opinions are true. My recent encounter with AIDS denialists, like an earlier (longer lasting and more intense) encounter I had here with creationists, has forced me to think a bit about the nature of knowledge, and why there are so many people out there who fervently believe stuff that I know damn well just isn't true. Obviously, they think the same about me, so I have to be able to show that there's a difference, that you, dear reader, have a basis for concluding that I'm right and they're wrong.

This is obviously a complicated subject, which is why half of philosophy is epistemology. There's a denialism blog, if you're interested in getting into the subject in depth. But what's the difference between denialism and legitimate intellectual dissent? There have been plenty of examples of people whose ideas were considered wacko at one time, but are now entirely accepted, e.g. Copernicus and Stanley Prusiner. The Denialism bloggers define it this way:

Denialism is the employment of rhetorical tactics to give the appearance of argument or legitimate debate, when in actuality there is none. These false arguments are used when one has few or no facts to support one's viewpoint against a scientific consensus or against overwhelming evidence to the contrary. They are effective in distracting from actual useful debate using emotionally appealing, but ultimately empty and illogical assertions.

Okay, but what makes me and Mark and Chris Hoofnagle the judges of what constitutes overwhelming evidence and what constitutes empty and illogical assertions? Our friend PZ Myers at Pharyngula (see sidebar) spends his blogging days arguing with creationists; he takes the time to engage them point by point. I personally agree that he proves they are deluded and generally incapable of coherent thought, but that does nothing to convince them or for that matter half of the American people. (Granted most of them know nothing about biology or paleontology and just believe whatever Rick Warren tells them to believe.)

I personally have made a substantial study of the subjects I write about here, including the pathology of HIV, and the scientific evidence concerning the history of the earth and of life hereon. However, I must admit that I fit a lot of other beliefs into my complete system without having in-depth knowledge. I accept the explanations of physicists and chemists essentially because I trust the institutional structures within which they work. I do read their stuff in Scientific American but that doesn't mean I understand the foundational work on which the conclusions they present for educated lay people are built. I basically just have to accept what they are saying.

This does not, however, make me comparable to creationists. True, most of them do invest credibility in some form of institution, usually a church. But it's the source of that credibility which is different. Faith-based denialism can be described and explained fairly easily, I think, as I have done in my post Thinking Backwards and a follow-up post.

But AIDS denialism is another matter. It has nothing to do with religion, or the violation of some prior belief system. The same goes for the thimerosal and autism nonsense, and global warming denial. These faux controversies are driven by people who claim to be rational empiricists, who are following the evidence where it leads them. It's just that their arguments are utterly nonsensical. What they have in common is a kind of political stance, that the entire scientific establishment, in each respective field, is actually a vast conspiracy against the public. Purported motivations include greed for research grants, a deeply vested interest in conclusions to which they jumped too soon, or an ulterior motive such as the destruction of capitalism or profits from drug sales.

That so vast a conspiracy is implausible -- involving, as it does, not only established scientists who lead research teams and publish in leading journals, but also every Ph.D. candidate, all of whom are successfully recruited into the conspiracy and none of whom blows the whistle; research assistants who even if they don't have a doctorate know enough to figure out the fraud; journal editors; public and private funders; and scientists in related fields who know enough to see through the fraud -- does not give them pause. But then you read the denialists' actual arguments and they are just absurd -- wrong on both facts and logic.

So we really need to look at their motivations. Some are con artists who are simply taking money from suckers. Others are more interested in self-aggrandizement and attention. I'd put RFK Jr. in that category, though he's made some money off of the deal as well. Some AIDS denialists just don't want to face up to the reality of their own situation, I suppose. How would you explain it?

Sunday, January 04, 2009

Now let's get real here

Okay folks, I have a responsibility to be absolutely clear and straight about this. There is no doubt whatsoever that:

a) Human Immunodeficiency Virus, HIV, a very well-studied and well-characterized entity, is the cause -- the sole cause, and a sufficient cause all by itself -- of a disease in humans called Acquired Immunodeficiency Syndrome, AIDS.
b) Combinations of drugs, collectively called Highly Active Antiretroviral Therapy, or HAART, that suppress replication of HIV, slow or entirely prevent the progression of HIV disease, and can reverse it when it has already developed.
c) People who are infected with HIV, who use HAART and adhere closely to the required dosage schedule, live much longer, and in better health, than people with HIV who do not use HAART.

Denying this is flat earth stuff, folks. It's like claiming that aspirin causes headaches. We know this, we don't just think it or believe it, and the way you can know it too doesn't require any scientific training and it doesn't require you to believe anything scientists say about the nature of HIV or the way the drugs work. Here's how you know:

Many tens of thousands of people all around the world, including in the United States, who had never taken any antiretroviral drugs but who nevertheless were terribly sick with AIDS, even terminally ill and on the point of death, who started taking antiretroviral drugs, recovered from their illness. They quickly, seemingly miraculously, got better. Their opportunistic infections disappeared. Once wasted away, they started gaining weight. Once bedridden and incapable of the basic activities of daily living, they arose from their beds, became active and vigorous, and resumed almost normal lives.

Doctors have observed and documented this phenomenon not only in writing, but in photographs and film documentaries. I have personally met and spoken with many people to whom this has happened -- people who fully expected to die and were preparing for death who suddenly found that they had lives again, and whose biggest problem was rediscovering life.

Remember that before the advent of HAART, AIDS was the leading cause of death for people age 20-49 in the United States. Thousands of people were dying every year. Furthermore, there were hundreds of children born every year with HIV infection transmitted from their mothers. Untreated, these children -- none of whom, obviously, had ever abused drugs or had repeated STDs or been malnourished, or any of the other reasons AIDS denialists say people with AIDS really get sick -- got sick and died, usually before they were six years old. With the advent of HAART, they stopped getting sick and they stopped dying and now they live right on into adolescence. We don't yet know how they'll do in the longer run but so far, so good. And, even better news, ARV drugs given to mothers absolutely do prevent transmission of HIV to their babies, so perinatal HIV in the United States is now extremely rare. Large programs that helped care for children with HIV and support their families have gone out of business. A whole cohort of pediatric specialists has had to find other work.

These are facts. They are incontrovertible. They speak for themselves. HIV causes AIDS. HAART prevents AIDS. QED.

Now, the drugs have side effects. For some people, these are worse than they are for others, but the side effects can be pretty bad. They include redistribution of body fat leading to physical deformity, hypercholesterolemia, diabetes, peripheral neuropathy, nausea, diarrhea. Many people who take HAART end up taking other drugs to manage the side effects, and they often end up feeling like a walking drugstore. Some people take 12 or 15 different medications.

Believe me, I hear you. That can be a stone drag. It's perfectly understandable that people often want to stop taking the drugs and take their chances with HIV, at least for a while. People with HIV do this so often that it even has a name -- a drug holiday. People might also decide that they'd rather wait to start taking HAART until their HIV disease is advanced enough that they really have no choice.

I'm certainly sympathetic to these positions, but I should also tell you that doctors have done a lot of experiments to try to find the best strategies for managing HIV disease, and they have concluded that drug holidays are not a good idea. They increase the chance that the virus will become resistant to the drugs, and they allow degradation of the immune system and other damage to occur. Delaying treatment too long is also not a good idea. While the consensus is not for immediate treatment, it now seems clear that it's not a good idea to let your T-cell count go below 300 if you want the best long-term prognosis.

These are the facts. Now, sometimes physicians don't do a good job of working with their patients to help them make the choices that work best for them. Doctors can be unsympathetic to the problems people have following their regimens, and respond to those difficulties with judgmental attitudes, scolding, or just callousness. These behaviors by doctors are counterproductive and can even lead to people stopping their medications.

Now, if you do stop, of course you will immediately feel better because you will no longer experience drug side effects and the psychological burden of having to be reminded that you have a serious disease every time you take the pills will be lifted from you. But eventually, your HIV disease will progress and then you will feel a whole lot worse. That's the way it is.

If you've decided to stop, that's your choice and I certainly don't think it's a moral issue or it makes you a bad person or anything like that. But please don't go around giving wrong information to other people.

Friday, January 02, 2009

Is that your final answer?

Orac and others have been all over the death of AIDS denialist Christine Maggiore, so I won't try to re-invent that particular wheel. I'll just give you a brief summary. Maggiore was a young woman who became infected with HIV and then had the very bad fortune to meet wacko pseudo-scientist Peter Duesberg, who persuaded her that HIV does not cause AIDS but antiretroviral drugs do. She refused to take AZT to prevent transmission of the virus to her baby Elizabeth, who subsequently died of AIDS at the age of 3, although Maggiore claimed the cause was something else. She went on to become a crusader against antiretroviral drugs and personally contributed to the deluding of South African president Thabo Mbeki and the deaths of hundreds of thousands of people. Maggiore herself has now died, apparently of AIDS-related pneumonia although, as one who walks in the light of reason, I won't jump to any definite conclusion about that unless we get more evidence -- which her family is withholding, of course.

I will use this as the occasion for what was going to be my next post anyway. As I have written previously, it is understandable that the experience with AZT monotherapy caused some people to doubt the standard HIV story. However, I simply do not see how anyone can remain in denial following the introduction of so-called Highly Active Antiretroviral Therapy (HAART), in which protease inhibitors and other classes of drugs are used in conjunction with AZT-like drugs.

When these regimens were introduced in the late 1990s, the results were absolutely incontrovertible. People all over the country who were days or hours from death, some who had been administered last rites, suddenly, seemingly miraculously, rose up and were healed. This was called the Lazarus Syndrome, and I have personally spoken with many people who experienced it for themselves, people who has ordered their lives around preparing for death and suddenly had to figure out how to live again. Until then, AIDS had been all about dying, and the institutions that grew up around the epidemic were hospices, and workshops about how to face the end, and grotesque financial instruments whereby people with AIDS would collect payments from people who wanted to gamble on their date of death.

With the introduction of HAART, the community had to completely reinvent itself around problems of living. The hospices went out of business, to be replaced by living centers, medication adherence counselors, and Positive Prevention programs to help people avoid being reinfected or transmitting the virus to others. The problem of paying for HAART therapy became a major focus of HIV-related activism. Face it Duesberg: people stopped getting AIDS, and they stopped dying, when they started taking the drugs. QED. End of story. Everything else the denialists say is equally bullshit. Yes, the virus is present in the blood and elsewhere of people who test positive for HIV antibodies, in virtually 100% of cases. Yes, the level of virus particles in the blood correlates strongly with progression of HIV disease. Yes, people who take the drugs faithfully have suppressed viral loads and they don't get AIDS. People who don't take the drugs do get AIDS. The sun rises in the east.

So the question is why this weird movement persists. Partly it's just that people get dug in to a position and they can't face up to their responsibility for error. To admit that you have contributed to mass slaughter is undoubtedly difficult. But they also attract followers who don't share that initial responsibility, but who somehow find the dissent appealing, as a symbol of resistance to authority, perhaps, or personal distinction. Whatever the psychology, it's extraordinarily frustrating -- quite similar to the thimerosal thing, and broader anti-vaccination activism. Some things we just know.

Thursday, January 01, 2009


The difficulty with most con games is that you have to promise people something in return for their money, and when they don't get it, the suckers are disappointed and call the cops and so on. All Bernie Madoff had to do was promise 10-12% a year in returns, no matter what was happening to the markets, and the money rolled in. But then some of the people wanted to cash in their profits so he's going to lose his mansions and yachts, and pretty soon he'll be eating shit on a shingle while wearing an orange jump suit and deeply lamenting that he cannot join either the crips or the Aryan Nation.

If Bernie were smarter he would have become a preacher. Promising the rewards after the people are dead eliminates the disadvantages of other confidence games. Nobody has ever come back from the dead to report that the Kingdom of Heaven is a lot of hooey. And, to the extent people need a little bit of more immediate gratification to keep the tithes flowing, you've got that covered too. When things go well, God gets the credit; when they go badly, your faith sustains you. If you fuck up, you're forgiven, and you may even get a magic cracker to make sure of it.

L. Ron Hubbard said it straight out -- the way to really make a lot of money is to start a religion, and by golly, he went out and did it. That Scientology is highly successful at raking in the dough for its leaders in spite of their transparent venality and L. Ron's on-the-record declaration is testament to the ineluctable gullibility of the human spirit.

I was about to say, and in spite of the utter absurdity of Scientological beliefs, but come to think of it, they are no more absurd than any other religion. A virgin gave birth to a guy who walked on water, raised the dead and drove some demons who were possessing a dude into a herd of pigs who then ran over a cliff, who walked around for a month after he himself was dead and then soared up into the sky -- right, you betcha.

You might argue that all this is just harmless fun and makes people feel better, but there are several reasons why I disagree. In the first place, that's an awful lot of society's scarce resources going for the Pope's silk slippers, the building and maintenance of non-productive edifices, clerical salaries, and magic crackers. In the second place, and this is most important, if you believe one thing on faith, you can believe anything -- and a lot of what the preachers tell people to believe is truly, profoundly damaging -- whether to individuals such as homosexuals and atheists who they teach their followers to despise, or to effective problem solving when they deny scientific facts and tell children that the world is run by magic and fantasy.

And it does real political damage as well, and stops people from standing up for their own best interests and the interest of the community. Joe Hill said it best:

You will eat bye and bye,
In that glorious land above the sky.
Work and pray, (work and pray),
Live on hay, (live on hay),
You'll get Pie in the Sky,
When you die, (that's a lie!)

And the starvation army they play,
They sing and they dance and they pray,
Till they get all your coin on the drum,
Then they tell you when you're on the bum:
You will eat bye and bye,
In that glorious land above the sky.
Work and pray, (work and pray),
Live on hay, (live on hay),
You'll get Pie in the Sky,
When you die, (that's a lie!)

If you fight hard for the good things in life,
They will tell you to stop all the strife,
Be a sheep for the bosses they say
Or to hell you are surely on the way!

Sure, religious institutions do some good works, but so do secular institutions, and without the ulterior motive of trying to convert people to superstition. I know what sorts of things religious apologists will say in response, and if any of them read this and care to leave a comment, bring 'em on. I'm not afraid to discuss this.

And yes, it has a lot to do with public health. Religious leaders have done immense harm to public health, in such fields as HIV prevention and the science education that produces the medical and public health work force of tomorrow. It's my responsibility to confront damaging superstitions, and it's long past time that more of us did it.