Map of life expectancy at birth from Global Education Project.

Friday, September 29, 2006

Don't waste time in mourning . . .

Yup, everybody's depressed today. Even the General can't find any humor in the situation. I can't find the inspiration to do the posts I had lined up on the shortage of Emergency Department services, the exodus of physicians from primary care, or the mildly creepy authoritarian public health philosophy of Prof. John Banzhaf. But I will get around to them. This is just one more day in history -- there are many more to come.

I lived through Vietnam and the Nixonian power grab, and long before I was old enough to vote, I was organizing and protesting. That's what you did in those days, if you didn't like killing and burning and torturing people in your name, and you still wanted to live in a nation of laws and liberties. And in the end, we won. We saved our Republic. And we did more than that -- we ended legalized discrimination, won the right to vote, won greater equality for women, changed some of the basic norms of our culture, apparently forever. So, we got complacent. The vandals were just outside the city gates, gathering in the woods, plotting our downfall. But we were growing soft.

The mass media culture grew even shallower, over the past 25 years, something I would not have thought possible. At its best, for a time, journalism was a profession with a sense of mission and honor. Now it's nothing more than a category of high class whoredom. Back then, people were engaged in politics through true mass organizations -- unions, local community groups, statewide and national issue-oriented mobilizations that had real members who went to meetings, held local events, and sent in money to keep the national program going. Now about the only people who do that are fascist religious fanatics.

When I first moved to the Boston area I got involved with Mobilization for Survival. (Now Boston Mobilization.) It was just an average month for us to have 10,000 people on the Boston Common protesting U.S. intervention in Central America, five or six talks in front of church or community groups (yup, churches used to be against war, and for social justice, imagine that), monthly meetings of four different program committees with 12 or 15 active members each, a small demonstration at a Congressional office -- and meantime there were 10 other groups that would join us in coalitions when they weren't doing their own thing at approximately the same rate, from SANE to CPPAX to the DSA.

If any president had tried to pull off one twentieth of the outrages of this present gang of murderous thieves, the Boston metropolitan area, and a lot of other cities around the U.S.A. would have been shut down. Hell, we did shut them down when Nixon invaded Cambodia, and things got so hot when Reagan wanted to send troops to Central America that we stopped him. I remember it very well, I trained marshalls and organized for a march on the Pentagon of 45,000 people, and there wasn't even a publicly announced intention to go to war, just some state-sponsored terrorism by the CIA.

But now we are a self-absorbed people, fat and comfortable, yet timorous and clinging -- maybe because we're afraid we really don't deserve what we have and that guilt makes us fearful of losing it. Why the wealthiest, most powerful and by far the most militarily secure society in history would sell its soul to a bunch of clownish megalomaniacs out of paranoia and cowardice is pretty hard to explain. But don't ever forget that Americans are no better than other people. And today, we are worse than most.

Too easy for a hint

Why do dictators torture people?

It's not to obtain information, or to protect the nation from enemies and subversives -- although they always say those are the reasons.

It is to proclaim their limitless power, and to terrorize anyone who would question or challenge it. Anyone who has ever lived under a dictatorship knows that.

Thursday, September 28, 2006

They hate us for our freedom

i generally try to stick to subjects on which I might have something special or at least idiosyncratic to add, and I resist dropping my own halfpenny worth on the topic du jour. But I mean, really. Goodness gracious. Heavens to Betsy. It seems the Congress needed to do something about the executive branch demolishing the Bill of Rights, the separation of powers, the rule of law, and 100 years of progress on international norms of conduct, so the solution is to legalize it all and grant the President of the United States the powers pertaining to a psychopathic dictator.

This is the effect of the legislation which has passed the House of Representatives and is now being debated in the Senate:

If, in the opinion of George W. Bush, I have "purposefully and materially supported hostilities against the United States" (let's say, by giving to Pakistani earthquake relief, as I urged people to do here last year), he can declare me an unlawful enemy combatant and make me disappear forever into a military prison, where he can have me tortured, with no recourse to the legal system, except that I will be tried by a "military tribunal." That I happen to be a U.S. citizen doesn't protect me. Oh sure, he wouldn't do that -- this bill is only directed at people who support terrorists on purpose, not to worry.

The punditocracy has ruled that this is a clever political maneuver by the Republicans. If Democrats vote against the bill, they will be enablers of terrorism, and it will redound to Republican advantage at the polls. It can happen here. It is happening here. It has happened here.

Wednesday, September 27, 2006

Two new links

I've added Unicorn Hat and Political Health to the blogroll. (Look to the left.) Do check them out, you won't be sorry.

This time, you really did read it here first

Here is the link to the uncorrected pre-publication proofs of The Future of Drug Safety: Promoting and Protecting the Health of the Public, the Institute of Medicine's long anticipated report on the FDA and the drug approval process. (200+ page PDF) It will cost you significant dollars to buy the typo-free version once it comes out, so if you're a health policy junky, this will save you enough for a trip to Dunkin' Donuts, or preferably the produce aisle.

The key recommendations, as usual, are another open door crashed through. If you've been reading this space, you know that everything they say has been said before by health care policy drones in every corner of the land. But that's how it works -- once the grunts, laboring in their anonymity, break the door down, the brass can stroll in and find themselves shocked, shocked, at what goes on here. Some highlights:

  • (Ooh, this is a good one!)3.3: The committee recommends the Secretary of HHS direct the FDA commissioner and Director of CDER [Center for Drug Evaluation Research], with the assistance of the Management Advisory Board, to develop a comprehensive strategy for sustained cultural change that positions the agency to fulfill its mission, including protecting the health of the public. (Damn, that's nasty.)

  • 3.5: To restore appropriate balance between the FDA’s dual goals of speeding access to innovative drugs and ensuring drug safety over the product’s lifecycle, the committee recommends that Congress should introduce specific safety-related performance goals in the Prescription Drug User Fee Act IV in 2007.(Whoops! guess they forgot a teeny weeny little item last time. Well, those Congress people have a lot to think about, what with having to raise money and all . . .

  • 4.1: The committee recommends that in order to improve the generation of new safety signals and hypotheses, CDER (a) conduct a systematic, scientific review of the AERS [Adverse Event Reporting System] system , (b) identify and implement changes in key factors that could lead to a more efficient system, and (c) systematically implement statistical-surveillance methods on a regular and routine basis for the automated generation of new safety signals.

  • In addition, CDER’s ability to test drug safety hypotheses is limited. Wait a minute -- isn't that supposed to be, like, their job? I keep getting the feeling that I'm missing something here . . .
    4.2: The committee recommends that in order to facilitate the formulation and testing of drug safety hypotheses, CDER (a) increase their intramural and extramural programs that access and study data from large automated healthcare databases and (b) include in these programs studies on drug utilization patterns and background incidence rates for adverse events of interest, and (c) develop and implement active surveillance of specific drugs and diseases as needed in a variety of settings. In other words, they should start to collect the actual data that would be necessary in order to evaluate drug safety. Golly, that does sound like a good idea.

And so it goes, with a whole series of recommendations that just make me say, "Well duhhhhh." Shorter IOM:

The FDA needs to develop the intention and the ability to protect the public. Neither of which it currently has.

(Thanks to Badri for the tip.)

Tuesday, September 26, 2006

Yes, all their dogs are in the hunt . . .

...but why should the drug pushers have all the horses? The American Psychological Association has just released an in-depth report on the use of psych meds in children. (56K warning: large pdf) As far as Stayin' Alive is concerned, it's another open door crashed through, but maybe some people will listen. Yes, yes, they aren't real doctors, they're doctors of philosophy, but still.

The bottom line is that (remember, you read it here first, if you happened to read it here first) kids with behavioral and emotional problems (APA goes along with the gag and calls them "mental disorders," without questioning the ontological status of the diagnoses) get drugs that have not been adequately tested for safety and long term efficacy, but they generally don't get psychosocial interventions that have been proven to work, because it's harder to get anyone to pay for them, and there aren't enough trained clinicians. A couple of money quotes:

The evidence base for treatment efficacy is somewhat uneven across disorders, with some of the most severe mental health conditions of childhood, including bipolar disorder and schizophrenia, receiving proportionally less attention from treatment researchers. Most of the evidence for efficacy is limited to acute symptomatic improvement, with only limited attention paid to functional outcomes, long-term durability, and safety of treatments. Few studies have been conducted in practice settings, and little is known about the therapeutic benefits of intervention under usual, or real-life, conditions.

. . .

It is the opinion of the working group that the decision about which treatment to use first be in general guided by the balance between anticipated benefits and possible harms of treatment choices (including absence of treatment), which should be the most favorable to the child. It is recommended that the safest treatments with demonstrated efficacy be considered first before considering other treatments with less favorable side effect profiles. For most of the disorders reviewed herein, there are psychosocial treatments that are solidly grounded in empirical support as stand-alone treatments. The preponderance of available evidence indicates that psychosocial treatments are safer than psychoactive medications. Thus, the working group recommends that in most cases psychosocial interventions be considered first.

The APA report goes on to consider the commonly diagnosed mental disorders of childhood in turn, beginning with our good friend Attention Deficit-Hyperactivity Disorder. (I won't get into whether not doing what children typically don't like to do -- sit quietly in rows, concentrate for long periods on boring tasks, and speak or move only with permission -- is properly called a disease. Granted, we all need to do things we don't enjoy, and this is one of them.) Anyhow, APA tells us that

Since the 1970s, a large number of studies have shown that behavioral interventions cause short-term amelioration of ADHD symptoms and impairment and that these acute effects are comparable in most domains to those obtained with low to moderate doses of stimulant medication (Pelham & Waschbusch, 1999). In contrast to the results of studies of stimulant medication that focus on improving the core symptoms of ADHD, studies of behavioral treatments have focused on improving the key domains of impairment associated with ADHD and thought to mediate long-term outcomes: parenting practices, peer relationships, and academic/school functioning (Pelham, Fabiano, & Massetti, 2005).

Hmm. Sounds to me like parents who really feel that something needs to be done might want to go to a psychologist first, instead of a drug dealer. But now you don't have to take my word for it.

Monday, September 25, 2006

Well now, this is a surprise

A few weeks back I attempted to define the concept of "drugs." As we've discussed quite a bit over the eons here, on the one hand the government wants you to know that "speed kills," on the other hand doctors are prescribing speed to perhaps 2% of American school children. Now, if you hand out a dangerous and street saleable drug to large numbers of kids, what might happen?

Right. Lots of kids will show up in emergency rooms with adverse consequences of drug abuse. Rather interestingly, the rate at which kids 12-17 taking Ritalin for "medical" use visit EDs for adverse consequences was about the same as the rate for kids taking it for "non-medical" uses. For kids taking amphetamine, there are more visits to EDs for non-medical use than for medical use, but both rates are much higher than the rates for young adults, who you would normally expect to be more at risk for drug abuse. We usually see the prevalence of illegal drug use rising after age 18, but in the case of these particular drugs, acute adverse consequences of both illegal use and of prescription use appear to be more prevalent in teenagers than in young adults.

A separate SAMHSA survey finds that prevalence of illicit use is higher than in young adults, but that is obviously not reflected in ED visits. Maybe college kids can use speed more safely -- it was very common when I was in college during final exam week, and as far as I know nobody died from it -- but I remain very skeptical of such widespread prescribing to children.

Sunday, September 24, 2006


The recently announced discovery of a nearly intact skeleton of a juvenile Austrolopithecus afarensis is an appropriate occasion for the final installment in my promised series on evolution. That doesn't mean I won't continue to discuss the subject, but this completes the previously stated agenda.

The human lineage diverged from the lineage of chimpanzees about 6 million years ago. More than 4 million years ago, the genus Australopithecus became fully bipedal. What the new fossil skeleton tells us however, is that 3 million years ago, the species was otherwise much more like an ape than a human -- including having a brain not much larger than a chimp's.

Not long, in geological terms, after the time of the Australopithecus child (or cub) -- a little over 2 million years ago -- the species Homo habilis and Homo erectus emerged. Their brains were bigger, though not as big as ours. And they made stone tools. However, in marked contrast to us, their ways of life were highly static. They continued to make very similar looking tools for more than a million years. We have found no evidence of art, or religion, or cultural development among them, although it could be hard to find given their great antiquity.

Creatures who paleontologists consider to be Homo sapiens appear in the fossil record about 400,000 years ago, and skeletons completely indistinguishable from those of modern humans appear 100,000 years ago.

Something extraordinary and as yet quite unexplained happened only about 50,000 years ago -- a sudden explosion of cultural development. Instead of seeing the same toolkit appearing in ancient sites over hundreds of thousands of years and across great distances, we see continual innovation and dramatic variation in time and space. We begin to see ritual burials, works of art, ornamentation. At almost exactly the same time, people began to spread outward from Africa. Within a mere 5,000 years, they had landed in Australia, having evidently crossed a large stretch of open ocean in boats. Within 30,000 years, people had penetrated to every corner of the earth except Antarctica. They had developed agriculture, adapted to climates ranging from the high arctic to deep forest to barren desert, invented baskets, pottery, sewn garments, spear throwers, musical instruments. Within a few thousand years more, they had founded civilizations, built great cities, forged tools of metal, begun to write down their current histories and mythical pasts.

This event was explosive, unprecedented on earth, astonishingly rapid. In a geological eyeblink, we have radically reshaped the planet's ecosystems, and now we are remodeling its very atmosphere and climate. We have even climbed out of the atmosphere and the gravity well and visited our planet's satellite, and sent our robots to other planets and out of the solar system entirely. And we have radically remade our own lives as well.

But what happened 50,000 years ago? The temptingly obvious answer is the development of a huge vocabulary and fully syntactical communication, that is language. With that came the possibility of elaborate reasoning, preservation and dissemination of knowledge, collective problem solving, transmission of knowledge, ideas and values from generation to generation -- the constructed, modifiable, improvable edifice of human culture and social organization. But that begs the question.

How and why could such a powerful and complex faculty as language emerge with such suddenness? Did it depend on a biological event, a genetic modification invisible in the skeleton but profoundly altering the functioning of the brain? Or was the brain somehow prepared to acquire and use language, only awaiting its discovery? The latter may seem improbable, but recent experiments have shown that apes can learn a vocabulary of a few hundred words and use them in a limited syntax. So presumably our earlier ancestors had at least some limited capability to use language, although whether they did so or not we cannot say.

But somehow, at that magic moment, a group of humans acquired the gift of gab -- the fount of unlimited potential, unbounded wonder, and horrific danger.

Friday, September 22, 2006


Being isolated here in the ivory tower with my chardonnay and brie, it's hard for me to understand those 40% of the people who tell pollsters they think the White House Occupant is doing a good job.

Things you have to believe to be a Republican:

  1. The universe was created 10,000 years ago. Or at least, there's a legitimate debate about whether it's 10,000 years old or maybe somewhat older, and we need to teach that debate in science class.
  2. A microscopic single cell has exactly the same moral status as a crying baby, if it happens to contain molecules of Deoxyribonucleic acid with nucleotide sequences characteristic of human beings. Jesus said so. It's in the Bible.
  3. The allegation that burning fossil fuels is increasing the concentration of CO2 in the atmosphere, thereby causing the globe to get warmer, is a hoax perpetrated by environmentalists in order to get foundation grants.
  4. Saddam Hussein attacked the United States on September 11, 2001.
  5. The United States is in a war against Terror, aka The Evildoers, which will go on until the President says its over, or forever, whichever comes first. As long as We're At War, the President must have the same powers as Adolf Hitler in Nazi Germany, or you and your family will not be safe. People who don't want him to have those powers care about The Evildoers more than they care about your family.
  6. Osama bin Laden, the leader of Terror, aka The Evildoers, is as dangerous as Hitler or Stalin.
  7. Osama bin Laden really doesn't matter. We're not that concerned about him.
  8. If people of the same sex are allowed to marry, my family will be destroyed.
  9. If we don't repeal the Death Tax, I won't be able to pass on my raised ranch and my 87 Ford Ranger pickup to my kids.
  10. Iraq is a shining example of democracy which will transform the Middle East. People in Iraq used to get tortured and killed, they couldn't practice their religion freely, and women had to do difficult jobs like being college professors and engineers. Now people can practice their religion freely (even though they get tortured and killed for it), and women get to stay home all the time so they won't be kidnapped and raped. Anyhow, we had to invade because Saddam Hussein attacked us on Sept. 11 2001, he had huge stockpiles of chemical and biological weapons that he was planning to give to Osama bin Lade, he was making nuclear bombs, and he's an Evildoer.
  11. Democrats are the party of fiscal irresponsibility. We need to keep Republicans in office so they can't run up huge budget deficits.

I'm sure you can think of a few more but that's enough for now. As usual, I'm out of here until Sunday.

Thursday, September 21, 2006

Lies and the Lying Liars who Fund Them

NEJM has once again provided a free full-text article to the common rabble, this time by Robert Schwartz, M.D., on stem cell research. It provides a quick, lay-friendly review of the current status of biomedical research using embryonic and adult stem cells.

What I want to chop out for your consideration today is this:

According to the New York Times, Karl Rove, head of the White House's Office of Political Affairs, has declared that embryonic stem cells aren't required because there is "far more promise from adult stem cells." Yet the notion that adult stem cells have the same developmental potential as embryonic stem cells, let alone "more promise," is dubious. It seems that the White House received this idea from David Prentice, a senior fellow for life sciences at the Family Research Council and an advisor to Republican members of Congress. In a report of the President's Council on Bioethics, Prentice claimed that adult stem cells can effectively treat more than 65 diseases. Not only is this assertion patently false, but the information purveyed on the Family Research Council's Web site is pure hokum.

In case you didn't already know, the Family Research Council is a "Christian" organization founded by James Dobson. It is famous, among other reasons, for accusing Sponge Bob of being gay.

Then there are all those "research institutes" that claim that human activity is not causing global warming. Turns out Exxon is paying them to say that. The British Royal Society (equivalent of our National Academy of Sciences) has asked them to stop.

This is the first time the society has written to a company to challenge its activities. The move reflects mounting concern about the activities of lobby groups that try to undermine the overwhelming scientific evidence that emissions are linked to climate change. The groups, such as the US Competitive Enterprise Institute (CEI), whose senior figures have described global warming as a myth, are expected to launch a renewed campaign ahead of a major new climate change report.

Exxon/Mobile is also, of course, a major contributor to the Republican Party and Republican candidates. Then there are all those researchers who claimed that tobacco doesn't cause cancer and heart disease. Paid for, of course, by the tobacco industry -- also a major backer of Republicans.

Why do all these ideologues and greed heads have to fund their own, private research institutes? Why aren't there scientists working in universities and independent research settings (such as the one where I work) who will say the things they want said? Is it because academia has a liberal bias, and refuses to hire people who don't wear pink underwear?

Nope. It's because reality has a liberal bias. The reason that conservatives lie all the time is because they have to. The truth is their enemy.

Wednesday, September 20, 2006

Reality Basis for the National Review

A few days back we had an inquiry about an article in the National Review by some clown named Deroy Murdock claiming that all those pinko commie national health care programs in Europe are much worse than our capitalist paradise system. My initial response was perhaps too sophisticated for our lay readers: the guy is making it up and he's full of shit. That's technical language for failure to provide appropriate citations and cherry picking of out of context anecdotes.

Anyhow, I didn't say anything further right away because I happened to know that we were about to get the real deal answer. Here it is, free to all, including liars who write for the National Review. (Is there some other kind of National Review writer? I'm just curious.)

The point of this study from the Commonwealth Fund was not really to compare the U.S. health care system performance to other countries, but rather to whatever seemed to be the best benchmarks available. But where those happen to be other countries, the conclusion is inescapable. As I keep repeating until it becomes like a spike driven into the brain, we spend twice as much of our GDP on health care as the median of wealthy countries, yet somehow manage to be the only one that doesn't guarantee coverage to everyone. Do we get what we pay for?

On "mortality amenable to health care" -- and we've had a lot of discussion lately about what that is and isn't, but it's something like half of all mortality -- we are close to the bottom. That's right, Deroy, we aren't the best, we're more like the worst. The top scoring countries have 80 annual deaths per 100,000 population; we have 115. The worst is 130. On healthy life expectancy at age 60, we are at 15.3 years for men, and 17.9 for women, vs. 17.4 and 20.8 for the best performing countries. Most of the benchmarks they use are from the best performing states or insurance plans, rather than other countries, but those two seem pretty powerful, because they are, after all, the bottom line.

Here's another: the countries that do the best have 22% of patients reporting experiencing a medical, medication, or lab test error. In the U.S., it's 34%. Deroy claims that in Canada and other pinko commie countries, people have to wait forever to see a doctor. Hmm. In the top scoring countries, 81% of people who need medical attention get to see a doctor by the next day. In the U.S., it's 47%. (I'm surprised it's that high.) I could go on, but you get the idea. Read the National Review for a good laugh, not for information.


Okay, so here is yet another headline about a medical error, in this case a hospital that managed to kill three premature infants by giving them adult doses of heparin.

Medical errors -- particularly medication errors -- obviously can happen outside of hospitals, but we don't have good data on errors and adverse events in ambulatory care. For hospitals, however, there is a commonly accepted range of estimates, that from 44,000 to 98,000 Americans die every year from avoidable errors made in hospitals. The number who are injured, including many serious injuries (e.g., amputating the wrong leg) is obviously much higher.

You can see where those estimates come from here, which is the first page of the E-book version of the Institute of Medicine's report "To Err Is Human: Building a Safer Health System." The E-book format is kind of dodgy: access to the publication is free, but you can't download it and print it out, you have to look at it on your computer one page at a time. In other words, they're still hoping you'll pay for the printed version. But it's there if you're really interested.

Okay, to kick off this discussion, let me make some basic observations:

Medical intervention, like flying an airplane, is inherently dangerous. You have a long way to fall from the sky, and you also can do a lot of damage by cutting people open, sticking tubes in them, or pumping in or feeding them powerfully bioactive chemicals.

Everybody makes mistakes. If I make a mistake at work (not that it would ever happen), the most dire consequence might be that somebody is sitting around in a conference room wondering where the hell I am, or a questionnaire goes out with an embarassing typo. If a doctor, nurse or pharmacy technician makes a mistake, well .. .

Modern medicine is a very complex undertaking. There are new drugs, new tests, new procedures all the time. New information about risks and counterindications for existing drugs, tests and procedures comes out all the time. It's nearly impossible for anybody to keep track of all the information that might affect patient safety, even in a narrow field. Take my post yesterday about sodium phosphate: most gastroenterologists apparently don't know that it's dangerous for people with kidney failure. That seems pretty basic, but it's also a different specialty.

Historically, the principle method by which medical providers have been made accountable for errors has been malpractice litigation. This does not efficiently discourage errors because:

Mistakes are not the same as malpractice, which requires a finding of negligence. You can make a mistake without being negligent.

Malpractice litigation is an adversarial procedure. It encourages doctors to fight the allegations rather than trying to figure out how to make sure it doesn't happen again. That means trying to suppress information, or interpret it in the most favorable possible light. It means not coming forward in the first place if you know you did something wrong, hoping nobody will notice. It drives physicians to stick together like thieves, creating a culture of cover-up and avoidance.

Malpractice litigation is mostly directed at finding fault in individuals, and getting them and/or their insurance companies to pay up. It doesn't encourage analyzing systems to find ways of making them mistake proof.

So, there is growing interest in systems approaches to medical errors. Try to figure out where the points of vulnerability are that lead to mistakes, and fix the physical environment, the procedures, the job descriptions so that mistakes are impossible to make. For example, people used to be injured by getting hooked up to the wrong kind of gas. Now the fittings for oxygen and anaesthesetics are incompatible. And have you noticed how they always ask your birthday before handing you your prescription? That's to make sure you are the right Pemberton G. Throckmorton.

We'll get into more depth on this later. But in the meantime, remember, even without mistakes, medical intervention is dangerous -- and as a matter of fact, it can be hard to draw the line between mistakes and bad luck. There is an awful lot of judgment involved in trying to trade off risks and benefits. There are deep psychological and philosophical issues in deciding what is appropriate and what is just plain nuts. Hospitals are always going to be very dangerous places.

Tuesday, September 19, 2006

Now they tell me . . .

I happened to be reading the British Medical Journal yesterday and I came across the information -- apparently sufficiently unfamiliar to your average physician that it required a major warning article with a lot of very basic background in one of the world's foremost generalist medical journals -- that sodium phosphate can be very dangerous for some patients. (Y Mun Woo, Susan Crail, Graham Curry, Colin C Geddes. A life threatening complication after ingestion of sodium phosphate bowel preparation. BMJ 2006;333:589-590) Not for me, as it turns out -- the major risk is to people with malfunctioning kidneys -- but they never actually asked me about my kidneys before they sent me the instructions to drink the industrial waste.

It turns out that ingesting sodium phosphate causes a transient spike in phosphate in the blood, and probably more important, a drop in calcium concentration. People with kidney failure can't excrete the phosphate and as a result can't get their serum calcium back up. Very bad news, because serum calcium is an essential electrolyte and without it the nervous system doesn't function right and the heart doesn't beat right. These authors say that mortality after ingesting sodium phosphate among people with damaged kidneys may be 33%. But the low calcium (hypocalcaemia) may also be dangerous for people with heart disease, liver disease, and known electrolyte imbalances, and people who are just plain old and frail. That would seem to include a fair proportion of people who get colonoscopies.

They go on to state that "Evidence shows that many endoscopists may not be aware of groups of patients who are at high risk and their potential for complications after ingestion of sodium phosphate."

So, what are we to make of this? Many endoscopy providers, as a default, tell all their patients to prepare using sodium phosphate. It definitely works,* and it's cheap. But there are also several alternatives which are safer for some patients. This is an obvious example of a practice which needs to be embedded in systems to prevent recommending it to people for whom it is not appropriate.

If I did have kidney failure, or heart disease, or for some other reason should not have ingested sodium phosphate, whose fault would it be if I were injured or killed? The triage nurse who called me ahead of time and didn't find out about my counterindications? The doctor who performed the procedure? Whoever developed the protocols used in the clinic? And maybe the protocols are stronger, but they just weren't followed -- but then whose fault is that? Who should my survivors have sued? Would they have won?

As I mentioned a couple of days ago, doctors are thinking hard about these sorts of questions, as are people like me who are not real doctors (I'm a doctor of philosophy). I'll try to review some of the common thinking shortly.

*For those of you who are fans of Dr. Science, the way it works is pretty simple -- it has extremely high osmolarity, so it just sucks water out of your body and into your colon. Whooosh, you're a human fountain. No fun, but mighty impressive.

Call me old fashioned . . .

I dunno, it just hadn't occurred to me before that giving the preznit the authority to torture people on his personal whim would be a winning political platform. Evidently it will make people feel safer. For some reason I can't quite put my finger on, it doesn't make me feel that way.

Monday, September 18, 2006

Nobody here but us chickens . . .

And I'd like to introduce everyone to our new security guard, Ms. Fox.

Okay, how many of you have heard of the Office of Information and Regulatory Affairs? Raise your right wing, please. That's what I thought. Cluck, cluck.

OIRA is an agency of the Office of Management and Budget, part of the Executive Office of the President, which among its other duties reviews proposed federal regulations to assure that they are compliant with federal policies. The operative executive order was issued by Bill Clinton, but the principles it contains are subject to err, interpretation. For example, point 6, "Each agency shall assess both the costs and the benefits of the intended regulation and, recognizing that some costs and benefits are difficult to quantify, propose or adopt a regulation only upon a reasoned determination that the benefits of the intended regulation justify its costs," and point 11, "Each agency shall tailor its regulations to impose the least burden on society, including individuals, businesses of differing sizes, and other entities (including small communities and governmental entities), consistent with obtaining the regulatory objectives, taking into account, among other things, and to the extent practicable, the costs of cumulative regulations."

So who do you think the Emperor of Mespotamia wants to appoint to head this office? Public Citizen (blogger ethics alert: I worked for them briefly in my misspent youth) wants you to know:

The nomination of Susan Dudley as administrator of the Office of Information and Regulatory Affairs (OIRA) represents another attack by the Bush administration on the government’s ability to hold industry accountable and keep Americans safe, according to a report released today by Public Citizen and OMB Watch. .. .

As director of regulatory studies at the industry-funded Mercatus Center, Dudley has sought to strike down countless environmental, health and safety rules. She has opposed such safeguards as the EPA’s attempts to keep arsenic out of drinking water and lower levels of disease-causing smog. She has questioned NHTSA’s life-saving air bag regulations and the Department of Transportation’s hours-of-service rules to keep sleep-deprived truck drivers off the roads. She has championed energy deregulation, which has led to skyrocketing prices and little consumer relief during record-setting heat waves. .. .

“With Susan Dudley’s nomination, President Bush is proposing to install one of the nation’s leading anti-regulatory zealots as the gatekeeper for all regulatory safeguards,” said Peg Seminario, director of health and safety for the AFL-CIO. “Not only has Dudley opposed virtually all new worker safety and health protections, she has also strongly advocated rolling back legal rights and protections that workers have already gained.”

“Throughout this administration, OIRA has weakened already troubled agencies,” said Public Citizen President Joan Claybrook.* “If Dudley is confirmed by the Senate, she will further strip them of their ability to stand up to government secrecy, politicization and corporate interests. On behalf of the public, we are urging the Senate to reject her nomination.”

So what do you think? Will the Senate reject her nomination? Why don't we ask Joe Lieberman what he plans to do about this.

*BTW, I have it on good authority that the rumors about Joan and Ralph are false. So there.


I don't know about you, but lately I often have the feeling that it just isn't worth worrying about the stuff I'm paid to worry about, at least not all that much, because we may have much bigger problems. The biggest problem of all, of course, is the vicious gang of malignant clowns who are running the U.S. government.

You don't have to take it from Sy Hersh any more - maybe you've already noticed that Time magazine is making it sound like plans for war with Iran are well under way, presumably just before the November elections, and too late to mount a meaningful public response. For non-subscibers, Josh Marshall has stolen the money quote:

The first message was routine enough: A "Prepare to Deploy" order sent through naval communications channels to a submarine, an Aegis-class cruiser, two minesweepers and two mine hunters. The orders didn't actually command the ships out of port; they just said to be ready to move by Oct. 1. But inside the Navy those messages generated more buzz than usual last week when a second request, from the Chief of Naval Operations (CNO), asked for fresh eyes on long-standing U.S. plans to blockade two Iranian oil ports on the Persian Gulf. The CNO had asked for a rundown on how a blockade of those strategic targets might work. When he didn't like the analysis he received, he ordered his troops to work the lash up once again.

What's going on? The two orders offered tantalizing clues. There are only a few places in the world where minesweepers top the list of U.S. naval requirements. And every sailor, petroleum engineer and hedge-fund manager knows the name of the most important: the Strait of Hormuz, the 20-mile-wide bottleneck in the Persian Gulf through which roughly 40% of the world's oil needs to pass each day. Coupled with the CNO's request for a blockade review, a deployment of minesweepers to the west coast of Iran would seem to suggest that a much discussed—but until now largely theoretical—prospect has become real: that the U.S. may be preparing for war with Iran.

This seems utterly insane, but to me it is plausible. This is the Hail Mary pass for the PNAC. A Democratic take over of either house of Congress in November will mean Congressional hearings that will strip away their armor of lies, hearings that the corporate media will not be able to ignore or spin away. Their plot to terminate the American republic and replace it with a permanent dictatorship of the possessors will hit the rocks. War with Iran is the one event they believe will let them recapture the magic glow of the two years following Sept. 11, 2001, when they led the country to disaster riding a flying carpet of mass delusion.

They don't care what happens to the United States, or its people, or humanity, or about any principles or ideals. They care only for power, and greed. If the military leadership, and the corporate media, and the five or six Republican members of Congress with a shred of decency and self respect allow them to finally immolate the constitution on the altar of megalomaniacal fantasy, then the American experiment will have failed.

So, it's a little hard to write about public health and medical sociology when I'm in this kind of mood. But don't worry, I'll get back to work anon.

UPDATE: And while you're waiting, you might want to check out John Mueller in Foreign Affairs. "We're at war!" with a phantom -- the projection of our own worst nature.

Friday, September 15, 2006

No Comment

What do you think, based on the facts as alleged? Is this homicide?

WAUKEGAN, Illinois (AP) -- A coroner's jury has declared the death of a heart attack victim who spent almost two hours in a hospital waiting room to be a homicide.

Beatrice Vance, 49, died of a heart attack, but the jury at a coroner's inquest ruled Thursday that her death also was "a result of gross deviations from the standard of care that a reasonable person would have exercised in this situation." . . .Vance had waited almost two hours for a doctor to see her after complaining of classic heart attack symptoms -- nausea, shortness of breath and chest pains, Deputy Coroner Robert Barrett testified.

She was seen by a triage nurse about 15 minutes after she arrived, and the nurse classified her condition as "semi-emergent," Barrett said. He said Vance's daughter twice asked nurses after that when her mother would see a doctor.

When her name was finally called, a nurse found Vance slumped unconscious in a waiting room chair without a pulse. Barrett said. She was pronounced dead shortly afterward.

I will have more to say about this later.

Return to the Magic Mountain?

A major controversy in public health in the late 20th Century concerned the work of Thomas McKeown, who argued that medical intervention had little to do with the decline in death rates and growth of population in the industrialized countries prior to the 20th Century. Rather, he argued, economic growth and attendant better living conditions, particularly better nutrition, were principally responsible.

McKeown was aggressively attacked and, as James Colgrove put it, (American Journal of Public Health, March 2002): "The consensus among most historians about the McKeown thesis a quarter century after it first stirred controversy is that one narrow aspect of it was correct -- that curative medical measures played little role in mortality decline prior to the mid-woth century -- but that most of its other claims, such as the assessment of the relative contributions of birth rates and of public health and sanitation measures to population growth, were flawed."

This is now supposed to be the smart kids' view of McKeown, but I find it quite odd. That is not a "narrow aspect" of McKeown's thesis -- it is what most readers found to be most essential about it. It's pretty much the whole point. Granted, he offended public health practitioners by playing down the clean water thing, and he was probably wrong about that. However, his most famous analysis had to do with the decline in tuberulosis mortality in England and Wales. TB used to be a major killer, but it had become rare before there were any effective medical treatments. Remember how much of 19th Century literature is about TB, from Mann to Keats to Alexandre Dumas? But who ever worried about it in the 1930s? McKeown's foremost critic, Simon Szreter (yup, I spelled that correctly) argued that he had confused tuberculosis and other respiratory diseases in death records, and so gotten the timing of the decline of tuberculosis wrong. This was all supposed to be quite devastating, but it is really nit picking. It remains true that TB became unimportant as a cause of death in the developed countries before effective treatments came along.

Anyway, I dredge all this up today because of the considerable alarm that has arisen over the emergence of so-called Extensively Drug Resistant Tuberculosis (XDR TB) in many areas of the world, particularly in association with HIV. It appears that drug resistant strains have arisen independently in various places.

As you know if you've been reading for a while, drug resistant pathogens result, among other causes, from erratic use of antibiotics or failure to complete courses of treatment. Tuberculosis can infect people without producing symptoms. It is most likely to cause illness in people who are immunocompromised or generally debilitated, and of course it is people with active symptoms, who are coughing and bringing up sputum, who are most likely to be infectious, and most likely to be caught by people who are in close proximity with infected people in poorly ventilated circumstances such as prisons or shelters for the homeless. So we can see why TB would decline with improving living conditions, and why TB is mostly seen today in conjunction with HIV, in poor countries, and among socially marginalized people in the rich countries.

So, the appearance of XDR TB is worrisome. TB control depends on antibiotics, and if we lose them, TB could once again become disastrous for humanity. Recommended measures include making sure to add multiple new drugs to regimens that prove ineffective, instead of just one; and so-called Directly Observed Therapy, making sure that people take all their pills; and finding as many cases of TB as possible and bombing them with multiple, powerful antibiotics. In this way, it is hoped, we can keep a lid on the problem.

Still, I wonder if we shouldn't give poor old McKeown more credit in this situation. Eliminating the social disadvantages which help TB thrive in the human population would help just as much, if not more. But that's obviously unrealistic.

Thursday, September 14, 2006

An interesting, though probably doomed, experiment

One e-mail list I'm glad to be on is that of Nicholas Zamiska, the WSJ's most excellent health and science correspondent based in Hong Kong. (And best of all, he never sends me Chinese pornography or offers to enlarge any of my body parts.) He tips us off to an experiment by Nature Magazine* to open up the peer review process to any competent lector, not just the two or three reviewers assigned by the journal.

Authors who agree to it have their submitted manuscripts placed on this website, where anybody can download the PDF and where qualified people can then leave their own comments. (The comments are moderated so it ain't Eschaton. Sorry, Adrian Spidle.**)

While Nick's article (WSJ, Sept. 14, page B1) emphasizes some recent failures of the peer review system to weed out fraud, this new procedure isn't likely to help a whole lot with that problem and it really isn't intended to do that. The possible virtues of this experiment, in my view, include:

  1. It contributes to the democratization of science. Interested lay people, or experts in fields other than those covered by the articles on the site, can see what scientific work in progress looks like and how fellow scientists in the same field critique it. The discussions are bound to be quite technical and hard for non-experts to follow, but you can still glean a good bit of insight, I think. I would certainly consider using this as a resource in science classes, at the college level and even for bright high school students.
  2. It can speed up discovery by giving the community of science early access to research. Publication can take months to a year or more, and meanwhile other scientists may be busy reinventing the wheel, or losing time by not having the results to build on.
  3. It can certainly improve the quality of published research by providing a broader range of critiques ahead of time. Again though, even a thousand readers can't necessarily detect fraud. Lies don't always give themselves away.

A downside is that jealousy or competitiveness could lead people to trash each other's work, but that's actually less of a worry than it is with the traditional peer review process, since commenters must identify themselves, whereas peer reviewers are anonymous. Another problem is that people could steal ideas, I suppose, though it's hard to see how they'd get away with that -- if the results are valid, they'll be published long before somebody else can repeat them and get them printed. Some scientists might prefer to hoard their own findings while they work on the next step, but I don't find that honorable -- the realm of the unknown is infinite, it's silly to try to keep some of it for yourself.

A more serious downside is that journalists or activists of one kind or another might seize on these unpublished findings and publicize them or try to use them for advocacy in some way, and it will turn out that they are rejected by the peer review process and are not truly valid. Once a wrong idea gets embedded in the culture, it's awfully hard to get rid of it. (No, taste buds for sweetness are not concentrated on the tip of the tongue.) Well, we'll see.

*and if you check out their web site, you can get free access to the much-ballyhooed report on the Neanderthal's purported last stand near Gibraltar.
** Adrian won the Rising Hegemon award for "most psychotic commenter" in 2004.

Oh no, where will we find the money?

According to the World Health Organization, resources needed for global HIV-related programs will be $18.1 billion in 2007, but only $10 billion is expected to be available. (PDF. Page 17.) $8 billion is a lot of money, and it's hard to imagine where it's going to come from.

That's less than it costs the U.S. to occupy Iraq for ten days.

Update: The official numbers are more like one month, but that's not counting the "long tail" of expenses for veterans' benefits and health care, replacing all the equipment that's being destroyed, and survivor benefits -- including at least five more bereaved American families today. How much longer will the American people put up with this world historical crime?

Wednesday, September 13, 2006

You might want to try this . . .

You may have heard about the Japanese study that finds that drinking green tea is significantly protective against dying from heart disease and stroke. (Abstract here.) If you want my opinion, this is about as persuasive as an epidemiological cohort study (as opposed to a randomized controlled trial) can get.

The effect is very large, particularly for women, and particularly for people who drink a lot of it. Women who drank 5 or more cups of green tea a day had 26% lower total mortality over 11 years than abstainers; men had 17% lower mortality. For heart disease specifically, the effects were even more dramatic. The researchers controlled for everything they could think of, but it didn't make much difference, except that the effect was weaker in men who had been, or were currently, smokers.

Now, this kind of study can't prove that drinking green tea is really the cause of this effect. It could conceivably be something associated with green tea drinking that they just didn't think of and couldn't control for. But it's hard to see what that might be. And it might not work as well in other populations because it might depend on other dietary, environmental or lifestyle components prevalent in that part of Japan. But the effect size is so large it seems awfully compelling.

Black tea, by the way, didn't work in this study. (Black tea comes from the same plant, but it's fermented, so some of the chemicals in the tea leaves are destroyed.) It will be bad if people decide that if they drink green tea, it's okay to get fat, not exercise, smoke, and eat doughnuts. But if you don't use it as an excuse for doing bad stuff, it looks like a winner. Oh, and it's fine to drink it cold.

(BTW, unlike other studies, this one didn't show any protective effect against cancer. In fact, cancer death rates were non-significantly higher for the green tea drinkers, although I suspect that's because they weren't being killed off first by heart disease, so they were still around to die of cancer.)

There are 8 kinds of people in the U.S.

Those who divide the U.S. into 8 kinds of people, and those who don't, further subdivided by urban and rural residence, and high and low income . . .

You may well have read news accounts of the study by Christopher Murray et al, published yesterday in PLoS Medicine, about disparities in life expectancy among various segments of the U.S. population. The newspaper account I read did not explain it very well, however. It is called Eight Americas. As Tony Snow would put it, "There's nothing new here," and it is in fact an extended exercise in the reification of constructs. But it still manages to be instructive, at least as a way of dramatizing what we already know.

Because they needed historical data going back a few decades, they used the 1977 "OMB statistical policy directive 15" standard for classifying the population by race, which means people get to be White, Asian, Native American, or Black. You can't be Latino, or anything else, such as Brazilian or Haitian or Somali or Assyrian. In fact, based on more recent data, the authors classified Asians living in places where 40% or more of the Asian population by the old standard is in fact Pacific Islander (Hawaii and Alaska, basically) as part of an amorphous Middle America, which is mostly white.

Anyway, starting with this 19th Century pseudo-scientific racist system, they went on to look at broader socio-demographic characteristics of counties and, based on what felt good at the moment, divided us all into 8 categories: Asians who don't live near a lot of Pacific Islanders; White people in low income rural counties of the northern plains; White people in Appalachia and the Mississipi valley who live in low-income counties; all other White people; Black people living in low-income counties of the South; Black people living in urban counties with high homicide rates; Native Americans in the Mountain and Plains states, mostly living on or near reservations; and everybody else, who is collectively called Middle America.

All this may strike you as arbitrary and odd, even vaguely offensive. Whatever. They then calculated life expectancy and mortality rates at various ages for these categories. As you know from your assiduous, dedicated reading of this blog, life expectancy is also an elaborate construct that does not necessarily represent any actual individual person's chances of living to any actual particular age. Nevertheless it does provide a snapshot of the condition of sub-groups of the population at a given moment.

The payoff is that disparities in life expectancy among the various "Americas" (and sorry if you don't happen to be in any of them, if you don't exist I guess your life expectancy must be zero) are as great as some of the most dramatic international comparisons. For example, the Asian females of America I have a live expectancy 12.8 years longer than the rural Black women of America 7 -- similar to the gap between women in Japan and womenin Nicaragua. The Asian males of America 1 can expect to outlive the urban Blacks of America 8 by 15.4 years -- similar to the gap between Icelandic men and men in Uzbekistan.

Okay. As I said before, nothing new here, just an elaborate way of telling us what we already know -- which is not nearly enough since we have such limited data on ethnicity, social and economic status, health care access and utilization, environmental exposure, and behavioral risk factors for the U.S. population, as the authors say. Still, it does seem more than disgraceful, doesn't it?

Oh look -- terrorists! They might kill somebody!

Tuesday, September 12, 2006

No excuse

I've addressed the trans fat issue briefly before, but it deserves a post of its own. I assume that most of my readers already know that trans fats are bad for you and that they are basically unnatural. It would be very easy to remove 80% or more of trans fats from the American food supply, and according to an estimate by Mozafarrian, et al (NEJM, 354(15), 13 April, 2006) doing so would reduce heart attacks and deaths from coronary artery disease by about 12% (more or less).

Fatty acids are hydrocarbon chains terminating in a carboxyl group (COOH). If you remember your high school chemistry you know that carbon has a valence of 4, so carbon atoms in chains can have single bonds with the carbons before and after, and two hydrogen atoms attached; or a double bond with one of their neighbors and one hydrogen atom attached. If the hydrogen atoms on each side of the double bond are on the same side of the chain, that's called the Cis configuration; if they are on opposite sides, that's called the Trans configuration. Fats with no missing hydrogen atoms, and no double carbon bonds, are called saturated fats.

Most animal fats are saturated, most vegetable oils are unsaturated. All unsaturated vegetable oils are Cis fats. Trans fats are rare in animals as well, though bacteria in the guts of ruminants produce small amounts of trans fats which end up in milk and meat. Everybody knows that eating a lot of animal fat -- meat and dairy -- raises your level of "bad" cholesterol (Low Density Lipoprotein) and so increases your risk of heart attacks and strokes. So, you should eat vegetable oil and particularly monounsaturated fats (i.e., one double carbon bond) such as olive oil, and stay away from animal fat, although a little is okay.

Now, you'll note that animal fats are generally solid at room temperature. That's because the saturated fatty acid chains are straight. They lie next to each other, tightly packed, and form a solid or at least a buttery glob. Vegetable fats, however, bend at the Cis bond. Therefore they can't pack tightly, they slide past each other, and you have liquids such as a beautiful silky olive oil. Industrial "food" manufacturers discovered that they could add hydrogen atoms to polyunsaturated vegetable oils by heating them in the presence of steam. In the process, the remaining Cis bonds are transformed to Trans bonds. As a result, the chains straighten out and the fats become semi-solid. The manufacturers like these kinds of fats because they don't turn rancid easily, they hold up well to the high heat of deep frying, and since they are semi-solid they can be used as shortening in crackers and packaged cakes and so on.

So now they are ubiquitous in the American diet. While dairy and meat fats may be 1 to 8% Trans fats, in french fries and other deep fried fast foods Trans fats are 25% or more of total fats. And those "healthy" granola bars? 18%. In crackers, the number is 34%. Doughnuts 25%, cookies 25%. Etc. (Mozaffarian et al)

So what does that do to you? Not only do trans fats raise levels of LDL, unlike saturated fat they also lower levels of the "good" HDL cholesterol. And they raises the levels of triglycerides, and cause other changes in blood lipids that raise the risk of heart disease and stroke. There is also evidence that they promote inflammation, further raising the risk of heart disease and diabetes. There are numerous other less well established harmful effects of trans fats.

Now, here's the bottom line: there is no health benefit from consuming trans fats, unless you count the empty calories they provide. On the other hand, they are very nasty poisons. They kill you the hard way.

Unfortunately, although manufacturers are now required to list Trans fats on food labels, if the specified serving contains less than 500 milligrams, they can give the amount as "zero," i.e., they can lie. Which they do, with glee. If you eat several cookies or a couple of handfuls of potato chips, you may consume several grams of Trans fats, even though the label says "zero trans fats." If you read the label more carefully, and see that "partially hydrogenated vegetable oils" are listed among the ingredients, you know that the "zero trans fats" claim is a lie, and you should avoid the product. However, that won't do you any good in restaurants.

Denmark has outlawed the sale of foods containing more than 2% artificially produced trans fats, which pretty much eliminated them from the food supply. Manufacturers were able to substitute Cis fats, and some saturated fats from palm oils, which are not good but better than Trans fats. Nobody noticed the difference. If you order your Freedom Fries or Chicken McNuggets in a McDonald's in Denmark, you will eat zero Trans fats, and they will taste exactly the same. (Which is still a good reason not to eat them). Other than that, nothing bad happened.

We don't even need legislation. The FDA could decide that Trans fats are no longer "generally regarded as safe," which would largely eliminate them. Don't hold your breath.

Monday, September 11, 2006

You too can be a medical sociologist

Just pay attention, and think about what's happening.

After a brief phone conversation with me, my doctor contacted the hospital with which his practice is affiliated to have them schedule my colonoscopy. I first heard from the hospital (let's call it Joe's Bar and Surgery) via a letter. It consisted of a page of single-spaced, dense type, stating the date, place and time at which I was to present myself, followed by lengthy, complicated instructions. I didn't do a formal literacy level test, but I'm accustomed to estimating and I'd put it at tenth or twelfth grade. That means the majority of people would not be able to process it.

The requirements are not only complicated, but quite onerous. Your last meal is breakfast the day before. After that, you are restricted to "clear liquids," which they don't exactly define, although they do give some examples -- jello, strained fruit juice. At about dinner time the night before, you have to drink some extremely nasty tasting stuff called sodium phosphate, which gives you instantaneous severe diarrhea. (It's actually marketed as a laxative, but I can't believe anybody would consume it voluntarily.) The diarrhea winds down in three or four hours. The next morning, you have to do another shot of poison, drink some water, and then have nothing -- not even water -- until after the exam. As a bonus, you get to experience another three hours of diarrhea while you are starving. I was scheduled to show up at 3:30 pm, so I spent a long day in mortal combat with my hungry id. Oh yeah -- you must have someone to drive you home. They won't let you leave without an adult escort, even if you promise to take public transportation. Fortunately, my brother works nearby so I had a solution. Not everybody does, of course.

Joe's did have someone call me, not so much to go over the instructions as to make sure I didn't have any counterindication for this, such as a drug allergy or severe chronic alcoholism or heroin addiction. I suppose that some people, perhaps most, have the chance to go over the procedure with their primary care physician ahead of time, but since we already know that immediately after a physician visit, people do not know 50% of what their doctor told them, I don't know how helpful that is.

I arrived at Joe's endoscopy center at 3:30, as required, and sat in the waiting room until 3:50. Okay, that's not too bad, usually physicians keep you waiting for at least half an hour. They had me strip and put on the notorious hospital jonny, then lie on a gurney and wait. A nurse came briefly to insert an IV, and I had the opportunity to ask her if they had a lot of no-shows for this procedure. The answer is yes.

So there I am, in infantile pajamas, in bed, in what is indistinguishable from a hospital ward. Other patients were lined up there in areas separated by curtains, but for the most part the curtains were not drawn and everyone was out in the open. Some of them were inpatients who were very sick. I lay there for 50 minutes, ignored, with nothing to read -- I didn't have my glasses anyway -- in the official clothing and posture of an invalid, on an invalid's furniture. This humiliation may appear pontless, but it's actually essential. Without the ritual redefinition of my social role from that of middle aged professional and free citizen to that of helpless infant (aka patient), it would not be possible for a man to insert a long tube into my anus and snake it through my digestive tract.

For those who are considering going through this, the procedure itself was much less unpleasant than you might think. They heavily drugged me and I really wasn't aware of what was going on, although I was conscious enough to see the video, which was sort of interesting, although my memory of it is fairly hazy.

So, here are my conclusions and questions for further research.

1) There are many barriers to people getting this procedure, in addition to the financial barrier, which is substantial. (My understanding is that the full cost is about $1,000. Even with good insurance, my co-pay is $100.) There is the need to fast for two days, the very unpleasant purging, and of course the need to take at least one day off from work. I did go to work the day before, but it's hard to be fully productive when you are starving.

For people who have difficulty understanding the instructions, or find them intimidating, there must be a strong temptation to just forget the whole thing. And I wouldn't be surprised if many people who take the first dose of sodium phosphate decide not to go through with the second one. Many people must succumb to temptation and cheat on the fast, as well. So I would be curious to know:

a) What percentage of people just don't show up for the appointment? What characteristics of patients, geography, and methods of communication by providers are associated with no-show rates? What kinds of actions (what we call interventions) can increase attendance?

b) What percentage of people do show up, but have made less than adequate preparations? What percentage of procedures are compromised by failure to properly clean the colon? Again, what kinds of communication and supports could help people succeed at this quite onerous requirement?

I presume that some research has been done on these questions, but it's pretty obvious to me that Joe's, at least, doesn't have the answers yet.

2) I don't see why people going through such procedures can't be granted more dignity. I can tell you that there is nothing unique about colonoscopy in the way that people who undergo it are systematically infantilized. Why must everyone don the same baby's smock, always with that dull gray check pattern, and the back open with your ass blowing in the breeze? Why can't the nurses manage the flow of patients so that people aren't asked to change until just before the procedure, and why can't we wait in a comfortable lounge, sitting on chairs, reading four-month-old issues of Popular Mechanics, instead of lying in bed among the suffering and the doomed?

When I was hospitalized some years ago following major surgery (long-time readers know this tale, which I never quite finished and suppose I must), after five or six days I convinced the nurses to let me wear doctor's scrubs instead of a backless smock. There was actually no medical reason why I shouldn't, as it turns out. And the cost to the hospital laundry was the same. The result? I felt like a grownup again. I got to wear a shirt, and long pants, just like the big kids. I think it helped me get better.

Sunday, September 10, 2006

The Tree of Life

Not Ygdrassil, but the conventional representation of evolutionary relationships.

For those concerned about me, the results of my examination were entirely reassuring. I will offer my sociological observations anon. Meanwhile, for reasons which will soon become apparent, this seems a good occasion for the next installment on evolution.

The major groups of metazoa -- the multicellular animals -- are all present in the fossil record going back to about 600 million years ago. We presume that animals with similar patterns of embryonic development are related to each other, but we can only conjecture about which may have come first and whether the more complex ones are necessarily descended from the simpler ones.

Sponge Bob is unique among his kind in that he is mobile and has a nervous system. Sponges in general (phylum Porifera) have no specialized organs, and cannot move. They may not really be multicellular organisms at all, but rather a highly evolved form of colonial living by individual cells. But the rest of the metazoa are classified as:

Diploblastic: Having two embryonic cell layers. These are the cnidaria (jellyfish, anenomes, and so on) and the comb jellies.

Triploblastic: Having three embryonic cell layers. Triploblastic organisms are thought to be more closely related to each other than to the diploblasts, but they come in several varieties. Those lacking a true internal body cavity (that is not the gut, but a fluid filled cavity between the gut and the body wall) are called acoelomates or pseudocoelomates, because the body cavity in Latin is called the coelem. In coelomate animals, those having a true body cavity -- and that includes you -- there are two distinct ways in which the body is formed during embryonic development. In the protostomes -- which include the molluscs, annelids (earthworms etc.), arthropods (insects, crustaceans, and what not), and various lesser known phyla -- the first opening which appears in the embryo becomes the mouth. In the deuterostomes -- which include you, and Sponge Bob's friend Patrick the starfish and all the other echinoderms -- that opening becomes the anus, and the mouth forms later. So, it appears that the closest relatives of the vertebrates are that very different looking phylum, the pentagonally symmetrical sea stars, sand dollars, and sea urchins.

You may have noticed that one thing that all of the triploblastic organisms have in common is a gut, a hollow tube running through the body. The food goes in one end, gets serially disassembled and the useful components absorbed, and the waste goes out the other end.

That is a major innovation, and for those contemplating possible options for reincarnation, it's a good reason to request not to come back as a diploblastic jellyfish. (Thanks to Jan Pechenik for this observation.) The food comes in, and the waste goes out, through the same opening. Life without an anus is clearly inferior. Not only is it impossible to take a meal until the last one has been discharged, but movement involves physical distortion of the digestive cavity and expulsion of much of what is in there. So, you can't digest and swim at the same time. Finally, you have to discharge your gametes and embryos through the same opening. Yuck.

So, let's hear it for one of our most important body parts, that gets very little respect.

Friday, September 08, 2006

I know we're only supposed to talk about the ABC 9/11 movie . . .

And I wasn't going to post today anyway, because I haven't eaten in more than 24 hours and when my blood sugar is down, you don't want anything to do with me, but ...

I decided somebody ought to pay attention to other stuff that's going on. The Dems finally pried loose part of the Senate Intelligence Committee report on the "intelligence" leading up to the Iraq war and, well, it says what I knew the second the bullshit came out of Colin Powell's mouth in Feb., 2003. But now it's official. As the AP summarizes:

WASHINGTON (AP) -- There's no evidence Saddam Hussein had a relationship with Abu Musab al-Zarqawi and his al Qaeda associates, according to a Senate report on prewar intelligence on Iraq. Democrats said the report undercuts President Bush's justification for going to war.

The declassified document being released Friday by the Senate Intelligence Committee also explores the role that inaccurate information supplied by the anti-Saddam exile group the Iraqi National Congress had in the march to war.

The report comes at a time that Bush is emphasizing the need to prevail in Iraq to win the war on terrorism while Democrats are seeking to make that policy an issue in the midterm elections.

It discloses for the first time an October 2005 CIA assessment that prior to the war Saddam's government "did not have a relationship, harbor, or turn a blind eye toward Zarqawi and his associates," according to excerpts of the 400-page report provided by Democrats.

Bush and other administration officials have said that the presence of Zarqawi in Iraq before the war was evidence of a connection between Saddam's government and al Qaeda. Zarqawi was killed by a U.S. airstrike in June this year.

White House press secretary Tony Snow played down the report as "nothing new."

Yeah, "nothing new." Then why was it widely reported in the European press before the invasion that Zarqawi had nothing to do with the Iraqi regime? (In fact, he operated in Iraqi Kurdistan, completely outside of Saddam's control.) Why was this publicly available information not reported in the United States? Why did President Cheney go around the country lying about precisely this issue for two years after the invasion of Iraq? Why do 60% of Americans still believe the lies? And why did Pfc. Vincent M. Frassetto, 21, of Toms River, N.J., 1st Battalion, 10th Marine Regiment, 2nd Marine Division, II Marine Expeditionary Force, die Sept. 7 while conducting combat operations in Al Anbar province, Iraq?

Well, I'm off now to get my interior inspected. But I'll tell you what I already know -- there's a cancerous growth on the country.

Thursday, September 07, 2006

Hear, Hear!

Ezekiel Emanuel, in the new JAMA, discusses the medical curriculum, and says exactly -- well almost exactly -- what needs to be said. Unfortunately, it's subscription only, and there's not even an abstract, which annoys me a whole lot because this commentary is important not only to doctors, but to all of us on the other end of the doctoring. Consequently, I'm going to steal liberally. I think it's fair use, but JAMA can sue me if they want to. Emanuel writes:

Today, the fundamental components of medicine go beyond the biomedical sciences to include its humanistic, legal, and management aspects. While science is absolutely essential, especially with greater precision in determining disease etiology through genetics and environmental influences, the limitations in practice increasingly result from systematic problems of implementation. Many of the medical services being delivered are irrelevant or harmful 10,11; much of what has been proven effective is not being routinely delivered to patients. 12–14 Consequently, hundreds of thousands are suffering and even dying prematurely while billions of dollars are wasted. 15 These problems are not the result of a few “bad apples,” but of systematic failures to deliver proven interventions to patients. 6,7,12 To apply 21st-century scientific advances effectively in the care of patients requires more emphasis on the humanistic, legal, and management sciences.

He might have added that many people are unhappy with their communication and relationships with their physicians, as some readers have expressed here -- doctors may do okay at curing, but badly at healing. It is usually seen as an intractrable problem that the 4 year medical school curriculum is already far too full of science, and far too demanding, so how can we possibly add all this touchy-feely stuff? Emanuel suggests that the solution begins with changing the pre-med requirements:

Why are calculus, organic chemistry, and physics still premed requirements? Mainly to “weed out” students. Surely, it would be better to require challenging courses on topics germane to medical practice, research, or administration to assess the quality of prospective medical students, rather than irrelevant material. 3


As the mere existence of the Hippocratic oath attests, ethical challenges are inherent in medical practice and research. 17 Yet there is no premed ethics requirement. Students need the ability to distinguish ethical issues from communications, economic issues, or aesthetic issues, to make ethical arguments, and to give ethical reasons that justify their decisions. Requiring a general ethics course is preferable to a focused bioethics course, which should wait until students have experience with actual patients and clinical dilemmas.

Moreover, much of the practice of medicine, as well as dealing with a research team and administering organizations, entails dealing with people and, therefore, human psychology. Requiring that students take a psychology course that provides education about established notions of human behavior, such as the fundamental attribution error, hindsight bias, transference, and moral distancing, could enhance physicians' interactions with patients, colleagues, and employees, not to mention their own families.

I would add that medical school admissions should look for people who haven't just studied ethics and psychology, but who are ethical and empathic people. One way to start is by refusing to take anybody right out of college -- spend a couple of years in the real world, and grow up first, find out what kind of person you are, find out how real people live, and maybe show something about yourself. Then apply to medical school.

And then, there is the question of how much of that medical school "basic science" is really necessary? In fact, I can tell you without any doubt that medical students forget 80% of it the day after the final exam:

Determining what courses are included and excluded from the curriculum is subject to fierce faculty battles. Each professor has a list of what could safely be eliminated, which is usually someone else's offering. Personally, despite being taught the Krebs cycle (twice during medical school as well as twice in college), I have never used it in my practice or research. 8 My drug prescribing habits tended to be influenced by the handbooks I carried around rather than my pharmacology course. A lot of the pathology and cytology courses had virtually no impact either.

He specifically recommends:

the challenge is to ensure that communications and bioethics education is more systematic and thoughtful. 20 In the first year, expert faculty should provide a formal introduction with a guiding framework. During clinical rotations, there should be repeated explicit instruction about practical applications of this framework as ethical issues arise and good communications can be modeled. In addition, in the fourth year a course should explicitly consolidate students' bioethics and communications learning. The “summing up” bioethics course could combine discussions about actual cases students have experienced in their clerkships with relevant readings; using standardized patients, good communication skills could be reinforced.

Of greater importance are areas that lack explicit LCME requirements, or are amalgamated into more vague requirements and may not be formally taught at all. While having statistics as a premed requirement will ensure that entering medical students grasp the basics, a formal statistics course in the first year would reinforce and apply the knowledge to reading the medical literature, analyzing research data, understanding health services research, and improving quality. Some schools teach statistics but most do not.

Health care now consumes more than $2 trillion per year. Health care is more than knowing what to provide. Reimbursement policies often determine what services are provided, and to whom. Practitioners and physician executives must understand how the financing system is structured, what services are covered by private insurance, Medicare, Medicaid, and other payers, and the incentives for clinicians implicit in reimbursement systems. It seems amazing to graduate physicians who have no idea what Medicare Part B is, the data on how copayments affect the use of medical services, how the resource-based relative value scale is determined and affects reimbursement, and why an aspirin administered in the hospital costs $20.

At the medical school with which I am associated, we used to have a course for first-year students called Patient, Doctor and Society that covered all this stuff -- well, not statistics, but everything else. It was team-taught in small sections by physicians and a person like me who studies medicine from the outside. I taught it for four years. But the school ended up eliminating it. Why? Because it was just too much for the students to be writing seminar papers while they were studying for the anatomy final, and none of this stuff was going to help them pass the boards.

The dean had a very unpleasant musculoskeletal disorder in which he was sitting on his shoulders. We need to fix this.

Wednesday, September 06, 2006

This is probably more information than you needed . . .

Anthropologists, traditionally, are social scientists who go to societies that are strange and exotic to them, and describe what they find for the folks back home. Sociologists are people who talk about where they live. (Yeah yeah, that distinction has blurred over time and I'll probably have the entire American Anthropological Association storming my house with torches and pitchforks, but it's basically how it was.)

So anyway, as a medical sociologist, I am also, as Sy Sperling used to say, "I'm also a cloyent." I'm with my late friend and mentor Irving Kenneth Zola, who was a founder of the sociological study of disability, an advocate for the full participation of people with disabilities in the life of the community, a person with disabilities, and a very famous and important medical sociologist. Irv's position was that we should not pretend to be detached, olympian observers of the reality we study, but rather fully proclaim our involvement, and our point of view. Eschew the passive voice, or, as another teacher of mine, Shulamit Reinhartz put it, more or less (I forget the exact quote), the construct of the disembodied observer undermines the very foundations of sociology.

So, here's what you don't want to know. I'm scheduled for a colonoscopy on Friday. You can read what the CDC has to say about this here. I have expressed reservations about some screening procedures -- notably prostate cancer screening, particularly using the Prostate Specific Antigen test, and routine mammography, particularly for women in their 40s and early 50s. The intuitive idea that it can't hurt to do something that might detect cancer early is wrong -- a false positive test result can hurt a lot, and the cost of screening everybody to find a few cancers earlier than they might have been detected otherwise might just not be worth it. So these really are personal decisions, that ought to be influenced by your individual risk factors as well as your individual fears and aversions. Check out what I have to say about Bayes Theorem and screening tests in general, and a recent post on what people don't understand about screening, if you're really interested.

Colon cancer screening is another matter, however. There is a slight risk of injury from an inexperienced operator, but there really isn't a false positive problem. If lesions are detected during colonoscopy, they are removed as part of the procedure -- which is already good even if they aren't cancerous, because they could become cancerous later -- and they can be examined under a microscope to find out what's really going on. In other words, colonoscopy can actually prevent cancer as well as detect it, and that's a major plus. The test is definitely more expensive than other screening tests, but it's cost-effectiveness has been pretty well established. One reason, and again it's counterintuitive, is that there really aren't strong risk factors known for colon cancer. You're at somewhat higher risk if you have a family history, but you aren't off the hook if you don't. If you have inflammatory bowel disease, you're also at higher risk, but in that case the doctors will probably want a look anyway.

Actually, in my case, I'm not even doing just as a screening test -- it's diagnostic as well, they want to get a look at what appears to be diverticulosis.

So, here is one case where that issue of people without health insurance really screams injustice. This is a procedure that just about nobody who doesn't have good health insurance can afford, or will get. For a percentage of people, that means that their lack of health insurance is a death sentence. How's that for compassionate conservatism and keeping us safe?

Finally, it's unpleasant and it is, well, a pain in the ass. You have to fast and purge, you have to go through something to which men in particular are averse due to what I guess are psycho-sexual hangups, and it really takes the better part of a day out of your much too busy life. But, bottom line [sic], if you're over 50 you probably ought to do it.

The sociological part comes later -- I'll tell you my sociological observations after it's over.

Tuesday, September 05, 2006

Keeping us Safe

The talking heads are predicting Democratic gains in the fall elections, possibly even capture of a majority in the House. But, as White House Political Director Sara Taylor told Jill Zuckman of the Chicago Tribune: ""There's no question that this will be a tough cycle," said White House political director Sara Taylor. Still, she added, "I'm confident we'll retain our majorities." When it comes to handling the war on terrorism and guiding the economy, Taylor said, Americans are more likely to trust the GOP. "I think Americans have confidence in the Republican Party and the president to keep them safe," she said."

So, what does it mean to be safe? I guess most people would say, at least in this context, that we're talking about not dying before your time, as they say. So what makes us not safe? The leading causes of death don't really tell the story, because most people die when they are old, and, well, sorry to say, it can't be helped. So the question is not what's likely to carry people off in the end, but what costs the most in total years of life, which gives more weight to things that kill younger people. This is called Years of Potential Life Lost, or YPLL, and in the most basic method, it consists of subtracting the age of decedents from 75 and counting up the total years for each cause of death. (People who die at 75 or older just aren't counted -- they're presumed to be on bonus time.)

There are various complications and adjustments, and different methods can yield slightly different results, but CDC does this exercise every year for important causes of death. You can find it all in Table 30 of the annual report Health United States, but I'm not going to bother with a link, it's in a PDF hundreds of pages long. Anyway, in 2003, the leading causes of YPLL per 100,000 population were actually the leading causes of death -- cancer and heart disease -- because they can strike relatively young people. Heart disease was responsible for 1,187.9 YPLL/100,000, and malignant neoplasms for 1,586.9. Of course cancer is really many different diseases. Lung cancer was #1 among all cancers, at 412.2. Unintentional injuries were the third leading of the very broad categories, at 1,084.6, with motor vehicle injuries accounting for more than half of that total. Homicide was pretty low on the list, but ahead of HIV (thanks to antiretroviral medications) at 274.3.

So where is terrorism in this picture? Believe it or not, we can estimate the answer from Table 30. In 2000, the YPLL due to homicide was 266.5; in 2002 and 2003, it was 274 and change. In 2001, with the inclusion of the 9/11 victims, it was 311.0. If we'd had a smooth increase from 2000 to 2002, the number would have been about 270, so as a reasonable guess, the 9/11 attack caused an increment of 41 YPLL/100,000 population, which means it wouldn't even have appeared on the list if it were broken out separately. (I'd need to be able to get the raw numbers from which the table was prepared, plus the ages of all the 9/11 victims, to compute the actual YPLL from the attack. But the visible spike looks like it's probably a decent approximation.)

It's less than half the YPLL due to influenza and pneumonia, 1/4 the total from cirrhosis of the liver, less than 1/6 the total due to diabetes, 1/9 the total due to suicide, and 1/14 the total due to car crashes. And of course, those other causes go on year after year after year. Even if we had terrorist attacks comparable to 9/11/2001 every single year, they wouldn't amount to a tiny drop in the bucket of what makes us not be safe.

Just something to think about if you're interested in who's going to keep you safe.


The Ten Commandments don't actually contain a broad prohibition against lying - only the narrower injunction, "You shall not bear false witness against your neighbor." Nevertheless Christians generally agree that it's a sin to tell a lie. Journalists obey an additional commandment: it's a sin to say that George W. Bush is lying.

Here's Peter Canellos in the Boston Globe, who comes as close as he can without endangering his immortal soul:

The president's campaign mantra this time is the same as in 2004 -- that the Iraq war is the war on terrorism, and that Democrats don't properly understand the terrorist threat.

Most of the recent violence has been along sectarian lines, between Iraqi Shi'ites and Sunnis who are battling for control of the country, but Bush nonetheless identifies the enemy as ``the terrorists," and last week, he reminded voters that Osama bin Laden wants the United States to fail in Iraq.

Bush's argument is that a withdrawal from Iraq -- which would be encouraged by a Democratic senatorial majority -- would signal US weakness, and would embolden terrorist groups.

This depends on the highly debatable notion that groups like Al Qaeda, which have no country and which operate on multiple continents, are deterred by the fear of US retaliation against countries in the Middle East.

Nonetheless, Bush's argument plays well on the stump. It showcases his steely determination and, perhaps, gives voters the misperception that US troops in Iraq are locked in combat with armies of terrorists.

Canellos thinks that this "argument" -- which is actually, you know, a lie, even though he isn't allowed to say so -- is likely to succeed, and will enable the Republicans to hold onto the Senate. But what is a necessary condition for such a lie to be effective? That the newspapers and television and radio stations that transmit it to the public refrain from pointing out that it is, in fact, a lie.

If you simply google the phrase "Bush lies" you will find that innumerable citizens have taken up the challenge to document the White House Occupant's innumerable lies. I didn't have time to thoroughly check them out, but here's one that's pretty well organized and straightforward. So it really doesn't require any serious journalistic enterprise to debunk the administration's lies or to conclude that George W. Bush is, in fact, a habitual and remorseless liar. But they just let him get away with it. Here's what Eric Alterman had to say about this way back in 2002, while the lies added up only Mount Rainier, not yet Mount Everest.

President Bush is a liar. There, I said it, but most of the mainstream media won't. Liberal pundits Michael Kinsley, Paul Krugman and Richard Cohen have addressed the issue on the Op-Ed pages, but almost all news pages and network broadcasts pretend not to notice. In the one significant effort by a national daily to deal with Bush's consistent pattern of mendacity, the Washington Post's Dana Milbank could not bring himself (or was not allowed) to utter the crucial words. Instead, readers were treated to such complicated linguistic circumlocutions as: Bush's statements represented "embroidering key assertions" and were clearly "dubious, if not wrong." The President's "rhetoric has taken some flights of fancy," he has "taken some liberties," "omitted qualifiers" and "simply outpace[d] the facts." But "Bush lied"? Never.

Ben Bradlee explains, "Even the very best newspapers have never learned how to handle public figures who lie with a straight face. No editor would dare print this version of Nixon's first comments on Watergate for instance. 'The Watergate break-in involved matters of national security, President Nixon told a national TV audience last night, and for that reason he would be unable to comment on the bizarre burglary. That is a lie.'" . . .

Let us note, moreover, that Bradlee's observation, offered in 1997, did not apply to President Clinton. Reporters were positively eager to call Clinton a liar, although his lies were about private matters about which many of us, including many reporters, lie all the time. "I'd like to be able to tell my children, 'You should tell the truth,'" Stuart Taylor Jr. of the National Journal said on Meet the Press. "I'd like to be able to tell them, 'You should respect the President.' And I'd like to be able to tell them both things at the same time." David Gergen, who had worked for both Ronald Reagan and Richard Nixon as well as Clinton and therefore could not claim to be a stranger to official dishonesty, decried what he termed "the deep and searing violation [that] took place when he not only lied to the country, but co-opted his friends and lied to them." Chris Matthews kvetched, "Clinton lies knowing that you know he's lying. It's brutal and it subjugates the person who's being lied to. I resent deeply being constantly lied to." George Will, a frequent apologist for the lies of Reagan and now Bush, went so far as to insist that Clinton's "calculated, sustained lying has involved an extraordinarily corrupting assault on language, which is the uniquely human capacity that makes persuasion, and hence popular government, possible."

George W. Bush does not lie about sex, I suppose--merely about war and peace. Most particularly he has consistently lied about Iraq's nuclear capabilities as well as its missile-delivery capabilities. Take a look at Milbank's gingerly worded page-one October 22 Post story if you doubt me. To cite just two particularly egregious examples, Bush tried to frighten Americans by claiming that Iraq possesses a fleet of unmanned aircraft that could be used "for missions targeting the United States." Previously he insisted that a report by the International Atomic Energy Agency revealed the Iraqis to be "six months away from developing a weapon." Both of these statements are false, but they are working. Nearly three-quarters of Americans surveyed think that Saddam is currently helping Al Qaeda; 71 percent think it is likely he was personally involved in the 9/11 attacks. . . .

Reporters and editors who "protect" their readers and viewers from the truth about Bush's lies are doing the nation--and ultimately George W. Bush--no favors. Take a look at the names at that long black wall on the Mall. Consider the tragic legacy of LBJ's failed presidency. Ask yourself just who is being served when the media allow Bush to lie, repeatedly, with impunity, in order to take the nation into war.

Well, they allowed it, and here we are today. And he's still lying. And they're still allowing it. And his "Christian" followers are as devoted as ever. What a world.