Map of life expectancy at birth from Global Education Project.

Wednesday, July 30, 2008

O-oh, Mexico

Oh, Mexico
I never really been but Id sure like to go
Oh, Mexico
I guess Ill have to go now


Actually I have been, but only to Baja California Norte. Sunday I'm heading to La Ciudad Mexico for the International AIDS Conference. People tell me it's more political than scientific but I am going to present an itty bitty teeny weeny bit of science, or at least empirical observation, about the ways doctors talk to patients.

More on that anon, but for now I just wanted to let y'all know. I'll do my best to blog regularly from the conference, which is usually more than slightly raucous, and also to give my impressions of Mexico City, about which the same is said. I have the advantages of speaking the language and having a native guide or two, so I hope to offer something more than la perspectiva de una turista.

Let me know if there's any thing in particular you'd like me to watch out for. I'm having a Carta Blanca as I write to get warmed up.

Tuesday, July 29, 2008

One of my DFH characteristics

I have not consumed the flesh of tetrapods since I was in college. It is not, however, because I believe it is morally wrong in principle for humans to eat meat. Biologically, we're omnivores, and animals eat each other. That's nature. No, it's because of the way meat is produced in our society. Here is an important report that has gotten far too little attention, called "Putting Meat on the Table: Industrial Animal Farm Production in America." It was produced through a grant from the Pew Charitable Trust, which created the Pew Commission on Industrial Farm Animal Production. Here are some brief excerpts from the introduction. I wish I could present more, but do read the whole thing.

Over the past 50 years, the production of farm animals for food has shifted from the traditional, extensive, decentralized family farm system to a more
concentrated system with fewer producers, in which large numbers of animals are confined in enormous operations. While we are raising approximately the same number of swine as we did in 1950, for example, we are doing so on significantly fewer, far larger farms, with dramatically fewer farm workers. This production model—sometimes called industrial farm animal production—is characterized by confining large numbers of animals of the same species in relatively small areas, generally in enclosed facilities that restrict movement. In many cases, the waste produced by the animals is eliminated through liquid systems and stored in open pit lagoons. . . .

This transformation, and the associated social, economic, environmental, and public health problems engendered by it, have gone virtually unnoticed by many American citizens. Not long ago, the bulk of the fruit, grain, vegetables, meat, and dairy products consumed by the American people were produced on small family farms. These farms once defined both the physical and the social character of the US countryside. However, the steady urbanization of the US population has resulted in an American populace that is increasingly disassociated from the production system that supplies its food. Despite the dramatic decline in family farms over the past 50 years, many Americans, until very recently, continued to think that their food still came from these small farms.

While increasing the speed of production, the intensive confinement production system creates a number of problems. These include contributing to the increase in the pool of antibiotic-resistant bacteria because of the overuse of antibiotics; air quality problems; the contamination of rivers, streams, and coastal waters with concentrated animal waste; animal welfare problems, mainly as a result of the extremely close quarters in which the animals are housed; and significant shifts in the social structure and economy of many farming regions throughout the country.


There is also a substantial environmental justice component to this problem. In the new American Journal of Public Health (subscription only) Steve Wing and colleagues tell us that "Industrial hog operations in North Carolina are disproportionately located in communities of low income people and people of color, where inadequate housing, poor nutrition, lack of access to medical care, and simultaneous exposure to other environmental and occupational hazards may exacerbate their impact."

Other negative externalities of industrial meat production stem from its immense resource intensity. Eight times as much grain is ultimately consumed when people eat meat, than when they eat the grain themselves. This grain is grown using diesel powered tractors and synthetic fertilizers, then trucked from cornfield to hog farm, driving up the cost of both vegetable foods and fossil fuels, and polluting the atmosphere. In rural China, people once kept pigs and fed them field and kitchen waste. Today, they more and more eat pork from factory "farms," and that is an important reason for the rising cost of food and fuel in the world. In the U.S., dairy farmers would keep their cows in pastures, and supplement the forage with corn they grew themselves, and fertilized with their own cow's manure. (That is still done, by the way, in Windham County.) Today, more and more, dairy "farms" are vast industrial operations that keep their cows in pens, feed them grain trucked in from elsewhere, and store their waste in lagoons from where it is eventually discharged into surface water.

It isn't morally wrong for people to eat animals that have been permitted a normal life, but factory farms, in my view, are inhumane. I have discussed the problem of antibiotic resistance resulting from the practice of routinely putting antibiotics in animal feed, which is necessary because of the crowded and unsanitary conditions in which animals are kept. The social, environmental, and moral costs of this industry constitute an atrocity.

Don't patronize it. If you are going to eat meat, find out where it comes from. It won't be easy to find ethical sources, believe me. The easiest and best thing to do is just stop.

Monday, July 28, 2008

Nanny state, or just common sense?

I have long been a proponent of banning synthetic trans fats from restaurant food. As you probably know, California has now taken this step.

I have found myself getting into squabbles with self-styled libertarians on this issue, who protest that is their "right" to decide what they want to eat. There is, however, a simple and obvious reason why the libertarian argument fails. When you eat in a restaurant, you have absolutely no idea what's in the food. You aren't making choices or exercising rights, you're just eating whatever the Board of Health says is legal. I don't hear those same libertarians insisting on their "right" to eat cockroach droppings, salmonella, or the leftovers from the last two people who ordered the same thing and didn't finish it.

Rather than supporting a ban on trans fats in groceries, I support labeling -- which has worked quite well, actually. Most food manufacturers don't want to have to put on their labels that their products contain a deadly ingredient. Public awareness of the dangers of trans fats is high enough, and the cost of eliminating them low enough, that it has turned out to be worthwhile for the big companies to get rid of them. They put a big red heart and "0 grams trans fat" on the front of the package. They're proud of it. And the potato chips -- which you still really shouldn't eat anyway -- taste just fine. Requiring food labeling doesn't take away your liberty, it enhances it. Knowledge is power.

Now, you could argue that restaurant food could be dealt with in the same way -- require labeling, whether it be disclosure on menus or a big sign in front or something. But there are important objections to that. First of all, we don't have the same flexibility and choice in restaurant patronage that we do in the supermarket. There just aren't all that many restaurants, and in our neighborhood there might only be a couple in our price range.

Second, it's not very cost effective. Restaurants would have to update their signage every time they changed their recipes.

Third, it's impossible to enforce. Health inspectors can look at the cleanliness of the kitchen and the temperature of the refrigerators, but they can't possibly test all the oils for cis- or trans-hydrogenation. Banning trans fats, however, can be enforced at the level of the distributor.

So this is a case where a hard regulation actually increases freedom. The logic is that most people -- and the polls say so -- don't want trans fats in their food. But they can't achieve that without the regulation. If there is a minority out there that prefers to eat poison, even though they can't tell the difference in the taste and the price difference is negligible, their "right" -- which has no evident rational basis -- is simply inferior to my right not to be killed.

Just one more reason why libertarianism is a very confused philosophy.

Friday, July 25, 2008

Time and Tide

Colin Farrelly's comments on the previous post (and please read if you haven't already) require us to think about some of the most fundamental questions in public health and health care ethics.

First let me get some administrative business out of the way. One of the questions I raised was about priorities. I never said we should not invest in research into slowing the aging process, nor do I see any contradiction between doing such research and simultaneously tackling other urgent problems. After all, we're spending $10 billion a month in Iraq and squandering even larger sums on building stealth bombers and what not.

But that's the problem. I would expect that anti-aging research has a relatively politically powerful constituency, among affluent people in affluent countries. My voice will probably continue to cry in the wilderness about other issues.

A couple of other quick points. Some of my physician friends are less optimistic than Dr. Farrelly about the prospects for effective anti-aging therapy any time soon. I have no expertise to judge that question. I will note that recently, we have made substantial progress in understanding the biology of what I might call endogenous (as opposed to infectious) disease processes, such as cancer, auto-immune disorders, and obesity. However, we have so far found progress translating this understanding into effective therapeutics to be disappointing. The processes turn out to have layers of complexity that makes intervention difficult and often counterproductive. Ironically, for example, while cancer is associated with aging, anti-aging therapies might perversely greatly increase the risk of cancer. So it remains to be seen how soon these speculations will confront reality. That's not an argument against continuing research, obviously, just an observation.

Now for the deep stuff. There are two possible scenarios for an anti-aging intervention, as Dr. Farrelly suggests. One is that it compresses the morbidity associated with aging -- that it means we go along through our 70s and perhaps beyond without the woes that have afflicted the aged since our ancestors crawled out of the ocean. (I'm assuming that fish experience aging differently than do tetrapods. I had to ground the figure somewhere in time.) Then we suddenly fall apart like the Wonderful One-hoss Shay, and go peacefully. It's hard to argue with that.

Possibility two is that it just pushes the whole thing back a few years. When you're 70, you have the physiology of a typical 63-year-old. At 80, you're like a 73-year-old, and so on. Now this is a different matter altogether. We get some more good years out of the deal, but then all is as before. Here we must reflect on the lifespan as a fundamental fact. Should the goal of medicine be to extend it? Most of us want more life, of course, as a selfish proposition, but there are tradeoffs for society.

Assuming that the earth has a carrying capacity, that resources are limited and that there are costs to a larger human population, if we live longer, we're taking something from our children. In addition to consuming resources, we're standing in their way. If we keep working through our seventies and beyond, we're clogging up the pipeline for the most desirable jobs, for leadership roles, for status. Human society is designed around the fact of aging and death. If we radically change the relevant parameters, we change society. Of course those young people will eventually get around to that promotion, or academic appointment, but without the energy and idealism of youth. The time of family formation and child rearing will be one of lesser affluence and less satisfying and enriching employment.

How would all this affect the culture, the general mood? Politics? And even the health of the young?

I am not one to invoke mystical ideas about "nature." Unlike some who would attack the idea of conquering age from positions that are equally likely to be labeled right or left, I believe that human nature is what we make of it, not some sacred and immutable quality. If we can be smarter, healthier, happier, let's go for it. But there is the law of unintended consequences. Fool around with one thing, and something else will bit you on the ass. So I'm just saying, let's think this thing through. And don't forget -- there's no telling what numbers we're talking about here. If we could delay aging by 7 years, why couldn't we delay it by 20? Or more? What then?

Thursday, July 24, 2008

The best is yet to be?

Two essays in the latest BMJ* argue along rather similar lines that it is time for a substantial investment in biomedical research into slowing the aging process in humans. (Colin Farrelly authors one of them; a multitude led by Robert Butler of the International Longevity Center presents the other.)

Here is the problem, in a nutshell. Current efforts to combat the various diseases associated with aging, from cancer to heart disease to osteoporosis to Alzheimer's can have, at best, a limited payoff. However well you succeed in saving an old person from death from cancer, you won't gain a great deal in disability-free lifespan because by the time we hit our 70s, comorbidity is the rule. The person also has arthritis, heart disease, cognitive decline. One way to think of it is that these are not diseases of aging, they are aging.

Aging is a global process affecting the organism -- metabolic level, DNA repair, tissue repair, all decline. The immune system becomes less effective. The brain shrinks. Muscles atrophy. We can slow some of these effects to some extent by maintaining a good diet, physical activity, and mental engagement, but we can't stop them. These authors argue that we are getting close enough to understanding the basic biology of aging that the possibility is not far off of a pill that will slow it down. As Farrelly puts it, "If we succeed in slowing ageing [sic -- British spelling] by seven years, the age specific risk of death, frailty and disability will be reduced by about half at every age. People who reach the age of 50 in the future would have the health profile and disease risk of today's 43 year old."

So, consider cancer. A man's risk of being diagnosed with cancer in the next 20 years is 21.4% at age 50 and 34.5% at age 60. Cancer cost $219 billion in the U.S. in 2007, including $89 billion in direct medical costs. So investing even a few billion dollars in anti-aging research, if it reduced the age-adjusted risk of cancer by even a few percent, as the population ages, would seem an extremely cost-effective investment. And how many of us wouldn't want to feel seven years younger?

However, it seems to me that this is not quite such a clear cut ethical question as it might seem. First of all, as I keep repeating, the situation is a lot different in much of the world, where the ills of aging are the least of people's worries, and children are lucky to make it past the age of five. The scourges of infectious and parasitic disease continue to burden children and adults, shortening lives, sapping energy and productivity and leaving whole communities mired in poverty. Of course, money spent on anti-aging research doesn't have to be taken from money that would otherwise be spent on combating tropical diseases and other afflictions of the poor, but as advocates, what should be our priorities?

Second, of course, delay is not elimination. Even if it gets us seven years later, it's still gonna get us. Perhaps it will be a better world if rich Europeans and Americans live to be 90 in stead of 83, and Tom Watson is still finishing in the top ten on the seniors tour when he's 75. I don't know. But think again -- it isn't actually going to save us any money. After seven years, people will have caught up to their previously expected biological age and we'll be back on track with the same levels of disease and disability. We would have been able to bank some dough in the mean time -- savings from Medicare, basically -- but will that actually happen?

In order for this to work, we'll have to re-engineer the life course socially as well as biologically. The expectation that at around age 65, most people will leave the workforce and take up golf, crocheting, and martini drinking on the veranda full-time on the public dole is not going to work -- particularly if they're going to do it for 25 or 30 years.

What might a longer, healthier old age look like, that would truly benefit the old, and the rest of society? Is it really possible? Is it worth pursuing? How would you use a few extra years?

*Subscription only, which normally disqualifies a reference here, but this seemed important enough to violate the rule.

Tuesday, July 22, 2008

More LDVD

That stands for the Fristian practice of long distance video diagnosis. The observation that John McCain has said some odd things lately is no longer confined to underemployed bloggers. Here CBS notes some (not all) of the recent "gaffes" such as substituting Iraq for Afghanistan, Somalia for Sudan, Russia for Germany, Czechoslovakia for the Czech Republic, Al Qaeda for Shiite "terrorists," and the Pittsburgh Steelers for the Green Bay Packers. I will also note that he recently was unable to remember Mitt Romney's name.

Now, it is obvious that these so-called "gaffes" all have something in common -- he's having trouble with proper names. I'm quite sure that Sen. McCain knows the difference between Russia and Germany, and Iraq and Afghanistan, but his brain is slipping a cog when it goes to access the label. I can't swear that he knows Sunni from Shiite but I'm sure he'd been briefed on it shortly before the news conference with Joe Lieberman in which he committed the error. (The Packers/Steelers thing could be a case of dishonest pandering, of course, but it's sort of hard to believe he thought he could get away with it. A cognitive slip could at least have contributed to the initial error, although a more honest man would have corrected it.)

This is not, as I have said before, a symptom of early stage Alzheimer's disease, which is characterized by failures of short-term memory integration, and looks very different. We all make such errors occasionally. Proper names are actually the most difficult kind of word for the brain to file and retrieve. Who hasn't had the horrible experience of not being able to remember the name of someone you have known fairly well? Still, it does seem to be happening to McCain a lot, and mostly in a fairly specific semantic category, which is actually a bit more suggestive of a real problem than would be a general occasional error in word choice. That he has to labor to read aloud is also suggestive of a problem in generating spoken language.

If this is more than just a few senior moments, the most likely diagnosis would be called primary progressive aphasia, which is a form of frontotemporal dementia. This poorly understood process is usually characterized only by language difficulties in the first two years or so, and often for longer, without other kinds of impairment, although the language difficulties tend to grow more profound and start to extend to comprehension as well as language production. Eventually, the disability spreads to other cognitive functions, affecting memory, and personality. It may seem inappropriate to mention this, and perhaps it is. The evidence is certainly weak, as I say people can make such mistakes without it being a sign of anything seriously wrong. But this man is running for president of the United States.

Finally, full disclosure: I'm a social scientist with a subspecialty in sociolinguistics, not a neurologist. But I have studied the sociology of brain injury, and my father has this precise disease -- now in a very advanced stage. So I have some idea what I'm talking about. McCain's age is not a big issue for me, so long as he's healthy. (Obviously, I don't think he is qualified to be president on other grounds, and I intend to vote for Obama.) But he does have a responsibility to make sure that he is fit for the job. I certainly hope that he has had his physicians perform an in-depth neurological assessment, and recently.

Monday, July 21, 2008

Speech Act Theory and Political Discourse

One of the distinctions we make in our method for discourse analysis is distinguishing between what we call "representatives," and "expressives." A representative is the kind of speech act that has traditionally been the most salient concern of linguistic philosophers,* that is an assertion which can be assigned a truth value based on a method of verification available to a set of beings with requisite sensory apparatus, logical processing capability, and opportunity for observation -- normally meaning cognitively and sensorily normal humans. Formally, positivists say that the meaning of a statement is equivalent to the means by which it can be verified.

Most utterances in natural speech do not have a truth value, even in principle -- they are not subject to verification. These include commissives (promises, offers, commitments), directives (orders, recommendations, encouragement, convincing, etc.), questions (which are a special kind of directive), social ritual (please and thank you and so on), and jokes. (Is it true or false that the chicken crossed the road to get to the other side?)

RepresentativesExpressives (thanks Mike for noticing the brain fart) are somewhere in between. They are assertions about the state of the speaker's wetware -- what the speaker feels, wishes, believes. We can form an opinion about whether the speaker is being truthful, of course, and there may be strong grounds for deciding this. Most neuroscientists will tell you that such mental states correspond to physical processes in the brain and may even hope that one day, they will be able to measure them directly, in other words read minds.

Nevertheless, even if our individual mental states could one day be brought into the intersubjective realm in the sense that assertions about them are verifiable, they will still never be intersubjective in the same sense as "the earth goes around the sun." There are two main reasons for this. The first is the problem of consciousness. Our consciousness is inaccessible to others, even if the neurological processes that generate it are not. It is our own experience, and can never, so far as we know, be anyone else's.

More important, however, facts about mental states are not the same as facts about the world out there. It may be verifiably true that you believe that human life begins at the moment the gametes fuse to form a zygote, and that ergo abortion is murder, but it is nevertheless not verifiably true that abortion is murder. All I have to do is feel otherwise, and that's the end of it. The fact that you believe it has a truth value, but the assertion itself does not.

Unfortunately, many people can't tell the difference between these kinds of assertions. Is removing U.S. troops from Iraq "defeat"? If it is "defeat," is that unacceptable for some reason? Do people have a fundamental right to keep all of their property? Should we avoid causing pain to non-human species? Are there certain factual assertions that should be outlawed, such as "the Nazis did not try to exterminate the Jews of Europe"? There are verifiable matters of fact that enter into such arguments, but ultimately they cannot be decided without reference to expressive, rather than representative, utterances. In order to have a constructive discussion, it is essential to distinguish rigorously between the two kinds of assertions and their respective places in the structure of an argument. Unfortunately, people often fail to do this and wind up arguing in separate universes.



*Okay, that's a broad generality. I haven't done a formal survey, but this is my impression.

Friday, July 18, 2008

End Times?

Remember this?

"Some seem to believe that we should negotiate with the terrorists and radicals, as if some ingenious argument will persuade them they have been wrong all along," Bush told the Israeli parliament.

"We have heard this foolish delusion before.

"We have an obligation to call this what it is -- the false comfort of appeasement, which has been repeatedly discredited by history," he said, drawing parallels with the 1930s capitulation to the Nazis. . . .

McCain, in a conference call with bloggers, used similarly emotional language as Bush and repeatedly accused Obama of being naive in his willingness to negotiate with the Iranian leadership. "If Senator Obama wants to sit down across the table from a country that calls Israel a stinking corpse and [whose president] comes to New York and says he wants to wipe Israel off the map, what is it that he wants to talk about with them?" McCain asked.


That was May 15 -- just two months ago. Today, we read this:

Iran's foreign minister today praised the attendance of a U.S. diplomat at this weekend's nuclear talks as "a new positive approach" and suggested that additional steps toward reconciliation could soon take place. . . .

The U.S. has shifted from its confrontational policy of isolating Iran in favor of a diplomatic approach. The administration is also floating a proposal to open a de facto U.S. Embassy in Tehran. U.S. diplomats would go to Iran for the first time in almost 30 years, since the countries broke relations after the 1979 Islamic revolution.


I suppose the equivalent of Kremlinology in this context would be WhiteHouseology? Anyway, I'm not expert at it. This is a puzzling development. Unless it's a feint intended to provide a fig leaf for the rumored attack on Iran ("We tried making nice, we tried to negotiate, but it just didn't work.") it would seem to signal the collapse of the neocon contingent within the administration. That it so clearly and dramatically undercuts the McCain candidacy makes it all the more astonishing.

Anybody who isn't paranoid these days is nuts, so I'm not taking all this at face value. But one way or another, something interesting is going on.

Thursday, July 17, 2008

If you don't want to take it from me . . .

Take it from David Himmelstein:

Health Care for America Now (HCAN) is pushing a superficially attractive health reform model that has a long record of failure – akin to prescribing a placebo for a serious illness when effective treatment is available. They would offer Americans a new public insurance plan and a menu of private ones, with subsidies for coverage for low income families.

This approach reprises the format of Medicare’s ongoing privatization. Despite promises of strict regulation and a level playing field that would allow the public plan to flourish, private insurers would (as they have done in Medicare) predictably overwhelm regulatory efforts through crafty schemes to selectively recruit profitable, lower-cost patients, and avoid the expensively ill. Like the Medicare Advantage program, originally touted as a market-based strategy to improve Medicare’s efficiency, the HCAN plan would evolve into a multibillion dollar subsidy for private insurers whose massive financial power (amassed largely at government expense) would prove a political roadblock to terminating the failed experiment.


Etc. etc. Remind you of anything somebody wrote yesterday?

We need universal, comprehensive, single payer national health care. That's what to advocate for, that's what to yell for. The time to compromise, if at all, is at the end of the struggle, not the beginning. Selling out to what seems politically feasible, before the battle is even joined, means you won't even get that.

Universal, comprehensive, single payer national health care.

Say it again.

Wednesday, July 16, 2008

A half-good example?

Props to our friend Ana -- what's not to love about a country that gave us fine chocolate, holey cheese, precision watches, leather pants, giant wooden trumpets, yodeling, sliding down mountainsides on various contrivances, a treacly shipwrecked family that inspired an iconic 60s sci-fi show, resolute neutrality, secret bank accounts for obscenely wealthy tax evaders, and Vatican guards in clown costumes?

But in spite of all these marvelous innovations, I do have a bone to pick with the land of melting ski resorts, and that is their health care system to which many an American wonk, including some of my best friends here in Massachusetts and the presumptive Democratic nominee for president, are looking for inspiration. The basic idea is, maybe it isn't perfect, but a) it's better than what we've got now and b) it just might be politically feasible. Yes, somewhat to point (a), but I've got a problem with (b).

Ana can clean this up if any of it is imprecise, but the basic description of the system is that there is a universal, individual mandate to purchase a basic health insurance product, that covers a specified set of services, from any of competing private insurers. The insurance companies are required to offer the same prices regardless of people's health status or factors that might predict their risk of consuming services, in other words they must use community rating. (On a canton-by-canton, rather than national basis, I believe.) They can charge more based on sex (women get pregnant) and age, to some extent, although the age differential is regulated. People can choose to make tradeoffs among premium costs, co-payments and deductibles, but in general out of pocket costs are high. Finally, there are subsidies so that nobody pays more than 10% of their income for insurance.

The Swiss spend less than the U.S. on health care, although they spend the most of any country in Europe, indeed the second most of any country in the world. And just about everybody is insured. So that it would seem like a step forward for the U.S., right? Proponents of the system call it "consumer driven," and claim that the reason for the lower costs is that with high co-payments and deductibles, people shop around for good deals and don't consume services they don't really need.

Sadly, no. Swiss consumers have almost no information about the comparative cost and quality of competing providers, and like health care consumers everywhere, they pretty much depend on their doctors to tell them what they need. The reason why costs are lower is that prices are regulated. Insurers negotiate fee schedules with doctors, which must be approved by the canton. Drug prices are federally controlled. There is also a set of mandated guidelines for cost-effective services, and people have to buy supplementary insurance to get services beyond the basic benefits package. Finally, the insurance companies aren't allowed to make a profit on the basic plans.

Now, you know as well as I do that if we were to introduce such a universal mandate system here in the U.S., the drug companies would make sure there were no price controls on drugs; the AMA would make sure there were no controls on physician incomes; the insurance companies would make sure they could hold on to 100% of their current profits; the health care industry as a whole would yell and scream about "rationing" and we'd have the same squandering of resources on $10,000 cancer treatments that give people 2 1/2 weeks of added survival that we do now. Subsidies for low income people would be insufficient, insurers would wiggle out of the community rating requirements and find ways to cherry pick healthy consumers, people in their 50s and early 60s would be forced to buy insurance they couldn't afford, that didn't offer them any real benefits because the deductibles and co-pays would be so high, and the whole thing would just end up being a massive infusion of even more money into the pharmaceutical industry and high-tech medical specialties.

Alright, call me a cynic. Just you watch.

Tuesday, July 15, 2008

Credit for Honesty Department

As I was waiting for the subway this morning the following announcement came over the loudspeaker:

Red Line train service is running normally. We apologize for the inconvenience.


Apparently they meant to apologize for an earlier service disruption but it was just fine the way it came out. The point, for this blog, is that there's one policy decision the Obama administration and the new Congress can collaborate one, along with all of our Democratic governors and legislatures, that will make this nation better and stronger in a whole lot of ways at once, and that's to take a couple of hundred billion dollars a year out of the military budget and invest it in mass transit.

Sorry to tarnish your legacy, Mr. Ford -- noting in passing that it's already bad enough that you were a Nazi, but we aren't supposed to mention that -- but cars do a whole lot of evil at once. We run them on fuel that we obtain by sending trillions of dollars to politically unstable regions of the world, followed by spending trillions of dollars more to invade and occupy said regions because we're worried that they won't let U.S. based companies reap a big chunk of the profits. Both halves of that proposition are making the country bankrupt and it means you won't get Medicare when you retire if we don't fix it, and quick.

Second, they spew CO2 into the atmosphere and, well, you know where that's going.

Third, although they don't tell you this and nobody seems terribly upset by it, they spew oxides of nitrogen, ultrafine hydrocarbon particles and other pollutants into the air, which people then breathe, and which cause asthma, lung cancer, and heart disease, thereby making millions of people sick and shortening their lives. More than 42,000 Americans die every year in motor vehicle crashes as well (which happened to my uncle, widowing my aunt and orphaning my cousins). Motor vehicle injuries result in about half a million hospitalizations every year and cost $150 billion.

Cars cause residential development patterns in which housing is built in sprawling tracts far from work, stores, and cultural amenities. This causes people not only to do a lot of driving and incurring all of the above costs, it also causes them to spend time driving when they could be doing something else. Something else includes walking to those destinations which would make them healthier, and particularly less obese. Furthermore it separates the housing from community gathering places, thereby fostering isolation and alienation.

Mass transit means less imported oil, less pollution, more safety, and denser mixed-use development around transit stations. For example, I live in Jamaica Plain, which is within the city limits of Boston but which is functionally a classic streetcar suburb, with its own Main Street district of retail stores, and restaurants, including music venues; and easy access to downtown jobs via mass transit, which would be even more popular if we invested enough in the subway system to make it more reliable and less crowded. I usually don't even start my vehicle during the week, I only use it when I need to get out of town. People in the burbs, without access to mass transit, may spend two hours or more in their cars every day.

This really is a miracle cure, folks. It kills every predatory bird that threatens us with one shot: global warming, foreign oil, foreign wars, national bankruptcy, air pollution, traumatic injury and death, even bowling alone. Automobiles are the leading cause of every single one of these problems and mass transit is a magic bullet. Yep, it really is that good. But it's way down the list of public priorities.

Monday, July 14, 2008

Pure Drivel

Frist! Not! Unlike the former Senator, I'm not a long-distance video diagnostician. Nevertheless, people importune me to opine on whether Sen. McCain is displaying early symptoms of dementia. Alright, enough, I will pronounce judgment.

No, I don't think so. The cognitive limitations Sen. McCain displays do not appear consistent with early stage Alzheimer's disease or frontotemporal dementia. The first manifestation of Alzheimer's is normally impaired short-term memory integration -- the person will tell the same story twice in two minutes, or forget what you just said, that sort of thing. I haven't observed McCain doing this. McCain's inability to keep Shia and Sunni straight, or to read a teleprompter, is a symptom of a different problem, to wit, he is a dolt. He finished next to last in his class at the Naval Academy, and it's a fair surmise that he would not have graduated at all if his father wasn't a prominent Navy officer. He has always proclaimed that he doesn't understand economics, so it's certainly no wonder that he doesn't understand foreign policy either. He has trouble reading a teleprompter because he has trouble reading, period.

As for saying that he gave his Vietnamese captors the names of the Pittsburgh Steelers defensive line, when has has always said previously that it was the Green Bay Packers offensive line, that's not a sign of dementia either. Only as dementia becomes more advanced does the memory for events in the distant past start to fade. The reason McCain swapped lines is not because he's demented, it's because he's a liar. Of course, he'd have to be pretty stupid to think he could get away with claiming it was the Steel Curtain just because he was in Pittsburgh, when anybody could look it up. Is he really that stupid? Actually a more credible theory is that he became easily confused because he'd made up the whole story in the first place, in other words he never did the Green Bay offensive line trick either.

So, demented? No. A lying doofus? Absolutely.

Sardonic wit: Yes, yes, obviously the point of the New Yorker cover is to satirize the wingnut characterization of the Obamas. Rather, it's obvious to you and me, but that's not the problem. I used to have a habit of trying to create sardonic humor by saying outrageous things on the presumption that the people listening knew me well enough to know that I was mocking people who would say such things. Alas, I learned to my sorrow that people didn't necessarily get that right away, and even if they did, they often didn't like it anyway.

The "controversy" over the cover is going to be a perfect excuse for Fox News and CNN to display it 24 hours a day for two weeks, not to New Yorker subscribers but to tens of millions of people who will not get the double layer of irony, accompanied by a panel stacked heavily with wingnuts who will -- surprise! -- fail to correctly construe that it is a devastating satire of them and who will make endless remarks to the effect that, regrettably, the American people still feel that they don't really know Obama and they aren't convinced of his loyalty or his resoluteness in confronting America's dangerous enemies, and here we have an illustration of how these doubts and concerns are out there, oh tut tut it's certainly offensive but yadda yadda yadda.

Profound, inexcusable error in judgment.

Wonkier, more on-topic stuff to come.

Friday, July 11, 2008

The Time of Peril

I've commented more than once, I believe, that it can be difficult to bring myself to write about the many specific and narrow concerns of public health and health care policy and practice when such large and disconcerting dangers loom. Another Great Depression, or just the collapse of the health care system (such as it is) here in the U.S., or continued growth in inequality and the impoverishment of more and more millions, widespread global famine, water shortages, ecological catastrophe -- I could continue but the point is, any such events would overwhelm the daily concerns I write about here.

It struck me this morning that I spent more than half of my life under the shadow of Mutual Assured Destruction and the entirely plausible prospect of a civilization-ending nuclear war. And yet the world somehow feels even less secure now, even though that unthinkable horror has receded (though it hasn't disappeared). I would say there are a few major reasons why the present feels so critical, and so uncertain.

One is the complexity of so many interlinked problems. Remember that the Great Depression began with the collapse of a financial bubble, specifically the U.S. stock market. Now we have another spectacular financial collapse, but it is compounded by the petroleum shortage, an exogenous event that I am fairly well convinced has no reasonable prospect of amelioration for many years. This crisis is just deepened by what I consider to be deliberate and mendacious denial. OPEC claims there is plenty of oil supply and that extraction will reach 113 million barrels a day by 2030. I happen to think they are flat out lying. Here's one of innumerable very well informed people who agree with me. But the persistence of conventional wisdom that the petroleum peak is still somewhere over the misty horizon of time (not that 15 or 20 years is a lifetime, but it's still long enough to procrastinate) let's political and cultural leaders off the hook for actually doing anything meaningful about it.

And there you'll find the biggest monster in my closet of anxieties. It is the utter failure of our institutions -- government, corporate news media, business leaders, opinion leaders, politicians -- to face the facts. We aren't having a serious discussion of the state of the world. We're in the middle of a contentious presidential campaign in which there are some issues in play but neither candidate is willing to do any more than nibble around the edges of what is at stake. Obama knows perfectly well that the American people, and most certainly the corporate media, are not willing to hear the truth and would destroy him if he tried to say it. McCain, however, is not up to the job and probably has little idea of what's really going on in the world.

Even Richard Nixon knew damn well that World War III was not an option, and Ronald Reagan, an ideologically deluded, and ultimately demented fool, was sufficiently in touch with reality, at least on alternate Tuesdays, that he knew he had to make sure it never happened and took meaningful steps toward that goal. Right now, however, the reality based community is marginalized. The whole country needs a dope slap. Who's gonna give it?

Thursday, July 10, 2008

More on our study

Okay, now that I've dealt pretty thoroughly with the limitations, what are we trying to accomplish and why do I think the limitations are acceptable?

First, in answer to C. Corax's question, yes, the evidence we have is that recording natural interactions isn't very disruptive. People quickly forget that they are being recorded, as far as we can tell, and behave normally. Obviously, that's difficult to prove, because it's considered unethical to record people without their permission. But experiments in which other assessments of interactions -- such as participant answers to questionnaires about the encounter, observations, and outcomes such as subsequent patient behaviors -- are compared in recorded and non-recorded encounters don't find any effect of recording per se. Anyhow, it's the best we can do. If you think doctors are on their best behavior when they are recorded, and that's different from how they normally behave, then you think that most of the time, they aren't really trying. And at least we're learning what their best is -- which, by the way, isn't necessarily very good.

We're actually finishing up one study as we start the new one, and we're trying to apply the lessons we've learned so far to our methods, our hypotheses, and our analyses. I can't get too far ahead on this blog -- we need to write up our results, get them peer reviewed, and publish them, and it's considered unseemly to say too much ahead of time, except for academic conferences. But I can tell you what we've already presented at conferences, and something about our questions and our methods.

The culture of the medical profession, and the wider culture too, I would say, awards a lot of prestige to physicians mostly because of their technical knowledge and skills. They are a kind of priesthood, sole possessors of arcane powers, and that's why they make the big bucks and your mother wanted you to marry one. In order to be admitted to medical school, you need top grades in science courses, and you need to do well on a test of scientific knowledge and aptitude. However, you don't need to study any humanities, and you don't need to be able to write a coherent sentence, let alone a paragraph. You can be a complete jerk as well, although most medical students I have known are nice people who want to do good by their patients as well as buy a horse farm some day, and some of them even have no interest in the horse farm. But it's not a qualification.

And that's a problem, because the effective practice of medicine is not achieved by technical skill alone. The most important skill, in fact, in my view (which is probably self-serving, but you're getting it anyway) is communication, in all its dimensions. Many people, including as a matter of fact many linguists, have a large part of their conceptual space occupied by the idea that the main function of language is representing reality -- that communication means using auditory symbols to create knowledge about concrete facts in the brain of the listener. We don't have to think about it very long to realize that this is only one of many functions of language, and it's not the most important function of doctor talk at all.

So we're trying to learn about how the talk between doctors and patients functions in myriad ways, and how those ways work together to create a complete encounter with whatever effects it has on people's subsequent well-being. More specifics to come.

Wednesday, July 09, 2008

The Epistemology Wonk Strikes Again

Something us science persons seldom like to admit is the goodly distance between reality and data. Data is the muck we run through the ol' statistical sausage grinder to give us the lapidary pictures we put in the journals, and the naive view (sorry if you think I'm talking about you) is that we're slicing and dicing the real world and telling it like it is. But in fact, by the time the input gets to the regression model, it's already been fattened, slaughtered, gutted, skinned, defatted, disinfected, flash frozen, slow thawed, braised, roasted, deboned, and selected.

Or, more specifically and non-analogously in the case of the sociolinguistic research I'm doing, first the doctor and the patient have to meet the eligibility criteria, come to interact with the recruitment efforts, agree to participate in the study, and show up on the appointed days. We had to make a number of decisions in advance that determine who, out of the millions of possible physician-patient dyads in the world, even have a chance of ending up at the top of set of screens, and determine or influence which ones will get through them and end up on the feedlot, as it were.

Then they go ahead and have their interaction, knowing that they are being recorded and will otherwise be interviewed, poked and prodded. Is their interaction changed in any way by this knowledge, or their previous interactions with us? No doubt. We can try to minimize and account for these changes, or explain them away, but there is a Heisenberg principle of social science, as of physics and most scientific endeavors. It's impossible to observe something without changing it.

Then, some of the dialogue will be inaudible, drowned out by extraneous noises, lost to mechanical malfunctions, or to one or another of the parties decided she or he doesn't want to be recorded after all, possibly in the middle of the visit. Once we get the recording, we have to transcribe it. In case you think it is physically possible to make an accurate written representation of natural human discourse, you are sadly mistaken. We have to invent an elaborate serious of rules for doing this, some of which quite consciously involve throwing away information, others of which require judgments which not every transcriptionist will necessarily make in the same way every time.

To analyze this data we need to create a complex set of rules for dividing it into units of analysis, and then operationalize variables -- descriptions or measurements of the units -- by creating a list of possible values and definitions of the conditions or properties which correspond to each of those value labels. Again, the consistency with which different observers may make each of these judgments is variable, but almost never close to perfect. The concepts we use to create the variables, and the ways in which we operationalize them, fully and profoundly limit the possibilities for what we can observe.

Although we are setting out on a journey of discovery, we have some idea of what we are looking for, and even what we hope it will be. To some extent those expectations and wishes will influence our choices unconsciously, but to an even greater extent, I would say, we are fully aware of what we are doing and why. We look for these specific properties of the dialog because that's what we are interested in, that's what is meaningful to us, potentially support our theory, or show up our rivals, get us published, make us famous.

Not that there's anything wrong with that. Ultimately, our observations and our conclusions will be as credible as we can make them, and absolutely true. We're honest and ethical and pure of heart. But they will be true within the limited and carefully designed universe we create for purposes of the investigation. Whether you end up believing that they truly explain the real world -- the sloppy, messy, infinite, buzzing confusion in which you live -- depends on whether you agree that all those decisions we made along the way don't take us too far away from it.

And that's why it's so frustrating for us to argue with denialists -- creationists, global warming deniers, AIDS deniers, and the like. They see our honest and rigorous admission of our limitations as fatal weaknesses, and proof that beliefs are arbitrary. But that's not it at all. You have to walk the path with us, that's the point. Look through the telescope, see those spots moving around Jupiter, test the telescope on comparatively nearby objects and see that it gives an honest picture, watch those spots for a while and see how we deduce that they are circling the planet, see the phases of Venus, see how they correspond to the hypothesis that Venus is illuminated by the sun and is circling it. Look through the telescope.

Nothing I didn't know . . .

Today we're going to have to endure watching Barack Obama join with the majority of Democratic senators to repudiate the fourth amendment and the rule of law. Obama is afraid of the ads accusing him of denying the president the tools he needs to protect you, your children and this puppy from the Islamofascistocommunistoterrorists, who threaten the extinction of Western Civilization, and of whom he is a covert agent. ("Exactly what conversations between Americans and foreign Islamic terrorist masterminds does B. Hussein Obama -- pictured here photoshopped with darkened skin and a nose hooked like a scimitar -- not want the National Security Agency to be able to intercept?")

His colleagues, of course, are merely afraid that the telecommunications companies will give millions of dollars to their opponents in the next election.

As a youth, I knew perfectly well that the United States was ruled by money, and that its imperialist aggression was in the interest of preserving the wealth and power of the privileged people who run the country. I don't know what caused me to forget that for a while.

Sorry for the erratic posting, I've been very busy starting up a new project. I'll talk about it shortly.

Monday, July 07, 2008

All you have to do is read the New York Times . . .

to find out what's wrong with the New York Times -- and the rest of the journalistic establishment.

First, David Carr puts his ass on the line to describe the vicious tactics used by Fox News against critics and potential critics. Yes, it's about Fox, but it also happens to be a powerful indictment -- whether quite intended or not -- of the entire news industry, because guess what? These tactics work. Here's one of the money quotes:

At Fox News, media relations is a kind of rolling opposition research operation intended to keep reporters in line by feeding and sometimes maiming them. Shooting the occasional messenger is baked right into the process.

As crude as that sounds, it works. By blacklisting reporters it does not like, planting stories with friendlies at every turn, Fox News has been living a life beyond consequence for years. Honesty compels me to admit that I have choked a few times at the keyboard when Fox News has come up in a story and it was not absolutely critical to the matter at hand.


But here's the most important tidbit:

In the last several years, reporters from The Associated Press, several large newspapers and various trade publications have said they were shut out from getting their calls returned because of stories they had written. Editors do not want to hear why your calls are not being returned, they just want you to fix the problem, or perhaps they will fix it by finding someone else to do your job.


And this explains why the Washington press corps is in the bag for the Bush administration: access is the coin of the realm. You can't cover the Administration if they won't talk to you, and they'll only talk to you if they like the stories you write.

Then there's this from Tim Arrango. Vincent Bugliosi, a well-known author who has written three best-sellers including Helter Skelter and a recent very serious, very well received book on the Kennedy Assassination, has a new book out. The title? "The Prosecution of George W. Bush for Murder," and it is just what it appears to be. Writes Arrango:

Mr. Bugliosi could be forgiven for perhaps thinking that a new book would generate considerable interest, among reviewers and on the broadcast talk-show circuit. But if he thought that, he would have been mistaken. . . . Internet advertising has been abundant, but ABC Radio refused to accept an advertisement for the book during the Don Imus show, said Roger Cooper, the publisher of Vanguard Press, which put out the book.


And, he hasn't gotten a single review in a mainstream newspaper, nor a single appearance on television -- including the Daily Show and MSNBC, both of which declined to have him on. Oh yeah-- the New York Times has not reviewed the book, although it is number 14 on the NYT's own best seller list.

Who needs state-controlled media? The corporate media does the same job. And oh yes, I appreciate the irony that both of these stories appeared in the New York Times -- in the business section, actually. That's the beauty of the whole arrangement. These exposes can slip through the cracks, where a few eccentrics like me will notice them, but it has no effect on the institution as a whole. The composition of the information stream will remain overwhelmingly fawning and deferential to the establishment, while the exceptions merely serve to disguise the fact.

It's all over?

One of the most puzzling proclivities of humans throughout history has been widespread belief in the imminent destruction of the world, or perhaps a better word is fundamental transformation. The belief is not in the end of existence, but in the extinction of familiar reality. The new universe is generally to be some sort of paradise or divine kingdom, although typically only the chosen people -- and that means us, whoever it is who shares this belief -- will be around to enjoy it. The rest will either be exterminated, or wallowing in hell.

The most widespread version of millenialism in the U.S. is of course the Christian fundamentalist version based on the biblical book called Revelation. Chip Berlet and Nikhil Aziz discuss this belief system and its relationship to U.S. Middle East policy here. It is downright horrifying that a powerful nation would be influenced in matters of war and peace by a delusional movement that places no value whatever on human life or even the continuation of civilization, on the premise that the known universe is about to end, but that is indeed the case. There is plenty of evidence that GW Bush believes in the imminence of the End Times and that his foreign policy is fundamentally shaped by that conviction.

Ian McEwan notes that according to opinion polls, 44% of Americans believe that "Jesus will return to judge the living and the dead within 50 years." To me, this belief is simply insane, indistinguishable in every way from the delusions of a paranoid schizophrenic, with the sole distinction that it is shared by hundreds of millions. But as McEwan tells us, it has been an essential current in Christian belief from the beginning (frequently leading to campaigns to slaughter Jews, by the way) and there are comparable apocalyptic beliefs in Judaism and Islam.

Yet apocalyptic belief is not confined to conventional religion, or religion at all. Remember the survivalists stocking up on bottled water and canned rations, and heading for the hills because the Y2K computer bug was going to destroy civilization? And now, perhaps even more bizarrely, there is a substantial movement that believes the world will end in 2012 based on -- get this -- the Mayan calendar. People are doing a booming business in survival supplies based on this notion, and buying up land in remote locations and building bunkers. But the notion is not even accurate. Just because the Mayan calendar system expires in 2012 doesn't mean the Mayans thought the world would end then. But even if the Mayans did think that, I mean, WTF?

Clearly, there is something very attractive to humans about these kinds of belief. Maybe it's just that we're dissatisfied with the world as it is and yearn so powerfully for a different one that we come to believe in our fantasies. But these delusions are very dangerous and horribly destructive. We have got to make this world work, it's the only one we've got or ever will have. The apocalyptic cults which have seduced as many as half of all Americans may produce real horrors if they continue to drive our decisions about war and peace, environmental protection, and the social order, but their adherents are going to be very disappointed by what emerges on the other end. It won't include any rapture, I can guarantee that.

Friday, July 04, 2008

You're an ignorant idiot . . .

Says Rick Shenkman. Well, okay, maybe you're an exception, but most Americans ignorance of history, civics, and political issues is absolutely appalling. Shenkman doesn't get into other areas, such as science -- actually Europeans aren't all that scientifically literate either but at least more than half of them believe in evolution. I'm having a hard time getting used to the finding that 25% of Americans believe that the sun revolves around the earth, and that 35% believe that astrology is "sort of scientific" (whatever that means).

But as for civics, the best Shenkman can do for good news is "Encouragingly, today the number of Americans who can correctly identify and name the three branches of government is up to 40%." Most Americans do not know the names of their own Senators or Representative. Forty-nine percent think the President has the power to suspend the Constitution. Oh wait - it turns out that he does, at least under the current Congress. But I would be inclined to call that a self-fulfilling delusion.

While this is obviously depressing, it's also quite puzzling. We have universal literacy, universal public education, with the large majority making it through 12 years and the rest at least getting 10, and a very highly educated elite that controls the pervasive information environment. Twelve years in school ought to be enough to learn that the earth goes around the sun, there are three branches of government, and the name of the Chief Justice of the Supreme Court. But instead, our Republic is an idiocracy. Does the candidate say the pledge of allegiance? Does he go to church? Do I like his wife? Is he a manly man or a girly man? And of course, the worse thing he can possibly be is smarter than me. One of the current president's chief qualifications for the job, according to many voters, is that he is a dimwit who shares their basic ignorance of the world and their 12th Century belief system.

Saying all this does not make me an elitist. On the contrary, my point is that I want more people to be better educated, more knowledgeable, and more capable of critical thinking. But somehow we have managed to fail miserably as a society in that endeavor.

Thursday, July 03, 2008

Mmmmmh, Doughnuts!

The other half of the system that controls our behavior is, of course, pleasure. Life would be a stone drag if we were only guided by aversion, but fortunately there are some things that positively make us feel good. Oddly enough, more than a few people subscribe to ethical systems which tend to condemn them, and if George Carlin were still alive I'd bring him in to discuss that oddity, but I'll have to leave it to another day.

The bad news is that, like the aversive system that manifests as pain, the pleasure system evolved in creatures less intelligent than ourselves and hence it operates rather crudely, from our supersmart point of view. We can often figure out that deferring some form of gratification now will make us happier in the long run, but we can't always manage to do it, and that makes us even more unhappy because now we are disgusted with ourselves as well a lacking in whatever that longer-term prize was.

The most notorious example nowadays has to do with food, of course. Back in the days when calories were always in short supply, and the only really sweet thing we were likely to encounter was a fruit containing plenty of fiber and vitamins as well as sugar, that wasn't going to last very long before something else got it, the best thing to do was gobble it all up right on the spot. The animals we hunted didn't have much fat on them, but what was they had was fuel for the next day's hunt, and it was going to be rancid by then anyway if the hyenas didn't steal it, so chow down!

Sadly for us all today, there's far too much of that sweet and fatty stuff around for own good, but we just can't help ourselves. There's even worse news for many people, which is that we've discovered chemicals that drive that pleasure system directly. It's biology, not sin. Here's a reasonably succinct, if rather aloof, explanation. By the way, although in my view the biological addiction model is a bit strained when it comes to behaviors which are intrinsically rewarding like eating and sex, it works very well when it comes to gambling, which alters brain processes very much in the same way as chemical addictions.

Following up on yesterday's post, by the way, opioids are useful for the treatment of pain even they don't actually block pain. They don't make pain go away, they make you not care about it -- just the gift I was asking the intelligent designer to give us. Alas, there is that awful price of addiction. The drug becomes an end in itself, for some people the only end, at least when it's difficult to get.

I've talked about the illusion of free will before, but this is the most straightforward attack of all. In the case of addiction and compulsive behavior, we may actually have the conscious experience of acting against our own wishes. But does that make any sense?

Wednesday, July 02, 2008

Ouch!

This slightly odd article in PLoS Medicine got me to thinking. The point of the study was to try to assess whether the usual signs that an infant is in pain -- which would be, you know, grimacing and crying and stuff -- are necessarily going to happen whenever pain is present. That may seem like a dumb question, but I don't suppose it is. The neurological equipment is still getting hooked up in very young infants and maybe the pain-related reflexes aren't always happening even though the pain already is.

So these people measured blood flow in brain regions normally associated with pain when infants got stuck in the heel for a blood sample. (Don't worry, these were blood samples that the doctors were taking anyway for clinical assessment.) They compared the results of this assessment with scores on a behavioral assessment of pain in premature infants, and they found that sometimes there were signs of pain in the blood-flow data that didn't show up in the facial expression and physiological responses. Now, the authors seem to assume that this means the infants are subjectively experiencing pain, and in fact I have no idea whether that's true. Indeed, I have no idea whether very young infants even can experience pain. Certainly none of us remembers it, and if we're just going to forget it anyway, what difference does it make? All very philosophically deep.

Okay, interesting, not a big whoop. But it got me to thinking. What and why is pain? I have a friend who has some chronic pain problems -- sciatica and tendinitis -- and I once remarked to him that a more intelligent designer could have given us the functional equivalent of pain without making it actually, you know, hurt. In other words, we could get a warning signal -- like a flashing red light -- that told us that something was wrong -- don't put weight on that leg, pull your hand away from the fire, you've been cut, whatever -- without torturing us. Does that make any sense? Is the negative subjective experience of pain actually necessary?

Evolution, of course, doesn't care. If agony works, then that's what we get. And it's pretty obvious that in order for the less unpleasant signal to be effective, you need an intelligent creature that can link the signal to an undesirable outcome. "Oh, I guess I'd better do something about that wound before I bleed to death." You need, in other words, an intellectual understanding of what the implications would be of ignoring the signal, motivation to act, and a plan about what to do. Presumably less intelligent ancestors couldn't possibly have evolved such a system, so we're stuck with the one that did evolve. And anyway, there might be something circular about the idea. We have to be averse to the consequences of ignoring the signal, which means that something we consider painful is coming along down the road. If we can't feel pain, what's the point of life? Can we be conscious in any meaningful way without really feeling the pleasures of rewards and the agonies of punishment?

And sure, if we didn't have pain, we'd probably say, "Mañana, I'll get around to it." You know how it is.

So we need the hurt. It's what makes us alive, it's what makes experience. Without it we're just automata. The trouble is, much of the time, it isn't really doing the job it's supposed to do. It's just making us miserable and we can't do anything about it, or we don't know what to do about it, or we do the wrong thing about it. So pain is both friend or foe. Can't live with it, can't live without it.