That's the subtext of an article which appeared last night on the Washington Post website, written by a demography professor and a graduate student. It says lots of interesting and odd things:
The consequences of Operation Iraqi Freedom for U.S. forces are being documented by the Defense Department with an exceptional degree of openness and transparency. Its daily and cumulative counts of deaths receive a great deal of publicity. But deaths alone don't indicate the risk for an individual. For this purpose, the number of deaths must be compared with the number of individuals exposed to the risk of death. The Defense Department has supplied us with appropriate data on exposure, and we take advantage of it to provide the first profile of military mortality in Iraq.
This is the beginning, and a good beginning it is. Note how the article immediately points out the need to define risk carefully, by comparing the deaths with the number of individuals expose to the risk of death? This is what was wrong with the earlier wingnut cries about Iraq being less dangerous than Washington, D.C., because more Americans died in the latter place.
But the article then zooms on with just this one correction:
Between March 21, 2003, when the first military death was recorded in Iraq, and March 31, 2006, there were 2,321 deaths among American troops in Iraq. Seventy-nine percent were a result of action by hostile forces. Troops spent a total of 592,002 "person-years" in Iraq during this period. The ratio of deaths to person-years, .00392, or 3.92 deaths per 1,000 person-years, is the death rate of military personnel in Iraq.
How does this rate compare with that in other groups? One meaningful comparison is to the civilian population of the United States. That rate was 8.42 per 1,000 in 2003, more than twice that for military personnel in Iraq.
The comparison is imperfect, of course, because a much higher fraction of the American population is elderly and subject to higher death rates from degenerative diseases. The death rate for U.S. men ages 18 to 39 in 2003 was 1.53 per 1,000 -- 39 percent of that of troops in Iraq. But one can also find something equivalent to combat conditions on home soil. The death rate for African American men ages 20 to 34 in Philadelphia was 4.37 per 1,000 in 2002, 11 percent higher than among troops in Iraq. Slightly more than half the Philadelphia deaths were homicides.
The death rate of American troops in Vietnam was 5.6 times that observed in Iraq. Part of the reduction in the death rate is attributable to improvements in military medicine and such things as the use of body armor. These have reduced the ratio of deaths to wounds from 24 percent in Vietnam to 13 percent in Iraq.
The short message here is this: Hey, Iraq isn't dangerous at all! In the long-run we are all going to die! And young black men in dangerous places die at even higher rates (though of course they don't have armor)! And more people died in another war!
Indeed. Maybe we should ship our elderly to Iraq, to benefit from the lower risk of death there?
This is all hogwash. What are the person-years, by the way? Are all the mortality statistics in the above quote expressed in person-years? It's not clear at all, and if there is a skip from "person-years" to just people in thousands, are the results still the same?
But that's not the main reason for the hogwashiness of the article. Just consider this paragraph:
How does this rate compare with that in other groups? One meaningful comparison is to the civilian population of the United States. That rate was 8.42 per 1,000 in 2003, more than twice that for military personnel in Iraq.
Who on earth could call that comparison "meaningful"? The overall death rate includes the deaths to people in their eighties and nineties, and it includes all natural deaths. It covers the whole lifetimes of all individuals. How is comparing that to the risk of death from a war meaningful? Are the authors trying to tell us that going to Iraq is safer than just living your ordinary lives, to their natural ends?
The only way they could make such an argument would be if they also standardized the time periods of exposure. That would mean having American military born in Iraq, growing up in Iraq and staying in Iraq all their lives; all the time under war conditions. Then the comparison would make some sense.
The sharper criticism of all this is that the authors here failed to distinguish between the general risk of death from just living for a long time and the very specific risk of death from war. That the results seem to give the military in Iraq a lower risk of death is because those folks are much younger than the general population in this country. It's the nursing-home population which faces the highest risk of death, you know, and they are excluded from our forces in Iraq.
What about the second comparison, then, the one to young black men in certain areas of Philadelphia? That is a valid reminder of the shock and shame we should all feel for allowing such places to exist in the United States of America. But the gang wars fighters in Philadelphia don't have armored trucks or helmets. Neither do they have support troops which usually have lower rates of risk of death. If we are to compare this area to the Iraq war arena, we should use only the rates of those in direct combat positions in such comparisons.
More generally, what are we trying to do when calculating the risk of death from wars? Suppose that we send 100,000 soldiers into an area for one year and that 100 of them die. As a proportion this is one in a thousand. Suppose then that a soldier not yet sent into that area sees the data and regards himself or herself as the average type of a soldier. This would make the one in a thousand the relevant odds for that soldier to consider.
In short, we translate actual data on deaths into a measure of probability, one that would apply, on average, to future deaths if nothing major changes in the war. If the soldier we are looking at is not "average", we might need more detailed data on the deaths by the military branch or support-vs-combat duties, and if this data was available we could figure more precise odds of death for him or her.
From this point of view, the best way to calculate the differential risk of dying caused by the Iraq war would be to find out what the average risk of dying would have been for the same military population in the absence of the war and then to compare that to the actual death figures. We can't do this, as it's impossible to measure the alternative death rates of a reality that didn't come about (the Gore presidency, perhaps), but we can do almost as well by finding out what the risk of death is for the American military not in Iraq at the present time, always assuming that this group has the same age, sex and race distribution as the Iraq group.
Note that it's all in the questions we pose. If we ask wrong questions we get wrong answers, and each question we ask means something different politically.