In recent years, outbreaks of diseases such as avian flu, severe acute respiratory syndrome (SARS), the Ebola virus, and mad cow disease have frightened the public, disrupted global commerce, caused massive economic losses, and jeopardized diplomatic relations. These diseases have also shared a worrisome key characteristic: the ability to cross the Darwinian divide between animals and people. None of these illnesses depends on human hosts for its survival; as a result, they all persist today, far beyond the reach of medical intervention.
Meanwhile, humanity has become vulnerable to cross-species illnesses, thanks to modern advances such as the rapid transportation of both goods and people, increasing population density around the globe, and a growing dependence on intensified livestock production for food. The global transport of animals and animal products, which includes hundreds of species of wildlife, also provides safe passage for the harmful bacteria, viruses, and fungi they carry, not to mention the prion proteins that cause insidious illnesses such as mad cow disease and chronic wasting disease in deer and elk.
Dating back to antiquity, influenza pandemics have posed the greatest threat of a worldwide calamity caused by infectious disease. Over the past 300 years, ten influenza pandemics have occurred among humans. The most recent came in 1957-58 and 1968-69, and although several tens of thousands of Americans died in each one, these were considered mild compared to others. The 1918-19 pandemic was not. According to recent analysis, it killed 50 to 100 million people globally. Today, with a population of 6.5 billion, more than three times that of 1918, even a "mild" pandemic could kill many millions of people.
The Bush administration has proclaimed a doctrine of unilateral preemption as a core part of its National Security Strategy. The limits of this approach are demonstrated daily in Iraq, where the United States is bearing the burden for security, reconstruction, and reform essentially on its own. Yet the world cannot afford to look the other way when faced with the prospect, as in Iraq, of a brutal ruler acquiring nuclear weapons or other weapons of mass destruction (WMD). Addressing this danger requires a different strategy, one that maximizes the chances of early and effective collective action. In this regard, and in comparison to the changes that are taking place in the area of intervention for the purposes of humanitarian protection, the biggest problem with the Bush preemption strategy may be that it does not go far enough.
President George W. Bush has singled out terrorist nuclear attacks on the United States as the defining threat the nation will face in the foreseeable future. In addressing this specter, he has asserted that Americans' "highest priority is to keep terrorists from acquiring weapons of mass destruction." So far, however, his words have not been matched by deeds. The Bush administration has yet to develop a coherent strategy for combating the threat of nuclear terror. Although it has made progress on some fronts, Washington has failed to take scores of specific actions that would measurably reduce the risk to the country. Unless it changes course—and fast—a nuclear terrorist attack on the United States will be more likely than not in the decade ahead.
The Bush administration's new "National Strategy to Combat Weapons of Mass Destruction (WMD)," announced in December, is wise in some places, in need of small fixes in other places, and dangerously radical in still others. Most important, the strategy's approach to nuclear issues seems destined to reduce international cooperation in enforcing nonproliferation commitments rather than enhance it. America's willingness to use force against emergent WMD threats, as in Iraq, can stir the limbs of the international body politic to action. But a truly effective strategy to reduce nuclear dangers over the long term must bring along hearts and minds as well.
The WMD proliferation problem involves biological, chemical, and nuclear weapons, but the third raises the most telling issues. Chemical and biological weapons are legally prohibited by treaty, and so the challenge they pose is basically one of enforcement. Nuclear weapons, on the other hand, are temporarily legal in five countries, not illegal in three others, and forbidden essentially everywhere else—a complex and inconsistent arrangement that presents a unique set of dilemmas.
History often places before the world a problem whose solution lies outside the bounds of contemporary political acceptability. Such was the case, for example, in the 1930s, when the rise of Hitler posed a threat to the European democracies that they lacked the resolve to face. To check Nazi aggression, most historians now agree, the democracies would have had to oppose it early and resolutely, as Winston Churchill advocated. But Churchill's prescriptions were beyond the pale of mainstream political thinking at the time, and he was forced "into the wilderness," as he famously put it. Not until the late 1930s did his ideas win political acceptance, and by then the price of stopping Hitler was World War II.
Since World War II, public health strategy has focused on the eradication of microbes. Using powerful medical weaponry developed during the postwar period—antibiotics, antimalarials, and vaccines—political and scientific leaders in the United States and around the world pursued a military-style campaign to obliterate viral, bacterial, and parasitic enemies. The goal was nothing less than pushing humanity through what was termed the "health transition," leaving the age of infectious disease permanently behind. By the turn of the century, it was thought, most of the world's population would live long lives ended only by the "chronics"—cancer, heart disease, and Alzheimer's.