As the 21st century unfolds, a new truth is gradually being recognized: Nuclear weapons and human security cannot co-exist.
Almost two decades after the end of the Cold War, there are still 25,000 nuclear weapons in existence, about 95 percent held by the United States and Russia with smaller numbers also possessed by the United Kingdom, France, China, India, Pakistan, and Israel. All told, half of humanity still lives in a nuclear-weapons state. The total amount of money spent by these countries on their nuclear arsenals exceeds $12 trillion, a stupendous sum only a fraction of which could have resolved the issues of mass poverty, health deficiencies, and education neglect.
During the Cold War, the rationale for the superpowers’ buildup of strategic nuclear weapons was the theory of deterrence. The U.S. and the Soviet Union were each deterred from using their nuclear weapons, according to the theory, in the knowledge that the opponent had the capacity to strike back overwhelmingly. This stand-off was called Mutual Assured Destruction (MAD). But with the post-Cold War emergence of the United States as the sole superpower, a new nuclear age has begun in which the war-fighting use of nuclear weapons is actually considered and threatened.
The Nuclear Posture Review, conducted by the Bush administration in 2001, established expansive plans to “revitalize” U.S. nuclear forces and all the elements that support them, within a new triad of capacities combining nuclear and conventional offensive strikes with missile defenses and nuclear-weapons infrastructure. Under the subsequent post-9/11 National Security Strategy, the administration said it would take “anticipatory action” (a euphemism for pre-emptive strikes) against enemies of the United States and has not ruled out using nuclear weapons, which remain a cornerstone of U.S. national security policy.