The Dangers of the Manhattan Model for Fighting Climate Change

Professor Merrill Singer
Department of Anthropology
University of Connecticut

Asserting the Manhattan Project, the effort that developed the first nuclear weapons, as a model for addressing global climate change1, suggests the need for environmental activists to closely examine the lessons of the model in question.

The starting point for this endeavor is determining the role of the Project in the ending of WWII, on which is based its claim of success as a technological, scientific, and military initiative. It has now been thoroughly documented, as affirmed in Dwight Eisenhower’s personal memoir The White House Years, that with American aircraft attacking Japanese cities at will in 1945 the Japanese wanted to surrender prior to the use of nuclear weapons. They felt, however, that they could not accept the stark terms demanded of them in the Potsdam Proclamation, most notably the total elimination of Japan’s royal house. As expressed in a memorandum sent by General Douglas MacArthur to President Roosevelt, high ranking Japanese officials agreed to practically all of the other terms specified in the Proclamation2. American policy makers maintained, none the less, that only the dropping of an atomic bomb would convince the Japanese to give up. Some critics, however, argue that with the Soviet Union declaring war on Japan the bombing was in no small measure intended as a message about U.S. military superiority, and hence Hiroshima might reasonably be called the first shot of the Cold War. At the time of the bombing of Hiroshima, Japan had already began negotiating with the Soviet Union for a peace treaty, something well known to American leaders because of the breaking of Japanese military codes. Moreover, in his farewell speech upon leaving office, President Eisenhower warned about the dangerous rise of the military industrial complex, and surely the Manhattan Project was also a significant step in the making of that powerful and still thriving entity.

When there was no immediate response from Japan after the bombing of Hiroshima, a second target, the city of Nagasaki, also was subjected to nuclear annihilation. Many of those directly involved in the Manhattan Project were dismayed by the bombing of Nagasaki, believing it was not necessary from a military standpoint3. Thus, J. Robert Oppenheimer, head of the Los Alamos Laboratory and a recognized “father of the atomic bomb,” immediately traveled to Washington to deliver a letter to the Secretary of War expressing his disgust and noting his desire for a total ban on nuclear weapons. Some later historians would come to understand the demand for the unconditional surrender of Japan as a “policy of revenge [that] hurt America’s national self-interest [and] prolonged the war4”. And further, as General Cutis LeMay, a known hawk who eventually headed the Strategic Air Command phrased it at time, the atomic bomb “had nothing to do with the end of the war5.” Some critics reasonably argue that the primary motivation for dropping the bomb was the positioning of the United States to dominate the postwar global economy. The Manhattan Project, in short, did not magically bring the brutality of world war to an end, rather, it set the stage for other grave events that were to follow.

After the end of World War II, Congress set up United States Atomic Energy Commission to promote the development of atomic science. President Harry Truman established public control of nuclear energy with the Atomic Energy Act of 1946. However, 18 years later President Lyndon Johnson signed the Private Ownership of Special Nuclear Materials Act, which permitted private corporations quick access to nuclear resources. Within ten years, 233 nuclear power plants were either up and running, under construction, or on order in the United States. Today there are 99 commercially operated nuclear reactors spread across 30 states6. In other words, it was a short leap from the Manhattan Project to the private production of nuclear energy, a technological and economic transformation that few people concerned about anthropogenic impacts on the planet see as either safe or sustainable. From nuclear disasters like those in Chernobyl, Ukraine in 1986 and Fukushima, Japan in 2011 to the continued lack of a safe, reliable solution for managing the growing stockpile of radioactive waste produced by nuclear plants, nuclear energy has proven to be costly in lives, in environmental damage, and in fiscal expenditures on construction, maintenance, and post-disaster clean-up.

While there is no denying that state level action is needed to address an issue as large and complex as the turmoil of climate change, it is important that we choose our models for action wisely least we fail to carefully assess the politics of policy decisions and the brand of economics that brought us a warming planet to begin with7. The old adage, ‘Be careful what you wish for, lest it come true’ could not be more apropos.

This post is a response to the CAOS Provocation Climate Change: War Footing or Peaceful Solidarity?. Read the other responses:


  1. Oreskes, Naomi (2013) We Need a New Manhattan Project to Deal With Climate Chang. The New York Times.
  1. Wainstock, Dennis (1996) The Decision to Drop the Atomic Bomb. Westport, CT: Praeger Publishers, p. 132.Monk, Ray (2012).
  1. Robert Oppenheimer: A Life Inside the Center. New York; Toronto: Doubleday.
  1. Weber, Mark (1997) Was Hiroshima Necessary? The Journal of Historical Review 16(3): 4-11.
  1. Alperovitz, Gar (2015) Seventy years after the bombing, will Americans face the brutal truth? The Nation, August 6.
  1. U.S. Energy Information Administration (2015) How many nuclear power plants are in the United States, and where are they located?
  1. Klein, Naomi (2014) This Changes Everything: Capitalism vs. The Climate. New York: Simon & Schuster.