J. Robert Oppenheimer, the father of the atomic bomb, spent years wrestling with the conflict between his science and the dictates of his conscience. In part because he publicly expressed his concerns about the hydrogen bomb and a nuclear arms race, Oppenheimer–the subject of a new biopic–ended his career as a martyr in Cold War politics. Fortunately, many other early nuclear experts, including the University of Chicago scientists who first produced a chain reaction, felt an obligation to help prevent the misuse of atomic science. These scientists understood something that today’s pioneers in artificial intelligence and genetic engineering also need to recognize: The people who usher revolutionary advances into the world have both the expertise and the moral responsibility to help society address their dangers.
In laboratories at universities and at for-profit companies today, researchers are working on technologies that raise profound ethical questions. Can we engineer plants and animals resistant to natural predators without upsetting the balance of nature? Should we allow patents on life forms? Can we ethically fix supposed abnormalities in human beings? Should we allow machines to make consequential decisions–for instance, whether to use force to respond to a threat, or whether to launch a retaliatory nuclear strike? Atomic scientists at Chicago and elsewhere left behind a model for the responsible conduct of science, a model as applicable now as it was in Oppenheimer’s day.
The race to the atom bomb began at the Metallurgical Laboratory at the University of Chicago, where, on December 2, 1942, the first engineered, self-sustaining nuclear-fission reaction occurred. The scientists gathered in what had become known as an “atomic village” included Leo Szilard, a Hungarian-born physicist who a few years earlier had helped persuade Albert Einstein to warn President Franklin D. Roosevelt that a weapon of awesome power was within scientific reach–and that Hitler’s scientists knew it too. The now-famous Einstein-Szilard letter, which launched the United States on the crash course known as the Manhattan Project, was the nuclear age’s first great act of scientific responsibility. The first lesson from the Met Lab was: Scientific knowledge, once obtained, cannot be called back. Perceiving the world-changing potential of recent discoveries in nuclear physics, Szilard and his colleagues had to inform the leaders of our democracy.
Chicago’s atomic village had an eclectic mix of scientists. Some, such as the physicist John Simpson, were young Americans who had grown up amid New Deal social reforms. Among the more established scientists were a number of Jewish ?migr?s, including Szilard, the German physicist James Franck, and the Russian German biophysicist Eugene Rabinowitch, whose experiences before leaving Europe had sensitized them in various ways to the moral dimensions of science. Franck, in fact, had firsthand experience of the subjection of science to politics. While working as a young researcher in Germany when World War I began, he had volunteered for the kaiser’s army and was an officer in the unit that introduced chlorine gas onto the battlefield. His friend Niels Bohr, the distinguished Danish physicist and Nobel laureate, harshly criticized his decision to accept the role, which Franck came to regret deeply.
By 1943, the primary work on nuclear-bomb development had shifted to Oak Ridge, Tennessee; Hanford, Washington; and Los Alamos, New Mexico. The scientists remaining at Chicago’s Met Lab had time to try to shape decisions about the use of nuclear technology, both in what remained of World War II and in the looming postwar period. The second lesson from the Met Lab was that, although scientific discovery is irreversible, its effects can be regulated. In her 1965 book, A Peril and a Hope: The Scientists’ Movement in America, 1945-1947, the historian Alice Kimball Smith, drawing on archival material and interviews, chronicled the intense discussions raging among the scientists during this period. The Met Lab scientists ultimately arrived at specific objectives that were lofty, practical, or both. They wanted to give Japan a preview of the atomic bomb’s power and the opportunity to surrender before being subjected to it. They also wanted to free science from the fetters of official secrecy, avert an arms race, and design international institutions to govern nuclear technology.
The third lesson from the Met Lab was that major decisions about the application of new technology should be made by civilians in a transparent democratic process. In the mid-’40s, the Chicago atomic scientists began bringing their concerns to leaders of the Manhattan Project and then to public officials. The Army bureaucracy preferred to keep secrets, but the scientists fought it every step of the way. Szilard, Franck, Rabinowitch, Simpson, and scores of their colleagues led the effort to educate politicians and inform the public about nuclear dangers. The scientists organized associations, among the first of which was the Atomic Scientists of Chicago. They gave lectures, wrote opinion essays, and founded publications, most notably the Bulletin of the Atomic Scientists, which Met Lab scientists edited and published on the University of Chicago campus. Working with colleagues at the other Manhattan Project sites, they marshaled support for the passage of the Atomic Energy Act, which created an independent agency of civilians, accountable to the president and Congress, to oversee the development and deployment of nuclear science. Their efforts continued well into the Cold War, with the successful campaigns for nuclear test bans, nonproliferation compacts, and arms-control agreements.
In the 21st century, many decisions about the development and deployment of new technologies are taking place in private laboratories and corporate executive suites, out of the public’s sight. Like the military’s secrecy requirements so resented by the Met Lab scientists, exclusive private ownership of scientific ideas impedes collaboration and the free flow of knowledge upon which the progress of science depends. The primacy of private decision making is an abrogation of the public’s right to participate, through the democratic process, in ethical decisions about the application of scientific and technical knowledge. These are the kind of choices the Met Lab scientists considered to be the public’s to make.
In August 1945, two atomic bombs caused the immediate or eventual deaths of 150,000 to 220,000 people in Hiroshima and Nagasaki. Some months later, at a meeting in the White House, Oppenheimer told Harry Truman, “Mr. President, I have blood on my hands.” But Truman reminded the physicist that the decision to drop the bombs was his own. Having made the weapon possible, the nation’s atomic scientists nevertheless acquitted themselves well. The regime they fostered, the template they created for responsible science, helped make that first use of nuclear weapons the only use in war to date. We will be wise to heed the lessons they learned.
This article was originally published on The Atlantic. double-think is a platform committed to broadening access to high-quality journalism, and we encourage you to engage with the original piece on the The Atlantic website. Our goal is to spotlight top-tier news and features from global leaders in reporting. We do not claim any ownership or authorship of the original work. If you enjoyed this piece, please consider supporting The Atlantic directly by subscribing or visiting their website. Thank you for reading!