One hundred years ago, a half century before the atomic bombing of Hiroshima and Nagasaki, the discovery of x rays spotlighted the extraordinary promise, and peril, of the atom. From that time until 1942, atomic research was in private hands. The Second World War and the Manhattan Project, which planned and built the first atomic bombs, transformed a cottage industry of researchers into the largest and one of the most secretive research projects ever undertaken. Scientists who had once raced to publish their results learned to speak in codes accessible only to those with a "need to know." Indeed, during the war the very existence of the man-made element plutonium was a national secret.
After the war's end, the network of radiation researchers, government and military officials, and physicians mobilized for the Manhattan Project did not disband. Rather, they began working on government programs to promote both peaceful uses of atomic energy and nuclear weapons development.
Having harnessed the atom in secret for war, the federal government turned enthusiastically to providing governmental and nongovernmental researchers, corporations, and farmers with new tools for peace--radioisotopes--mass-produced with the same machinery that produced essential materials for the nation's nuclear weapons. Radioisotopes, the newly established Atomic Energy Commission (AEC) promised, would create new businesses, improve agricultural production, and through "human uses" in medical research, save lives.
From its 1947 creation to the 1974 reorganization of atomic energy activities, the AEC produced radioisotopes that were used in thousands of human radiation experiments conducted at universities, hospitals, and government facilities. This research brought major advances in the understanding of the workings of the human body and the ability of doctors to diagnose, prevent, and treat disease.
The growth of radiation research with humans after World War II was part of the enormous expansion of the entire biomedical research enterprise following the war. Although human experiments had long been part of medicine, there had been relatively few subjects, the research had not been as systematic, and there were far fewer promising interventions than there were in the late 1940s.
With so many more human beings as research subjects, and with potentially dangerous new substances involved, certain moral questions in the relationship between the physician-researcher and the human subject--questions that were raised in the nineteenth century--assumed more prominence than ever: What was there to protect people if a researcher's zeal for data gathering conflicted with his or her commitment to the subjects' well-being? Was the age-old ethical tradition of the doctor-patient relationship, in which the patient was to defer to the doctor's expertise and wisdom, adequate when the doctor was also a researcher and the procedures were experimental?
While these questions about the role of medical researchers were fresh in the air, the Manhattan Project, and then the Cold War, presented new ethical questions of a different order.
In March 1946, former British Prime Minister Winston Churchill told an audience in Fulton, Missouri, that an "iron curtain" had descended between Eastern and Western Europe--giving a name to the hostile division of the continent that had existed since the end of World War II. By the following year, Cold War was the term used to describe this state of affairs between the United States and its allies on the one hand and the Soviet bloc on the other. A quick succession of events underscored the scope of this conflict, as well as the stakes involved: In 1948 a Soviet blockade precipitated a crisis over Berlin; in 1949, the American nuclear monopoly ended when the Soviet Union exploded its first atomic bomb; in 1950, the Korean War began.
The seeming likelihood that atomic bombs would be used again in war, and that American civilians as well as soldiers would be targets, meant that the country had to know as much as it could, as quickly as it could, about the effects of radiation and the treatment of radiation injury.
This need for knowledge put radiation researchers, including physicians, in the middle of new questions of risk and benefit, disclosure and consent. The focus of these questions was, directly and indirectly, an unprecedented public health hazard: nuclear war. In addressing these questions, medical researchers had to define the new roles that they would play.
As advisers to the government, radiation researchers were asked to assist military commanders, who called for human experimentation to determine the effects of atomic weapons on their troops. But these researchers also knew that human experimentation might not readily provide the answers the military needed.
As physicians, they had a commitment to prevent disease and heal. At the same time, as government advisers, they were called upon to participate in making decisions to proceed with weapons development and testing programs that they knew could put citizens, soldiers, and workers at risk. As experts they were asked to ensure that the risks would not be excessive. And as researchers they saw these programs as an opportunity for gathering data.
As researchers, they were often among the first to volunteer to take the risks that were unavoidable in such research. But the risks could not always be disclosed to members of the public who were also exposed.
In keeping with the tradition of scientific inquiry, these researchers understood that their work should be the subject of vigorous discussion, at least among other scientists in their field. But, as government officials and advisers, they understood that their public statements had to be constrained by Cold War national security requirements, and they shared in official concern that public misunderstanding could compromise government programs and their own research.
Medical researchers, especially those expert in radiation, were not oblivious to the importance of the special roles they were being asked to play. "Never before in history," began the 1949 medical text Atomic Medicine, "have the interests of the weaponeers and those who practice the healing arts been so closely related." This volume, edited by Captain C. F. Behrens, the head of the Navy's new atomic medicine division, was evidently the first treatise on the topic.
It concluded with a chapter by Dr. Shields Warren, the first chief of the AEC's Division of Biology and Medicine, who would become a major figure in setting policy for postwar biomedical radiation research. While the atomic bomb was not "of medicine's contriving," the book began, it was to physicians "more than to any other profession" that atomic energy had brought a "bewildering array of new problems, brilliant prospects, and inescapable responsibilities."
The text, a prefatory chapter explained, treats "not of high policy, of ethics, of strategy or of international control [of nuclear materials], as physicians these matters are not for us."[3] Yet what many readers of Atomic Medicine could not know in 1949 was that Behrens, along with Warren and other biomedical experts, was already engaged in vigorous but secret discussions of the ethics underlying human radiation experiments. At the heart of these discussions lay difficult choices at the intersection of geopolitics, science, and medicine that would have a fundamental impact on the federal government's relationship with the American people.
Radiation has existed in nature from the origins of the universe, but was unknown to man until a century ago. Its discovery came by accident. On a Friday evening, November 8, 1895, the German physicist Wilhelm Roentgen was studying the nature of electrical currents by using a cathode ray tube, a common piece of scientific equipment.
When he turned the tube on, he noticed to his surprise that a glowing spot appeared on a black paper screen coated with fluorescent material that was across the room. Intrigued, he soon determined that invisible but highly penetrating rays were being produced at one end of the cathode ray tube. The rays could expose photographic plates, leaving shadows of dense objects, such as bone.
After about six weeks of experimenting with his discovery, which he called x rays, Roentgen sent a summary and several "shadow pictures" to a local scientific society. The society published the report in its regular journal and wisely printed extra copies. News spread rapidly; Roentgen sent copies to physicists throughout Europe. One Berlin physicist "could not help thinking that I was reading a fairy tale . . . only the actual photograph proved to everyone that this was a fact."
Physicians immediately recognized these rays as a new tool for diagnosis, a window into the interior of the body. The useless left arm of German Emperor Wilhelm II was x-rayed to reveal the cause of his disability, while Queen Amelia of Portugal used x rays of several of her court ladies to vividly display the dangers of "tightlacing."
Physicians began to use x rays routinely for examining fractures and locating foreign objects, such as needles swallowed by children or bullets shot into adults. During World War I, more than 1.1 million wounded soldiers were treated with the help of diagnostic x rays.
In 1896, Roentgen's insight led to the discovery of natural radioactivity. Henri Becquerel, who had been studying phosphorescence, discovered that shadow pictures were also created when wrapped photographic plates were exposed to crystals partly composed of uranium. Could this radioactive property be concentrated further by extracting and purifying some as-yet-unknown component of the uranium crystals? Marie and Pierre Curie began laborious chemical analyses that led to the isolation of the element polonium, named after Marie's native Poland. Continuing their work, they isolated the element radium. To describe these elements' emission of energy, they coined the word radioactivity.
As with x rays, popular hopes and fears for natural radioactivity far exceeded the actual applications. One 1905 headline captures it all: "Radium, as a Substitute for Gas, Electricity, and as a Positive Cure for Every Disease." Following initial enthusiasm that radiation could, by destroying tumors, provide a miracle cure for cancer, the reappearance of irradiated tumors led to discouragement.
Despite distressing setbacks, research into the medical uses of radiation persisted. In the 1920s French researchers, performing experiments on animals, discovered that radiation treatments administered in a series of fractionated doses, instead of a single massive dose, could eliminate tumors without causing permanent damage. With the new method of treatment, doctors began to report impressive survival rates for patients with a variety of cancers. Fractionation became, and remains, an accepted approach to cancer treatment.
Along with better understanding of radiation's benefits came a better practical appreciation of its dangers. Radiation burns were quickly apparent, but the greater danger took longer to manifest itself. Doctors and researchers were frequently among the victims. Radiation researchers were also slow to take steps to protect themselves from the hidden danger. One journal opened its April 1914 issue by noting that "[w]e have to deplore once more the sacrifice of a radiologist, the victim of his art."(April 1914): 393.
Clear and early evidence of tragic results sharpened both expert and public concern. By 1924, a New Jersey dentist noticed an unusual rate of deterioration of the jawbone among local women. On further investigation he learned that all at one time had jobs painting a radium solution onto watch dials.
Further studies revealed that as they painted, they licked their brushes to maintain a sharp point. Doing so, they absorbed radium into their bodies. The radium gradually revealed its presence in jaw deterioration, blood disease, and eventually, a painful, disfiguring deterioration of the jaw.
There was no question that radium was the culprit. The immediate outcome was a highly publicized crusade, investigation, lawsuits, and payments to the victims. Despite the publicity surrounding the dial painters, response to the danger remained agonizingly slow. Patent medicines containing radium and radium therapies continued.
The tragedy of the radium dial painters and similar cases of patients who took radium nostrums have provided basic data for protection standards for radioactive substances taken into the body. One prominent researcher in the new area of radiation safety was Robley Evans.
Evans was drawn into the field by the highly publicized death in 1932 of Eben Byers, following routine consumption of the nostrum Radiothor. Byers's death spurred Evans, then a California Institute of Technology physics graduate student, to undertake research that led to a study of the effects on the body of ingesting radium; this study would continue for more than half a century.
Evans's study and subsequent studies of the effects of radium treatments provided the anchor in human data for our understanding of the effects of radiation within the human body. As the dangers of the imprudent use of x rays and internal radiation became clear, private scientific advisory committees sprang up to develop voluntary guidelines to promote safety among those working with radiation. When the government did enter the atomic age, it often referred to the guidelines of these private committees as it developed radiation protection standards.
In 1913, the Hungarian chemist Georg von Hevesy began to experiment with the use of radioactive forms of elements (radioisotopes) to trace the behavior of the normal, nonradioactive forms of a variety of elements. Ten years later Hevesy extended his chemical experiments to biology, using a radioisotope of lead to trace the movement of lead from soil into bean plants. In 1943, Hevesy won the Nobel Prize for his work on the use of radioisotopes as tracers.
Previously, those seeking to understand life processes of an organism had to extract molecules and structures from dead cells or organisms, and then study those molecules by arduous chemical procedures, or use traceable chemicals that were foreign to the organism being studied but that mimicked normal body chemicals in some important way. Foreign chemicals could alter the very processes being measured and, in any case, were often as difficult to measure precisely as were normal body constituents.
The radioactive tracer--as Our Friend the Atom, a book written by Dr. Heinz Haber for Walt Disney productions, explained in 1956 to readers of all ages--was an elegant alternative: "Making a sample of material mildly radioactive is like putting a bell on a sheep. The shepherd traces the whole flock around by the sound of the bell. In the same way it is possible to keep tabs on tracer-atoms with a Geiger counter or any other radiation detector."
By the late 1920s the tracer technique was being applied to humans in Boston by researchers using an injection of dissolved radon to measure the rate of blood circulation, an early example of using radioactivity to observe life processes. However, research opportunities were limited by the fact that some of the elements that are most important in living creatures do not possess naturally occurring radioactive isotopes.
The answer to this problem came simultaneously at faculty clubs and seminars in Berkeley and Boston in the early 1930s. Medical researchers realized that the famed "atom smasher," the cyclotron invented by University of California physicist Ernest Lawrence, could be used as a factory to create radioisotopes for medical research and treatment. "Take an ordinary needle," Our Friend the Atom explained, "put it into an atomic reactor for a short while. Some of the ions contained in the steel will capture a neutron and be transformed into a radioisotope of iron. . . . Now that needle could be found in the proverbial haystack without any trouble."
In 1936, two of Lawrence's Berkeley colleagues, Drs. Joseph Hamilton and Robert Stone, administered radiosodium to treat several leukemia patients. In 1937, Ernest Lawrence's brother, physician John Lawrence, became the first to use radiophosphorus for the treatment of leukemia. This application was extended the following year to the treatment of polycythemia vera, a blood disease.
This method soon became a standard treatment for that disease. In 1938, Hamilton and Stone also began pioneering work in the use of cyclotron-produced neutrons for the treatment of cancer. The following year, not long before the war in Europe began, Ernest Lawrence unveiled a larger atom smasher, to be used to create additional radioisotopes and hence dubbed the "medical cyclotron." The discovery that some radioisotopes deposited selectively in different parts of the body--the thyroid, for example--inspired a spirited search for a radioactive "magic bullet" that might treat, or even cure, cancer and other diseases.
In Cambridge, the age of "nuclear medicine" is said to have begun in November 1936 with a lunchtime seminar at Harvard, at which MIT President Karl Compton talked on "What Physics Can Do for Biology and Medicine." Robley Evans, by that time at MIT, is reported to have helped prepare the portion of the talk from which medical researchers at the Massachusetts General Hospital's thyroid clinic came to realize that MIT's atom smasher could produce a great research tool for their work--radioisotopes.
Soon, doctors at the thyroid clinic began a series of experiments, including some involving humans, that would lead to the development of radioiodine as a standard tool for diagnosing and treating thyroid disease.
In late 1938, the discovery of atomic fission in Germany prompted concern among physicists in England and the United States that Nazi Germany might be the first to harness the power of the atom--as a propulsion method for submarines, as radioactive poison, or most worrisome of all, as a bomb capable of unimagined destruction. In the United States, a world-famous physicist, Albert Einstein, and a recent émigré from Hungary, Leo Szilard, alerted President Franklin D. Roosevelt to the military implications of the German discovery in an August 1939 letter.
Assigning his own science adviser, Vannevar Bush, to the task of determining the feasibility of an atomic bomb, Roosevelt's simple "O.K.," scrawled on a piece of paper, set in motion the chain of events that would lead to the largest and most expensive engineering project in history. Soon, Ernest Lawrence's Radiation Laboratory and its medical cyclotron were mobilized to aid in the nationwide effort to build the world's first atomic bomb. In a related effort, Drs. Stone and Hamilton, and others, would turn their talents to the medical research needed to ensure the safety of those working on the bomb.
On August 6, 1945, when the atomic bomb was dropped on Hiroshima, the most sensitive of secrets became a symbol for the ages. A week later, the bomb was the subject of a government report that revealed to the public the uses of plutonium and uranium. Immediately, debate began over the future of atomic energy. Could it be controlled at the international level? Should it remain entirely under control of the military? What role would industry have in developing its potential? Although American policymakers failed to establish international control of the bomb, they succeeded in creating a national agency with responsibility for the domestic control of atomic energy.
The most divisive question in the creation of the new agency that would hold sway over the atom was the role of the military. Following congressional hearings, the Atomic Energy Commission was established by the 1946 McMahon Act, to be headed by five civilian commissioners. President Truman appointed David Lilienthal, former head of the Tennessee Valley Authority, as the first chairman of the AEC, which took over responsibilities of the Manhattan Engineer District in January 1947.
Also in 1947, under the National Security Act, the armed services were put under the authority of the newly created National Military Establishment (NME), to be headed by the secretary of defense. In 1949 the National Security Act was amended, and the NME was transformed into an executive department--the Department of Defense. The Armed Forces Special Weapons Project, which would coordinate the Defense Department's responsibilities in the area of nuclear weapons, became the military heir to the Manhattan Engineer District. The Military Liaison Committee was also established as an intermediary between the Atomic Energy Commission and the Defense Department; it was also to help set military requirements for the number and type of nuclear weapons needed by the armed services.
Even before the AEC officially assumed responsibility for the bomb from the Manhattan Project, the Interim Medical Advisory Committee, chaired by former Manhattan Project medical director Stafford Warren, began meeting to map out an ambitious postwar biomedical research program. Former Manhattan Project contractors proposed to resume the research that had been interrupted by the war and to continue wartime radiation effects studies upon human subjects.
In May 1947, Lilienthal commissioned a blue-ribbon panel, the Medical Board of Review, that reported the following month on the agency's biomedical program. In strongly recommending a broad research and training program, the board found the need for research "both urgent and extensive." The need was "urgent because of the extraordinary danger of exposing living creatures to radioactivity.
It is urgent because effective defensive measures (in the military sense) against radiant energy are not yet known." The board, pointing to the AEC's "absolute monopoly of new and important tools for research and important knowledge," noted the commensurate responsibilities--both to employees and others who could suffer from "its negligence or ignorance" and to the scientific world, with which it was obliged to "share its acquisitions . . . whenever security considerations permit." In the fall of 1947, as recommended by the Medical Board of Review, the AEC created a Division of Biology and Medicine (DBM) to coordinate biomedical research involving atomic energy and an Advisory Committee for Biology and Medicine (ACBM), which reported directly to the AEC's chairman.
Not surprisingly, the DBM and ACBM became gathering places for the luminaries of radiation science. The ACBM was headed by a Rockefeller Foundation official, Dr. Alan Gregg. It settled on Dr. Shields Warren, a Harvard-trained pathologist, to serve as the first chief of the DBM. Warren, as we shall see, would play a central role in developments related to radiation research and human experimentation.
In the 1930s, focusing on cancer research, and influenced by the work of Hevesy and the pioneering radioisotope work being done in Berkeley and Boston, Warren turned to the question of the effects of radiation on animals and the treatment of acute leukemia, the "most hopeless . . . of tumors at that time." As the war neared, Warren enlisted in the Naval Reserve. He continued medical work for the Navy, turning down an invitation to join Stafford Warren (no relation) on "a project . . . that he couldn't tell me anything about [the Manhattan Project]."
While most of the AEC's budget would be devoted to highly secret weapons development and related activities, the biomedical research program represented the commission's proud public face. Even before the AEC opened its doors, Manhattan Project officials and experts had laid the groundwork for a bold program to encourage the use of radioisotopes for scientific research, especially in medicine. This program was first presented to the broad public in a September 1946 article in the New York Times Magazine. The article began dramatically by describing the use of "radioactive salt" to measure circulation in a crushed leg, so that a decision on whether to amputate below or above the knee could be made.
By November 1946, the isotope distribution program was well under way, with more than 200 requests approved, about half of which were designated for "human uses." From the beginning, the AEC's Isotope Division at Oak Ridge had in its program director, Paul Aebersold, a veritable Johnny Appleseed for radioelements.
In presentations before the public and to researchers, Aebersold, dubbed "Mr. Isotope," touted the simplicity and low cost with which scientists would be provided with radioisotopes: "The materials and services are made available . . . with a minimum of red tape and under conditions which encourage their use."At an international cancer conference in St. Louis in 1947, the AEC announced that it would make radioisotopes available without cost for cancer research and experimental cancer treatment. This, Shields Warren later recalled, had a "tremendous effect" and "led to a revolution in the type of work done in this field."
To AEC administrators, Aebersold emphasized the benefits to the AEC's public image: "Much of the Commission's success is judged by the public and scientists . . . on its willingness to carry out a wide and liberal policy on the distribution of materials, information, and services," he wrote in a memo to the AEC's general manager.
The AEC biomedical program as a whole also provided for funding of cancer research centers, research equipment, and numerous other research projects. Here, too, were advances that would save many lives. Before the war, radiotherapy had reached a plateau, limited by the cost of radium and the inability of the machines of the time to focus radiation precisely on tumors to the exclusion of surrounding healthy tissue.
AEC facilities inherited from the Manhattan Project could produce radioactive cobalt, a cheaper substitute for radium. As well, the AEC's "teletherapy" program funded the development of new equipment capable of producing precisely focused high-energy beams.
The AEC's highly publicized peacetime medical program was not immune to the pressures of the Cold War political climate. Even the lives of young researchers in the AEC Fellowship Program conducting nonclassified research were subject to Federal Bureau of Investigation review despite protests from commission members.
Congressionally mandated Cold War requirements such as loyalty oaths and noncommunist affidavits, Chairman Lilienthal declared, would have a chilling effect on scientific discussion and could damage the AEC's ability to recruit a new generation of scientists. The reach of the law, the Advisory Committee for Biology and Medicine agreed, was like a "blighting hand; for thoughtful men now know how political domination can distort free inquiry into a malignant servant of expediency and authoritarian abstraction." Nonetheless, the AEC accepted the congressional conditions for its fellowship program and determined to seek the program's expansion.
The AEC's direct promotional efforts were multiplied by the success of Aebersold and his colleagues in carrying the message to other government agencies, as well as to industry and private researchers. This success led, in turn, to new programs.
In August 1947, General Groves urged Major General Paul Hawley, the director of the medical programs of the Veterans Administration, to address medical problems related to the military's use of atomic energy. Soon thereafter, Hawley appointed an advisory committee, manned by Stafford Warren and other medical researchers. The advisers recommended that the VA create both a "publicized" program to promote the use of radioisotopes in research and a "confidential" program to deal with potential liability claims from veterans exposed to radiation hazards. The "publicized" program soon mushroomed, with Stafford Warren, Shields Warren, and Hymer Friedell among the key advisers.
By 1974, according to VA reports, more than 2,000 human radiation experiments would be performed at VA facilities, many of which would work in tandem with neighboring medical schools, such as the relationship between the UCLA medical school, where Stafford Warren was now dean, and the Wadsworth (West Los Angeles) VA Hospital.
While the AEC's weapons-related work would continue to be cloaked in secrecy, the isotope program was used by researchers in all corners of the land to achieve new scientific understanding and help create new diagnostic and therapeutic tools. It was, however, only a small part of an enormous institution. By 1951 the AEC would employ 60,000 people, all but 5,000 through contractors. Its land would encompass 2,800 square miles, an area equal to Rhode Island and Delaware combined.
In addition to research centers throughout the United States, its operations "extend[ed] from the ore fields of the Belgian Congo and the Arctic region of Canada to the weapons proving ground at Enewetak Atoll in the Pacific and the medical projects studying the after-effects of atomic bombing in . . . Japan." The Isotope Division, however, would employ only about fifty people and, when reactor production time was accounted for, occupy only a fraction of its budget and resources.