by Richard Rhodes
[Richard Rhodes, historian and author of numerous books including the Pulitzer Prize-winning The Making of the Atomic Bomb, was the keynote speaker at a special dinner held in observance of the 75th anniversary of the discovery of nuclear fission at the American Nuclear Society 2013 Winter Meeting. Many ANS members and others, both in attendance and unable to attend, have expressed a desire to see in print his remarkable presentation on the fundamental technological revolutions and advances of the past century, especially the monumental discovery and application of nuclear technology. The speech is printed in its entirety in the January edition of Nuclear News, and below.]
Karl Compton, the American physicist who was for many years president of the Massachusetts Institute of Technology, liked to tell a story about his sister, who lived in India. She hired an electrician there to make improvements in her house. She didn’t know much about electricity, so she had trouble explaining what she wanted done. Finally she told the electrician, “Oh, you know what’s needed here, just use common sense and do it.” The man shrugged. “Alas, madam,” he said, “common sense is a gift of God. I’m just a humble soul with a technical education.”
I’m a humble soul without even a technical education, but I do know a little about history, particularly the history of technology. This evening I’d like to share with you some of what I’ve learned. You work on the front lines—you are people, as the writer William Burroughs liked to say, who know how to get the Spam to the front lines—Spam in this case being not unwanted advertising but the canned meat product that fed our troops in World War II. I thought you’d enjoy looking back a little at the last hundred-some years. Whatever the Luddites think, what technology accomplished across the past century is remarkable—not the least of which, of course, was the discovery of nuclear fission in December 1938 and its elaboration since then to produce today about 11 percent of world electricity.[i] That’s an extraordinary and, I have to say, an under-appreciated achievement.
In the late 1920s, a newspaper editor named Mark Sullivan reviewed the first quarter of the 20th century in a six-volume compendium of stories and statistics called Our Times. Early in the first volume, Sullivan looked back to the beginning of the 20th century. His portrait is focused on the United States, but it applies equally to the rest of the industrialized world.
“In his newspapers of January 1st, 1900,” Sullivan writes, “the American found no such word as radio, for that was yet 20 years from coming; nor ‘movie,’ for that too was still mainly of the future; nor chauffeur, for the automobile was only just emerging and had been called ‘horseless carriage’ when treated seriously, but rather more frequently, ‘devil-wagon,’ and the driver, the ‘engineer.’ There was no such word as aviator—all that word implies was still a part of the Arabian Nights.…In 1900 doctors had not yet heard of…insulin; science had not heard of relativity or the quantum theory. Farmers had not heard of tractors…nor sailors of oil-burning engines.” Sullivan continues this catalog of a world not yet invented for several more paragraphs, then turns to the condition of the land, finding a landscape far more blighted than nostalgia recalls:
Only the Eastern seaboard [Sullivan writes] had the appearance of civilization having really established itself and attained permanence. From the Alleghenies to the Pacific Coast, the picture was mainly of a country still frontier and of a people still in flux: the Allegheny mountainsides scarred by the axe, cluttered with the rubbish of improvident lumbering, blackened with fire; mountain valleys disfigured with ugly coal-breakers, furnaces, and smokestacks; western Pennsylvania and eastern Ohio an eruption of ungainly wooden oil-derricks; rivers muddied by the erosion from lands cleared of trees but not yet brought to grass, soiled with the sewage of raw new towns and factories; prairies furrowed with the first breaking of sod. Nineteen hundred was in the floodtide of railroad-building: long fingers of fresh dirt pushing up and down the prairies, steam-shovels digging into virgin land, rock-blasting on the mountainsides. On the prairie farms, sod houses were not unusual. Frequently there were no barns, or, if any, mere sheds. Straw was not even stacked, but rotted in sodden piles. Villages were just past the early picturesqueness of two long lines of saloons and stores, but not yet arrived at the orderliness of established communities; houses were almost wholly frame, usually of one story, with a false top, and generally of a flimsy construction that suggested transiency; larger towns with a marble Carnegie Library at Second Street, and Indian tepees at Tenth. Even as to most of the cities, including the Eastern ones, their outer edges were a kind of frontier, unfinished streets pushing out to the fields; sidewalks, where there were any, either of brick that loosened with the first thaw, or wood that rotted quickly: rapid growth leading to rapid change. At the gates of the country, great masses of human raw materials were being dumped from immigrant ships. Slovenly immigrant trains tracked westward. Bands of unattached men, floating labor, moved about from the logging camps of the winter woods to harvest in the fields, or to railroad-construction camps.…
One whole quarter of the country, which had been the seat of its most ornate civilization, the South, though it had spots of melancholy beauty, presented chiefly the impression of the weedy ruins of 35 years after the Civil War. . . .
In 1900 the United States was a nation of just under 76 [million] people. . . [ii]
I count two fundamental technological revolutions in the 20th century and the beginnings of two more that are still unfolding. The two 20th century revolutions were in public health and nuclear energy; the two still unfolding are digital information and molecular biology.
Medicine, including public health, began a remarkable advancement in the early years of the century. I had occasion to understand that change some years ago when I wrote a profile of the Mayo Clinic of Rochester, Minnesota. The Mayo Brothers succeeded in part because they began their group practice just when surgery was developing aseptic technique. There was a reservoir of suffering humanity in the world at the beginning of the century; the dam that confined it was medical ignorance. People were afraid to risk abdominal surgery for fear of deadly infection. Blood transfusion had not yet been devised. Long before patients visited the Mayo Clinic from everywhere in the world, crowds of patients came to Mayo from the upper Midwest: chronic gallbladders and infected appendixes misdiagnosed as “colic” and “stomach disease” and “dyspepsia”; tens of thousands of goiters in that region without iodine in the soil (the Mayos treated 37,228 cases of goiter between 1892 and 1934); ovarian cysts that grew so large, filling with fluid, that women sometimes wore special harnesses their farmer husbands made for them to hold their abdomens up (the largest ever removed at Mayo, in 1920, weighed 140 pounds).
Not chronic suffering but stark death from disease was the lot of infants in those days. Life expectancy at the turn of the 20th century for both men and women throughout the industrialized world was less than 50 years—50 years—but that average masks a disproportionate loss in infancy. In the United States in the second half of the 19th century, between 15 and 20 percent of all infants died before their first birthday. In large cities that number reached 30 percent—one out of three. Today infant mortality in the United States—not the most progressive in the developed world—is barely one percent.
No other modern reduction in mortality—not that from controlling tuberculosis, venereal disease, or even epidemic infections—comes close to the reduction in infant mortality. A hundred years ago the New York Times editorialized that “There is no more depressing feature about our American cities than the annual slaughter of little children.”[iii] Much of the annual slaughter came during the summer, and the killer was contaminated milk. The great modern reduction in human mortality was generally the result of improved sanitation and nutrition, but pasteurization of milk was crucial to the reduction in infant mortality in the cities.
You won’t be surprised to hear that food purists in those days as well as many physicians opposed pasteurization, just as food purists today oppose food irradiation even though it would prevent tens of thousands of serious illnesses and save thousands of lives. The arguments offered today against irradiation are the same arguments offered a hundred years ago against pasteurization: that it wasn’t “natural,” that it changed the taste, that it destroyed some mysterious vital factor, that it encouraged farmers and processors to produce an unsanitary product. With pasteurization, at least, wiser heads prevailed. By 1921, pasteurized milk predominated in more than 90 percent of American cities above 100,000 population, and epidemics of summer mortality among infants had essentially ceased.
It’s obvious today that medicine and public health are triumphant technologies, but as late as 1937 in the United States, technology professionals evidently didn’t think of them as such. That year a commission chaired by Secretary of the Interior Harold Ickes reported to President Franklin Roosevelt on “the kinds of new inventions which may affect living and working conditions in America in the next 10 to 25 years.” The commission’s remarkably pedestrian findings made no mention of military, public health, cultural, or ecological consequences.
Such striking omissions partly reflect an understandable preoccupation with the Great Depression. Millions of people had been thrown out of work. Some of them blamed machines. I find numerous attacks on industry and technology in the writings of that era, and a few sturdy champions. An essayist, George Boas, defended technology articulately:
We are first told [Boas wrote] that though man invented [machines] to be his servants, he has become theirs.… This argument is a gross exaggeration. Man is no more a slave of his machines now than he has ever been, or than he is to his body, of which they are…an extension. A farmer is certainly as much of a slave to his primitive plow or sickle as a factory hand to his power loom or engine.…Steam undoubtedly produces much of the ugliness and dirt of our cities, but we are not for the moment discussing the aesthetic aspects of the question. Why steam is more mechanical than wind or falling water or muscle-driven hammers is somewhat obscure. A sailboat, a rowboat, an inflated goatskin, a log are all equally machines. A linotype, a hand-press, a pen, a reed, a charred stick are all machines. They are all mechanical supplements to man’s corporeal inadequacies.…
When I have pointed this out in conversation with primitivistic friends [Boas continues] I have been invariably charged with sophistry. They have always insisted that my definition of “machine” was too broad. My answer is that the only alternative they offer, arbitrarily identifies a machine with a bad machine.…
As one digs into this discussion, one finds the instinctive hatred that many people have always had for innovation. We do not hate machines, we hate new machines.… I have heard a gardener in France inveighing against chemical fertilizers which [violate the earth], as if horse manure were non-chemical. Sailors in the windjammers railed against the steamboat, and steamboat crews think none too kindly of the johnnies who sail oil-burners. Greek and Roman literature is full of invective against any kind of navigation, for it takes the pine tree off its mountain top and sends men wandering.[iv]
“Obviously,” Boas concludes, “a new machine, like an old one, must be judged on its merits, not on its novelty.”
Here, I would add, is one basis for the continuing hostility among some of our citizens to nuclear power, a truly novel new source of energy that only emerged to the light of day 75 years ago next month—not a long time where great energy transitions are concerned. When the Elizabethan English had cut down their forests so far away around London that wood had become prohibitively expensive to transport, when they therefore had to begin to transition from wood to coal, which had been little used before, you wouldn’t believe the outcry from pulpit and parliament. Preachers argued that coal was literally the Devil’s excrement—it was black and dirty, after all, it stank of sulfur, and it was obviously unsuitable to burn in English fireplaces, which in those days often lacked chimneys so that the sweet woodsmoke they formerly produced could waft through Elizabethan houses, harden the rafters, and sweeten the air. Beef roasted over coal fires was nearly inedible. The English only began accepting coal as a substitute for wood when Queen Elizabeth died and was succeeded on the throne by the Scottish James I. The Scots had transitioned to coal earlier, and their coal was less sulfurous; when the king began burning coal it became fashionable, easing the transition—and incidentally, beginning the chain of technological developments that led to the industrial revolution.
Energy transitions are tough.
Public health is an organizational technology—software rather than hardware. But judged on its merits, public health was by far the most important technological development of the past century—dependent, of course, on progress in biology. Two American demographers, Kevin White and Samuel Preston, took its measure in a 1996 study that asked the question, “How many Americans are alive because of 20th century improvements in mortality?”[v] Their surprising conclusion applies with equal validity to the rest of the developed world.
“Mortality reduction throughout the world,” they write, “has been more rapid in the 20th century than in any previous period. The expansion in longevity ranks among the great social achievements of our time. Life expectancy at birth in the United States has increased from 47.3 years in 1900 to 75.7 years in 1994.” Today in the United States it’s almost 79 years. But White and Preston found their most startling result when they asked the question in the title of their paper:
If mortality had remained at 1900 levels throughout the [20th] century, holding everything else constant, the population [of the United States] in the year 2000 would be almost exactly half its actual size: 139 million people instead of 276 million. Half of Americans today can attribute their being alive to mortality improvements in the 20th century: 51 percent of females and 49 percent of males.…
Half [of that half] represent…those who would have been born but would subsequently have died [before they were old enough to reproduce].… [And therefore,] most of the additional people below age 30 would never have been born. They are the indirect beneficiaries of mortality reductions among their mothers, grandmothers, and great-grandmothers.
In other words, public health saved more lives in the 20th century in the United States alone than were lost throughout the world in all that century’s terrible wars, when losses of combatants and civilians are estimated to have totaled approximately 120 million deaths. And millions more lives would have been saved had the benefits of public health extended beyond the developed world to developing countries as well. That humane extension is still in progress. Smallpox has been eradicated; polio eradication is almost complete; measles will follow and other diseases as well.
War is one significant cause of premature death. So war is another problem in public health. In the first half of the 20th century, war was a seemingly intractable problem, escalating in destructiveness as governments improved its technologies and widened its acceptable range of victims. A graph of man-made deaths from war and war’s attendant privation in the 20th century shows annual peaks in the low millions during the First World War and the Russian Revolution, a huge peak four times as high during the Second World War—15 million deaths in 1943, partly from combat and privation, partly from the Holocaust—and then an abrupt drop-off after 1945 to a smoldering one or two million deaths annually ever since—nothing to be proud of, to be sure, but only about one-fifth as many as the annual toll of deaths from smoking. The world would celebrate a comparable drop-off in a disease epidemic and judge it to be clear evidence that the epidemic was being brought under control. Yet the claim that knowledge of how to release nuclear energy—knowledge embodied in weapons so deadly that no nation has dared to explode one in anger since the end of the Second World War—goes largely uncelebrated. Who can doubt that such knowledge put an end to world-scale war? What else explains the abrupt decline in man-made deaths from war after 1945?
I said earlier that nuclear energy was one of two profound technological revolutions our century has seen. I mean first of all its effect on the arbitrary exercise of power by nation-states. At the end of the Second World War, many people believed that the only way to prevent another such disaster was to install over the national governments that confronted each other in international anarchy, a world government armed with nuclear weapons—a truly frightening notion. A few visionaries had a better idea, embodied in a 1946 U.S. government document called the Acheson-Lilienthal Report. That report was prepared for President Truman by a committee of scientists, engineers, and industrialists familiar with the work of the Manhattan Project, a committee that included the American theoretical physicist and former Los Alamos lab director Robert Oppenheimer. Through Oppenheimer, the Nobel laureate physicist I. I. Rabi contributed ideas indirectly to its formulation, as did the great Danish physicist Niels Bohr.
The Acheson-Lilienthal Report envisioned a world where a distributed network of nuclear knowledge and infrastructure guarded the peace, where many countries conducted and benefited from nuclear research and nuclear power, where many, if not most, were therefore capable of, and had the materials for, building nuclear weapons in a matter of months, but where, by mutual agreement, no tangible arsenals of such weapons were stockpiled. Given sufficient transparency, technical monitoring, and intrusive inspection, the agreement would have policed itself, since any country that began building nuclear weapons would essentially have been declaring war, an act that would have triggered a similar response from others, effectively nullifying the escalation at a higher level of risk. Although the recommendations of the Acheson-Lilienthal Report were rejected at the United Nations, we have nevertheless voluntarily, because of the obvious benefits, moved a long way in the direction of installing such a world. We’re not there yet, but we’re not all that far away.
The main difference between the vision of the Acheson-Lilienthal Report and the real world we live in, of course, is that nine nations have in fact built and stockpiled nuclear weapons—10, if you count South Africa, the only nation which also, in 1993, dismantled and abrogated the small arsenal it had built. Our confidence falters when new nuclear powers emerge, as North Korea did a decade ago, but everyone here knows that many more countries could similarly burden themselves with the expense and reprobation and increased insecurity of actual nuclear arsenals if they chose. The marvel isn’t that we have several new nuclear powers in the wake of the Cold War; the marvel is that we don’t have dozens. When, in the years ahead, the declared nuclear powers come to trust that the world will be a safer place with three months’ delivery time from factory to target than it is with 30 minutes delivery time from submarines and missile silos, then the vision of distributed deterrence that the Acheson-Lilienthal Report described in 1946 will be fulfilled. Then, as Niels Bohr liked to say, nations can compete with the magnitude of their good works rather than threaten with their arsenals.
Not long before his death Joseph Rotblat, the Nobel Peace Prize laureate, told Jonathan Schell that “The main enemy now is poverty, which we don’t need a war to fight.”[vi] I agree. The ultimate cause of conflict in the world today is surely structural violence—meaning violence that’s built into the structure of societies by limitations and restrictions on development. Structural violence is mortality that vaccination could prevent if preventive medicine were more equitably distributed. Structural violence is malnutrition from poverty from lack of infrastructure such as roads, education, and energy that a more equitable distribution of resources might supply. Structural violence is the average 10 years’ shorter lifespan of African-Americans in the United States, a number that quantifies the effect of long years of racial discrimination in this country and that has its counterparts in racial and ethic conflicts in other places. Structural violence is the massive unemployment and diminished prospects of young men in the Middle East and North Africa, the breeding grounds of terrorism.
In a paper published almost 50 years ago, two aerospace engineers, T. J. Gordon and A. L. Shef, examined the effect of technology on human progress—that is, on the alleviation of structural violence. They found surprising regularities. “The technological status of the world as a whole,” they wrote, “advances at a roughly constant exponential rate, doubling every 20 years, or in effect every generation. Although slight temporal differences exist from an overall viewpoint, growth rate from at least the beginning of the 20th century has been relatively constant for the world as a whole. Furthermore, the present [meaning 1968] technological status of the world is roughly equivalent to the level of the United States alone at the beginning of the 20th century.” Gordon and Shef found, remarkably, that the technological growth rates for developed and developing countries were approximately the same. That finding might imply that developing countries would always lag relatively behind, but by 1965, the two engineers noted, Japan had managed to increase its technological growth rate sufficiently to cross over from developing to developed, and China was on its way.
Gordon and Shef drew several intriguing conclusions from their study, conclusions that would still seem to apply today. They found that technology is growing exponentially, with the technological index approximately doubling every 20 years. They found that the rate of growth of technology appears to be accelerating. National programs, they found, control the content of technology, not its rate of production.[vii] These findings remind me of the work of the Italian physicist Cesare Marchetti, who has identified remarkable regularities in human activity by starting with the assumption, in Marchetti’s words, “that society is a learning system, that learning is basically a random search with filters, and that random searches are characterized by logistic functions”[viii]—that is, by growth curves like those common to biological forms.
The ultimate goal of technology is the alleviation of human suffering. That admirable morality is inherent in the technological enterprise, not added on. The scholar Elaine Scarry, echoing Francis Bacon, defines the function of human imagination embodied in invention as “the progressive materialization of the world.” Out of the silence of the inanimate we shape material objects so as to inform them with human purpose:
The naturally existing external world [Scarry writes]—whose staggering powers and beauty need not be rehearsed here—is wholly ignorant of the “hurtability” of human beings. Immune, inanimate, inhuman, it indifferently manifests itself in the thunderbolt and hailstorm, rabid bat, smallpox microbe, and ice crystal. The human imagination reconceives the external world, divesting it of its immunity and irresponsibility not by literally putting it in pain or making it animate but by, quite literally, “making it” as knowledgeable about human pain as if it were itself animate and in pain.…
The general distribution of material objects to a population means that a certain minimum level of objectified human compassion is built into the revised structure of the external world and does not depend on the day-by-day generosity of other inhabitants.… It is almost universally the case in everyday life that the most cherished object is one that has been handmade by a friend; there is no mystery about this, for the object’s material attributes themselves record and memorialize the intensely personal, extraordinary because exclusive, interior feelings of the maker for just this person—this is for you. But anonymous, mass- produced objects contain a collective and equally extraordinary message: Whoever you are, and whether or not I personally like or even know you, in at least this small way, be well.[ix]
Nuclear energy, by offering essentially unlimited energy to the human project, promises equally exceptional alleviation, particularly of the structural violence that follows from inequalities in the distribution of material resources. Adding to the energy supply is a rising tide that lifts all boats. I personally believe that those who oppose increasing the supply of nuclear power are more than simply misinformed and elitist. I believe strongly that their opposition is immoral. It contributes to human suffering and premature death by perpetuating structural violence.
David Lilienthal, the first chairman of the U.S. Atomic Energy Commission and the Lilienthal of the Acheson-Lilienthal Report, spoke to this point long ago. “Energy is part of a historic process,” he said, “a substitute for the labor of human beings. As human aspirations develop, so does the demand for and use of energy grow and develop.”[x]
Satisfying human aspirations is what our species invents technology to do. Some people, secure in comfortable affluence, may dream of a simpler and smaller world. How ever idealistic they imagine such a dream to be, its hidden agenda is brutalizing. Millions of children still die every year in our resource-rich world for lack of adequate resources—clean water, food, medical care. The development of those resources is directly dependent on energy supplies. The real world of real human beings needs more energy, not less. As oil and coal continue their historic decline, as climate change accelerates, that energy across at least the next 50 years will necessarily come from nuclear power and natural gas.
Nuclear energy is an important part of the answer to climate change, of course. It’s poised to take off in Asia, as it originally did in the United States, partly as a remedy for noxious air pollution from coal burning, partly to meet the increasing demand for electricity from populations working and moving in the direction of greater prosperity.
To that point, and consistent with my emphasis on the public-health benefits of adequate supplies of energy, Pushker Kharecha and James Hansen, of the Columbia University Earth Institute and the NASA Goddard Institute for Space Studies, recently published in the journal Environmental Science & Technology a detailed estimate of the effect nuclear power has had on preventing deaths related to air pollution. They estimate that global nuclear power production for the historical period 1971 to 2009 prevented some 1.84 million deaths from air pollution by replacing the burning of coal and natural gas. They estimate further that nuclear power could additionally prevent between 420,000 and 7.04 million deaths between 2010 and 2050, depending on which fuels it replaces. These public-health effects are in addition to its effects, past and future, mitigating climate change in comparison to fossil fuels, natural gas in particular.[xi] It has long seemed to me important, in discussing nuclear power, to emphasize its public-health effects. Years ago I interviewed the president of Duquesne Power and Light, the company in Pittsburgh that built the first commercial nuclear power plant in the United States at Shippingport. He told me that the most important argument for building the plant had been its mitigating effect on the terrible coal smoke pollution around Pittsburgh. It was, he said, the greenest available energy. It still is.
How digital technology and genetic engineering will change the world we are only dimly beginning to see. The changes will be deep, perhaps as deep as the changes from the discovery of nuclear fission have been. But it’s incontrovertible that public health and nuclear energy have already saved and improved hundreds of millions of human lives. Of course both technologies have problems, as all technologies do—after all, they’re the work of humble souls with technical educations.
Technology has taken a beating across the modern era. It deserves better press. In the midst of your meetings, I hope you’ll pause occasionally to recall the value and the virtue of your work. I hope you’ll remind yourselves that the wholly honorable purpose of your enterprise is nothing less than the alleviation of human suffering.
Attendees at the 75th Anniversary Dinner
[i] 11 percent of world electricity: World Nuclear Association (online).
[ii] Mark Sullivan, Our Times (Vol. I: The Turn of the Century). Charles Scribner’s Sons, 1926, pp. 22-31.
[iii]Quoted in Richard A. Meckel, Save the Babies, Johns Hopkins University Press, 1990, p. 11.
[iv] George Boas, “In Defense of Machines,” Harper’s 165 (June 32).
[v] Kevin M. White and Samuel H. Preston, “How many Americans are alive because of twentieth-century improvements in mortality?” Population and Development Review 22(3): 415-428 (Sept. 96).
[vi] Jonathan Schell, “The Gift of Time,” The Nation, 2/9 Feb 98, p. 29.
[vii] T. J. Gordon and A. L. Shef, “National Programs and the Progress of Technological Societies,” in Philip K. Eckman, ed., Technology and Social Progress—Synergism or Conflict? AAS Science and Technology Series, Vol. 18, Proceedings of the Sixth AAS Goddard Memorial Symposium held March 12-13, 1968, Washington DC, AAS Publications Office, 1968, pp. 105-109.
[viii] Cesare Marchetti, “Society as a Learning System: Discovery, Invention, and Innovation Cycles Revisited.” Technological Forecasting and Social Change 18, 267-282 (1980), p. 268.
[ix] Elaine Scarry, The Body in Pain, Oxford University Press, 1985, pp. 288-292.
[x] David Lilienthal, Atomic Energy: A New Start, Harper & Row, 1980, p. 10.
[xi] Environ. Sci. Technol. 2013, 47, 4889-4895.
Richard Rhodes is a historian and best-selling author of numerous books, including The Making of the Atomic Bomb which won a Pulitzer Prize in Nonfiction, a National Book Award, and a National Book Critics Circle Award.