From: Seeing the Light: The Case for Nuclear Power in the 21st Century
On December 7, 2015, seventy-one Nobel Laureate scientists presented a document that spoke in one voice to the United Nations climate summit in Paris. The document they had signed, known as the Mainau Declaration, was a forceful call to action for all the world’s nations. The threat faced by humanity from a warming climate, it stated, was greater than any that had ever existed, with one possible exception.
Indeed, human civilization on planet Earth clearly faces a growing existential crisis. This risk is beyond denial, confirmed many times over by scientific work, and is now fully evident in the climatic changes that people are witnessing where they live and, increasingly, bearing the impacts of, from the Arctic to the equator. Science is conservative in embracing new truths. The requirements for evidence are prodigious and the place of doubt and debate central. Climate change has not only moved well beyond these stages of validation. It has repeatedly exceeded the every forecast for its effects…
Yet climate change, for all its risks, is only part of the existential crisis today. There is another type of threat, once thought to be quelled but that has returned with a vengeance and looms over large parts of humanity. Nearly 18,000 people lose their lives every day due to toxic material in the air they breathe. This amounts to 6.5 million deaths annually, which the World Health Organization (WHO) emphasizes is significantly greater than combined fatalities from HIV/AIDS, tuberculosis, and traffic accidents. Energy production defines the main contributor to this global threat to health, above all the burning of fossil fuels, particularly coal…
It turns out, then, that two of the greatest threats facing humanity largely have the same cause. They therefore can be dealt with, at least in part, using the same means…
Nuclear power became available in the immediate wake of the largest, most destructive war ever fought by humanity. For a large number of people, it has been associated with this conflict, with the two monstrous bombs that ended it, ever since. If understandable in some measure, this has ended up as a great misfortunate, since nuclear plants could now be everywhere and the climate crisis almost non-existent. Despite having injured or killed the smallest number of people of any major energy source in the past 60-70 years, nuclear is thought by more than a few to be the most risky. On the basis of the most extensive, long-term study, between 56 and 61 deaths (records are less than perfect) and some 4,000 cancers are associated with Chernobyl after three decades, numbers that are dwarfed by the 12,000 who died in London’s killer smog of 1952. This is not to mention the yearly toll of thousands whose lives are lost to extreme weather events today, nor the hundreds of thousands who die from toxic air in Asian cities--deaths that do not require an accident or other extraordinary event. It would seem time to put away the fear that associates nuclear power with nuclear weapons and face the real threats to society that now exist…
Nuclear power will be a growth industry for the 21st century, and this is a good thing. Two-thirds of humanity now lives in countries that have this source of electricity. During the next several decades, the figure will likely rise to three-quarters. In 2015, the world had 436 reactors for power generation. By 2035, the total number could approach 600, and by mid-century, 1,000 or more. Commercial reactors have operated since the 1950s, mainly in advanced nations. This will change: the global nuclear era has begun. Billions of tons of carbon emissions and lethal air pollution will not happen because of it. Millions of lives will be saved.
And yet much talk today about climate change and the energy future has ignored the nuclear option. A common tendency is to pose fossil fuels against the ascent of renewables, sources of carbon against non-carbon energy, with natural gas as a bridge between them. But in nations like China and India, with key roles in carbon emissions, nuclear power defines a core part of the effort to replace fossil energy. Just as important will be the expansion of nuclear power into dozens of developing nations for whom coal, not natural gas or renewables, has been the default choice. As this book will show, the future for nuclear power will be larger and more essential than commonly thought.
To many who live in the West or Japan, the idea of a thousand or more reactors in the world may come as a bit of a shock. What about the effects of Chernobyl and the more recent accident at Fukushima Daiichi? This incident in Japan, combined with a great earthquake and tsunami, did cause massive evacuations and did spread the acids of fear and distrust once again in many parts of the world. Nuclear power, it seemed to some, would never again find favor as a major energy option. Humanity would now enter a twilight of the nuclear idols.
Not true. Though of huge impact in Japan, the global significance of Fukushima has been greatly overstated. In the immediate wake of the accident nations did re-evaluate their plans to grow or launch nuclear programs. More than a few considered shutting down or retiring their operating plants. But only a tiny handful ever did so. Moreover, each of these nations—Germany, Belgium, Italy, Switzerland—remain a “nuclear state” in real terms, as they continue to operate research reactors for materials testing and the production of special isotopes for medicine and industry.
In every other country that discussed a phase out, experts ultimately determined the accident did not warrant such an extreme measure. On the contrary. Given that 12 other reactors plus a uranium enrichment plant, all affected by the earthquake, shut down safely was viewed as an impressive success. Adding to this view was that these plants endured the largest earthquake ever recorded in this quake-prone country, undergoing levels of ground acceleration far greater than they were designed to withstand. For nations with less risk of quakes and tsunamis--most of the world, after all--the conclusion was all the stronger.
From: The Shape of the New: Four Big Ideas and How They Built the Modern World
Arguably the three most powerful men of the twentieth century never lived to see it. Adam Smith, Karl Marx, and Charles Darwin could hardly have imagined the forms of wealth, revolution, and science that would emerge in their names during the decades after 1900 or the ugly dogmatism, pseudoscience, and staggering brutality. It would surprise them no less to find that their names were familiar to every well-educated person in a world of billions. Had each of them lived only a few decades longer, they might have seen inklings of this. What they could not have guessed was that their formative role in modern history would only grow with time.
Smith, Marx, and Darwin were not kings or military commanders. Nor were they political leaders or religious prophets. They were intellectuals. Their field of effort and the origin of the influence they exerted after their deaths lie in the realm of ideas. The ideas they articulated in the hands of followers, detractors, and many others provided the radioactive substance of transformation. It is impossible to talk about the rise of modern economics and the capitalist system—a system that profoundly changed the nature of the world and that is now fully global—without referring to Adam Smith. Marx set loose ideas that sought to destroy this system, that became the inspiration for revolutions and wars that swept away entire societies, changing and also ending the lives of many millions of people. And Darwin? His thought redefined the universe of living things and their relation to human beings, while both radically weakening the explanatory power of religion and radicalizing its reactive response to modernity.
Needless to say, these are not minor developments. They must be considered essential to the “modern,” however defined. Moreover, the conflicts and debates that led to these developments, and the struggles over them, are far from over. If the past two hundred years have revealed anything, it is that engagements over fundamental ideas—those elemental to the building of institutions, to changes in governments and the organization of society, to concepts like individualism and human rights—have not at all receded and show no prospect of an end. The battle over free markets and government power can hardly be called settled. The collapse of the Soviet Union has not erased state control from the globe and turned the world democratic by default. Modern biology has not destroyed fundamentalist religion. The confrontations waged over these primary matters have a continuing history to them that is still lived—in extreme as well as moderate form, today as much as a century ago. No group, nation, or party has definitively won the battle of ideas.
Contemporary society, in short, has been built over time from the materials of thought. We live within institutions and under political systems created and shaped by ideas that often began in the imagination of major thinkers. When first made public, many of these seemed in their time to be so original, sometimes so daring, that they were dismissed as implausible or even dangerous. For a great many of us, meanwhile, it is easy to assume that the society we inhabit has long been in place and sits on firm foundations. We are not quite prepared to accept that fundamentally new ways of looking at the world might come to reshape the reality we inhabit. But they have. Indeed, they are largely the source of our social existence and even a fair part of our beliefs about it. By this is meant not only grand theories about economics, history, and life but ideas concerned with liberty, the individual, the role of religion, education, and, not least, the nation-state. Concepts about these and similar institutions are often called by other names: policies, principles, schemes, plans; but they all come back to basic, underlying philosophies about the nature of society and how it should work. Ideas, therefore, are not mere mental substance. Operating through leaders, the public, interest groups, and ordinary individuals, they are a determining factor in the creation of social reality.
We are hardly the first to argue such a position. One of the twentieth century’s great economists, John Maynard Keynes, completed his most ambitious work, The General Theory of Employment, Interest and Money, with these pointed words:
The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back. I am sure that the power of vested interests is vastly exaggerated compared with the gradual encroachment of ideas. … [S]oon or late, it is ideas, not vested interests, which are dangerous for good or evil. (1936/2009, 383)
We agree with Keynes in most respects. But as we inhabit a later time and have seen a good deal more of history unfold, we would amend his conclusion in a significant way. We would emphasize that economists and political philosophers, important as they are (and as close to Mr. Keynes’s heart as they undoubtedly were), do not compose the entire taxonomy of thinkers who have delivered us to the present. We should not leave out the ideas from such pivotal domains as science and religion, for example. And in this book, we do not. Nor do we ignore the dangers that Keynes refers to, meaning dangers that have come from extreme and often violent interpretations of key ideas.
From: The Chicago Guide to Communicating Science, Second Edition
Science exists because scientists are writers and speakers. We know this, if only intuitively, from the very moment we embark upon a career in biology, physics, or geology. As a shared form of knowledge, scientific understanding is inseparable from the written and spoken word. There are no boundaries, no walls, between the doing of science and the communication of it; communicating is the doing of science. If data falls in the forest, and no one hears or sees or it… Research that never sees the dark of print remains either hidden or virtual or non-existent. Publication and public speaking are how scientific work gains a presence, a shared reality in the world.
These basic truths form a starting point. As scientists, we are scholars too, steeped in learning, study, and, yes, competitive fellowship. Communicating is our life’s work—it is what determines our presence and place in the universe of professional endeavor. And so we must accept the duties, as well as the demands and urges (and, fortunately or unfortunately, the responsibilities too) of authorship. But aside from noble sentiment, there are other reasons for being able to communicate well with our intellectual brethren.
No one who aspires to a scientific career can afford to overlook the practical implications of what has just been said. The ability to write and speak effectively will determine, in no uncertain terms, the perceived importance and validity of your work. To a large degree, your reputation will rest on your ability to communicate. The reason to improve your skill in this area, therefore, is not to please English teachers past and present (though these may well haunt us till we shed our mortal coil). It is to gain something very real in the professional world, something of advantage. To communicate well is to engage in self-interest. Another way of saying this is that writing and speaking intelligibly are required forms of professional competence—nothing less.
Contrary to what you may feel, however, based on your own experience and the stories of others, this situation is not a fatal one. Creating and sharing knowledge are truly profound but also eminently performable acts. Indeed, they are among the highest achievements of which human beings are capable. Every time you put finger to keyboard, step up to the podium, or clear your throat in front of a class, you become a full participant in the history of what has clearly become humankind’s most powerful domain of intellectual enterprise.
The purpose of this guide is to help you, the scientist, deal competently, even eloquently, with your role as an author. My intent is to aid you in learning how to feel at home with, and even take significant pride in, the communicating you will do as a member of the greater scientific community. This can be done, as it happens, without torture or torment, golden rules or iron systems. What it does require, among other things, is patience, a willingness to learn from others, and a certain way of looking at authorship.
The Importance of Attitude
Writing, we know, does not always come easily to scientists. Innumerable tales can be told of brilliant researchers whose papers would blind the eye of a freshman composition instructor. Yet, in reality, good writing rarely comes easily to anyone, in any discipline, whether quantum mechanics or art history. Writing is aptly called a skill, or, more accurately, a collection of skills. It is never entirely mechanical and always involves a level of emotional engagement, as well as forbearance and discipline. The Japanese have an excellent proverb for what it takes to learn a skill: Ishi no ue ni, san nen. Three years, standing on a rock.
I’m not suggesting that we try this (1-2 years, with time off for good behavior, should be plenty). But it points in a certain direction. What has our training, as scientists, been like in this area? In fact, a major difference between the humanities and sciences is that composing, critiquing, and revising papers forms a central part of learning in the former, while in the sciences it does not. Moreover, immersing oneself in eloquent writing of the past is also prominent in humanities training, whereas scientific instruction tends to avoid this sort of thing almost entirely. We don’t read Newton (or much of him) in a basic physics class, Linnaeus in a botany course, Lavoisier or Lyell in a chemistry or geology curriculum. Why is this so? The reasons are complex, and have much to do with the recent history of science. But the effects are clear: good writing is something that scientists are supposed to “pick up,” either from a course or two in technical writing while in school, or through osmosis after entering the caffeine-ridden world of professional research.
If formal communication can be intimidating for scientists and engineers, what is the best way to help gain back the upper hand? Much begins with how one thinks about writing in particular and about scientific language in general. To communicate well, you need to feel at least some degree of control over the language you are using. This means a basic awareness that you, the writer, are able to take words and images and create something out of them. It also means an understanding that you are doing this by employing certain forms and structures toward the goal of persuading—telling a story to—a very particular kind of audience.
Too often in science we have the feeling that language is our opponent, something we have to wrestle with and subdue. Technical speech can seem like something hardened and formal that we have to obey, that predetermines a great deal of what we can and cannot say. There is a drop of truth here; scientific writing is generally flat, unromantic, heavily reliant upon pre-existing technical terms and phrases. Journal editors are unlikely to smile favorably at Shakespearean turns of phrase, passionate outbursts, or fanfares to the gods of invention. Yet this hardly describes the whole of the matter. Science may sound anonymous to the ear, but it is fully human and personal to the touch. The calm, declarative “voice” of technical speech is one we must make anew, every time, through a host of choices, a number of which are actually quite flexible. If we look closely enough, we can find many avenues where personal eloquence may be put to practical use. The creative and the individual have a very important dimension in our writing (we’ll say more about this in chapter 4).
At the same time, we scientists have certain advantages over our (distant?) cousins in the humanities. Some of the same aspects that make our language seem flat and formal work in our favor. Abundant use of technical words and phrases does, in fact, mean that pieces of our discourse are prefabricated. There are more moments, that is, during the composition of any paper when a series of words flow easily from the fingers into place, as if by automation. This is not a sign of cybernetic rebirth, but actually something close to the opposite: an intuitive sense of when this is needed or possible. How do we acquire this? The answer is probably not very shocking—by internalizing the discourse of our subject and field. Such can come from long years of reading and reciting (at meetings) the relevant literature, until it becomes second speech. But there are other ways that require far less time, that graduate students can use. We will go over them in Chapter 3. The point here is that scientists shouldn’t feel writing is a lonely chore or errand in the wilderness. It is communal at every step and also comes with help.
Much begins and ends with attitude, therefore. Reasonably confident authors see their writing as a deserved opportunity, and they transfer this to the reader. Their science tends to be effective, less hesitant. If, however, you are terrified of writing, it is likely that your writing will terrify others (or worse, inspire humor). On the other hand, if you view the composition of technical papers as an unbounded creative exercise, with enthrallment as its goal, you will meet a quick and scarlet end at the hands of the first editor you encounter This book has been written to protect you from both fates…
From: Does Science Need a Global Language?
When I first met Ben, I thought he must be in customer service, so friendly and practiced was his smile. In fact, he is a biochemist from Uganda. Very dark-skinned and always neatly dressed with a touch of elegance, he speaks fluent and natural English that bubbles with an East African accent. His eyes have a sharp intelligence that can penetrate solid objects. We were forced colleagues, our boys playing on the same sports team, and so I decided to ask how he became a chemist. Every researcher has a story; his was something more.
“I am lucky to be a scientist,” Ben began, “but my luck was no accident.” Born in 1958, four years before Uganda’s independence from Britain, Ben spent his early years near Mubende, a town northwest of Kampala, where Bantu is spoken. He attended a local district school like most other children and was taught English. His father had worked in the colonial bureaucracy and often spoke the language at home with his son. “He had high hopes for me,” Ben said, without further explanation. “He saved enough to send me to a private academy, where a British man taught.” This man, an expatriate engineer of Indian descent, quickly recognized in Ben an aptitude for math. With the father’s permission, he gave the boy private lessons and much encouragement. “He was a mentor,” Ben noted, “and a lifeline.”
In 1972, the new dictator, Idi Amin, ordered all Asians to leave the country within ninety days. The teacher was forced to flee and never re- turned. In the face of mounting chaos and murders caused by the regime, Ben’s father sent his son to an uncle in Tanzania and then, with help from other family members, to San Francisco, where a relative owned a small restaurant. Ben was granted refugee status and attended school while working part time in the restaurant; since his English was both excellent and polite, he helped conduct business with suppliers. With his earnings, he eventually enrolled in a community college. Ben’s parents told him he must remain in the United States, so he eventually transferred to the University of Oregon, where a scholarship helped him earn a BS in mathematics and an MS in biochemistry. Chemistry drew him, he said, because of its powers of transformation. “I know this is the ancient view, of the alchemists. But it is true; in chemistry I found a kind of hope.” He studied the biochemistry of plants for his PhD, then took a job with a firm in Chicago.
Since 1990, Ben has specialized in food-related research. When I asked why, he replied, “Because this is what the world needs most.” He has had professional assignments in Brazil, India, Japan, Norway, and elsewhere, and has presented papers at many scientific meetings. He enjoys these meetings a great deal and attends several every year, as he almost always comes away with new research ideas and collaborations. Yet he said he had been thinking about returning to Uganda to teach…“I feel science must be shared,” he said. “It is not mine to keep. I can speak to my countrymen in a language that will not take sides with any group.”
In a globalizing world, language is power. The more human beings and institutions with which we can communicate, the more access to the offerings and agents of the larger world we gain. This may seem merely a matter of numbers, but far more is involved. Language has a role in the oldest dream for a better world: the dream of a universal language that allows people everywhere to commune and work together. It is the vision of a unified humanity, harmony on a planetary scale. In the West, we know this dream through the image of its loss: the biblical story of the Tower of Babel, a great structure erected to reach the heavens, designed no doubt by engineers and scientists of the time, but left incomplete when a jealous God shattered the once-universal language into thousands of tongues that could not understand one another.
What if, after a significant pause, a new chapter and verse might be added to the tale? What if, in our own time, a worthy alternative to Babel has emerged, lacking in arrogance, extending not merely to the empyreal realm but deep into the atom and as far as the distant galaxies? Such questions have already been answered. For the first time in history, science— humanity’s great tower of knowledge—has a global tongue. In truth, it is a global language for numerous domains, with science being one case among many. It is a special case, to be sure, but one whose meaning can’t be probed without an understanding of this larger reality.
Today, close to 2 billion people in over 120 nations speak English at some level of proficiency. This extraordinary number includes a broad spectrum of ability, without any doubt. Yet it testifies to the global draw this language now commands. For the natural sciences, medicine, and large areas of engineering, English utterly dominates in international communication. This does not mean that it rules in every circumstance, in every country. Its dominance has definite limits, being confined mainly to situations with an international or, especially, a global dimension. Yet this is crucial, as we will see, since science has itself entered a new, globalizing era. English, in short, is the global tongue for this era of globalization…
None of this is to say that the situation is final. Major changes can certainly occur during the present century. Yet any such changes would have to reverse a momentum of profound, global extent. Empirically, the dominance of English in science stands beyond question. From lab to classroom, democracy to autocracy, researchers can and do communicate well in a language accepted as a kind of universal currency.
It would be wrong, however, to assume that scientists everywhere possess this coin, or possess it to the same degree. They do not. And as with any form of capital, uneven possession is widespread and means inequality, with large implications. There are realities that a story like Ben’s doesn’t bring to the eye or ear. An international tongue can be a hard master. Those who have it, as Ben did from an early age (“my luck was not an accident”), may gain opportunity, mobility, and more. But consider the young Korean biochemist whose English is poor, who must struggle or pay to get her slides translated for an upcoming meeting, to work on her script, pronunciation, anxiety. Those who do not possess command of the dominant tongue find themselves limited, confined, even disenfranchised, ignored. Much forced accommodation exists among scientists who do not know English well. Local tongues and possibly cultures are affected. A language that spreads to many nations is one toward which many millions of people will migrate, perhaps leaving behind part of their native linguistic heritage. Casualties exist, in other words. History (as we will see in chapter 5) suggests, when we examine the past spread of other world languages like Chinese, Arabic, Spanish, and Russian, that such casualties not be altogether avoidable.
Yet there are huge advantages to learning the dominant tongue, another point taken from history. This is of course a reason why so many scientists and engineers do so without a sense of injustice. They place great value on being able to reach so many others, to gain higher levels of recognition for their work, to join and share this work with the greatest number. If we look at the past, we see this has happened time and again. A language such as Latin or Arabic began its spread as the tongue of conquerors and traders, but then evolved into an unparalleled storehouse of texts and knowledge that remained vital long after the respective empires crumbled away. Few scholars of nature in sixteenth-century Paris or tenth-century Cordoba could have made a contribution without Renaissance Latin or classical Arabic. These languages provided reservoirs into which thinkers from many places added their contributions. As languages of power, they weakened the motive for doing sophisticated work in local tongues. But this did not mean that thinkers had to abandon their speech as a matter of course. The situation has rarely been either/or. For researchers today as well, knowledge of English is often an added skill.
From: Science in Translation: Movements of Knowledge through Cultures and Time
Knowledge, whatever its contents, has always been a mobile form of culture. However one cares to define it—as a body of fact and hypothesis, the product of a specific labor, or an instrument of domination—human understanding, literary or scientific, has undergone enormous passages between peoples and places over the span of history. Its movement has come on the heels of war and conquest, commerce and trade, religious conversion, immigration, discovery. It has resulted, no less significantly, from the travels of itinerant scholars, pilgrims, and adventurers, either in the service of wealthy patrons or of their own curiosity and ambition. The mobilization of knowledge has taken place suddenly, during brief historical periods. It has occurred, more quietly and perhaps more profoundly, across the creep of millennia, as an elemental feature of daily along national and linguistic borders, both within and between cultures.
Beyond any doubt, the transfer of learning has been critical to the building of societies, those we call “modern” most of all. Time and again, the introduction of new concepts and methods—Roman law, the system of Hindu-Arabic numerals, the abacus, Newtonian physics, linear perspective—has proved the source of new capacities for ordering, directing, and expanding human existence. Placing the knowledge of one people into the hands of another involves the transfer of powers…Such transfer therefore defines a key historical process: it is what scholars really mean when they speak (and they do so often) of “influence” and “impact” between different periods or societies.
Thus a question: how is knowledge rendered mobile? What makes it able to cross boundaries of time, place, and language?...
“Translation,” one soon realizes is not a word that describes any single activity. As the second oldest profession on the streets of authorship, it is generally conceived in fairly obvious terms, as a matter of rendering the words of one language into those of another, hopefully with little spillage of meaning. Yet this is hardly a definition; it is more in the manner of a description. It deals not at all with the enormous variety and complexity of the transfer itself.