The Scientific Citizen
Let us start with a flashback to the 1660s—to the earliest days of one of the world’s first scientific academies, the Royal Society of London. Christopher Wren, Robert Hooke, Samuel Pepys, and other “ingenious and curious gentlemen” (as they described themselves) met regularly. Their motto was to accept nothing on authority. They did experiments; they peered through newly invented telescopes and microscopes; they dissected weird animals. But, as well as indulging their curiosity, they were immersed in the practical agenda of their era: improving navigation, exploring the New World, and rebuilding London after the Great Fire.
Today, science has transformed our lives. Our horizons have hugely expanded; no new continents remain to be discovered. Our Earth no longer offers an open frontier, but seems constricted and crowded—a “pale blue dot” in the immense cosmos. Despite all that has changed, the values and attitudes of the Royal Society’s founders are enduring ones. Today’s scientists are specialists, not polymaths. But, like their forebears, they probe nature and nature’s laws by observation and experiment; and they should also engage broadly with society and with public affairs.
Indeed, their engagement is needed more than ever before. But science isn’t just for scientists. We should all have a voice in ensuring that it’s applied ethically, and to the benefit of both the developing and the developed world. We must confront widely held anxieties that genetics, brain science, and artificial intelligence may “run away” too fast. As citizens, we all need a feel for how much confidence can be placed in science’s claims.
Science as culture
Quite apart from its pervasive impact on our lives, science is part of our culture. In 2009 Charles Darwin’s anniversary was celebrated—especially, of course, in his home country, but around the world. Darwin’s impact on nineteenth-century thought was profound, but it resonates even more today. His concept of natural selection was described by Dan Dennett, with only slight hyperbole, as “the best idea anyone ever had.” His insights are pivotal to our understanding of all life on Earth, and the vulnerability of our environment to human actions. Other sciences have disclosed the nature of atoms, DNA, and the stars, and have enlarged our cosmic horizons. It’s a cultural deprivation not to appreciate the panorama offered by modern cosmology and Darwinian evolution—the chain of emergent complexity leading from some still-mysterious beginning to atoms, stars, and planets—and how, on our planet, life emerged and evolved into a biosphere containing creatures with brains able to ponder the wonder of it all. This common understanding should transcend all national differences—and all faiths too.
Science is indeed a global culture. Its universality is specially compelling in my own subject of astronomy. The dark night sky is an inheritance we’ve shared with all humanity, throughout history. All have gazed up in wonder at the same vault of heaven, but interpreted it in diverse ways. There is a natural fascination with the big questions: Was there a beginning? How did life emerge? Is there life in space? And so forth.
The simplest building blocks of our world—atoms—behave in ways physicists can understand and calculate. And the laws and forces governing them are universal: atoms behave the same way everywhere on Earth—indeed they are the same even in the remotest stars. We know these basics well enough to enable engineers to design all the mechanical artifacts of our modern world, from radios to rockets. Our environment is far too complicated for every detail to be explained but our perspective has been transformed by great insights—great unifying ideas. The concept of continental drift, for instance, helps us to fit together a whole raft of geological and ecological patterns across the globe. And Darwin’s great insight revealed the overarching unity of the entire web of life on our planet.
There are patterns in nature. There are even patterns in how we humans behave—in how cities grow, how epidemics spread, and how technologies like silicon chips develop. The more we understand the world, the less bewildering it becomes, and the more we’re able to change it.
These “laws” or patterns are the great triumphs of science. To discover them required dedicated talent—even genius. But to grasp their essence isn’t so difficult: most of us appreciate music even if we can’t compose or perform it. Likewise, the key ideas of science can be accessed and enjoyed by almost everyone; the technicalities may be daunting, but these are less important for most of us and can be left to the specialists.
Politics and scientific uncertainty
The twenty-first century is a crucial one. The Earth has existed for 45 million centuries but this is the first period in which one species, ours, can determine, for good or ill, the future of the entire biosphere. Over most of history, the threats have come from nature—disease, earthquakes, floods, and so on. But now they come from us. We’ve entered what some have called the “Anthropocene” era.
But despite the concerns there are powerful grounds for optimism. For most people in most nations, there’s never been a better time to be alive. The innovations driving economic advance—information technology, biotechnology, and nanotechnology—can boost the developing as well as the developed world. Creativity in science and the arts is nourished by a wider range of influences—and is accessible to hugely more people worldwide—than in the past. We’re becoming embedded in a cyberspace that can link anyone, anywhere, to all the world’s information and culture, and to most other people on the planet. Twenty-first-century technologies can offer everyone a lifestyle that requires little compromise on what Americans and Europeans aspire to today, while being environmentally benign, involving lower demands on energy or resources.
Some changes happen with staggering speed. Everyday life has been transformed in less than two decades by mobile phones and the Internet. Computers double their power every two years. Spin-offs from developments in genetics could soon be as pervasive as those we’ve already seen from the microchip. These rapid advances, and others across the whole of science, raise profound questions. Who should access the readout of our personal genetic code? How will our lengthening life spans affect society? Should we build nuclear power stations, or wind farms, if we want to keep the lights on? Should we use more insecticides, or plant genetically modified (GM) crops? Should the law allow “designer babies”? How much should computer technology be permitted to invade our privacy?
Such questions don’t feature much in national election campaigns in any country. That’s partly because they transcend party politics, but it’s also because they’re long-term, and tend to be trumped by more urgent items on political agendas. But often science has an urgent impact on our lives. Governments and businesses, as well as individuals, then need specialist advice—advice that fairly presents the level of confidence and the degree of uncertainty.
Issues come up unexpectedly. For instance, in April 2010, the volcanic eruptions in Iceland that disrupted air travel in Northern Europe raised urgent questions about vulcanology, about wind patterns, and about how volcanic dust affects jet engines. In that instance, the knowledge was, basically, there; what was lacking was coordination, and an agreement on the acceptable level of risk. After this episode, a code of practice was agreed that should ensure that future events of this kind are handled more smoothly. Sometimes, though, the key science isn’t known. An interesting example was the outbreak of BSE or “mad cow disease” in Britain in the 1980s. At first, experts conjectured that this disease posed no threat to humans because it resembled scrapie in sheep, which had been endemic for 200 years without crossing the species barrier. That was a reasonable conjecture, and comforting to politicians and the public, but it proved wrong. The pendulum then swung the other way. Banning “beef on the bone,” for instance, was, in retrospect, an overreaction, but at the time seemed a prudent precaution against a potential tragedy that could have been far more widespread than it actually turned out to be.
Likewise, governments were prudent to stock up vaccines against swine flu—even though, fortunately, the epidemics have so far proved milder than feared. Indeed, if we apply to pandemics the same prudent analysis whereby we calculate an insurance premium—multiplying probability by consequences—we’d surely conclude that measures to alleviate this kind of extreme event should actually be scaled up. (And these measures need international cooperation. Whether or not an epidemic gets a global grip may hinge, for instance, on how quickly a Vietnamese poultry farmer can report any strange sickness.)
Incidentally, there’s a mismatch between public perception of very different risks and their actual seriousness. We fret unduly about carcinogens in food and low-level radiation. But we are in denial about low-probability, high-consequence events that should concern us more. The recent financial crash was one such event; but others that haven’t yet happened—lethal pandemics are one example—should loom higher on the agenda.
The varied topics mentioned above show how pervasive science is, in our lives and in public policy. President Obama certainly recognized this when he filled some key posts in his administration with a dream team of top-rate scientists. He opined that their advice should be heeded “even when it is inconvenient—indeed especially when it is inconvenient.” In Britain we have a chief science advisor, and separate independent advisors in most government departments.
Winston Churchill valued advice from scientists (which was plainly crucial in World War II), but kept them in their place; he insisted that they should be “on tap, not on top.” It is indeed the elected politicians who should make decisions. But the role of scientific advice is not just to provide facts, still less to support policies already decided. Experts should be prepared to challenge decision-makers, and help them to navigate the uncertainties. But there’s one thing that scientific advisors mustn’t forget. Whether the context be nuclear weapons, nuclear power, drug classification, or health risks, political decisions are seldom purely scientific: they involve ethics, economics, and social policies as well. And in domains beyond their special expertise, scientists speak just as citizens, with no enhanced authority.
There’s no denying where science has recently had the most contentious policy impact, and where the long-term stakes are highest: climate change. Climate science is complex, involving a network of intermeshing effects. But there is one decisive piece of evidence: the amount of carbon dioxide in the atmosphere is higher than it’s been for at least half a million years, and it is inexorably rising, mainly because of the burning of fossil fuels. This measurement isn’t controversial. And straightforward chemistry tells us that carbon dioxide is a so-called greenhouse gas; it acts like a blanket, preventing some of the heat radiated by the Earth from escaping freely into space. So the carbon dioxide buildup in the atmosphere will trigger a long-term warming, superimposed on all the other complicated effects that make climate fluctuate.
What is uncertain, however, is the predicted rate of warming—calculations yield a spread of projections, depending on how much the poorly understood “feedback” from water vapor and clouds enhances the blanketing. The high projections are of course more threatening than the low ones, and so much is at stake that it’s crucial to develop better understanding so that predictions can be firmed up. Nonetheless, even the existing science convinces me that the threat of seriously disruptive climate change is significant enough to justify its priority on the political agenda.
This confidence may surprise anyone who has dipped into all that’s been written on the subject. Any trawl of the Internet reveals diverse and contradictory claims. How do you make up your mind? The following analogy suggests an answer.
Suppose you seek medical guidance. Googling any ailment reveals a bewildering range of purported remedies. But, if your own health were at stake, you wouldn’t attach equal weight to everything in the blogosphere: you’d entrust yourself to someone with manifest medical credentials and a successful record of diagnosis. Likewise, we get a clearer “steer” on climate—though not, of course, a complete consensus—by attaching more weight to those with serious credentials in the subject. But, as already noted, it’s crucial to keep “clear water” between the science on the one hand and the policy response on the other. Risk assessment should be separate from risk management.
Even if there were minimal uncertainty about how the world’s weather might change, there would still be divergent views on what governments should do about it. Climate scientists would themselves still have a range of opinions on what the best policies were: but they should express these views as citizens, and not claim any special weight for their policy judgments.
There’s a balance to be struck between mitigating climate change and adapting to it. And there are other questions. How much should we sacrifice now to ensure that the world is no worse when our grandchildren grow old? How much subsidy should be transferred from the rich world, whose fossil fuel emissions have mostly caused the problem, to the developing nations? How much should we incentivise clean energy? Should we gamble that our successors may devise a technical fix that will render nugatory any actions we take now? On all these choices there’s as yet minimal consensus, still less effective action. But policies, and investment priorities, are being influenced by climate change projections. So it’s inevitable, and right, that climate science is under specially close scrutiny.
In the past, most people unquestioningly accepted “authorities” on any topic, but that has now changed. We can all access far more information than ever before and we want to weigh up evidence for ourselves. Such scrutiny should be welcome; just as there are instances of shoddy work, error, or even malpractice in the medical and legal professions, so there are occasionally in science.
Current practice in archiving and managing data is not uniform across all fields, nor across all countries. Nor is there a consensus on the appropriate guidelines for making such information available. Under the United Kingdom’s Freedom of Information Act, anyone, whether a UK taxpayer or not, whether they have good reason or not, can impose burdensome demands on researchers by repeated requests. It is not obvious that this is right. Similar issues arise in the United States, where members of Congress have made vexacious requests for data from researchers, or for access to e-mails. On the other hand, we surely need to facilitate open debate, to ensure that scientific claims are robust and firmly grounded.
Scientists are their own severest critics. They have more incentive than anyone else to uncover errors. That’s because the greatest esteem goes to those who contribute something unexpected and original, and especially to those who can overturn a consensus. That’s how initially tentative ideas get firmed up—not only on climate change, but (to cite some examples from earlier years) regarding the link between smoking and lung cancer, and between HIV and AIDS. But that’s also how seductive theories get destroyed by harsh facts. Science is organized skepticism.
The path toward a consensual understanding is often winding, with many blind alleys being explored before reaching it. Sometimes, a prior consensus is overturned—though Thomas Kuhn’s famous book on scientific revolutions perhaps exaggerates how often this happens. The Copernican cosmology, overthrowing the concept of a geocentric cosmos, would qualify as a genuine revolution, as would quantum theory. But most advances transcend and generalize the concepts that went before, rather than contradict them. For instance, Einstein didn’t overthrow Newton. His work led to a theory that had broader scope and gave deeper insights into the nature of space and gravity, but Newton’s laws are still good enough to predict the trajectories of spacecraft. (There is, incidentally, one practical context where Einstein’s refinements are needed: the accuracy of the Global Positioning Satellites (GPS) used in SatNav systems would be fatally degraded if proper allowance wasn’t made for the slight difference between the clock rates on Earth and those in orbit that is predicted by relativity theory.)
As a student at Cambridge University in the 1960s I watched at close hand a standoff between Martin Ryle and Fred Hoyle—two outstanding scientists utterly different in their personal and professional styles—on whether the universe had emerged from a big bang or whether it had existed forever in a so-called steady state. New evidence settled this debate in Ryle’s favor, an outcome to which Hoyle was never fully reconciled, though by the end of his life he was advocating a compromise “steady bang” theory that gained little traction with others. Likewise, there have been high-profile vendettas on conceptually important issues in evolutionary biology and sociobiology. There are fewer purely intellectual disputes today that become so deeply personalized. This is not due to the sweeter dispositions of a younger generation of scientists, but because, as data accumulate, there is progressively less scope for viable but strongly divergent hypotheses; and there is a growing incentive toward collaborative rather than isolated research.
When rival theories fight it out there is eventually just one winner—at most. Sometimes, one crucial piece of evidence clinches the case, as happened for the big bang cosmology and also for continental drift. In other cases, an idea gains only a gradual ascendancy as alternative views get marginalized until their leading proponents die off. Sometimes, the subject moves on, and what once seemed an epochal issue is bypassed or sidelined.
Our scientific knowledge and capabilities are actually surprisingly patchy. Odd though it may seem, some of the best-understood phenomena are far away in the cosmos. Back in the seventeenth century, Newton could describe the “clockwork of the heavens”; eclipses could be both understood and predicted. (Indeed, even in Babylonian times, regularities and repetitions were discerned and some prediction was possible even in ignorance of the underlying mechanism.) But few other things are so predictable. For instance, it’s hard to forecast, even a day before, whether those who go to view an eclipse will encounter clouds or clear skies. And our understanding of some familiar matters that interest us all—diet and child care, for instance—is still so meager that expert advice changes from year to year. Everyday phenomena, especially those involving entities as complex as human beings, can be more intractable than anything in the inanimate world.
If you ask scientists what they are working on, you will seldom get an inspirational reply like “seeking to cure cancer” or “understanding the universe.” Rather, they will focus on a tiny piece of the puzzle and tackle something that seems tractable. Scientists are not thereby ducking the big problems, but judging instead that an oblique approach can often pay off best. A frontal attack on a grand challenge may, in fact, be premature. For instance, forty years ago President Richard Nixon declared a war on cancer, envisaging this as a national goal modeled on the then-recent Apollo Moon-landing program. But there was a crucial difference. The science underpinning Apollo—rocketry and celestial mechanics—was already understood, so that, when funds gushed at NASA, the Moon landings became reality. But in the case of cancer the scientists knew too little to be able to target their efforts effectively.
It’s easy to think of other examples. Suppose that a nineteenth-century innovator had wanted to develop better machines to reproduce music. He could have made very elaborate mechanical organs or pianolas, but wouldn’t have accelerated the advent of radio or the MP3 player. And a medical program to seek ways to see through flesh certainly wouldn’t have stimulated the serendipitous discovery of X-rays.
The word “science” is being used here in a broad sense, by the way, to encompass technology and engineering; this is not just to save words, but because all of these disciplines are symbiotically linked. The mental process of problem solving motivates us all, whether one is an astronomer probing the remote cosmos or an engineer facing a down-to-earth design conundrum. There is at least as much challenge in the latter, a point neatly made by an old cartoon showing two beavers looking up at a hydroelectric dam. One beaver says, “I didn’t actually build it, but it’s based on my idea.” The Swedish engineer who invented the zip fastener made a greater intellectual leap than that achieved by most pure academics.
Nixon’s cancer program, incidentally, facilitated a lot of good research into genetics and the structure of cells. Indeed, the overall research investment made in the twentieth century has paid off abundantly. But the payoff happens unpredictably and after a time lag that may be decades long, which is why much of science has to be funded as a public good. A fine exemplar of this point is the laser, invented in 1960. The laser applied basic ideas that Einstein had developed more than forty years earlier, but its inventors couldn’t have foreseen that lasers would later be used in eye surgery and in DVD players.
Communication and assessment
Traditionally, discoveries reach public attention only after surviving peer review and being published in a scientific journal. This procedure actually dates back to the seventeenth century. In the 1660s, the Royal Society started to publish Philosophical Transactions, the first scientific journal, which continues to this day. Authors were enjoined to “reject all amplifications, digressions and swellings of style.” This journal pioneered what is still the accepted procedure whereby scientific ideas are criticized, refined, and codified into public knowledge. Over the centuries, it published Isaac Newton’s research on light, Benjamin Franklin’s experiments on lightning, reports of Captain Cook’s expeditions, Volta’s first battery, and, more recently, many of the triumphs of twentieth-century science.
But this procedure for quality control is under increasing strain, due to competitive or commercial pressures, 24-hour media, and the greater scale and diversity of a scientific enterprise that is now widely international. (And of course scientific journals are now mainly distributed electronically rather than as paper copies.)
A conspicuous departure from traditional norms happened in 1989 when Stanley Pons and Martin Fleischmann, then at the University of Utah, claimed at a press conference to have generated nuclear power using a tabletop apparatus. If credible, this “cold fusion” fully merited the hype it aroused: it would have ranked as one of the most momentous breakthroughs since the discovery of fire. But doubts set in. Extraordinary claims demand extraordinary evidence, and in this case the evidence proved far from robust. Inconsistencies were discerned; and others failed to reproduce what Pons and Fleischmann claimed they had done. Within a year, there was a consensus that the results had been misinterpreted, though even today some believers remain.
The cold fusion claims bypassed the normal quality controls of the scientific profession, but did no great harm in the long run, except to the reputations of Pons and Fleischmann. Indeed in any similar episode today, exchanges via the Internet would have led to a consensus verdict even more quickly.
But this fiasco holds an important lesson, one that I’ve already emphasized in the context of climate science. What’s crucial in sifting error and validating scientific claims is open discussion. If Pons and Fleischmann had worked not in a university but in a lab whose mission was military, or commercially confidential, what would have happened then? If those in charge had been convinced that the scientists had stumbled on something stupendous, a massive program might have gotten under way, shielded from public scrutiny and wasting huge resources. (Indeed, just such waste has occurred, more than occasionally, in military laboratories. One example was the X-ray laser project spearheaded by Edward Teller at the Livermore Laboratory as part of President Reagan’s Star Wars initiative in the 1980s.)
The imperative for openness and debate is a common thread through all the examples I’ve discussed. It ensures that any conclusions that emerge are robust and that science is self-correcting. Even wider discussion is needed when what’s in contention is not the science itself but how new findings should be applied. Such discussions should engage all of us, as citizens, and, of course, our elected representatives. Sometimes this has happened, and constructively too. In Britain, ongoing dialogue with parliamentarians led, despite divergent ethical stances, to a generally admired legal framework on embryos and stem cells—a contrast to what happened in the United States. But Britain has had failures too: the GM crop debate was left too late, to a time when opinion was already polarized between eco-campaigners on the one side and commercial interests on the other. As a result, GM food, which has been eaten by 300 million Americans without any manifest ill effects, is severely constrained throughout the European Union.
With regard to the successful communication of science to the public, Darwin’s On the Origin of Species, published in 1860, is an exemplar. The book was a best seller, readily accessible—even fine literature—as well as an epochal contribution to science. But that was an exception. In glaring contrast, Gregor Mendel’s 1866 paper entitled “Experiments with Plant Hybrids,” reporting the classic experiments on sweet peas conducted in his monastery garden, was published in an obscure journal and wasn’t properly appreciated for decades. (Darwin had the journal in his private library, but the pages remained uncut. It is a scientific tragedy that he never absorbed Mendel’s work, which laid the foundations for modern genetics.)
No twenty-first-century breakthroughs could be presented to general readers in such a compelling and accessible way as Darwin’s ideas were; the barrier is especially high when ideas can be fully expressed only in mathematical language. Few read Einstein’s original papers, even though his insights have permeated our culture. Indeed, that barrier already existed in the seventeenth century. Newton’s great work, the Principia, highly mathematical and written in Latin, was heavy going even for his distinguished contemporaries like Halley and Hooke; certainly a general reader would have found it impenetrable, even when an English version appeared. Popularizers later distilled Newton’s ideas into more accessible form—as early as 1730 a book appeared entitled Newtonianism for Ladies. What makes science seem forbidding is the technical vocabulary, the formulas, and so forth. Despite these impediments, the essence (albeit without the supportive arguments) can generally be conveyed by skilled communicators. It’s usually necessary to eschew equations, but that by itself is not enough. The specialist jargon—and, even more, the use of familiar words (like “degenerate,” “strings,” or “color”) in special contexts different from their everyday usage—can be baffling too.
The gulf between what is written for specialists and what is accessible to the average reader is widening. Literally millions of scientific papers are published, worldwide, each year. They are addressed to fellow specialists and typically have very few readers. This vast primary literature needs to be sifted and synthesized, otherwise not even the specialists can keep up. Moreover, professional scientists are depressingly “lay” outside their specialties—my own knowledge, such as it is, of recent biological advances comes largely from excellent popular books and journalism. Science writers and journalists do an important job, and a difficult one. I know how hard it is to explain in clear language even something I think I understand well. But journalists have the far greater challenge of assimilating and presenting topics quite new to them, often to a tight deadline; some are required to speak at short notice, without hesitation, deviation, or repetition, before a microphone or TV camera.
In Britain there is a strong tradition of science journalism. But there is an impediment: these dedicated journalists are up against the problem that few in top editorial positions have any real background in science. The editors of even the so-called highbrow press feel they cannot assume that their readers possess the level of knowledge that we might hope high school graduates would have achieved, whereas the same organs would not talk down to their readers on financial topics or on the arts pages: economic articles are often quite arcane, and the music critic would be thought to be insulting his readers if he defined a concerto or a modulation. About half of the readers of the quality press have some scientific education or are engaged in work with a technical dimension, while it is those who control the media (and those in politics) who are overwhelmingly lacking in such basic knowledge.
Science generally only earns a newspaper headline, or a place on TV bulletins, as background to a natural disaster, or health scare, rather than as a story in its own right. Scientists shouldn’t complain about this any more than novelists or composers would complain that their new works don’t make the news bulletins. Indeed, coverage restricted to newsworthy items—newly announced results that carry a crisp and easily summarized message—distorts and obscures the way science normally develops. Scientific ideas are better suited to documentaries and features. The terrestrial TV channels offer the largest potential audience, but commercial pressures, and concern that the viewers may channel-surf before the next advertising break, militate against the kind of extended and serious argument presented in TV classics such as Jacob Bronowski’s The Ascent of Man and Carl Sagan’s Cosmos. Fortunately, cable channels and the Internet open up niches for more specialized content—lectures given at universities and scientific meetings are now routinely webstreamed and archived.
The best way to ensure that one’s views get through undistorted is via the written word, in articles and books. Some distinguished scientists have been successful authors but most scientists generally dislike writing, though present-day students are far more fluent (if not more literate) than my own pre–e-mail and pre-blog generation ever was. Many of the most successful writers of scientific books are interpreters and synthesizers rather than active researchers. Bill Bryson, for instance, has marvelously conveyed his zest and enthusiasm for “nearly everything” to millions.
I would derive less satisfaction from my astronomical research if I could discuss it only with professional colleagues. I enjoy sharing ideas, and the mystery and the wonder of the universe, with nonspecialists. Moreover, even when we do it badly, attempts at this kind of communication are salutory for scientists themselves, helping us to see our work in perspective. As already emphasized, researchers don’t usually shoot directly for a grand goal. Unless they are geniuses (or unless they are cranks) they focus on timely, bite-sized problems because that’s the methodology that pays off. But it does carry an occupational risk: we may forget that we’re wearing blinkers and that our piecemeal efforts are only worthwhile insofar as they’re steps toward some fundamental question.
In 1964, Arno Penzias and Robert Wilson, radio engineers at the Bell Telephone Laboratories in Holmdell, New Jersey, made, quite unexpectedly, one of the great discoveries of the twentieth century: they detected weak microwaves, filling all of space, which are actually a relic of the big bang. But Wilson afterward remarked that he was so focused on the technicalities that he didn’t himself appreciate the full import of what he’d done until he read a popular description in the New York Times, where the background noise in his radio antenna was described as the “afterglow of creation.”
Incidentally, we scientists habitually bemoan the meager public grasp of our subject—and of course all citizens need some understanding, if policy debates are to get beyond tabloid slogans. But maybe we protest too much. On the contrary, we should, perhaps, be gratified and surprised that there’s wide interest in such remote topics as dinosaurs, the Large Hadron Collider in Geneva, or alien life. It is indeed sad if some citizens can’t distinguish a proton from a protein; but equally so if they are ignorant of their nation’s history, or are unable to find Korea or Syria on a map—and many people can’t.
Misperceptions about Darwin or dinosaurs are an intellectual loss, but no more. In the medical arena, however, they could be a matter of life and death. Hope can be cruelly raised by claims of miracle cures; exaggerated scares can distort health-care choices. When reporting a particular viewpoint, journalists should clarify whether it is widely supported, or whether it is contested by 99 percent of specialists. The latter was the case when a doctor claimed that the MMR vaccine (offering combined protection against measles, mumps, and rubella) could induce autism in small children—a claim that was later discredited.
Noisy controversy need not signify evenly balanced arguments. Of course, the establishment is sometimes routed and a maverick vindicated. We all enjoy seeing this happen, but such instances are rarer than is commonly supposed. The best scientific journalists and bloggers are plugged into an extensive network that should enable them to calibrate the quality of novel claims and the reliability of sources.
But what about ideas beyond the fringe? Here there’s less scope for debate—the two sides do not share the same methods or play by the same evidence-based rules; as an astronomer, I’ve not found it fruitful to have much dialogue with astrologers nor creationists. We shouldn’t let a craving for certainty—for the easy answers that science can seldom offer—drive us toward the illusory comfort and reassurance that such concepts offer.
Scientists should expect media scrutiny. Their expertise is crucial in areas that fascinate us and matter to us all. And they shouldn’t be bashful in proclaiming the overall promise that science offers—it’s an unending quest to understand nature, and essential for our survival.
Scientists as campaigners and concerned citizens
I’ll end, as I began, with a flashback, this time to World War II. The scientific community was then engaged in the war effort, most monumentally in the Manhattan Project that led to the development of the first atomic bomb, but also in radar, operations research, and code-breaking. Most scientists returned with relief to peacetime academic pursuits but for some, especially those who had helped build the bomb, the ivory tower was no sanctuary. They continued not just as scientists but as engaged citizens, promoting efforts to control the power they had helped to unleash.
Among them was Joseph Rotblat, a physicist of Polish origin who went to Los Alamos as part of the UK scientific contingent. In his mind there was only one justification for the bomb project: to ensure that Hitler didn’t get one first and hold the Allies to ransom. As soon as this ceased to be a credible risk, Rotblat left the Manhattan Project, the only scientist to do so at that juncture. He returned to England, became a professor of medical physics, an expert on the effects of radiation, and a compelling and outspoken campaigner. In 1955, he met Bertrand Russell and encouraged him to prepare a manifesto stressing the extreme gravity of the nuclear peril. Rotblat got Einstein to sign, too; it was Einstein’s last public act—he died a week later. This Russell-Einstein manifesto was then signed by nine other eminent scientists from around the world. It led to the initiation of the Pugwash Conferences—so-called after the village in Nova Scotia where the inaugural conference was held. These meetings, which continue to this day, helped to sustain a dialogue between scientists in Russia and the West throughout the Cold War. Such contacts eased the path for the Partial Test Ban Treaty of 1963 and the later Anti-Ballistic Missile Treaty. When the Pugwash Conferences were recognized by the 1995 Nobel Peace Prize, half the award went to the Pugwash organization and half to Rotblat personally, as their prime mover and untiring inspiration.
The goal of Rotblat’s crusade was to rid the world completely of nuclear weapons, an aim that was widely derided as the product of woolly idealism. But this goal gained broader establishment support over the years and in 2006 the Gang of Four—George Shultz, Sam Nunn, William Perry, and Henry Kissinger—espoused a similar cause. They have been supplemented by European gatherings of senior politicians, including former ministers of defense. More importantly, President Obama has reactivated the disarmament agenda by persuading the United States Senate to ratify the SALT agreement, and—in, for instance, his inspirational speech in Prague in 2010—has set zero as an ultimate goal.
Few of the generation with senior involvement in World War II are alive today. In the United States, they have been followed by an impressive cohort of scientists—people from succeeding generations who have done a spell in government, or in high-tech industry, and who serve regularly as consultants to the Pentagon or on advisory committees. But in my own country, Britain, there are depressingly few younger scientists who can match the credentials and expertise of their US counterparts in providing independent expertise. The reasons for this transatlantic asymmetry aren’t hard to find. In the United States, senior staff shuffle between government jobs and posts in, for instance, the Brookings Institution whenever the administration changes. There are always some who are “out” rather than “in.” Britain, in contrast, doesn’t have a revolving-door system; government service is still generally a lifetime career. For this reason, and because secrecy is more pervasive, discussions of defense issues tend to be restricted to a closed official world.
The atomic scientists of the World War II generation were an elite group—the alchemists of their time, possessors of secret knowledge—and independent British scientists cannot aspire to the wisdom and expertise of that battle-hardened generation. But defense and arms control are a diminishing part of the agenda for today’s citizen scientists: the agenda is now far wider and more complex—and the issues span all of the sciences. They are far more open and often global. There is less demarcation between experts and laypersons; campaigners and bloggers enrich the debate. But professionals have special obligations to engage and men like Rotblat were inspiring exemplars. You would be a poor parent if you didn’t care about what happened to your children in adulthood, even though you may have little control over it. Likewise, scientists, whatever their expertise, shouldn’t be indifferent to the fruits of their ideas. Their influence may be limited, but they should try to foster benign spin-offs, commercial or otherwise. They should resist, so far as they can, dubious or threatening applications of their work and alert the public and politicians to perceived dangers.
Unprecedented pressures confront the world, but there are unprecedented prospects too. The benefits of globalization must be fairly shared. There’s a widening gap between what science allows us to do and what it’s prudent or ethical actually to do—there are doors that science could open but which are best left closed. Everyone should engage with these choices, but their efforts must be leveraged by scientific citizens—engaging, from all political perspectives, with the media, and with a public attuned to the scope and the limit of science.