As a medical student I was aware that the teaching of anatomy was in decline. An angry letter from an elderly surgeon in The Lancet bemoaned the poor state of knowledge possessed by younger doctors and those still at medical school. A few weeks later the journal printed a response from an old classmate of the surgeon’s. When they were students, it said, they learnt a lot of anatomy, but that was because there was little else to learn. As an example they summarised what they had been taught of immunology in their student days: that the blood had white cells in it and that they were somehow relevant.
In the decades since, anatomy teaching has changed sufficiently to further chill the hearts of those who fondly remember it being taken more seriously. Rather than laboriously dissecting their own corpses, as even I did, students now barely dissect at all, relying on models or on dissections done by others. The loss has been deliberate. Archie Cochrane, a twentieth-century pioneer who did much to improve the state of knowledge in medicine, had won the prize for anatomy when he trained at Cambridge. Years later he found he could hardly recall anything about it. It was not worth arguing about whether the training had been useful, he said, it was worth an experiment. Students should be randomised to being taught detailed anatomy or not, and their effectiveness as doctors measured. Not a bad proposal but one overtaken by fate. Those fighting the battle for detailed anatomy teaching have lost.
They have lost because there is so much else to learn. The statistical tools that underlie clinical trials, and the trials themselves and the reasons for doing them, take up swathes of the curriculum. What is it that doctors now need to learn? What should be in the books they study? The questions relate to asking what it is they do. Behind that is the question to what extent they can be helped by technology. Or replaced. We are right to worry about getting the answers to those questions wrong, but not to hanker for the fine old days, when doctors were caring and compassionate and looked at their patients rather than computer screens. It is not the artifice of technology we should fear. ‘All things are artificial, for nature is the Art of God’, wrote the physician Sir Thomas Browne in his Religio Medici. The amount of time doctors spend with their hospital patients has not changed in the fifty years over which we have been measuring it, and technology can be used to aid the human relationships good medicine requires.1
Evidence exists about what computers do well and what doctors do well.2 More is needed but it always will be. That’s the nature of evidence even when it relates to nothing of any practical use – even more so when the utility is real. Our ability to acquire evidence about what works and what doesn’t, and our improvements in understanding what that evidence needs to consist of in order to be reliable, has driven the transformation of medicine over the past seventy-five years. As much as technology, the technique of the randomised controlled trial has created medicine’s effectiveness: it’s just that the technology is more impressive.3 ‘Knowing why is more important than learning what’, said James Watson.4 In many settings, clinical medicine included, the reverse is true. Knowing why something has its effect may help you plot your next experiment but the experiment is what tells you what the effect is; knowledge by itself cannot predict that. The techniques of acquiring good evidence are not going to appear in a film drama about the future. They should. They were key in medicine changing from a profession that did more harm than good from the beginnings of history through to the 1930s,5 and into one partly responsible for our modern improvements in life and health. Most medical decisions today are taken on the basis of good evidence but about a fifth remain guesses, and even those based on decent evidence could often helpfully be based on better.6 Finessing our knowledge matters as much as improving our technology. The single greatest change that would most improve human health over the coming century would be if randomised trials, rather than being separated from the practice of daily medicine by a bureaucracy that discouraged them, were embedded within it. To call that the single greatest possible change is to say that it would do more than the elimination of tobacco, which is to say a very great deal.
If we need no more evidence about the harms of tobacco, we need it about almost everything else. We need it because the results of a medical intervention are unpredictable. No matter our understanding of the theory and the underlying science, reality can be relied upon to wrong-foot us. Drug trials go through a number of formal stages. By the time they get to phase 3, we are confident they are going to work. Phase 1 happens only after theory and expert opinion and animal experiment all justify a new treatment. Only if the results are good does the drug then proceed to phase 2, in which its physical effect is measured to check it meets expectations. Only if those results are good does it go to phase 3, in which its actual impact on clinical practice is tested. Given the high degree of confidence everyone possessed even before phase 1, phase 3 should be a formality. Such was our certainty that these final obstacles were unnecessary that, up until a few decades ago, we didn’t routinely do such phase 3 trials. Many still feel that these hurdles are formalised bureaucracy which stifle innovation and cost lives. Why hold the experts back, why not trust them? ‘I think that both with regards to AIDS and cancer and any other life-threatening disease,’ the US Congressman in charge of health said in 1995, ‘we ought to make available to people as quickly as possible drugs and other therapies that may extend their lives and not wait until we know with certainty that something is going to be effective.’7 Many doctors feel the same. All are wrong. By the time a new intervention gets to a phase 3 study, it does not face a final bureaucratic hoop but a vital test. Results from phase 3 trials, when you look at them as a series stretching back decades, are fifty–fifty. The new interventions are as likely to harm as to help.8 With regard to AIDS and cancer and all life-threatening diseases we ought to make available to people as quickly as possible drugs and other therapies that will extend or improve their lives. But first we need to know with certainty, or as much as life allows, that they really do help rather than harm.
The physical world and the human body are so tangled and intricate that predictions are unreliable and tests are needed. If one makes a change at a more basic level of organisation than that of an enzyme or neurotransmitter – at the level, say, of a gene responsible for them – then the convolutions are greater. Recall how few genes we have. With 20,000 proving responsible for all that makes us, each gene is almost always going to be doing more than one thing and the permutations and combinations with which they interact will be astronomical. Predictability goes down, not up. Hence the certainty that gene therapy will not proceed by a gene-programmer conjuring a blueprint for a new human being. It will get going by learning how to make small changes to dreadful diseases, where the consequences of doing nothing are so awful they justify the hazards of doing anything at all. From there it will edge forward by aiming, each time, a little higher than before and eventually a little higher than average.
The repeated lesson of the twentieth century is of the refrain running through this book: that a host of small changes accumulate into something big, and that at some stage they result not only in a quantitative accumulation of improvements – a steadily increasing life expectancy – but a qualitative change. A different sort of life. Writing in 1898, in a book about the 1800s called The Wonderful Century: Its Successes and Failures, Alfred Russel Wallace thought the technological achievements of the previous hundred years were best compared not with the hundred years before that but with the entirety of previous human history. Change will continue to accelerate. All attempts will fail to accurately predict even what will seem, in retrospect, to have been obvious. ‘One of the most prominent features of our century has been the enormous and continuous growth of wealth’, wrote Wallace in 1898, ‘without any corresponding increase in the well-being of the whole people; while there is ample evidence to show that the number of the very poor… has enormously increased.’9 Our power at predicting the future is not up to our ability to explain the past and Wallace was getting both wrong. Wealth had already increased the well-being of the whole people and has continued to do so. It is likely to go on doing so, however much rises in inequality make that progress stutter and however many stories tell truthfully of the exceptions.
Medawar called science the art of the possible. Through experiments on cows and mice, he showed how the immune system learnt self from non-self. That led to the conclusion that it must, in theory, be possible to conduct a successful organ transplantation. If the system for differentiating self from non-self could be understood, it could be altered. The eventual success did not come from techniques Medawar developed. His effect on the emergence of organ transplantation was to encourage the people who brought it about to believe that they could. The notion that the germ warfare of microorganisms could be harnessed therapeutically had been around for some time but had never been taken seriously. Pasteur mentioned it, noting how fungi killed bacteria, and speculated in passing that it might be medically useful. But the idea did not seem to belong to the realm of the possible. Fifty years passed before Ehrlich, excited by the emergence of organic dyes that stained particular cells and tissues in unique ways, noted that some of these dyes were toxic. If one could find a dye that was taken up only by bacteria, and was toxic to them, the discovery would amount to a ‘magic bullet’ that would harm the invading microbe and leave the host untouched. Ehrlich took the term from German folklore and used it so persuasively that he put into the realm of the possible a new idea. Doing so triggered a serious search to track down the antibiotics he had predicted were out there. It was a matter of time and the time was only ever going to be short.
Sweeping fantasies and bold predictions are rarely helpful. But sometimes, if they are based closely enough on a sense of how the world actually is, they are. They open up a window onto new views of life. Some willingness to risk making a fool of oneself by being imaginative is necessary, lest the realm of the possible be narrowed down to the scope of what is already there. Ehrlich’s idea that dyes could kill microbes was ludicrous – even in retrospect, his idea that they needed to be dyes was a ridiculous muddle – but it was an idea of enough imaginative genius to lead to the prevention of millions and millions of premature deaths. His prediction could have been made decades earlier. It was the insight that was lacking, not the technology.
There is no profit in routine and banal predictions that we will soon rebuild, redesign and improve all bits of the human body and mind: their effect is to demean the incorrigible intricacy of the world. But the future is uncertain and hence its excitement. The uncertainty is not only due to the importance of imagination and genius but because tiny incremental gains will continue to add up to profound changes and these are unpredictable. Water gradually evaporates as the temperature rises in a kettle, but at a certain point everything changes and it boils. Each passenger pigeon or dodo that was killed subtracted only a small amount from the sum of life except when the final deaths, of birds no different from the others, meant the extinction of their species. These phenomena are predictable because we know them; when they first happened, they weren’t. The computers I use today are incrementally faster than those I used yesterday but the difference does not mean they are doing the same things faster. They are doing different things entirely. ‘Often quoted is the saying attributed to the architect Ludwig Mies van der Rohe, “Less is more”’, wrote the physicist John Archibald Wheeler, adding an alternative:
‘More is different’. When you put enough elementary units together, you get something that is more than the sum of these units. A substance made of a great number of molecules, for instance, has properties such as pressure and temperature that no one molecule possessed. It may be a solid or a liquid or a gas, although no single molecule is solid or liquid or gas… When enough simple elements are stirred together, there is no limit to what can result.10
In sleeve notes written to accompany a recording of Mozart’s Requiem, Stephen Jay Gould – in his capacity as a student of history and evolution – remarked that ‘Unpredictable contingency, not lawlike order, rules the pathways of history.’11 How easily, he pointed out, Mozart might have survived as long as Händel. The collision of microorganisms with a single immune system had consequences we still feel. But then so did the good luck of Mozart surviving smallpox, typhoid and rheumatic fever as a child – and of having existed at all. As it was in the past so shall it be in the future. The trends of technology show some predictability. If Florey, Chain and Heatley had not discovered penicillin, someone else would have.* Computers would have been interconnected through a worldwide web had Tim Berners Lee never been born. But human culture is less easy to prophesy. Science has not changed that and shows no sign of being able to. In the discussion after a talk given by Gould, Peter Medawar asked the audience whether anyone present could give a single example of a successful prediction about human societies made on the basis of biology. None could.12 Genetics and evolution and biology are full of interest, but their insights cannot be applied to all problems. Sociologists, psychologists, economists, political scientists and historians have precious little to offer. It is precious because it is the best going. ‘Say not, let there be light,’ wrote Hazlitt, ‘but darkness visible’.13 It was the same thought which Keats put so famously:
Several things dove-tailed in my mind, and at once it struck me what quality went to form a Man of Achievement, especially in Literature, and which Shakespeare possessed so enormously – I mean Negative Capability, that is, when a man is capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason.14
‘Science lives in a perpetual present, and must always discard its own past as it advances’, wrote Clive James.15 Bewitchingly put and entirely wrong. Science is a historical project. ‘Poetry comprehends all science’, Shelley said, which was better.16 He meant that science was necessarily imaginative. Its processes can be automated but not its creativity. ‘A man cannot say I will write poetry’, he asserted; ‘the greatest poet even cannot say it’. Imagination comes with no guarantees. ‘An art of discovery is not possible’, noted William Whewell, the man who coined the term ‘scientist’. One can write down words or perform an experiment but that is no guarantee of producing either poetry or science. What is true of individuals is not true of societies. At a higher level of organisation, results cease being random and become at least partly predictable. In the right cultural environments, science and poetry prosper.
‘A scientist can revisit scientific history at his choice’, wrote James; ‘a humanist has no choice: he must revisit the history of the humanities all the time, because it is always alive, and can’t be superseded’. That was wrong too. A man doing a scientific experiment has to be basing it on history, and the better his judging of history the more likely he is to have sensed the future possibilities he hopes his experiment will reveal. A man who wishes to be a scientist must live in the past, or at least with an acute awareness of how the past influences the present and how the two together determine the future. ‘Fashion is something barbarous,’ wrote Santayana, ‘for it produces innovation without reason and imitation without benefit.’17 Without a feel for history, there is no distinguishing the weak touch of fashion from the grip of reality. The difference is vital. ‘The certainty that life cannot be long, and the probability that it will be much shorter than nature allows,’ said Samuel Johnson, ‘ought to awaken every man.’18 Ehrlich’s insight about the existence of antibiotics carried weight with his audience for the same reason that it turned out to be so broadly and usefully correct. It was based on a sense of the possible. It was a piece of imagination, of poetic inspiration. Such moments – in science as in all else – come by grace, and grace comes by art, and art does not come easy.19 Ehrlich had spent his life putting in the work that gained him such a feel for the history and progress of science that he could sense its possibilities. What seems obvious in retrospect was obvious at the time to nobody but him, and not even to him until he thought of it.
Over a century ago Osler pointed out that anaesthesia was a miracle not anticipated even in the Bible. Today we can add that neither were antibiotics, nor the progress the world has seen in mortality and morbidity, in social mobility, in racial and sexual equality, in freedom of behaviour, in the open access technology offers to the best of culture. Riches for all have been preached by religions, but the riches were of the afterlife or they were spiritual; they were never the worldly ones we have achieved. It is easy to overlook that Christ told the rich man to give his money to the poor in order to help the rich man’s soul, not the poor man’s life. Even the New Testament’s message of love focused on the benefits of being a better person, not of making a better world. To aid and ease the poor and the diseased has been a common aspiration of religions since religions began. The elimination of absolute poverty and the annihilation of premature disease has not. We have gone some way to managing what our greatest prophets never conceived was possible.
Miseries that were once common have become less so. The achievement is no less precious for being incomplete. In his memoir of the First World War, the tank commander Wilfred Bion recalled the noises of a quiet night on the Western Front, sounds that he remembered ever after in ‘the watch fires of a thousand sleepless nights’.20 The sounds had struck him when he first heard them as being the cries of bitterns, of marsh birds. In fact they were the cries of the men stuck in mud of no-man’s-land, sinking and dying. ‘Do you mean no stretcher bearer gets them?’, he asked, on the night he first understood what they were. ‘No stretcher bearer would be such a fool’, he was told. Those who left the tracks and paths in order to try sank into the mud themselves, and were never found.
Many have been lost and many will continue to be, their lives and their hopes sucked down without a trace. Everyone reading this will be able to think of a hundred counter-examples to progress, a hundred reasons not to give in to cheerfulness and optimism. But everything is disappointing in practice. Counter-examples do not outweigh what has happened in human history and what is set to continue happening, which, for all its flaws, is justly called progress. So many people have drowned in the mud of war and in the metaphorical mud of disease, lack of opportunity and life’s other hazards, the hazards in which lives that could be well lived are swamped and sunk. It matters that the fate befalls fewer than ever before, and knowing that this is so matters also. It matters because fortifying our sense that the world is improvable helps our efforts to improve it. The world has never been so beautiful. Those born into the most miserable and bleak of today’s countries face a brighter future than those born into the richest and best of two centuries ago. The fact that these are generalisations means they have exceptions. It also means that in general they are true. The improvements are likely to keep coming, so long as climate change or nuclear war or the toxic effects of autocracy and mob rule do not eliminate their ability to do so. Science has proven its worth as a way of making life better. To think the world will continue to be improved by science is an act of optimism that has to be shakily advanced, but to think it will be put right without science is not even good enough to be shaky. Today few of us have first-hand experience of the loss of one of our children, let alone several. First-hand experience of death before old age is still normal, in the statistical sense. It is shocking when a friend dies in middle age. It will become more so. Generations of children will grow up for whom losing a parent prematurely shall be as rare an experience as for a parent to lose a child is now.
Ötzi the ice-man fell into the Alpine ice 2,000 years before the Pharaoh Tutankhamun walked the earth. He emerged in 1991.
When Ötzi was alive civilisation was flourishing. We have the poetry and laws and medical knowledge of Sumer, and much of it feels modern. From concerns that publicans might take advantage of consumers by serving them small measures,21 through to its curses (‘may your columns… fall to the ground like tall young men drunk on wine’22), it is a culture we can recognise. A teacher tells a student about to begin his adult education, ‘you will never return to your blinkered vision’. Eyes once opened to ideas cannot be easily closed on them, nor, he explains, should they be – ‘that would be greatly to demean due deference, the decency of mankind’.23
How much of this cultural knowledge reached Ötzi, with his flint knife, copper-headed axe and clothing of grass and leather, we cannot know. Some fraction. The clothes he wore and the equipment he carried are recognisable to us; so too would have been the thoughts in his mind. Both are less than they would be today. The tools his hand and his mind could grasp were the tools of his age. Our capacities have expanded. The bioprosthetics available to Alpinists are beyond what he could have dreamt of – and so are their dreams.
Free of a modern diet, Ötzi’s teeth showed no signs of decay but the roughness of his diet had ground his molars flat. His incisors were worn too, probably the result of needing to use them as tools, biting down to work the leather his life depended on. He might have been twenty when he died or he might have been sixty; we can’t tell. He was arthritic in his back and in his right hip. He had broken his ribs long enough beforehand that they had had enough time to heal. He had the calcified blood vessels of atherosclerosis. He had survived frostbite, but incompletely – his left little toe bore the scars. The bones of his legs were thickened from hard use.
Away from the great civilisations of his day, Ötzi lived and died at the shadowy edge of the world. Even for the luckiest, life was shorter and more run through with violence and uncertainty and random death. The scope of human imagination was not so broad, nor were lives buoyed up or made rich by thousands of years of written culture. It was not only teeth and bones that were worn down. Ötzi was about 1.65 metres (5 ft 5 in), but modern times would have changed his thoughts more fundamentally than his measurements. The real difference it would have made would have been to the life of his mind. When life was wilder and more inexplicable, when the horrors of fate that were common could not even be understood, Ötzi would have lived in fear. He was right to. His fate was to die from being shot in the back.24
*
Darwin was thought of by many as a lucky plodder. He still is, the verdict of those who notice that alongside On the Origin of Species and The Descent of Man he wrote chiefly on the details of worms, barnacles, orchids and corals. Medicine is often felt to consist of advances like penicillin – of discoveries where the word ‘breakthrough’ must always be applied. Both views are wrong. It was absorption in detail that meant Darwin produced grand sweeps of view that were not grandiose. His study of worms and corals and natural selection showed how small changes could create landscapes and create species. Medawar wrote of how often the public assumed biology institutes were full of people arguing about the definition of life, when actually the question was of no interest there because it inspired no experiments. Detail and uncertainty are hallmarks of seriousness. The effortful interest in truth is supported by the habit of always reaching for explanations whose conceptions match the phenomena one is trying to explain. The length of the coastline of Maine is measurable with a ruler on a map but not with one on a beach. ‘At each stage, entirely new laws, concepts and generalisations are necessary… Psychology is not applied biology nor is biology applied chemistry.’25
Most of the time we sense there is something wrong with any explanation that makes out life is simple. Much about the world is too muddled to be systematic. From the crooked timber of humanity, nothing straight was ever made.26 When even physics serves as an imperfect model for the world, biology and social science cannot hope to do better. Knowledge and information too easily accumulate enough to impress but not sufficiently to help. ‘She had a great many opinions,’ said V. S. Naipaul of a character he disliked, ‘but taken together they did not add up to a point of view.’27 Life can never be seen clearly, but having a point of view allows one a position to occupy when peering at it. Collections of facts and opinions are not substantial enough to ward off ideology, and ideology subtracts from what one can see of life.† Such thoughts are not out of place in a book about medicine and medical science. Famine and war count when it comes to avoidable mortality. They counted in the twentieth century and they count today. Liberal democracies are deeply, profoundly and permanently flawed, packed full of compromise and dishonesty, self-interest and fudge. ‘The worst government is often the most moral’, wrote H. L. Mencken. ‘One composed of cynics is often very tolerant and humane. But when fanatics are on top there is no limit to oppression.’28 Liberal democracies remain the fundamental drivers of human improvement. They have been since they started and they started a long time ago. When the Romans invaded Britain and health improved sufficiently for human height to rise, the improvement came because Roman society was better than that of the Britons. To say that is just saying the same thing twice. What’s not tautological is to point out that it is evidence that Roman society, for all it was far from being liberal or democratic in a modern sense, allowed more human freedom and fairness, more tolerance and liberality, than had been managed by pre-Roman Britons. When freedoms and opportunities are balanced and compromised in a more civil way than before, people get healthier. One does not need to wait for perfection in order to look for improvement.
Science and ideology are mutual antagonists. Being empirical – experimental – science opposes the idea that the world can be understood in advance, through theory, which itself is the bedrock of ideology. ‘There is a mask of theory over the whole face of nature’, said William Whewell in the nineteenth century.29 It was science that was capable of seeing past the mask, and it still is, because science is the methodical prioritising of experience over expectations. Science strengthens our ability to learn from experience just as ideology weakens it. Science flourishes best in societies whose beliefs are incomplete and inconsistent. It is blighted by orthodoxy, as Soviet agriculture was by Lysenkoism, a system of beliefs about biology and genetics that fitted well with communist ideals and badly with reality. Scientists who objected were killed and their objections proven correct when crop yields fell. Many more then died too, and they died from starvation. Compared with that, measles outbreaks among ‘anti-vaxxers’ in the southern USA feel like lessons in progress.
Progress in medicine equates to expansion. Two hundred years ago Britain’s 11 million inhabitants were served by a few hundred physicians. Now there are 65 million people living here and if the number of doctors had gone up in proportion there would be a couple of thousand, not many more than staff my own hospital. Instead, there are a quarter of a million. Will medicine just get ever bigger, occupying more and more of our lives? Potentially, yes, and let us hope so. Not only will it get bigger as it gets more effective, it will get bigger as we need to spend less of our time on other occupations. It’s no coincidence that medical activity is increasing as other forms of employment are shrinking. It’s one of the luxuries progress buys us. As we develop we get better at doing many things with fewer people. Mechanisation reduces some opportunities but by making the world more efficient it creates others. That could leave us with greater inequality, and a large underclass of long-term unemployed, or it could make us strive to ensure that entry into the professions that remain is never limited by lack of education. It’s certainly not limited by genes. Societies with fewer menial jobs will need to direct ever more investment into education. Increasing wealth should mean a rising number of people employed in education and in health care. More is not necessarily better when it comes to health and education but a better world certainly has more. There are no limits to education. Whether results are worth the expense and the effort depends on how much we have to spend and on how wisely we spend it. The same applies to medicine. The notion that health-care services will shrink as we get better at health prevention was a fond fantasy and a bad mistake. Age shall always weary us, and the years condemn, and staving off premature death ever more effectively will only make the problems of old age weigh more heavily. As we cope with those problems we have to remember to celebrate the fact we have them.
Health promotion and healthy living do not catch the idea of what medicine is now about. The notion that we ought to live well, physically speaking, has a moral edge to it, an edge sharp enough to cut. Its puritanism contains more than a seed – more even than a sapling – of the notion that the proper way forward is not to swallow pills or submit to surgery but to eat little, drink less, exercise heartily (preferably out of doors and in bracing weather while wearing shorts) and fend off illness through strength of self-denying spirit. One sees this in the policies that health promotion and healthy living lead to and in the way they are pushed. It accounts for the absence of clarity about the evidence, and the absence of quantification when it comes to what we buy ourselves through brisk walks and small meals. The lack of evidence and lack of clarity result from the policies being based on puritanism, not science. Puritanism believes healthy living is a moral good, and moral goods need not be justified by measurement or assessment. ‘The Puritan hated bear-baiting,’ wrote Macaulay, ‘not because it gave pain to the bear but because it gave pleasure to the spectators.’ That spirit still lives.
The prolongation of healthy life is what medicine is now about, but this is obscured by its overlap with the tradition of good advice and puritanism. The overlap is harmful. Critics of medicine’s increasing tendency to intrude into our healthy lives take it for being puritanism. The extra days of health that pills buy us are not worth their costs, these critics point out. They should be encouraged: encouraged and answered, for sometimes they might be right. Health prolongation is no more something that should be sold on the basis of moral superiority than is healthy living. The precise costs and benefits of pills need measuring. Only on the basis of good measurements, of what they have to offer and what burden they bring, can anyone decide whether to take them. There should be no obligation on anyone to accept or deny any therapeutic arsenal. But there should be an obligation on everyone to be clear what they are about. Rejecting the proven benefit of a drug in the full understanding of what it is will always be a reasonable response, but if the person doing the rejecting is simultaneously swallowing a pill proven to have no benefits and some harms, as with many vitamins, they are not thinking clearly enough to be reasonable. To defend a choice as being a thoughtful one is always a good defence except when the thoughtfulness is a fraud.
‘Wait thirty years and then look out over the earth. You shall see marvel upon marvels, added to those whose nativity you have witnessed; and conspicuous above them you shall see their formidable Result – Man at almost his full stature at last! – and still growing, visibly growing, while you look.’ That was Mark Twain’s prediction in 1889.30 There is no reason not to aim for that prediction always being true, save that the notion of a full stature is a false one. Neither evolution nor our genes have placed limits we need bump our heads on. Physical height will not extend indefinitely but culture and intelligence and thoughtfulness offer gardens for cultivation that have no boundaries.
Twain wrote that as part of Walt Whitman’s seventieth birthday celebrations. The hazards of reductionism, the way they damped down the fires of imagination and froze the human face divine into a mask of theory, were as palpable to Whitman as they had been to Blake and Wordsworth.
Of physiology from top to toe, I sing;
Not physiognomy alone, nor brain alone, is worthy for the Muse
I say the Form complete is worthier far.31
Reducing phenomena to their constituents is a hazard only when taken too far. In some situations it works, and works marvellously. Science depends on it, just as it depends on not stretching it beyond the bounds of what is likely to add to understanding. History gives us a guide to where those bounds are, and where they are likely to be tomorrow. Much of what most fascinates us will remain enthralling rather than explicable. Science will never pluck out the heart of our mystery. When it gets close to trying it ceases to function and it ceases being science.
Beware of trying to say too much. In his diary the ageing Whitman wrote of adjusting himself to the half-paralysis that followed a stroke. ‘The trick is, I find, to tone your wants and tastes low down enough, and make much of negatives, and of mere daylight and the skies.’32 Predictions need to be good enough to be wrong. The grandest aren’t. Modest successes come from modest efforts. We should all properly be half paralysed in the face of the world’s complexity, and nowhere more than when we prophesy. I have made much of negatives, of what does not seem possible or likely or predictable. But the daylight and the skies of history – the brave o’erhanging firmament, the majestical roof fretted with golden fire – make for a fine view of what we have achieved and what is stretched ahead of us. What a piece of work is man. The beauty of the world, the paragon of animals: and a work in progress.
Of Life immense in passion, pulse, and power,
Cheerful, for freest action form’d under the laws divine,
The Modern Man I sing.
* Fleming discovered a property of a broth of penicillin mould. It was Florey, Chain and Heatley who isolated the specific molecule. Had they given it a new name, as is normal when a drug is discovered, their role would have been clearer. Bayer, when they discovered Aspirin and Heroin, would never have dreamt of calling them Willow Bark or Poppy.
† ‘As Darwin discovered the law of evolution in organic nature,’ said Engels, ‘so Marx discovered the law of evolution in human history.’ Cited in Terrence Ball, ‘Marx and Darwin: A Reconsideration’, Political Theory (1979) 7(4):469–483. To describe the socioeconomic structures he saw in society, and to point out who benefited and who did not, was one thing. To claim human society had to be seen in the light of the model, rather than the model in the light of observations of society, was another. It was consistent with his false analogy between biology and social history.