1
Science and Society

There is not a discovery in science, however revolutionary, however sparkling with insight, that does not arise out of what went before.

From Adding a Dimension: Seventeen Essays on the History of Science, Isaac Asimov (1964)

I feel the story should be told, partly because many of my scientific friends have expressed curiosity about how the double helix was found, and for them an incomplete version is better than none. But even more important, I believe, there remains a general ignorance about how science is ‘done’. That is not to say that all science is done in the manner described here. This is far from the case, for styles of scientific research vary almost as much as human personalities. On the other hand, I do not believe that the way DNA came out constitutes an odd exception to a scientific world complicated by the contradictory pulls of ambition and a sense of fair play.

From The Double Helix, James D Watson (1968)

The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.

From Isaac Asimov’s Book of Science and Nature Quotations, Isaac Asimov and Jason A. Shulman (1988)

1.1 What’s It All About?

This is a book about bioethics but we are starting with a consideration of the practice of science and its relationship with wider society. Why? Consider the four following case studies:

These case studies are on the surface very different from each other. However, they all describe situations in which ethical dilemmas have been raised by advances in science and by the way that the science, through its application, may have impacts on the lives of individuals and/or on wider society. The issues presented in these case studies are discussed in detail in later chapters. In the mean time it is important to consider briefly the factors that influence our decision‐making in these and similar situations:

Thus, science is one of the factors that informs bioethical decision‐making; we cannot avoid thinking about science, why and how it is done and how it relates to wider society. And that is what we explore in the rest of this chapter.

1.2 What Is Science?

1.2.1 Introduction: Some History (But Not Very Much)

We get the word science from a Latin word, scio, which means ‘I know’ and in the original usage science simply meant knowledge. The application of the word specifically to knowledge about the material nature of the universe, gained by a particular set of methods, dates back less than 200 years (see a more detailed discussion towards the end of this section). Some of those whom we regard as the great scientists of the past, such as Isaac Newton or Robert Boyle, would not have called themselves scientists. Indeed, Newton’s position at Cambridge was Lucasian Professor of Mathematics and his major work was called (translating from the Latin original) The Mathematical Principles of Natural Philosophy. The latter term natural philosophy was what we now call ‘science’ but because of the emphasis in the science of the time, in practice it came to mean physics. Indeed, it was used in this way in the older Scottish universities well into the second half of the 20th century. However, we now have a very clear idea of what we mean by the more general term science: the word implies a whole approach to the material world, to methods of acquiring knowledge about that world and to the body of knowledge thus acquired. So, how did we arrive at this situation?

To an early human being, the world around must have seemed a strange and often hostile place. It was certainly a place of contrasts, embodying both provision and threat. So while plants could be harvested, some were poisonous; while animals could be hunted, some animals, including some quarry animals, were very dangerous. Further, there were (and indeed still are) unpredictable and often devastating events such as storms, earthquakes and volcanic eruptions. Nature was not to be taken lightly and it was important that knowledge of the positive and negative aspects of the natural world was passed on verbally from generation to generation. Doubtless humankind’s investigation and knowledge of nature remained at this level for tens of thousands of years. However, dating from over 75,000 years ago, there is evidence of art; as that art, over successive millennia, became more sophisticated, it relied on quite detailed observations of nature. One just has to look at rock art and cave paintings in places as diverse as Australia, France, Siberia, South Africa2 and Spain dating from between 25,000 and 10,000 years ago to become aware of this. Furthermore, as cultures evolved, so did descriptive knowledge of the times and seasons, so that there was confidence that the sun would rise daily and that the seasonal rains would fall, that certain animals migrated and that plants grew at particular times. Some of that knowledge may have been very sophisticated; in Britain, for example, the alignment of particular stones in the stone circle at Stonehenge with the sunrise on the summer solstice and the sunset on the winter solstice indicates quite a detailed knowledge of astronomical events through the year. Stonehenge dates at about 2800 BC, around the same time as the period of building pyramids in Giza, Egypt, was under way. The alignment of the pyramids shows that the Egyptians could ascertain the direction of true north, another indication of growing knowledge of the natural world.

The Egyptians also put knowledge about the natural world into good use in their daily lives. The river Nile provides water in a land that would otherwise be very arid. The ancient Egyptians observed that the river flooded every spring and that the silt spread by the floods provided a fertile substrate for growth of crops. Indeed, by measuring the volume of the flood water at different places, estimates were made of the likely crop yield that year (and therefore what the tax ‘take’ was likely to be!). But, despite this sophistication, apparently there was no knowledge of the spring run‐off from the mountains of the upper Nile basin that causes the annual flooding. The Egyptians were thus observers of nature as it affected their lives but was this science? They also applied their observations and had significant engineering expertise, expertise good enough for the building of the pyramids, but again we may ask, was it science?

To the extent that science simply equals knowledge, the ancient Egyptians (and the ancient Britons who built Stonehenge) were scientists. But as far as we can tell, there was no theorising about the reasons for the phenomena they observed, beyond ascribing them to the work of myriad gods. It was in the Greek culture, with its emphasis on mind, that theorising about the reason for and the nature of the universe began to flourish and this theorising was tied in with other areas of thought, including especially mathematics, philosophy and ethics (see Chapter 2). The Greeks, like the Egyptians, were accomplished builders and technicians, putting their knowledge to practical use. But they were not great experimenters, despite Archimedes’s fortuitous bath‐time discovery about volume and water displacement from which he cleverly deduced information on the density of metals. So, although the flowering of Greek culture saw the development of theories about many natural phenomena, even a great physician such as Hippocrates carried out very little actual experimental testing of the theories. Nevertheless, the Greeks added significantly to our knowledge of the universe and thus they practised science. Indeed, their knowledge of planets and stars, albeit at a time when most believed that the Earth was the centre of the solar system, enabled the construction in about 150 BC, of a simple analogue ‘computer’, the Antikythera mechanism, by which to calculate positions of planets and stars (Figure 1.1).

Photo of the crust-encased ancient Greek Antikythera mechanism.

Figure 1.1 The Ancient Greek Antikythera Mechanism (150 BC), a simple analogue computer.

Source: Picture from Wikipedia Commons, reproduced under the terms of the GNU Free Documentation Licence. Reproduced with permission of Wikipedia https://creativecommons.org/licenses/by‐sa/3.0/deed.pt.

Thus it is legitimate to ask whether lack of an experimental approach precludes an activity from being called science. As the Egyptians, Greeks and possibly the builders of Stonehenge show that information about how the universe works can come from careful and repeated observations and measurements; otherwise, how would knowledge of planetary movements, for example, have been obtained? Further, Aristotle’s vision of what we now call science was a vision of a dual path involving generalising from specific observations into a universal law, and then back again from universal laws to predictions about what might be observed. The ability to make predictions, which may themselves be tested, is today regarded as a criterion for the validity of scientific hypotheses.

Continuing with this theme, there has recently been an increased interest in the science carried out in mediaeval times, showing clearly that in western Europe, investigation of the natural world did not go through a ‘dark age’ in which investigation was suppressed by the Church. The scholar, philosopher and theologian Robert Grosseteste (1175–1253), who became Bishop of Lincoln, provides a clear example of the scientific activities of those times. He clearly understood the importance of Aristotle’s dual path vision (above) and has been described by the science historian Alastair Crombie as ‘the real founder of the tradition of scientific thought in medieval Oxford and in some ways, of the modern English intellectual tradition’. He introduced to western Europe the concept of controlled experiments and related that approach to observational science, as one among several ways of arriving at knowledge of the natural world. Grosseteste’s books on light (De luce) and on rainbows (De iride) show a great understanding of the nature of light, of optics and of colour. He conjectured that the universe was born in an explosion3 followed by the crystallisation of matter to form stars and planets; in De luce he presented the first attempt to describe the Earth and the heavens using a single set of physical laws. Indeed, the ‘Ordered Universe’ research group4 are very much in agreement with Crombie (see above) and regard Grosseteste’s work as a clear demonstration that pre‐Renaissance science was far more advanced than we previously thought.5

However, as we hinted above, much of the scientific and scholarly activity of medieval times has been overlooked or forgotten, so much so that Robert Grosseteste has been described as ‘the greatest mind you’ve never heard of’. And so we jump forward four centuries to Francis Bacon (1561–1626) who played a key role in the formalisation of science. He was very impressed by the discoveries made by Copernicus and insisted that understanding nature required evidence that could only be gathered by experiment, by careful measurement and by rigorous observation. This has become known as the Baconian revolution and Bacon is often referred to as the Father of Science and the Secretary of Nature which, with our new understanding of mediaeval scholarship, now seems a little ‘unfair’ on Grosseteste and other scholars. Bacon and his contemporary, Galileo, are credited with abolishing forever the Aristotelian view of nature (notwithstanding the importance of Aristotle’s ‘dual path’ approach; see above). The adoption of Bacon’s concepts led to a rapid expansion of scientific knowledge in the 17th and 18th centuries, typified by, for example, the work of Newton, leading thence to modern science.

We need to make one last point. As we mentioned briefly above, at the time of Bacon and indeed of Newton, the term science was not used to describe systematic investigation of the natural world nor would its practitioners have called themselves scientists. They were ‘natural philosophers’. The term natural philosophy may be taken as meaning love of wisdom about the natural world (Greek philo, loving; sophia, wisdom). The use of the word ‘science’ dates back to 24 June 1833. At a public meeting, the poet Samuel Coleridge Taylor told the natural philosophers, ‘You must stop calling yourselves “natural philosophers”’. What we now call the ‘scientific method’ did not resonate with the poet’s view of what philosophy was. The geologist, mathematician and philosopher William Whewell was quick to respond with the word ‘science’ and ascribed to it the meaning of gathering knowledge about the universe by using a particular set of methods. That sense of the word is still embodied in the way we use it today but for some, its meaning has grown not only to include the scientific method and the knowledge obtained by that method (Latin scire, to know; scio, I know) but also to carry the implication that it is the only source of knowledge about the universe. The latter view is known as scientism.6

1.3 Modern Science

Science as practised in the 21st century continues to embody the principles set out by Bacon and thus we can say that science is an investigation of the material nature of the universe by a set of methods that include observation, experiment, the formulation of hypotheses and the testing of those hypotheses. But within that overall definition, there is room for much variety. Different sciences place different emphases on observation and experiment. Hypotheses come in different forms as do methods of testing them. Science as practised is not a single type of activity although it all takes place within a single overarching framework. This was clearly understood by Nobel laureate James Watson whose words head this chapter.

Let us then open this up a little more and explore briefly some ideas in the philosophy of science and the nature of scientific knowledge. This is important because misunderstandings of what science is and how it works can lead to negative attitudes to science, to scientists and to the applications of science. A most important basic principle is that, at any one moment, scientific knowledge is incomplete (we do not and cannot know everything) and provisional (it is possible that our current understanding may be modified by subsequent findings). For this reason many aspects of scientific ‘knowledge’ are actually the hypotheses that are open to further testing. Nevertheless, scientists assume that there is an objective reality to which this partial and provisional knowledge relates. This is what the science philosopher Polyani calls verisimilitude – approach to the truth. Progress in scientific knowledge and understanding is generally said to be made by the ‘scientific method’ that was outlined above and in particular in the testing of hypotheses. Further, the science philosopher Karl Popper maintained that ‘real’ hypotheses are those for which there is the possibility of being proved wrong (i.e. falsifiable). So, according to this view, science can only progress by the formation of falsifiable hypotheses that are then tested by further work. It seems a very sterile description of an activity that many find very exciting.

Indeed, amongst many practising scientists and growing numbers of science philosophers, there is a view that the ‘Popperian’ approach to science is too sterile and stereotyped. Science is actually more flexible. It embodies serendipity (making significant discoveries by accident, as has happened for one of us), intuition (in which an interpretative leap is made that goes beyond the strict limits of what the data tell us) and even guesswork. When Watson and Crick turned one strand of the double helix upside down (and in doing so achieved a workable and essentially correct model for the structure of DNA), they were acting on either a ‘lucky’ guess or a piece of brilliant intuition, depending on who one reads. So science can make progress by methods other than the direct testing of specific hypotheses, although, of course, these ‘non‐conventional’ findings can themselves be verified or falsified by subsequent work, as in the double helix where the opposite orientation of the two strands was confirmed by experiment.

The strictly conventional view of science also fails on two other grounds. Firstly, it is clear that scientific hypotheses come in a variety of forms; some are very well established and are so widely and generally applicable that they should be regarded as paradigms. In scientific language they are usually called theories. Evolution comes into this category. Indeed, the use of the term ‘theory of evolution’ has led to a good deal of misunderstanding amongst those who seek to promote other views. In scientific usage the word theory indicates something that is very well established. On the other hand, some hypotheses are very local in application and may also be very tentative because of the scarcity of relevant information, such as when we have data based on observations of just a tiny number of patients or from one small experiment. Secondly, Popper’s description of real hypotheses as those being those capable of being proved wrong cannot be universally applied. Experiments are often carried out in order to ascertain whether there is evidence to support rather than refute a hypothesis. Further, there are some facets of scientific knowledge that, as pointed out by John Polkinghorne,7 are here to stay; these include atoms and the helical structure of DNA. In our view then, Popper’s view of science does not accommodate gains in knowledge.

1.4 Science, Ethics and Values

1.4.1 Introduction

Science progresses in a stepwise manner; some of the steps are large (and then the public media often talk of a breakthrough) but mostly they are small. But whether the steps are large or small and whether the new data support or refute an earlier hypothesis, one thing is clear: science progress depends on what has gone before. If one of us sets up an experiment that is based on published data, it is expected that those data were not falsified or fudged and that the author in whose paper the data appear has given a correct version of what he or she has done. We can only see further than previous scientists because we are, metaphorically, standing on their shoulders (whether or not they are giants8). The reader will be quick to appreciate that this implies a trust in those who have gone before, a trust that they did not make up their data. Without this ability to trust what other scientists publish, the whole edifice of science would tumble. A parallel situation occurs in competitive sport where throwing a game for the sake of financial reward or cheating to achieve victory are both seen as going against the whole ethos of sporting competition. Thus, amongst other responsibilities, a scientist has ethical responsibilities to the whole science community, indeed to science itself. To suggest that a scientist has lied about his or her results (as has happened in some of the debates about genetic modification of crops; see Chapter 10) is a very serious accusation.

1.4.2 Scientific Fraud

Despite the seriousness of scientific fraud, it certainly takes place in various forms, including fabricating data, manipulating data in a way that is not justified and claiming other people’s data as one’s own (plagiarism). With scientists under increasing pressure to produce results, or to publish ‘significant’ results in high‐impact journals or to ‘win the race’ to make a particular discovery or to obtain the next large grant, scientific fraud appears to be increasing in frequency.9 There is no doubt that instances of fraud give science a bad name, albeit temporarily. When discovered, the perpetrators of fraud invariably lose their jobs, either because their employment is terminated or they resign. In very serious cases, the fraudsters may be stripped of previously earned awards and honours, even they were earned legitimately. The scientific community treats fraud as a very serious breach of scientific ethics, not least because, as we state in the previous paragraph, it is essential for the progress of science that we can trust those who have gone before. We will encounter a number of examples of scientific fraud in other chapters (see Chapter 5 for some especially notorious cases). In the meantime, readers who have an interest in this topic are referred to some helpful books.10

1.4.3 Science and Societal Values

In addition to the ethics specifically associated with the practice of science, we must also emphasise that the science is not value‐free. The impression of the scientist working in a social vacuum, driven just by curiosity, is no longer valid and perhaps never was. At the personal level, scientists may speak of competition, of the race to reach a particular research goal and of the desire for having one’s name associated with a major discovery. James Watson suggests that he and Francis Crick selected the structure of DNA because it was then the biggest prize in biological science. Personal ambition is often a major driver of the scientific enterprise but more altruistic motives may also lead to research on particular topics; for example, some are drawn to work on vaccines for malaria or on drought‐tolerant crops because they hope for applications that will aid less‐developed countries. The scientist does not leave behind his or her aspirations, world view or personality when entering the lab. Indeed, the latter may affect the choice of research area and the context in which the research is performed.

For scientific discovery there is an important parallel here with learning theory in general. The influential Russian psychologist Vygotsky wrote of the importance of the ‘zone of proximal development’, meaning that the social and physical environment is vital for learning to take place. He believed that a successful learner is in some manner ‘scaffolded’ – supported – by ‘able others’. Whilst scientists pushing forward the frontiers of our understanding can often make intuitive deductive leaps based on the interpretation of their observations, this individual effort is often, in terms of the overall investigation, a small part of the whole, and others will have significantly contributed to that breakthrough moment.

So, the context in which science is done is socially constructed. The gentleman or lady scientist doing original research paid for from their own financial means is today very rare indeed. Science has grown into a major world activity, embedded into national economies and employing across the world many tens of thousands of people. In the developed world, the applications of science are woven into our daily lives and are very much taken for granted. Science publishing is now a major business with thousands of journals, increasing numbers of which are published only in electronic form, competing with each other to attract the best research papers in their particular subject area. Modern science needs extensive funds and the allocation of funds for particular types of research is a societal decision, whether made as a result of government policy or of industrial priorities. Even in so‐called blue skies research, it is easier to obtain funds for some research topics than for others. Resource allocation reflects what society at the time deems to be valuable.

So then it is clear that there are ethical issues arising from some types of scientific research.

These include the use of animals, possible environmental damage, participation of human subjects, concerns about possible applications of results and allocation of scarce (financial) resources, to mention a few. There are also issues relating to individual and to societal values. We cannot say that science is value‐free, albeit that some scientists still try to do so. All these have a bearing on the way that science is regarded and in the way that its findings are applied. We therefore continue by examining the changing attitudes to science.

1.5 Attitudes to Science

1.5.1 Science and the Enlightenment

Societal attitudes to science in the early years of the 21st century are somewhat different from those of 50 years or so ago. A closer look at changes in prevailing world views shows why this may have occurred, especially in northern Europe. The Baconian revolution in science occurred very early in a period characterised by an intellectual movement known as the Enlightenment that, from roots in the 16th and 17th centuries, flourished especially in the 18th century on both sides of the Atlantic.11 The Enlightenment placed great value on the abilities of humankind; the Church was no longer seen as the source of all knowledge. The use of human reason was regarded as the major way to combat ignorance and superstition and to build a better world. Many of the adherents of the Enlightenment movement rejected religion and thus were humanists. On the other hand, there were also Enlightenment thinkers who did not reject religion and they regarded the human mind as the pinnacle of God’s creation. Thus, whether religious or not, members of the Enlightenment movement placed great stress on the human intellect. Combining this with the Baconian approach to investigating nature thus placed science in very high esteem.

1.5.2 Science, Modernism, Modernity and Postmodernism

Although the Enlightenment as a movement died out towards the end of the 18th century, many of its attitudes continued into the 19th century,12 including for the most part, a positive attitude to science and its applications. There were however, some voices of dissent, early signs of an arts–science divide. Goethe suggested that the view of the world espoused by Newton and his successors was cold, hard and materialistic, turning nature into a machine. The romantic poet Keats, referring to Newton’s work, wrote

Philosophy will clip an Angel’s wings,

Conquer all mysteries by rule and line,

Empty the haunted air, and gnomed mine –

Unweave a rainbow…

However, in general, the 19th century witnessed widespread applications, especially of the physical sciences, in technology and engineering. There was continued confidence that science could reveal objective truth about the world and that human ingenuity could put that knowledge to good use. Thus emerged a philosophy known as modernism that, although we can trace its beginnings back through the Enlightenment to the Baconian revolution, flourished in the later years of the 19th century right through into the middle years of the 20th century. There was a confidence that a better world could be built through science and technology. In the arts, according to JG Ballard,13 modernists wanted to strip the world of mystery and emotion. Thus, according to modernists, previous and traditional forms of art, architecture and literature were now outdated in an increasingly industrialised world. The poet Ezra Pound typified this approach with his clarion call to ‘Make It New’ in 1934.

However, for many, the occurrence of two world wars dented idealistic views of humans as moral agents. Despite this, there remained an immense confidence in humankind’s creative and technological abilities. Indeed, there was a widening acceptance in Western cultures of modernity. This is subtly different from modernism14 in that it embodies a strong reliance on evidence, an increasing level of secularisation in a world dominated by capitalism and a very high regard for progress. Thus, in 1964, Harold Wilson, then the prime minister of the United Kingdom, spoke of the country benefiting from the ‘white heat of technology’. Science and technology shaped many aspects of culture in the 1960s on both sides of the Atlantic. The contraceptive pill opened the way for a widespread change in sexual behaviour at a time when traditional values were being widely questioned; there was great public interest in the conquest of space; telecommunications and information technology were on the verge of huge expansion. The press (but not the science community) spoke of nuclear energy as likely to provide ‘electricity too cheap to meter, thus providing an ‘atoms for peace’ counter to background angst about nuclear warfare. Such confidence in science in all its aspects continued in general right through the 1970s.

However, the arts–science divide that surfaced early in the 19th century was becoming more marked. The scientist, public administrator and novelist C.P. Snow wrote extensively in his novels about the work of scientists in public life and about the relationship between science and other aspects of society and culture. In 1959, he coined the term ‘the two cultures’ to describe, in the educated classes in the United Kingdom, a great divide between science and the arts. His claim was that, despite the central position of science and technology in modern life, a high proportion of well‐educated people understood very little about science, a cultural divide that continues today. Further, we also need to note that in the 1970s a philosophical shift had already started, a shift towards postmodernism.

In order to understand this philosophical shift, we need to look back to the 19th‐century philosopher Nietzsche. Based on his view that ‘God is dead’, he suggested that there are no external reference points; each individual defines for themselves their own moral and cultural values and indeed are free to ‘reinvent’ themselves. This leads to a fragmentation in ideas about truth and culture. If individuals can define their own moral values, then there is nothing to stop a person deciding on courses of action that work out best for themselves rather than having wider terms of reference. This approach to moral decision‐making is known as rational egoism and is the most extreme of the consequentialist ethical systems (as discussed in the next chapter), in that it considers only the consequences for the individual making the decision. It is thus in the philosophy of Nietzsche that we see the origins of postmodernism, a belief that anyone’s world view, concept or version of the truth or ethical value system is a valid as anyone else’s. If this leads an individual to adopt rational egoism as an ethical system, so be it.

Although the roots of postmodernism were planted in the 19th century, its growth and flowering have been very much a feature of the 20th and early 21st centuries. It is not our intention here to discuss this philosophy in detail but we do need to mention some of the main strands within it. In the United Kingdom, the Cambridge philosopher Wittgenstein insisted that words, including scientific terms, must be interpreted in their social context. This, taken to its ultimate conclusion, leads to the view that no word can have a universally accepted meaning15 and that there can be no underlying universal truth, a conclusion that is certainly reached by writers such as Derrida and Foucault and the ‘deconstructionist’ school of literary criticism, all of whom emphasised the absence of universal truths, of overriding themes or ‘metanarratives’. In some academic circles, it is now acceptable to state that ‘all things are relative’, despite the inherently self‐defeating nature of this statement; relativism has thus become a distinct philosophy under the postmodern umbrella.

Although the average person in the street probably has not heard of postmodernism, this mode of thinking has certainly seeped into popular culture, especially in northern Europe (rather less so in the United States). Although in the second decade of the 21st century, there is evidence that the influence of postmodernism is beginning to decline, it is nevertheless probable that most people in the United Kingdom, especially in the under‐55 age group, think in a postmodernist way, very much influenced by the media that have been pervaded by postmodernism. An overarching postmodernism will clearly affect general ethical thinking, as mentioned above and as discussed in the next chapter. But what about science?

If all ‘truth’ is culturally constructed, then that will include scientific truth.16 So postmodernism will argue that published scientific data have little or no relation to objective reality, even if it is accepted that the scientists themselves have published those data in good faith. In the most extreme versions of this view, it is suggested that the actual results obtained by the scientists are socially constructed. Obviously if this were so, the whole edifice of science would collapse, as we mentioned earlier in the context of falsification of data. Experiments done in one continent within one culture would yield different results from the same experiments done in another continent within another culture. That is not the experience of the scientific community and scientists in general have not espoused postmodernism, at least in respect of science. Postmodernism is thus seen as a threat to science. However, scientists do acknowledge that because science is an activity of people, its practice is not free from personal values, including reasons for choosing particular lines of research, personal ambition, altruism, desire for recognition and so on (as mentioned earlier). Science is not done by robots. Further, in the practice of modern science, some types of research are regarded as more deserving (or demanding) of financial support than others; there is thus, as we noted earlier, a strong societal element in the support of science. However, scientists argue strongly that the actual results obtained in scientific experiments are not socially constructed. The source of the money does not determine the outcome of the research. Nevertheless, it is acknowledged that there are cases in which results may have been suppressed because of commercial interests (as happened, e.g. in the tobacco industry with data indicating the adverse effects on health of smoking) or more generally because the results do not support the policies/activities of the funders.

1.5.3 Postmodernism and ‘Pseudo‐modernism’

Although the influence of postmodernism is still pervasive (despite the signs of a decline, mentioned above), many social commentators believe that it is currently giving way to another mode of thinking, named by some as pseudo‐modernism. It is not that there are no overarching truths or metanarratives. Rather, anyone can claim expertise as what those truths or metanarratives are. As discussed in the next section, real expertise is now often regarded as much with suspicion as with respect. Although we are not of the opinion that scientific experts should be put on a pedestal, we nevertheless suggest that pseudo‐modernism may be as unhelpful to science as postmodernism. Indeed, some commentators suggest that in this rather vague ‘twilight zone’, it is possible for activist groups opposed to particular technologies (e.g. GM crops; vaccines) to establish a ‘parallel science’ that apparently supports the ideologies of those activists.

Thus Marcel Kuntz writes: [While] Pseudo‐sciences may harm naive believers, parallel ‘science’ is harming democracy. It is a component of a predetermined political project to the exclusive benefit of the ideological views of a minority. ‘Parallel science’ seemingly resembles science, but it differs from science since its conclusions precede experimentation.

Parallel ‘science’ has been created to replace scientists, especially in risk assessment, by ‘experts’ (often self‐proclaimed) supportive of a political project. This parallel ‘science’ is hidden behind positive‐sounding terms, such as ‘citizen science’ or ‘independent’ or ‘whistleblower’, while mainstream scientists are accused of having ‘conflicts of interest’ or having ties with ‘industry’. In order to further propagate distrust in current risk assessment, parallel ‘science’ will invoke unrelated past health problems or environmental damages, but never the way science has solved problems.17

1.5.4 Public Attitudes to Science

Finally, we return to public attitudes to science. There is certainly a greater ambivalence towards science now than in the middle years of the 20th century, at least in northern Europe. So, although the influence of science and technology is as central as ever, it is not uncommon to hear anti‐science views expressed, some of which echo the words of Goethe and Keats. The seepage of postmodernism and now pseudo‐modernism into general modes of thinking has affected societal attitudes to science. Thus, an individual may accept or reject a particular scientific finding according to whether it coincides with that individual’s pre‐existing ideas or indeed, whether it useful to do so. The discovery of a gene involved in a particular disease may be hailed as a breakthrough: the application of science in medicine generates a good deal of respect for the authority of science. On the other hand the findings of some scientific enquiries are rejected because they do not coincide with the views of particular groups. My view is as valid as yours, even if you are the so‐called expert. Thus, in the United States, Tom Nichols writes18 I fear we are witnessing the ‘death of expertise’: a…collapse of any division between professionals and laymen, students and teachers, knowers and wonderers – in other words, between those of any achievement in an area and those with none at all. By this, I do not mean the death of actual expertise, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors, lawyers, engineers, and other specialists in various fields. Rather, what I fear has died is any acknowledgement of expertise as anything that should alter our thoughts or change the way we live. Similar sentiments were expressed by the UK journalist Lucy Mangan19: Being a bona fide expert is a tricky business these days. Expertise is no longer something to be admired. It is a liability. It seems to our confused modern sensibilities, somehow elitist and undemocratic. What was once respected – the careful, deliberate acquisition of knowledge – is now an affront. The TV scientist Brian Cox has commented, in respect of ‘anti‐expertise’, that it’s entirely wrong, and it’s the road back to the cave. Nowhere was the rejection of expertise more apparent than during the lead‐up to the 2016 referendum as to whether the United Kingdom should remain in the European Union (EU). Expert opinion in a range of fields was almost unanimous in suggesting that leaving the EU would be very damaging to many aspects of UK life, including incidentally, science. Nevertheless, senior politicians campaigning to leave the EU were very happy to state that ‘The British public are sick of experts’. A similar situation occurred during the 2016 American presidential election campaign during which one polling organisation stated that ‘facts just don’t cut it’. Indeed, in the United Kingdom, the Oxford Dictionary now lists the word post‐truth, in the context of our ‘post‐truth’ society in which objective facts are valued far less than emotional reactions and ‘gut responses’. In the world of politics it means that campaigners can get away with saying things that are obviously untrue. In the world of science and medicine it leads to situations like that in Texas where, by 2016, there were over 40,000 families in which children had not been vaccinated and where there is a clear anti‐vaccination ‘thread’ in Republican party policies.20

However, in the United Kingdom, a 2014 society‐wide MORI poll21 actually revealed a wide range of attitudes to science. So, for example, although about 80% of respondents thought that the findings of science are important in our lives, only 55% thought that the benefits of science outweigh possible harmful effects. Related to this, in a similar survey in 2011, 56% had agreed that ‘people shouldn’t tamper with nature’. In respect of the trustworthiness of scientists, 35% thought that scientists altered their results to get the answers they want (which, as we have noted already, is a cardinal ‘sin’ for scientists22). In the 2011 survey, 41% of respondents agreed that ‘scientists seem to be trying new things without stopping to think about the consequences’, while about a quarter of respondents agreed that ‘the more I know about science, the more worried I am’.

Thus, acceptance of scientific authority and of the validity of scientific findings is patchy. To some extent this reflects the sources of scientific information available to the wider public: a majority use one or more of the broadcast media (especially TV23), newspapers and, to a lesser extent, the Internet. In most cases they have no way of checking the validity of what they see/hear/read. This in itself is a problem because the broadcast media especially often set up discussions between a genuine expert and total non‐expert (albeit with strong views). This may be done because of a distorted sense of fairness or in the postmodern spirit of the age in which everyone can be their own expert (see above) but it certainly does not help those without the relevant knowledge to obtain a clear understanding. An added problem is that those reporting science findings in the broadcast media or in the press may well not understand what they are trying to report. This is reflected in the 2014 survey in which the majority of respondents believed that that most journalists reporting science in news broadcasts (as opposed to specific science programmes) were not appropriately qualified and that they failed to check their facts. Thus, the public distrust the media on which they rely for information and most think that scientists themselves should be better at communicating. The situation is further exacerbated when celebrities with no scientific qualifications make confident and often bizarre statements, usually about health, that have no scientific or medical basis.24 Unfortunately, their celebrity status often means that people take notice of them.

Finally, as is illustrated in the case studies set out at the end of the chapter, science is widely used as a marketing tool. In a survey carried out at the University of Exeter, it was revealed that science was used in some way in the advertising or marketing of several hundred products. These were mostly but not entirely in the cosmetic and personal care sectors. Science is used in five different ways:

  1. ‘Sciency’‐sounding terms that mean nothing but are used to impress
  2. Genuine science terms used in a meaningless way
  3. Genuine science terms used in unsupportable claims
  4. Genuine science terms used correctly
  5. Science being used as a ‘validator’ as in ‘scientifically proven’ or ‘clinically proven’

It was also found that many of the genuine science terms used were actually beyond the average level of science knowledge amongst the public (Table 1.1), many of whom, in the United Kingdom at least, had not studied science past the age of 16 years.

Table 1.1 Some of the more advanced science terms used in advertising. Many people will have no idea of the meaning of these terms.

Antioxidant
Cellular matrix
Ceramide
Hyaluronic acid
Lycopene

Thus, there are many factors that do not help the wider public, even those who are genuinely interested, to understand science. This is further reflected, as also shown in the 2011 Ipsos‐MORI poll, in a rather ‘patchy’ and even inconsistent range of attitudes to the ethical issues that arise from modern science, especially bioscience. It is these issues that we now deal with in the rest of this book.

Key References and Suggestions for Further Reading

  1. Bell R (1992) Impure Science: Fraud, Compromise and Political Influence in Scientific Research. Wiley, Chichester, UK.
  2. Fang FC, Steen RG, Casadevall A (2012) Misconduct accounts for the majority of retracted scientific publications. Proceedings of the National Academy of Sciences USA 109, 17028–17033.
  3. Genetic Literacy Project. https://geneticliteracyproject.org/ (accessed 26 October 2017).
  4. Goldacre B (2008) Bad Science. Fourth Estate, London.
  5. Goodstein D (2010) On Fact and Fraud: Cautionary Tales from the Front Lines of Science. Princeton University Press, Princeton, NJ.
  6. Ipsos‐MORI (2014) Public Attitudes to Science, 2014. https://www.ipsos‐mori.com/researchpublications/researcharchive/3357/Public‐Attitudes‐to‐Science‐2014.aspx (accessed 18 September 2017).
  7. Kuntz M (2014a) OGM, la question politique. Presses Universitaires de Grenoble, Grenoble, France.
  8. Kuntz M (2014b) ‘Parallel science’ of NGO advocacy groups: how post‐modernism encourages pseudo‐science. Genetic Literacy Project, 15 July 2014. https://www.geneticliteracyproject.org/2014/07/15/parallel‐science‐of‐ngo‐advocacy‐groups‐how‐post‐modernism‐encourages‐pseudo‐science/ (accessed 18 September 2017).
  9. Kuntz M (2016) Scientists should oppose the drive of postmodern ideology. Trends in Biotechnology 34, 943–945.
  10. McLeish T (2014) Faith and Wisdom in Science. Oxford University Press, Oxford.
  11. McLeish T, Casper G, Smithson H (2015) Our latest research partner was a medieval bishop. The Conversation, 7 June 2015.
  12. Nichols T (2014) The death of expertise. The Federalist, 17 January 2014. http://thefederalist.com/2014/01/17/the‐death‐of‐expertise/ (accessed 18 September 2017).
  13. Okasha S (2016) Philosophy of Science: A Very Short Introduction, 2nd edition. Oxford University Press, Oxford.
  14. Sense about Science. http://senseaboutscience.org/ (accessed 26 October 2017).
  15. Stewart CN (2011) Research Ethics for Scientists. Wiley‐Blackwell, Chichester, UK.

Notes