Chapter Four
The Dangers of Radiation
IN BELITTLING THE dangers of radiation from the Fukushima reactors, Professor Sato of Fukushima Medical University (see previous chapter) follows a wider pattern of denial about the harmful effects of radiation, joining a surprisingly broad range of people and institutions. Such denial has a long history.
At the beginning of the atomic age, scientists understood little about ionising radiation, and took a casual approach towards exposure. Marie Skłodowska-Curie — the discoverer of radium, which she chemically separated from pitchblende — died of aplastic anaemia at the age of 66, as the result of her work with radiation. So did her daughter, Irène Joliot-Curie, who, with her own husband, Frédéric, carried on Skłodowska-Curie’s research. Skłodowska-Curie’s research documents and even her cookbooks were so strongly irradiated that they are still kept in lead-lined boxes in Paris and may only be accessed by researchers wearing heavy protective clothing. By 1897, a record of scientific statistics showed that 69 nuclear researchers had suffered from biological damage due to radiation. The number rose to 170 in only a few years. By 1911, 94 scientists and other researchers known as the ‘medical martyrs’ had contracted cancerous tumours induced by X-rays.
Americans were as casual about radiation as their French colleagues. Sabin von Sochocky was a doctor with a lucrative X-ray business in beauty parlours in New Jersey in the 1920s. He bombarded his patients with X-rays to cure arthritis, gout, hypertension, sciatica, lumbago, and diabetes. He advised some of his patients to drink radium in solution. He was also an amateur artist who used radium in his paintings because it glowed in the dark. During World War I, he employed 250 young women in his Radium Luminous Material Corporation factory in Orange, New Jersey, to paint luminous watch dials for the doughboys. He encouraged the women to lick their paintbrushes to give them a fine point before applying the radium. By 1924, some of his employees were crippled, their jaws, spines, and legs corroded by the alpha particles from the radium that had lodged in their bones. Many died, precipitating a call for new standards of protection from ionising radiation.
In 1928, the International Commission on Radiological Protection (ICRP) was established by British scientists concerned about the effects of radiation on the human body. After many years of research, it recommended in the early 1930s a maximum permissible dose for X-ray and radium workers of 0.2 roentgens per day, or 60 roentgens per year. In 1936, American scientists halved this to 0.1 roentgens per day, a measurement used for ‘standard workers’ of a certain height, weight, and age; this was strictly observed by staff in the Manhattan Project. But just before the ‘Trinity’ atom-bomb test at the Alamogordo, New Mexico testing ground in July 1945, the Army commander of the project, general Leslie Groves, refused on security grounds to warn nearby inhabitants — including sheep and cattle ranchers, and a tribe of Apache Indians — of the radiation dangers. They received more than 40 times the upper limit prescribed for the standard workers. Many suffered from cancers, and probably sustained mutagenic damage as well, but no epidemiological records were kept, and the matter was hushed up.
Official American radiation denial continued even after the atomic bombings of Hiroshima and Nagasaki in August 1945. Doctors in the US occupation force asserted in the weeks after the explosions that victims had died from flash burns and the kinetic effects of the blast only, refusing to admit the existence of ionising radiation, let alone an atomic ‘plague’. When Japanese doctors were finally permitted to enter the blast zones, several weeks later, they established for themselves what the Americans already knew — that people otherwise unmarked by blast or heat were affected by blood disorders, general debilitation, dehydration, anaemia, and other unspecified illnesses, and that many of them were dying. Later evidence of birth defects in the victims’ offspring, who had been in utero at the time of the bombings, was hushed up as much by the victims as by Japanese and American authorities. It was regarded as shameful to be hibakusha, or atom-bomb victims, and their prospects of finding healthy partners or social acceptance into Japanese society were small.
At the time, most Japanese internalised the human damage sustained at Hiroshima and Nagasaki as part of the price of losing the war. But American nuclear tests at Bikini Atoll in the Marshall Islands in the Pacific in 1954 created a firestorm of indignation in Japan when a Japanese fishing boat, the Daigo Fukuryū Maru, was showered with intense radiation. The American authorities were reluctant to admit that there had been miscalculations about the extent of the fallout, or that injuries had been caused, but they did pay compensation to the families of some of the victims who later died from related illnesses. (For a fuller account, see Chapter Six.)
Geneticists had earlier argued that radiation from cosmic rays and naturally occurring radioactive elements, such as uranium and thorium, had helped mankind evolve. But this view was modified by data derived from atmospheric nuclear testing, a feeding frenzy of which had begun in 1951. The geneticists started to suspect that a substantial increase in radiation through artificial means could destroy the natural genetic equilibrium. The initial radium standard was quietly abandoned as the numerous fission by-products created in nuclear detonations were shown to attack the human body in different ways. Among the most notoriously carcinogenic were iodine-131, which can cause thyroid cancer, strontium-90, a calcium analogue that can cause bone cancer, and caesium-137, a potassium analogue which can irradiate muscle cells and nearby organs. Plutonium-239, which fuelled many of the atom bombs, was not as prevalent in the fallout from atmospheric tests, as most of it had fissioned in the explosions. (See Appendix for more detail on radio-toxic isotopes.)
The ICRP adopted a new occupational-health standard in 1950, which limited exposure by weekly rather than daily doses, providing greater safety through less exposure for atomic-plant operators. Workers could now face 0.3 roentgens a week or 15.6 roentgens a year. But there were still no national or international standards for people living in test areas, for the wider community, or indeed, for the military, who could be ordered to fight in radioactive war theatres. US troops were routinely ordered to march through the Nevada testing grounds where atomic bombs had been detonated, to prove that such weapons could be used in warfare without compromising the ability of the troops to fight. At Emu Field and Maralinga in outback Australia, British and Australian servicemen were similarly exposed to radiation from British nuclear tests, and Royal Australian Air Force pilots flew their Canberra jet bombers through atomic clouds to gather radiation samples.
The American government continued to promote the view in the wider community that a nuclear war with the Soviet Union was survivable and winnable. Nuclear air-raid shelters were built into the basements of tall buildings in major cities. Families were told how to construct their own fallout shelters, complete with food and water supplies, chemical toilets, and filters to keep out radiation. School children were told that if an atom bomb was dropped when they were in class, all they had to do was hide under their desks until the danger had passed. This ridiculous notion grew from the views of Thomas K. Jones, deputy undersecretary of defence during the Reagan years, and was derided by the writer Robert Scheer in his 1983 book, With Enough Shovels: Reagan, Bush, and nuclear war. As mentioned earlier, the 1982 cartoon story When the Wind Blows, by Raymond Briggs, describes the sad fate of a fictional English couple who believed similar advice.
Nuclear myopia also infected the staff and workers at the Hanford nuclear plant, established in February 1943 on the banks of the Columbia River in the south-east corner of Washington state, to make plutonium for American atomic weapons. Until the early 1950s, there were no filters on the chimneys of the reactors, and radioactive gases and particulates flew freely into the air; radioactive water ran unimpeded into the Columbia River. The plant and surrounding countryside became heavily polluted with radiation. In the interests of national security, workers at Hanford and their families were sworn to secrecy about what the plant produced, but in the 1980s a scandal broke when Tom Bailie, an employee at the plant and resident in the nearby town of Richland, admitted to the press that members of his family had become ill from mysterious causes. His father and four uncles all had intestinal tumours; his grandfather had died of liver cancer, and his grandmother of colon cancer; his sisters all had thyroid problems; and his mother and six of her friends had all had miscarriages. Bailie’s neighbours had taken great pride in building the atom bombs that had ended the Pacific War, and could not believe that the government had lied to them about radiation dangers.
Bailie was shunned and hated in Richland as a whistleblower. Gradually, however, the truth was accepted by a growing number of people. Numerous lawsuits were launched by ‘downwinders’ against the US government for compensation for cancers allegedly caused by emissions from Hanford. The plaintiffs came from as far north as Spokane, and as far south as the California–Oregon border. Lawyers employed by DuPont and its successor at the plant, General Electric, argued that no link could be proven between radiation emanating from Hanford and illness. Some downwinders received compensation; others didn’t. A few were offered the derisory amount of US$10,000 for thyroid cancer. Meanwhile, litigants got older and sicker, and many died without receiving any compensation.
Denial about the radio-toxic effects of nuclear materials has continued into the 21st century. A prime example is depleted uranium (DU), made from what is left after enriching natural uranium by increasing the concentration of the fissile isotope uranium-235. Uranium-238 is a stable but fertile isotope, with a half-life of 4.5 billion years. Its low but persistent radioactivity is accompanied by dangerous levels of chemical toxicity. Because it is denser than lead and harder than many forms of steel, it is used in munitions by US and British military forces, and probably those of some other countries. For example, the US Army uses it in 25 mm ammunition in the M242 gun mounted on its Bradley Fighting Vehicles; the Air Force, in Vulcan Gatling-guns mounted in its A-10 ‘Tankbuster’ aircraft; the Marines, in ordnance used in their AV-8B Harrier jump-jet aircraft and Cobra helicopter gunships; and the Navy, in its Phalanx close-in anti-missile defence system. On hitting targets, these munitions aerosolise and turn into a powder that can be inhaled, settling in the nose, mouth, lungs, and gut. The rounds are also pyrophoric, and burn after impact, creating more radioactive dust.
The US Defence Department has strict rules about the way DU ammunition is to be handled by US and allied troops. But the department has consistently denied in public that DU is radio-toxic or chemically toxic, and has had few inhibitions about its use against an enemy. About 320 tonnes of DU munitions were used in the 1991 Gulf War; smaller amounts in Bosnia, Serbia, and Kosovo in 1999, and in the recent Iraq War. Subsequent evidence suggested that the chemical toxicity of DU and its radioactivity reinforce each other in a so-called ‘synergistic effect’. Alexandra Miller of the US Armed Forces Radiobiology Research Institute found significant evidence of genetic damage in laboratory animals implanted with DU pellets. It depresses the immune system and is suspected of causing birth defects. A major survey, conducted by the London School of Hygiene and Tropical Medicine on behalf of the UK Ministry of Defence in 2004, found that babies whose fathers had served in the Gulf War were 50 per cent more likely to have physical abnormalities, and women whose partners served in the Gulf War had a 40 per cent greater risk of suffering miscarriages. While the authors of the study advised caution in the interpretation of the results, the data are significant.
The possible effects among civilian populations appear to be even more alarming. In September 2009 for example, of 170 children born at the Fallujah General Hospital, 24 per cent died within seven days, and three-quarters of these exhibited deformities such as two heads, no head, a single eye in the forehead, or missing limbs. During a comparable month before the invasion, when there were 530 births, only six died, and only one was deformed. On 13 November 2009, the British Guardian newspaper quoted Dr Ayman Qais, director and senior specialist at Fallujah General Hospital, as saying, ‘We are seeing a very significant increase in central nervous system anomalies ... There is also a very marked increase in ... brain tumours.’ In an earlier story about Iraqi boys playing with the remains of US rockets, Sky News interviewed a Fallujah gravedigger, who said that of the four or five newborn babies he buried every day, most had deformities. As further evidence was unearthed, American and British military authorities began reluctantly to admit that DU munitions posed possible long-term health risks, but continued to assert that much more research needed to be done, and refused to attempt to decontaminate former battle areas — a difficult task indeed.
Proponents of civil nuclear power, citing its safety and economic benefits, are keen to downplay the carcinogenic and mutagenic side effects of ionising radiation. Denials after each of the four most serious civil nuclear accidents prior to Fukushima are particularly revealing. The accidents occurred at Windscale in the UK, in 1957; at Kyshtym in Russia, also in 1957; at Three Mile Island in the US, in 1979; and at Chernobyl in Ukraine, in 1986.
In the early 1950s, Britain built two nuclear reactors at Windscale (later called Sellafield) in Cumbria, initially to make plutonium for nuclear weapons, and later, as an afterthought, to generate electricity. The reactors, primitive by today’s standards, were fuelled by natural uranium, cooled by carbon dioxide, and moderated by graphite. In October 1957, the core of the first unit caught fire, releasing large amounts of radioactive isotopes into the surrounding country. Five hundred square kilometres around the plant were contaminated, and large quantities of milk from cows in the area were diluted and dumped into the Irish Sea. The British government never released the full report on the accident, and epidemiological records are incomplete. While the number of cancers induced by the accident will never be known with certainty, unofficial estimates undertaken by John Garland, formerly of the UK Atomic Energy Authority, and Richard Wakeford, a visiting professor at the University of Manchester, estimate a possible death toll of around 240, with many more premature deaths left unrecorded.
During the Cold War, Kyshtym was a town between Sverdlovsk and Chelyabinsk, on the edge of the West Siberian Plain in the Soviet Union. Like Richland, near Hanford, it was a dormitory town for workers at nearby nuclear reactors — in this case, dedicated to the production of plutonium-239 for a rapidly expanding Soviet nuclear-weapons program. Working conditions were primitive and highly dangerous. Radioactive waste from the reactors was dumped in open, unlined trenches, close to the plant, generating a highly concentrated surface layer of plutonium. This produced much heat, which at some stage late in 1957 (the Soviet government never disclosed details) caused a gigantic steam explosion in the ground water, releasing a massive radioactive cloud into the surrounding environment. The number of people who subsequently died is not known, but later scientific journals referred to mass evacuations, and hospitals filled with patients suffering from acute radiation poisoning. Huge quantities of food from local markets were seized, and ‘clean’ food was brought in and sold from the backs of trucks. The main north–south highway through Kyshtym was closed for nine months; when it was reopened, drivers were advised to speed through the area with their windows closed. A decade later, doctors were still advising pregnant women to have abortions.
According to Peter Pringle and Jim Spigelman in The Nuclear Barons, a Soviet physicist who drove through the area two years after the accident observed the following: ‘As far as I could see was empty land. The land was dead — no villages, no towns, only chimneys of destroyed homes, no cultivated fields or pastures, no herds, no people — nothing. It was like the moon for many hundreds of square kilometres, useless and unproductive for a very long time, maybe hundreds of years.’ The Soviet government tried desperately to stop news of the disaster from leaking, but it inevitably got out. Yet when a Russian-refugee biochemist, Zhores Medvedev, casually referred to the Kyshtym disaster in an article he wrote for the London-based New Scientist in 1976, he was met with angry and defensive denials from British, French, and American nuclear officials and scientists, who were having their own problems with nuclear waste. Most memorable was the response of Sir John Hill, chairman of the UK Atomic Energy Authority, who said Medvedev’s account was ‘pure science fiction’, and the allegation that a waste dump had exploded was ‘rubbish’.
The accident at Three Mile Island occurred on 28 March 1979. The complex, on the Susquehanna River in Dauphin County near Harrisburg in Pennsylvania, comprised two pressurised-water reactors (PWRs) built in the late 1960s by Babcock and Wilcox. The accident occurred in unit number one, when a failure in the non-nuclear secondary cooling system was followed by a malfunctioning pilot relief valve in the primary system, causing radioactive nuclear coolant to escape. The operators initially failed to recognise it as a loss-of-coolant incident, and their delay in rectifying the situation resulted in a partial meltdown of the fuel rods. Ninety-two point five million gigabecquerels of irradiated gases escaped from the reactor, some of it iodine-131. One hundred and fifty thousand litres of irradiated cooling water also escaped into the Susquehanna River. One hundred and forty thousand people, primarily pregnant women and preschool-age children, were evacuated from the surrounding countryside.
The accident rated five on the seven-point nuclear accident scale, an ‘accident with wider consequences’, resulting in a clean-up bill of US$1 billion and a bill of US$2.4 billion in property damage. The Nuclear Regulatory Commission initiated an investigation headed by John G. Kemeny, president of Dartmouth College. After a drawn-out procedure with many hearings, Kemeny decided that no cancers had occurred, nor were lives lost as a result of the accident. An independent public-health official, Joseph Mangano, initiated a radiation and public-health project in which he concluded that two years after the accident there had been a spike in infant mortality in the downwind community. Another report prepared by an anti-nuclear activist, Harvey Wasserman, claimed that a plague of death and disease had occurred among farm livestock and wild animals, as well as a sharp fall in the reproductive rate of horses and cows in the area. Both the Mangano and Wasserman findings were handicapped by methodological problems and proper control-group scrutiny, but it is likely that radiological harm exceeded official pronouncements.
Whatever the truth of the matter, Three Mile Island resulted in no new nuclear reactors being built in the United States for the next 32 years. On February 2010, however, President Obama began a process of resurrecting the industry, by announcing that his administration would offer a US$8.33 billion loan guarantee to Georgia Power for it to build two new nuclear reactors at its Vogtle power station. Anti-nuclear groups were amazed that an unpopular and dangerous technology should suddenly get a kick-start from a new Democratic administration, but one fact might help explain the initiative: Exelon, a Chicago-based company that owns one of the largest fleets of nuclear-power stations in the country, including the reactors at Three Mile Island, was a major financial contributor to Obama’s presidential election campaign.
The Chernobyl nuclear complex went online in 1977, near the city of Prypiat in the Soviet republic of the Ukraine, and close to the border with Belarus, and the Dnieper River. Chernobyl’s four reactors were of a Soviet design known as the RBMK, constructed with no containment building, moderated by graphite, and known to be subject to dangerous power fluctuations. On 26 April 1986, a systems test was being conducted in reactor number four when a sudden power surge prompted the operating crew to attempt an emergency shutdown. A more extreme power spike occurred, leading to explosions which ruptured the reactor vessel. As at Fukushima in 2011, the explosions were of hydrogen gas, which had been generated when the graphite control rods were subjected to extreme temperatures in the presence of steam and oxygen. Firefighters were sent into the complex without them knowing of the massive radiation being generated from the stricken core.
Subsequently, 30 per cent of the reactor’s 90 tonnes of fuel was distributed over the reactor buildings and surrounding areas; 1 to 2 per cent was ejected into the atmosphere. The reactor’s entire inventory of radioactive gases was released at the same time. But the main source of radiation was the fire in the reactor’s 1700 tonnes of graphite moderator, which lasted for eight days. The World Health Organization estimated that the total radiation released was 200 times the amount released from both of the atom bombs dropped on Japan.
Three hundred and forty thousand people were evacuated from the most heavily irradiated areas in Belarus, and resettled in other parts of the Soviet Union. A dense cloud of radioactivity from the plant blanketed considerable areas of the Ukraine, Belarus, Moldova, Slovakia, Slovenia, Austria, Germany, Switzerland, the United Kingdom, Sweden, Finland, and Turkey. The resulting economic dislocation and attempts by Soviet authorities to cover up the worst aspects of the disaster were a significant catalyst for Mikhail Gorbachev’s policy of glasnost — openness and transparency in government — that prompted the reforms which led to the dissolution of the Soviet Union in 1991.
Ever since the catastrophe, widely conflicting claims have been made about the severity of radiation and the number of casualties. The problem lies in the fact that, unlike with the atom-bomb survivors, there has been no internationally coordinated, well-funded, prospective case-controlled study. The data is piecemeal, derived from the cancer registries of individual countries, where such registries exist. Reliable internal-dosimetry figures are scarce, so that even if one could prove increased cancer rates, and even if all those relocated could be traced, it is difficult to sheet them home to Chernobyl. Another problem is that it takes a couple of decades or more to develop most radiogenic solid-cancers (leukaemia develops sooner); for example, atom-bomb-survivor studies only began showing a statistically significant increase in solid-cancer rates 25 years after the bombs were dropped. Using currently accepted risk coefficients, it is estimated that there will be 16,000 deaths of inhabitants throughout the fallout area until 2065. But the natural incidence of cancer in the 600 million people exposed to some of the Chernobyl fallout will equate to 160 million cancer deaths. Finally, the doses averaged across Western Europe are 0.3 millisieverts per person, the statistical limits of which are too broad to be meaningful.
In the absence of conclusive studies, advocates of nuclear power, including uranium miners and those involved in the international nuclear industry, tend to play down such figures as do exist. Hundreds of investigations have been carried out, the findings of most being Russian or Ukrainian. If we look at the conclusions of seven major analyses conducted some 20 years after the catastrophe, we might get some idea of the truth.
At the lower end of estimated casualties is the Chernobyl Forum report, published in September 2005 with contributions from the International Atomic Energy Agency (IAEA), the World Health Organization (WHO), the UN Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), other agencies, and the governments of Belarus, Russia, and Ukraine. It put the total number of predicted deaths at around 4000, of which 2200 will be from the ranks of 600,000 emergency workers, or ‘liquidators’, at the plant. The report stated that apart from a 30-kilometre area around the site, and a few restricted lakes and forests, radiation levels had returned to acceptable levels.
A report published in 2006 strongly disputed these findings. Dubbed the TORCH report (‘The Other Report on Chernobyl’), it was initiated by Rebecca Harms, a German member of the Greens party of the European Parliament, and included the research of many independent scientists. It pointed out that the Chernobyl Forum failed to examine casualties outside the former Soviet Union, where 40 per cent of the fallout from Chernobyl had fallen. The TORCH report predicted about 30,000 to 60,000 excess cancer deaths, seven to 15 times greater than the 4000 estimated in the earlier report. It predicted excess cases of thyroid cancer at between 18,000 and 66,000 in Belarus alone. According to the Australian physician and expert in nuclear medicine Dr Peter Karamoskos, thyroid-cancer cases are estimated to be around 20,000 to 30,000, and will peak around the year 2030. With good surveillance and treatments among the victims, he thinks these cancers will be unlikely to lead to death.
A third report, also published in 2006, was sponsored by Greenpeace. The report noted that recently published figures indicated that in Belarus, Russia, and Ukraine alone, 200,000 cancer deaths probably occurred between 1990 and 2004, and suggested that 270,000 cases of cancer eventually would be attributable to fallout from Chernobyl, and that 93,000 of these would probably be fatal. The Greenpeace report sharply contested a claim in the Chernobyl Forum report that of the thousands of thyroid cancers among children, 99 per cent were operable and curable.
An even more sobering report concerned deaths among liquidators. Prepared in 2005 by Union Chernobyl, a liquidator advocacy group, it calculated that, at its time of publication, 10 per cent of the 600,000 recruited had died and 165,000 had developed a disability, not necessarily all from radiation exposure.
A fifth report, released in 2006 by the International Physicians for Prevention of Nuclear Warfare, examined genetic defects that had occurred as a result of Chernobyl. Entitled ‘Health Effects of Chernobyl: 20 years after the reactor catastrophe’, it claimed that 10,000 people have been affected by thyroid cancer and 50,000 new cases are expected; also that there have been 10,000 deformities and 5000 deaths among newborns following the disaster.
A sixth report was published in 2009 by the New York Academy of Sciences. Entitled Chernobyl: consequences of the catastrophe for people and the environment, it was prepared by three Russian researchers, Alexey Yablokov, Vassily Nesterenko, and Alexey Nesterenko, who painstakingly sifted through medical records, dating from 1986 to 2004, in the districts surrounding Chernobyl. They reached what has become recognised as the upper limit of estimates about radiation-induced mortality by concluding that a total of 985,000 deaths had occurred as a result of the catastrophe at Chernobyl, with most occurring in Belarus, Russia, and Ukraine. As with the other reports, opinion about the veracity of this report is divided, with some critics suggesting its findings are too radical to be taken seriously.
For completeness, it is necessary to refer to a UNSCEAR report of 2008, which reversed the Chernobyl Forum’s earlier findings of 4000 deaths. UNSCEAR decided that poor data and statistical uncertainty made it impossible to determine how many deaths, if any, there would be, even though it had participated in the earlier Chernobyl Forum. The UNSCEAR decision has been waved about by pro-nuclear groups like a talisman to ‘prove’ that the perils of ionising radiation are highly exaggerated or non-existent.
Whatever conclusions one draws from these various findings, one thing is clear: until Fukushima, Chernobyl was the most serious nuclear accident in history, and its long-term health effects have been and will continue to be extremely severe. Meanwhile, the intensity of radiation in northern Japan is still being measured, but the findings to date are alarming. In March 2012, Haruo Sato of the Japan Atomic Energy Agency’s Horonobe Underground Research Centre in Hokkaido confirmed that, with every month that passed, caesium-137 was sinking further into the ground in various agricultural and urban areas of northern Japan. In agricultural areas, it has tightly bonded with the soil, imposing enormous burdens on those trying to decontaminate the ground. Where the caesium has landed on hard surfaces in urban areas, it has infiltrated tiny pores, and can only be removed by grinding off the surface — a procedure which creates contaminated dust. Over a year since the accident, a wide range of radioisotopes, in varying degrees of concentration, continue to be found across north-eastern and central Japan, including in the Tochigi, Ibaraki, Saitama, and Chiba prefectures. According to the United States Geological Survey, fallout from Fukushima has not just pervaded Japan. In a report published on 1 March 2012, it estimated that it took just 18 days for the fallout to circle the globe. Officials in Vermont confirmed that traces of caesium-134 and -137 originating from Fukushima had been found in drinking water, milk, vegetation, and maple products.
In the absence of completely accurate and irrefutable epidemiological evidence of the effects of such radiation over a long period, controversy about its effects will continue to flourish. Objective research has been frustrated by pro-nuclear lobby groups that turn the debate into a battleground. Keith Baverstock, a former health and radiation advisor to the WHO, suggests there is still ample room for further forensic research on Chernobyl (and no doubt Fukushima), and has called for a comprehensive investigation into cancers and birth defects. He calculates that such an undertaking on Chernobyl would cost 10 million for the first ten years, and believes it should be funded by the European Commission. No one has yet made a similar estimate for Fukushima.
Leaving such estimates and speculations aside, what is ionising radiation, how is it measured, and why is it so devastating to human tissue and genetic structures in the body? The Australian radiation expert Dr Peter Karamoskos gives the following explanation.
Broadly speaking, there are two types of ionising radiation: particulate and electromagnetic, or EMR. Particulate radiation is due to nuclear instability, where the nucleus of an atom tries to achieve a lower energy state and gives off alpha particles, beta particles, or neutrons. Nuclear instability can also give off high-energy EMR, which is called gamma radiation. Think of it as intensely powerful sunlight, whereas the particulate stuff is like masses of sub-microscopic ball bearings. A source of ionising radiation is nuclear fission, where heavy atoms of uranium or plutonium split, releasing lots of energy (gamma rays) and neutrons (particulate radiation similar in size to protons). Ionising radiation is so named because when it strikes atoms they create electrically charged molecules called ions, which destroy living tissue. In DNA, such radiation breaks bonds and plays havoc with genes and chromosomes, either directly or through the formation of free radicals.
We measure ionising radiation by its ability to ionise air. The roentgen (R) is the unit of exposure required to produce one electrostatic charge in one cubic centimetre of dry air. If we want to measure a dose absorbed by the human body, we use the gray (Gy) unit, which is equivalent to the absorption of one joule per kilogram. It is important to note here that not all radiation has the same biological effect. Some types (for example, alpha particles) have much greater potential to do damage than others (such as X-rays), so we reflect this in a radiation-weighting factor. The unit of equivalent biological dose is the sievert (Sv). Using this measure, we adjust for the differing sensitivities of various body tissues. In general, the faster a tissue multiplies, the more sensitive it is to the harmful effects of ionising radiation. Gut, breast, and blood cells have a higher tissue-rating than bones. In babies, the thyroid is particularly sensitive because of its rapid growth.
Adjusting for tissue-sensitivity weighting factor gives us the effective biological dose. From a risk perspective, it is the sievert that matters — any other unit measures something other than radiation damage to human tissue. Radioactivity itself, for example, has its own units, which measure the intensity of radiation in inanimate substances, including soil, rather than the biological dose. The basic unit is the becquerel (Bq), which equals one nuclear disintegration per second.
However, a major weakness in using the sievert is that it uses as its model an ‘ideal’ human being — a 70 kg Caucasian male or female of around 170 cm in height. Clearly, very few people exactly conform to the ‘ideal’ human, and so it is merely a tool for delivering an approximate dose estimate (not too far off the mark, but still imprecise). In this context, the maximum-radiation-level maps for the Fukushima catastrophe (mentioned in Chapter Three and published daily in the Japanese press), which provided the biological dose to humans over a given period of time, were too generalised to be reliable.
According to authoritative estimates, the average background dose-rate from naturally occurring radiation is 3 mSv per annum, or 0.3 microsieverts per hour. Compare this with northern Japan after Fukushima: on 25 October 2011, The Japan Times map showed the abandoned village of Iitate (39 kilometres north-west of Fukushima Dai-ichi) as having a radiation dose-rate of 2.041 microsieverts per hour, the highest of anywhere in Japan (except in the reactors themselves) — that is, six times the average natural radiation level. Overall, Fukushima prefecture registered 1.02; Miyagi, 0.059; Iwate, 0.022; Ibaraki, 0.081; and Tokyo, 0.058. The maps were made by aerial-survey planes, and were averaged over areas. Their weakness was that they didn’t reveal hot spots such as those created by caesium-137, an aerosol which does not spread out uniformly like a gas, its deposition being dependent on air currents, rain, snow, and geographical features such as mountain tops. Early in the crisis, a cloud of caesium travelled over Tokyo before moving to the south, where it irradiated a tea plantation. If it had rained that day, Tokyo itself would have become heavily contaminated, and parts of the city would have become uninhabitable.
The intensity of radiation in foods or any material is measured by the becquerel. A survey conducted by the Ministry of Education, Culture, Sports, Science, and Technology (MEXT) of 2200 locations within a 100-kilometre radius of the Fukushima plant showed that more than 30 had caesium-137 contamination of 1.48 million Bq per square metre, the level set by the Soviet Union for forced resettlement after the 1986 Chernobyl disaster. Another 132 locations had a combined amount of caesium-134/137 of more than 555,000 Bq per square metre, the level at which Soviet authorities called for voluntary evacuation and imposed a ban on farming. Local inhabitants were suspicious that TEPCO or the Japanese government were not telling them the full story of the severity of radiation in food.
Nor did they believe that the upwardly adjusted dosage limits of ionising radiation in human beings were trustworthy. In April 2011, MEXT set a new dose limit for nursery, kindergarten, primary-school, and junior-high-school children of 3.8 microsieverts per hour, or 33 millisieverts over a year. This was much higher than the ICRP recommendation of 1 mSv per year, and close to the maximum permissible annual dose of 50 mSv in any one year recommended by the ICRP for adult nuclear-industry workers. A critical factor here is that not everyone faces the same risk. For infants (under one year of age) the radiation-related cancer risk is four to five times higher than for adults; and female infants are twice as susceptible as males. Furthermore, females’ overall risk of cancer related to radiation exposure is 40 per cent higher than for males. Foetuses in the womb are the most radiation-sensitive of all. After Fukushima, Japan increased the maximum dose of nuclear-industry workers from its own self-imposed dose of 100 mSv over five years to 250 mSv per annum. It became an arbitrary case of ‘When the river rises, build a higher bridge.’
Two other problems complicate measuring acceptable and unacceptable doses of radiation. First, no threshold exists below which radiation does not pose a risk of cancer or genetic damage. In 2006, the US National Academy of Sciences produced the Biological Effects of Ionising Radiation VII report on low doses of ionising radiation (doses less than 100 mSv). The report stated that ‘it is unlikely that there is a threshold below which cancers are not induced’. In other words, there is no safe dose of ionising radiation. Second, the effects of radiation are cumulative, just like the withering and cancerous effects of sunlight on human skin. The effects are not fleeting, but sustained and cumulative over time; the more exposure one has, the greater the risks of cancer.
Just as there are dissidents in the global-warming debate who say that climate change is not happening, so there are heterodox advocates who belittle the risks of ionising radiation. One of the most prominent is Professor Wade Allison, a particle physicist at Keble College, Oxford, who in his book Radiation and Reason asserts that small doses of radiation are good for you (a theory known as hormesis), and that international bodies such as the ICRP and the IAEA are biased toward setting higher standards of radiation intolerance than are needed, thus driving up the cost of nuclear power. Allison’s thesis is comprehensively taken apart in a review of his book by Keith Baverstock, who was mentioned earlier in this chapter as a former WHO radiation expert, but who is also a mainstream radiation scientist in the Department of Environmental Science at the University of Eastern Finland. Baverstock claims that some of Allison’s analogies are inept, and that his thesis is subject to serious errors — notably, his criticism of the linear no-threshold model introduced by regulatory authorities. Baverstock concludes that ‘as the adverse effects of radiation do not manifest themselves for a long time, and we still do not fully understand the risks of exposure to low doses, we believe it is wise to keep to the principle of “as low as reasonably achievable” in respect of exposure to ionising radiation’. The fact is that Allison and some other scientists sit at the fringe of science.
Many significant areas of doubt persist in Japan about the seriousness of radiation released from Fukushima Dai-ichi. Apart from those workers subject to severe radiation doses, the effects on others will not manifest themselves for ten, 20, 30, or even 40 years. And whether these kill will depend as much as anything else on each individual’s inherited susceptibility to cancer. The science is clear that the chances of developing cancers increase with exposure to ionising radiation. But this is a statistical probability, not a certainty. It is similarly fallacious to assert that all heavy smokers will contract lung cancer (many will, but some won’t), although the odds of doing so increase with the act and frequency of smoking. The incidence of death from radiation usually comes years later, and the numbers are lost in the white noise of general mortality in the population. This provides a challenge for epidemiologists; but until they have firmer evidence, it allows the advocates of nuclear power to continue to get away with their claims that it is the safest way to generate electricity.
As the months have unfolded since 11 March 2011, evidence has continued to emerge about the ineluctable spread around Japan of radiation from the Fukushima reactors. A particularly disturbing discovery was made on 26 March 2012 by Arnie Gundersen, chief nuclear engineer of Fairewinds Associates, a US-based energy consulting company. In February, Gundersen took soil samples in Tokyo public parks, playgrounds, and rooftop gardens. He found that all of his samples would be considered nuclear waste if found in the United States. At a Nuclear Regulatory Commission conference in Washington from 13 to 15 March 2012, the NRC chairman, Gregory Jaczko, emphasised his concern that the nuclear industry presently does not consider the costs of radioactive contamination or mass evacuation in their cost–benefit analyses used to license nuclear power plants.
On 4 February 2012, the Yomiuri Shimbun published what appears to be the first attempt to calculate the number of civilian deaths directly attributable to the Fukushima accident. The survey covered 13 towns and villages in Fukushima, all in the no-entry emergency-evacuation, or expanded evacuation zones around the nuclear complex. It found that 573 deaths could be certified as ‘disaster-related’ — that is, deaths due to chronic disease exacerbated by the disaster. This stopped short of attributing the deaths to radiation, something that TEPCO and the central government were particularly sensitive about and anxious to avoid. But it is an indicator of what will probably become a much greater number of radiation-related casualties as time goes on.
Meanwhile, whatever the number of radiation deaths and illnesses may be, the psychological damage to many Japanese is severe, and likely to lead to significant mental-health problems. According to Evelyn Bromet and Johan Havenaar of the Department of Psychiatry at the State University of New York, a study of Chernobyl 20 years after the accident showed that rates of depressive anxiety and medically unexplained physical symptoms among exposed Ukrainians were two to four times higher than among non-exposed Ukrainians. Cognitive impairment and suicide rates were also higher. In general, morbidity patterns were consistent with psychological impairment documented after other cataclysmic toxic events, such as the atomic bombings at Hiroshima and Nagasaki, the Three Mile Island incident, and the toxic contamination at the Union Carbide factory at Bhopal in India in 1984. Given the magnitude and persistence of adverse mental-health effects on the general population in Tōhoku, it would seem to be an urgent task for the government to initiate long-term educational and psycho-social interventions on a wide scale. At the time of writing, no government agency, let alone TEPCO, had initiated such a program.