ION MEYER GENTLY LIFTS away the sheaf of white tissue concealing the woman’s face. Immediately, it’s apparent that there is something terribly wrong with it. The skin is scarred and uneven-looking, and the area around her closed left eye is red and swollen. As I bend forward to take a closer look, I see that the flesh from the bridge of her nose, down to her left nostril, and extending across her left eye socket has been eaten away; through her closed eyelid, the white curve of an eyeball peeks through. A brass plate puts a name to the face:
Maren Lauridsen
Lupus vulgaris
2.7.18.
This isn’t a hospital morgue, but a storeroom at the back of Copenhagen’s Medical Museion, and the date refers not to 2018, but to a hundred years earlier.
Today, few people are familiar with Lupus vulgaris, or skin tuberculosis, but a hundred years ago, when Maren walked Copenhagen’s streets, it was a disease that was particularly feared. Caused by the same bacterium responsible for tuberculosis of the lungs, it often begins at the centre of the face with painless brown nodules that spread outwards, developing into ulcers that wolfishly consume the flesh (lupus is Latin for ‘wolf’).
There was no cure, so physicians would resort to slowing its spread by burning away the infected flesh with heated irons or corrosive chemicals like arsenic. No wonder people lived in terror of catching it. Once afflicted, victims would become isolated from their friends, family and community, left to face this torture alone.
Although Maren Lauridsen is long dead, this imprint of her disfigured face lives on. Meyer, who oversees the museum’s collections, pulls down another crate, and then another; each containing yet more examples of horribly mutilated flesh, immortalised in wax. One face looks as though it was immersed in seawater for several days – it’s impossible to tell if it’s a man or a woman. Other crates contain just sections of faces: a mouth and jaw reduced to an angry red mush; a pitted, blistered nose.
The models were produced by taking a plaster imprint of the victim’s face, then pouring wax molasses into the mould and painting the finished cast. Their function was to document the extent of people’s injuries before they underwent a revolutionary new treatment that they hoped would cure them. That treatment was light. Filtered and concentrated through a series of glass lenses and cooled by passage through a water-filled tube, ultraviolet rays were directed onto the faces of the patients, where they set to work killing the disfiguring flesh-eating bacteria.
The man who came up with this treatment was Niels Ryberg Finsen, and he would go on to win a Nobel Prize for his efforts. He would also usher in a new era of interest in the health benefits of sunlight, which continues to this day. Finsen’s work had nothing to do with circadian rhythms; it dealt with the direct impact of the sun’s rays on bacteria and our skin.
Finsen was born on 15 December 1860, in the Faroe Islands, a small jigsaw of dramatic and improbable-looking peaks that jut out of the North Atlantic some 177 miles north-west of the Shetland Islands. Buffeted by climatic depressions that bring frequent cloud, rain and storms, sunny days would have been in short supply during Finsen’s childhood. Perhaps this is what motivated him to try to capture those sunrays and concentrate them, in the process rendering them powerful enough to heal.
Arriving in Copenhagen to study medicine when he was twenty-two, Finsen lived cooped up with his books in a north-facing room where the sun never penetrated. He suffered from anaemia and tiredness – but he noticed that his health improved if he was exposed to sunlight.
In fact, Finsen was in the initial stages of Pick’s disease, a progressive disorder characterised by the abnormal metabolism of fats, which begin to accumulate in internal organs, including the liver, heart and spleen, eventually compromising their ability to function. As Finsen progressed through medical school, his conviction about the health-promoting effects of sunlight continued to grow. He collected descriptions about the sun-seeking behaviour of plants and animals and noted how a cat lying out in the sun would repeatedly change position in order to avoid being caught by the shade.1
Finsen was particularly inspired by a paper he discovered in an 1877 volume of the Proceedings of the Royal Society of London. Written by two British scientists, Arthur Downes and Thomas Blunt, it described an experiment in which test tubes filled with sugar water were left on a south-east facing windowsill. Half of the tubes were left in sunlight; the other half were covered by a thin sheet of lead. After a month, the researchers noted that those tubes that had been exposed to the sun remained clear, while those that had been covered were foul and cloudy. It was some of the first proof that sunlight could kill bacteria; soon afterwards, the famed bacteriologist Robert Koch – who had only recently identified the bacterium responsible for causing tuberculosis – showed that that, too, could be killed by sunlight.
But these scientists were by no means the first people to take an interest in the sun’s healing power at that time. In 1860, the year that Finsen was born, the English nurse and social reformer Florence Nightingale published her Notes on Nursing, which contained a section about light. She wrote: ‘It is the unqualified result of all my experience with the sick, that second only to their need of fresh air is their need of light; that, after a close room, what hurts them most is a dark room. And that it is not only light but direct sunlight they want.’2
Nightingale observed how, in hospital rooms that contained windows, almost all patients lay with their faces turned towards the light, ‘exactly as plants always make their way towards the light’ – even when lying on that side of their body was uncomfortable or painful to them.
She emphasised how the morning and midday sun (the very times when hospital patients were likely to be in bed) were of most importance. ‘Perhaps you can take them out of bed in the afternoon and set them by the window, where they can see the sun,’ she suggested. ‘But the best rule is, if possible, to give them direct sunlight from the moment he rises till the moment he sets.’
Although the ancient Babylonians, Greeks and Romans had embraced the healing properties of sunlight, for centuries the idea had lain forgotten. Now, in the sun-starved cities of northern Europe, sunlight was being rediscovered. In an era before antibiotics, the revelation that light could kill bacteria constituted a major medical breakthrough. And it was Finsen who developed the first practical application of this discovery.
After graduating from medical school, Finsen found work teaching anatomy in the building that now houses Copenhagen’s Medical Museion. His fascination with sunlight continued, nevertheless, and he began experimenting with devices that might harness it more effectively. Today, the shelves of the museum’s storeroom heave with the glass and quartz lenses that Finsen developed to conduct his early research into the healing effects of light. He even turned guinea pig himself, quantifying the amount of sun exposure needed to trigger sunburn.
Because sunshine was so often in short supply in Denmark, Finsen also began collaborating with Copenhagen Electric Light Works to develop an artificial light that could be used when the sun was absent. While working there, he encountered an engineer called Niels Mogensen, whose face was blighted with severe and painful ulcers caused by tuberculosis. After just four days of Finsen’s light treatment, he showed a dramatic improvement.
From this collaboration, the Finsen Light was born: an elaborate rigging of telescope-like tubes and lenses that filtered, compressed and cooled the rays from a carbon arc lamp, and could be used to treat multiple patients at once. In 1896, Finsen established the Medical Light Institute, which enabled even more patients to be treated, with impressive results: of the 804 people who underwent light treatment for skin tuberculosis between 1896 and 1901, 83 per cent were cured, and only 6 per cent showed no improvement.
Through his experiments, Finsen concluded that it was ‘chemical light’, identified as blue, violet and ultraviolet light, that was responsible for its healing effects. Initially he thought it was because the rays themselves killed the bacterium that causes tuberculosis, but more recent experiments have revealed that the Finsen Lamp concentrated UVB rays, which reacted with substances inside the bacteria called porphyrins, causing unstable molecules called reactive oxygen species to be produced that then killed the bacteria.3 Later, Finsen hypothesised that the light was somehow stimulating the body to heal itself, which may also be true.
By the time Finsen received the Nobel Prize in 1903, his own health had deteriorated to the point where he was in a wheelchair, and he died just a year later, aged forty-four.
Despite his use of electric lights, his passion for sunshine was unwavering, and he frequently encouraged patients to walk about naked in the sun. In an interview conducted shortly before his death he remarked: ‘All that I have accomplished in my experiments with light, and all that I have learned about its therapeutic value, has come because I needed the light so much myself. I longed for it so.’4
* * *
The nineteenth century was a time of immense change. Not only did it see the invention of new forms of artificial lighting, but the Industrial Revolution also brought flocks of people to the cities seeking work in the mills and factories that were then springing up. A similar thing is happening in developing countries today, and vitamin D deficiency – caused by smog, sun avoidance and clothing which completely covers the skin – is a growing problem, even in sunny countries in the Middle East, Africa and parts of Asia.
Vitamin D is essential for regulating the amount of calcium and phosphorus in our bones, teeth and muscles, and is needed to keep them strong and healthy. Although we get some vitamin D from our diet, mainly from sources such as oily fish, eggs and cheese, most of our requirements are met by manufacturing it in the skin. A substance called 7-dehydrocholesterol absorbs UVB rays from sunlight, converting it into vitamin D3. This circulates in the blood and is further metabolised into the active form of vitamin D elsewhere in the body. In growing children, vitamin D deficiency causes rickets – characterised by soft, weak bones, stunted growth, and skeletal deformities; in adults it causes the bones to soften, resulting in bone pain, fractures and muscle weakness.
By the middle of the nineteenth century, rickets was widespread in urban Britain and in other rapidly industrialising countries. A survey by the British Medical Association in the 1880s highlighted the urban nature of the problem: rickets was virtually absent from small settlements and agricultural areas. Many who flocked to the booming cities found themselves living in cramped and gloomy conditions, and the burning of coal for the new industries – not to mention the production of gas for lighting – cast a thick blanket of smog that smothered out sunlight and made spending time outdoors unpleasant. Children played in narrow alleyways, between tall buildings, further isolating them from any sunlight that did penetrate through. Add nutritional deficiencies because of poverty into the mix and a legion of bowed, deformed skeletons was the result.
Various theories were put forward to explain the cause of rickets. Jon Snow, best known for his detective work in tracing the origin of cholera to a water pump in London’s Soho district, believed that bread adulterated with aluminium sulphate was to blame; this might interfere with the absorption of dietary phosphorus, needed to mineralise and strengthen the skeleton, he said. Others pointed the finger at air pollution.
It was in the late 1880s that an English missionary called Theobald Palm suggested that sunlight deficiency was the cause. He had recently returned to Cumberland in northern England to practice medicine after having spent ten years in Japan, and he was struck by the contrast: suddenly he was encountering deformed children, something he’d never seen during his time overseas.
After consulting other medical missionaries in China, Ceylon, India, Mongolia and Morocco, Palm became convinced that rickets was a disease of grey skies and gloomy alleyways. He suggested that ‘the systematic use of sunbaths’ might be a solution.5
Combined with the observations of Downes and Blunt about the bactericidal effects of sunlight, and Finsen’s success in treating skin tuberculosis with light, Palm’s ideas heralded a new appreciation for the sun. It started with wound infections, tuberculosis and rickets, but over the next forty years, ‘sun cures’ became a mainstay of medical treatment. It was as though sunlight – or more specifically, the UV rays contained within it – somehow boosted health more generally. Exposure to it also made people feel good, and increasingly, society decided, made them look better as well.
* * *
In 1903, the year that Finsen was awarded the Nobel Prize, a Swiss doctor called Auguste Rollier was turning his back on a conventional medical career, following the suicide of a close friend who had become disabled by skeletal tuberculosis. Besides infecting the skin and lungs, tuberculosis can also infect the bones and joints, causing the spine to deform and protrude outwards, or the hip joints to degenerate, resulting in lameness. It was the latter problem that had afflicted Rollier’s friend. Parts of his knee and hip joint had been surgically removed while the boys were still at school, but this had failed to contain the disease. Subsequent surgery as a young adult had left him mutilated, and he eventually took his own life.
Soon afterwards, Rollier’s fiancée contracted tuberculosis of the lungs. Perhaps in desperation, he turned to a folk remedy that he’d learned from some of his patients: heading to the high mountains and lying out in the sun. In 1903, he accepted a job at a rural surgery in Leysin, in the Swiss Alps, and the couple moved to this sunny village with spectacular views of the fang-like peaks of the Dents Du Midi. It was here that Rollier began to develop an alternative treatment for tuberculosis.
‘At an altitude of 5,000 feet,’ he wrote in his 1927 book, Heliotherapy, ‘the air is never oppressively hot, even in the height of summer; in winter, although the atmosphere is intensely cold, the brilliance of the sun more than counteracts this quality.’
On sheltered, outdoor terraces, ‘debilitated and miserable patients’ would be laid out in loincloths in the sun, ‘under conditions that impart to their bodies a more effective means of self-defence than could be possible in the flat country. The sick people regain their lost vital energy through the instrumentality of the sun and alpine air.’6
This wasn’t sunbathing as we know it today, where the sun-starved masses of northern climates fling their bodies onto Mediterranean sand for a week of intense baking. Instead, Rollier advocated slow and progressive exposure to sunlight, starting with just five minutes on the feet, which would gradually increase over the next three weeks, until all but the fairest patients were taking daily two- to three-hour ‘sunbaths’ during the summer, and three- to four-hour baths in winter. Believing that the combination of hot air and sunshine was bad for people’s health, he banned patients from sunbathing at midday during summer, and preferred to make use of the early morning sun.
Not only did Rollier’s fiancée recover, soon many other patients were regaining their health under his supervision. Before and after photos documented the startling transformation of hunched and deformed children’s spines reverting to their normal curvature within eighteen months of sun treatment. Others captured men lying sprawled out in loincloths in front of enormous sunny windows, and young boys waving from their beds on sunny outdoor terraces.
In such ‘internal’ cases of tuberculosis, it seems unlikely that UV light was directly killing the disease-causing bacterium, as it did in the skin. Neither could this bactericidal effect of sunlight explain its role in preventing rickets.
The breakthrough came in 1925, when an American doctor called Alfred Hess discovered that feeding rickety rats the skin of humans or calves that had been irradiated with UV light cured them of their rickets.7 The mystery curative factor it contained was eventually characterised as vitamin D.
We now know that the reason Rollier’s sunlight treatment may have been so effective against internal forms of tuberculosis is because the vitamin D it manufactures helps to turn on the body’s first line of defence against bacterial invaders inside the body. When immune cells such as macrophages – which detect, engulf and destroy foreign bodies, including bacteria – encounter an invader, they start converting the inactive precursor of vitamin D into its active form, and producing receptors that will enable them to respond to it. As a result, they spew out an antimicrobial peptide called cathelicidin, which helps to kill the bugs. The same process is thought to reduce our susceptibility to other chest infections besides TB.8
By the end of the 1920s and into the 1930s, sunlight was being touted as a cure for, well, pretty much every disease under the sun. In his 1929 book, The Sunlight Cure, the medical writer Victor Dane concluded: ‘If you wish to obtain a general idea of the powers of the sun, and to know the names of the various diseases it benefits, buy a medical dictionary and memorize the names of all the diseases you find therein. The sun is the great healer of healers, truly an “elixir vitae”.’9 Sunlight had hit the mainstream, and suntans were to become a must-have fashion accessory.
However, not everyone bought into the idea of sunlight as a cure-all. A 1923 paper in The Lancet noted that ‘the results on tuberculosis of the lungs have been, in many hands, disappointing, and have led to avoidance of the treatment by many physicians, and even to its condemnation by some as a dangerous and unjustifiable form of therapy’.10 In some cases, unsupervised sunbaths had resulted in an increased temperature, worsening cough, and the coughing up of blood.
Some went even further in their criticism. In his 1947 book, Nothing New Under the Sun, the distinguished British surgeon John Lockhart-Mummery dismissed sunlight therapy as ‘pseudo-magic’, adding that ‘most of the benefit patients get from such treatment is due to their faith in the magical results, rather than direct benefit.’11
By then, however, the popularity of sunlight as a panacea was already beginning to fade – even if the fashion for bronzed skin that it helped kick-start continued for decades. The discovery of antibiotics rendered sunlight therapy for infectious diseases obsolete, and as smog from the cities cleared, and cod liver oil was identified as a rich dietary source of vitamin D and routinely spooned down children’s throats, the threat from rickets also diminished. Today, phototherapy is still used, but it is confined to certain skin diseases, including psoriasis, atopic eczema, and other forms of dermatitis.
Even so, as concerns about antibiotic resistance grow in our own age, there’s renewed interest in harnessing the bactericidal effects of light. Fixtures that use a narrow spectrum of visible indigo-coloured light to kill bacteria are being deployed in hospitals to disinfect surfaces and clean the air. So too is the subtype of UV light called UVC, which is unable to penetrate human skin or the outer layer of the eye yet is deadly to smaller bacterial cells. A recent trial published in The Lancet found that UVC machines cut transmission of four drug-resistant superbugs – MRSA, vancomycin-resistant enterococci, Clostridium difficile, and Acinetobacter – by 30 per cent.12 Unlike antibiotics, which target specific cellular systems, light destroys the nucleic acids that make up DNA, leaving bacteria unable to replicate or perform vital cellular functions.
It isn’t only these deadly aspects of UV light that are prompting a renewed interest in sunlight. In our bustling twenty-first-century cities people are clamouring for it. And even though vitamin D supplements can be used to manage rickets, and antibiotics given to fight off stubborn infections, there are other reasons why our access to sunlight is more important than ever.
* * *
Less than three hours by car from Hanna and Ben’s house in rural Pennsylvania lies New York: the city that never sleeps. Arriving in Sonia’s dad’s truck, fresh from our stay with the Amish, was like being transported into an alternative universe. The blinds in my Airbnb room in Lower Manhattan were broken, but the light made little difference to my sleep, as I was kept awake by the constant roar of the city: first late-night revellers; then the sound of garbage trucks and trash collectors; then the growing rumble of cars and pedestrians as the working day began.
New York is one of the most densely populated areas in the world, and of its five boroughs Manhattan tops the tables – although its population has fallen since the early twentieth century, when whole families were crammed into tiny tenement apartments on the Lower East Side and sunlight deprivation was rife.
Such was the demand for land that many developers tried to maximise the space they owned by building upwards. By now, the link between sunlight and diseases such as tuberculosis and rickets had filtered into the public consciousness, and people were talking about a ‘right to light’ – similar to the ‘right to light’ in contemporary English law, itself based on the Ancient Lights law, which was motivated by the simple desire to have enough light by which to see in your own home. In the UK, this law enables homeowners to block developments that will obstruct their daylight, provided they have enjoyed access to it from across a neighbour’s land for twenty years or more.
The growing clamour in New York prompted the authorities to introduce zoning regulations in 1916, which specified that after a certain height, developers would have to ‘set back’ the upper floors of buildings, leading to the classic ‘wedding cake’ design of many Manhattan skyscrapers.
These issues have recently returned to the fore, as Manhattan’s population begins to expand again. The New York city planning department estimates that by 2030, Manhattan will house between 220,000 and 290,000 more people – roughly one new neighbour for every six current residents. Unsurprisingly, this influx is generating a demand for growth in pockets of the city, which still – arguably – have room to be developed.
Like many American cities, Manhattan is laid out on a grid system, which is perfectly rectangular – except for Broadway, which seems to snake through the neatly ordered boxes of concrete as it chooses. While uptown is generally considered to lie to the north, and downtown to the south, the grid is actually rotated 30 degrees east of north. This means that on two days a year, 5 December and 8 January, the rising sun aligns precisely with the street grid, flooding both the north and south sides of each cross street with light. And on 28 May and 11 July, the towering pillars of glass and concrete neatly frame the setting sun – a phenomenon dubbed ‘Manhattanhenge’ that draws thousands of tourists and office workers out on to the streets to observe it.
The shimmering towers of Manhattan are impressive to behold, reflecting sun and sky in their substance, as many do. But on the ground, it’s a different story. As the city grows upwards, New Yorkers are progressively being deprived of their lunchtime dose of sunshine as their public outdoor spaces are plunged into shade.
Home to iconic skyscrapers such as the Chrysler Building, Rockefeller Center, and United Nations Headquarters, east Midtown is a densely packed district of Manhattan that will seem familiar to anyone with access to a TV set. Yet the city planning department believes that there’s still room to grow – particularly at its outer edges, where the buildings are just eight to ten storeys high, and the sidewalks are planted with trees.
Here, next to a small synagogue, and opposite two unpretentious Thai and Japanese restaurants, is the entrance to the ‘vest-pocket’ Greenacre Park – a space so tiny, that on my first attempt to find it, I overshot and almost missed it entirely.
The park was opened for the people of New York City by the late philanthropist Abby Rockefeller Mauzé, in 1971, ‘in the hope that they will find here some moments of serenity in this busy world’. There’s a certain irony in the fact that the inherited wealth that funded this haven of sunlight and serenity came from Abby’s grandfather, John D. Rockefeller Senior, whose fortune was built on refining oil to kerosene, the market for which was fuelled by the growing demand for indoor living.
Greenacre Park is no bigger than a tennis court, and is entered through a wooden gateway, which supports a pergola that extends around the left-hand side of the park to a raised area, where people sit around chatting and eating their lunch. In her old age, Mrs Rockefeller used to sit here herself, reading, chain-smoking and admiring the enormous waterfall that gushes out of the leaf-covered back wall into the rectangular pool below. The other unusual feature is the grove of spectacular honey locust trees planted in the park’s central area. These slender, reddish-brown-trunked trees form a canopy of delicate, fern-like fronds that filter and dapple the sunlight, scattering it into a constantly moving mosaic of softly dancing light and shadow. Combined with the waterfall, it’s like being transported into a Hawaiian glade, right in the middle of New York City.
Next to a small café by the entrance, I chance upon Charles ‘Charlie’ Weston, an African American man in a brown park-keeper’s uniform, who helped build the park. If I want to know how it might look if the proposed rezoning goes ahead, he suggests, I should head over to Paley Park, another pocket park located between Madison and Fifth Avenue. I take his advice, and discover an almost identical space, complete with waterfall and honey locusts – yet here the shade from the encroaching skyscrapers has robbed the trees of their mid-level growth, taking much of the park’s character along with it.
‘Before the skyscrapers, there was a lot of sun; very beautiful trees,’ says Tony Harris, another long-standing park-keeper, who has worked here for three decades. ‘Nowadays, we still get the sun, but it comes and goes.’ I ask if it’s made a difference to the number of visitors, and he cracks a broad smile: ‘Of course not; it’s Paley Park: best-kept little park in the whole of New York.’
Others disagree: a shady park soon becomes a disused park – particularly during the colder months when the lack of sunshine makes it too unpleasant to linger outdoors. In a place like New York, where the price of land is at such a premium, it’s hard to justify maintaining a disused park, which puts these outdoor spaces at risk. The ‘Fight for Light’ campaign centring around Greenacre Park says a lot about people’s inherent thirst for sunlight – although, at the time of writing, it had fallen on deaf ears, and the Midtown rezoning plans were expected to go ahead.
Similar battles are raging elsewhere. In London, Roman Abramovich’s plans to build a new £1 billion stadium for Chelsea Football Club have clashed with the desire of local families to maintain access to sunlight in their homes and gardens.13 Even in sweltering Delhi, where the sun shines on 350 days of the year, a relaxation of the height restrictions on new buildings has prompted concerns about neighbouring buildings being cast into shadow. It’s already an issue in Mumbai, where a recent report by the consultancy firm Environment Policy and Research India recommended at least two hours of ‘uninterrupted sunlight’ for buildings each day.14
The discovery that city dwellers’ access to sunlight is crucial for their physical health was hard-won. Now, as ever more of us pour into already crowded cities, we risk forgetting those lessons. Parks and other public outdoor spaces shouldn’t be a luxury. A recent review of published evidence by the World Health Organization concluded that access to urban green space is beneficial to people’s mental health, reduces the toll from cardiovascular disease and type 2 diabetes, and improves pregnancy outcomes. And for children and young people, there’s another reason why spending more time outdoors is good for them – and it has to do with the shape of their eyeballs.
* * *
Ian Morgan lays claim to having once been the world’s foremost expert on the chicken retina. Most of all, he was interested in how the eye makes the switch from low-light to high-light vision – a process that involves the signalling molecule dopamine. ‘If you tell that to people at a dinner party they fall asleep around you,’ says Morgan, in a strong Australian accent. Tell them you’re working on a cure for myopia, however, and they start to sit up and take notice – particularly when you mention that your work may change the fate of a billion children in China.
East Asia is grappling with an epidemic of myopia (short-sightedness), a condition that, like rickets, begins in childhood. In many urban areas – including Guangzhou – prevalence often exceeds 90 per cent, whereas sixty years ago, just 10 to 20 per cent of the Chinese population was short-sighted. This is very different from Morgan’s native Australia, where just 9.7 per cent of Caucasian children are short-sighted.
Guangzhou, in South China, is at the heart of one of the most populous, built-up areas on the planet. It’s also home to China’s largest eye hospital, where Morgan is a visiting fellow – yet even here they are struggling to keep up with demand for their services. On some days, you can’t pass through the corridors for all the patients.
Yet they are the lucky ones: in some rural areas, a misconception that glasses will harm children’s eyesight means that many are left untreated. Because they can’t see the blackboard, they fall behind with their schoolwork.
So few people are short-sighted in Australia that Morgan wasn’t even entirely sure what myopia was during the early part of his career. However, every now and then an academic paper on the subject would cross his radar, so one day he decided to start reading.
This taught him two things: 1) although many textbooks claim that myopia is a genetic condition, its prevalence is rising far faster than can be explained by natural selection; and 2) myopia isn’t simply a case of needing glasses: it is also a leading cause of adult-onset blindness.
When Morgan heard about the soaring levels of myopia in East Asia, he spotted an opportunity to make a genuine difference to people’s lives, and set out to investigate. His first task was finding out just how prevalent the disorder was in Australia compared to neighbouring Asian countries. His results were startling.15 By the age of seven, the prevalence of myopia in Australia was 1 per cent; in Singapore it was 30 per cent. Wondering if genetics might have something to do with it, Morgan then looked at myopia prevalence among children of Chinese ancestry who had been raised in Australia: it was just 3 per cent.
‘The only factor we could think of that differed, was the amount of time that kids reported being outdoors,’ Morgan says. Further studies revealed that, whereas Australian children spend some four to five hours outdoors per day, in Singapore it was more like 30 minutes.
Morgan’s theory that outdoor light might be protective was strengthened by a series of animal experiments by other labs. Raising baby chickens in low light was found to significantly increase their chances of becoming myopic, and a separate study found that housing chicks in the equivalent of outside light protected them from developing an experimental form of myopia.
Myopia is caused by the eyeballs growing too long, resulting in the light from distant objects focusing a short distance in front of the retina, rather than directly onto it. In severe cases, the inner parts of the eye stretch and thin, resulting in complications such as cataracts, retinal detachment, glaucoma and blindness.
The current best guess is that light stimulates the release of dopamine in the retina, which blocks the elongation of the eye during development (unfortunately, bright light doesn’t seem to reverse myopia in adults). Retinal dopamine is regulated by a circadian clock and usually ramps up during the day, enabling the eye to switch from night- to daytime vision. The theory is that, in the absence of bright daytime light, this rhythm becomes disrupted, resulting in retarded growth. Further studies have revealed that intermittent exposure to bright light – as one would naturally get if one spent lots of time outdoors on this rotating planet – is even more protective against experimentally induced myopia.
Ironically, given the impact that myopia can have on children’s education, it is the very desire to see their children excel at school that is contributing to the problem in East Asia. The combination of intensive schooling and a lifestyle that actively discourages children from playing outside, is depriving them of daylight – an essential ingredient of healthy eye development. ‘Children don’t go outside at recess time, because they’re told it is bad for their skin; that girls will never get a husband if they have dark skin,’ says Morgan. ‘In Australia it’s a punishment if you don’t go outside.’16
However, myopia isn’t an issue only for East Asia. In both the UK and US, myopia rates have doubled since the 1960s, and continue to rise. In Western Europe, some 56 per cent of people are expected to be short-sighted by 2050; in North America it’s projected to be 58 per cent. And even outdoor-loving Aussies aren’t immune from the trend towards increased indoor living and time spent in front of electronic screens: based on current trends, some 55 per cent of Australasians are projected to be myopic by 2050.
Once myopia starts, it generally progresses until late adolescence, so if you can delay its onset by even a few years the hope is that this may drastically reduce the numbers suffering from severe myopia, with its associated risks.
The solution may be relatively simple. In 2009, Morgan launched an ambitious trial to put his theory about the protective effects of outdoor light to the test in Guangzhou. At six randomly selected schools, the six- and seven-year-old children would have a compulsory 40-minute outdoor class bolted on to the end of each school day. The children were also sent home with activity packs containing umbrellas, water bottles and hats with an outdoor activity logo; and they were rewarded if they completed a diary of weekend outdoor activities. Children at six additional schools carried on as normal, serving as controls. Three years later, Morgan and his colleagues compared rates of myopia between the two sets of schools: in the schools with the outdoor intervention, 30 per cent of children had developed myopia, whereas in the control schools it was 40 per cent. 17
That may not sound like much, but the children of Guangzhou were only getting an extra 40 minutes of daylight per day, five days a week during term time – and from a very low starting point. Also, barely any families took up the challenge of venturing outdoors more frequently at the weekends. ‘Our hypothesis would be that you need to be aiming for four or five hours of outdoor time each day, like Australian children have,’ Morgan says. Another American study found that spending ten to fourteen hours per week engaged in ‘outdoor and sport activities’ was associated with approximately half the risk of developing myopia, compared to children doing less than five hours per week.
In Taiwan, schools have tried being more forceful. In 2010, the Taiwanese government launched an initiative called Recess Outside Classroom, recommending that elementary schools (catering for seven- to eleven-year-olds) turn children outside during their recess periods, which total 80 minutes each day. ‘They made the kids get outside, turned off the classroom lights, and locked the classroom doors,’ Morgan says. After one year, they reported that myopia incidence had halved among schools that adopted the programme.
Such assertive methods are unlikely to work everywhere. Neither is it necessarily healthy to send children to sit outside in direct sunlight: suffering sunburn during childhood or adolescence more than doubles an individual’s chances of developing the potentially deadly skin cancer melanoma in later life. However, avoiding direct sunlight altogether can create other issues. Rickets is still a problem in many East Asian countries – and is making a comeback in Western cities, such as London, due to a combination of malnutrition and indoor living.
Researchers are also waking up to the possibility that vitamin D has other important effects on our health – even before we are born – and that the sun may affect our biology in other, unexpected, ways. That mystery factor that so fuelled the popularity of sun cures during the first half of the twentieth century is becoming less mysterious by the day. Sunlight may not be the ‘elixir vitae’ that Victor Dane proposed – and there’s no doubting that it’s harmful in large amounts – but the sun’s influence over our biology runs deep, and there are some recesses that are only beginning to be explored.