It never ceases to amaze me that we can actually see the stars in the night sky. That might sound like a rather silly thing for an astronomer to say, but really sit and think for a moment at just how far starlight has had to travel before finally reaching our eyes. Next time you catch a glimpse of the night sky, see if you can find the three stars in Orion’s Belt. The closest star in Orion’s belt is 11 quadrillion kilometres away, or 1,200 light years. That means it took that light 1,200 years to travel from that star to our eyes.55 Not only are we seeing the star as it was 1,200 years ago, but somehow, a tiny part of the light which was sent out in all directions across the Universe has managed to make it to our eyes across such a vast distance.

Think of how lights like torches and car headlights get so much fainter when we get further away from them. So now, just stop to picture how bright those stars truly have to be for us to be able to spot them with a quick glance towards the sky from our bedroom windows, despite being quadrillions of kilometres away and competing with the glare of the street lamp across the road. This is why I catch my breath every time I look at the sky. I get lost in the knowledge of how easy it is for us to merely look up and see even tiny pinpricks of light that have been on the most epic of journeys.

Every star you can see in the night sky is in our local neighbourhood of the galaxy. The light from the stars further away in the Milky Way, over on the other side of the galaxy, combine together in one big faint fuzzy glow that looks like someone has spilled milk across the sky (hence how our galaxy originally got its name; the word galaxy even comes from the Greek galakt, meaning milk). Those of you who have seen a truly dark sky, away from the light pollution of cities and towns, won’t have been able to miss the arch of the Milky Way (it’s a flat spiral shape, with all the stars orbiting in one plane like the planets in the Solar System, so it appears as a strip across the sky), and those of you who have only ever seen the night sky from a city will probably not know what I’m talking about. Fainter still in the night sky is the galaxy Andromeda, which is made up of over a trillion stars; it’s visible from the northern hemisphere as a small fuzzy blob, but in reality it extends about the width of six full Moons across the sky. It’s just so far away that the light from those trillion stars is so faint we can barely just detect it with our eyes.

The view is very different when you break out a telescope, though – when Galileo trained his telescope on the fuzzy glow of the Milky Way back in the 1600s, he was shocked to see it resolve into lots of individual stars. Telescopes have allowed us to see further and fainter things in greater detail than we could ever see with our own eyes. And not just telescopes that see in visible light, like our eyes do, but those that detect radio waves (like those used by Bell Burnell and Hewish to discover pulsars) and also incredibly energetic X-rays.

As we saw earlier, X-rays and radio waves are all forms of light; just light with different wavelengths across the entire spectrum. The rainbow doesn’t end at red and violet, it’s just that our eyes can’t detect the light beyond those colours. It was Scottish physicist James Clerk Maxwell who made that jump in understanding of what’s actually ‘over the rainbow’ in 1867. Maxwell’s equations, as they’re now known, are the foundation of every single physics university course the world over. They explain what light is; a wave made up of an electric and a magnetic part (an electromagnetic wave), and how these waves travel. Maxwell concluded that visible light was an electromagnetic wave with a very short wavelength, and predicted the existence of other electromagnetic waves with both longer and shorter wavelengths with different properties.

Maxwell’s equations were just that though: equations. Mathematics only. No one had yet proved that light was actually an electromagnetic wave, or observed those with longer or shorter wavelengths that Maxwell had predicted. But just twenty years later, in 1887, a German physicist named Henrich Hertz invented a device that would generate what we now know as radio waves; electromagnetic waves with a much longer wavelength than visible light. Over the next few years he would go on to prove that they behaved just as Maxwell had predicted, and crucially behaved the same way that visible light did. They reflected, refracted (changing direction when passing from e.g. air to glass, as Fraunhofer found to his intense frustration) and diffracted (spread out around an obstacle or opening, like ocean waves in a cove).

Hertz’s discovery wasn’t just the first ever recorded generation of radio waves, it was the very first proof of Maxwell’s equations and ideas about what light actually is. It opened the door for the discovery of even more types of electromagnetic radiation; and in particular for the ‘accidental’ discovery of X-rays in 1895 by another German physicist, Wilhelm Röntgen. Röntgen was working at the University of Würzburg, playing around with some of Thomson’s cathode ray tubes. As Thomson later discovered, cathode rays are essentially a stream of electrons flowing from a negatively to a positively charged rod of metal. The electrons are accelerated to huge speeds by the voltage difference between the two rods.

Electrons are tiny particles, invisible to the naked eye, so we can’t see the actual cathode ray itself, but what people noticed at the end of the nineteenth century was that if the electrons hit the inside of the glass tube, the glass would glow. The atoms in the glass were absorbing some of the electron’s energy and emitting it as light – this is fluorescence.

Röntgen was trying to establish if it was possible to get the cathode ray out of the tube through a little opening in the glass (the opening was made from aluminium, to block light but conduct electrons). He figured if he covered the entire thing in thick black paper to shield any of the fluorescent glow from inside of the glass, he could then observe if he saw any fluorescence outside of the opening in the glass. To check if his paper cover was completely light-tight, he covered the aluminium opening with his black paper and then turned off all the lights in his lab. He didn’t see any fluorescent glow coming from his paper-sheathed creation and so, satisfied, he went to turn the light back on. It was then, in the dark of the lab, that he spotted something shimmering on a bench a few metres away from the tube. Far further than anyone expected the cathode ray to be able to travel through air; quite famously, electrons need a good conductor to travel through, like copper, hence why all our houses are full of copper wire (or even copper-sheathed aluminium) to deliver us precious electricity.

Röntgen, not believing what his eyes were seeing, repeated the experiment a few times, running a voltage through the paper-sheathed glass tube repeatedly before being convinced this fluorescence was real. He determined that the fluorescence must be caused by a brand new type of radiation. Since these rays were unknown to him, he used the classic mathematical symbol for an unknown property: ‘x’, and dubbed them ‘X-rays’. That term has stuck, at least in English in anyway – in many European languages X-rays are actually known as Röntgen rays.

He then dove into understanding as much as possible about these new ‘X-rays’. What materials could they travel through? How much fluorescence did they cause? How were they generated? He recorded all of this with photographic plates; in the early days of photography images were created by exposing metal plates coated in silver-based salts that were sensitive to light. Where light hit the plate, the substance would turn dark (we know this today as a negative image). His biggest breakthrough was when he moved a piece of lead in front of the opening of the cathode ray tube and noticed it blocked the X-rays, along with his own hand. After seeing a ghostly image of his own hand on the photographic plate, he started to conduct his experiments in secret, fearing his scientific reputation was on the line. However, other scientists had already noticed that photographic plates became exposed if left too close to a cathode ray tube, with American physicist Arthur Goodspeed noticing that a photographic plate with two coins left on it developed to show two dark circles.

So Röntgen, despite his doubts, decided to continue investigating which substances blocked these ‘X-rays’ and which didn’t. It fell to his wife, Anna Bertha Ludwig, to act as guinea pig in his experiment, and he managed to capture the very first recognisable medical X-ray of the bones in her hand. Her bones and the ring on her finger blocked more X-rays than the muscle and skin surrounding them, and so appeared darker on the image. The image looks so familiar and recognisable to us in the twenty-first century (an X-ray is barely anything to blink at when they appear in the background of an episode of Grey’s Anatomy), but on seeing the image of her skeletal fingers, the first of its kind, Anna Bertha is reported to have said, ‘I have seen my death!’

By December 1895, Röntgen had published his work and the discovery of this new kind of radiation took both the public and the scientific world by storm. Practically every physicist had a cathode ray tube in their lab at that time, which meant they could drop everything to recreate Röntgen’s experiment and further study these mysterious new rays themselves. But it was Röntgen himself that recognised how useful they would be in medicine, writing letters about his discovery to every doctor he knew. Within a year, the medical community across the world was using X-rays to locate bullet fragments, see bone fractures, locate comical swallowed objects and more (although with a bit more of a devil-may-care attitude than nowadays, as they didn’t know the dangers that continuous exposure to high doses of X-rays can pose56).

Start of image description, An X-ray image of Anna Bertha Ludwig’s hand published by her husband, Wilhelm Röntgen, on 22 December 1895. The hazy image shows the bones of her left hand and her wedding ring., end of image description

The first ever X-ray published by Wilhelm Röntgen in 1896, of the hand of his wife, Anna Bertha Ludwig. Darker areas are where bone and jewellery block more X-rays. Lighter areas are where fewer X-rays are blocked.

It wasn’t until 1912 that Max von Laue (another German physicist), together with his students doing the grunt work, would figure out what Röntgen’s X-rays were: an electromagnetic wave. They were light, but with a much shorter wavelength than visible light; generated when the electrons in the cathode ray collided with the aluminium covering the opening in the glass tube, and then carried on unhindered through the heavy paper covering it. Röntgen never patented his discovery on ethical grounds, believing something so beneficial to medicine should be free to all. He eventually won the very first Nobel Prize in Physics in 1901 for the discovery, donating the 50,000 Swedish krona prize to further research at the University of Würzburg.

Röntgen’s discovery might have rocked the physics and medical worlds, but it didn’t have much of an effect on astronomers for another fifty years. Max von Laue’s discovery that Röntgen’s rays were a type of light might have planted the idea of observing the sky with X-rays in astronomers’ minds, but it was far from feasible. Thankfully for life on this planet, Earth’s atmosphere blocks the majority of harmful X-rays from outer space from reaching us down on the surface (unlike visible light and some radio waves, which make it through no problem). Great news for us: bad news for budding X-ray astronomers in the early twentieth century.

The atmosphere makes the process of detecting X-rays from objects in space harder than detecting optical light, UV light or radio light. You can’t just cobble together the parts for a telescope on the university’s spare patch of land. Instead, you have to launch your telescope, along with an X-ray detector, up above the Earth’s atmosphere. Sounds fairly easy now to you and me, who are used to even private space companies launching satellites, spacecraft or perhaps the odd electric car into space nearly every single day. But back in the early twentieth century, the idea of X-ray astronomy was considered by most astronomers as just too much of a faff.

Not so for Riccardo Giacconi, though, who having seen the leaps and bounds in knowledge made in the physics world thanks to X-rays, made X-ray astronomy his mission. Giacconi was an Italian-American astronomer who, after completing his PhD at the University of Milan in 1954, jumped ship to the USA on a Fulbright Scholarship.57 Giacconi had been captivated by earlier efforts to detect X-rays at ever-higher altitudes using balloons. But the time of the balloon was over; the time of the rocket-based X-ray observatory had come.

In a technique that would be used until the early 1970s, rockets with X-ray detectors attached would make short flights into the upper reaches of the Earth’s atmosphere and back down again, recording any detections on the way. Giacconi’s experiments using this technique revealed that there were X-rays peppering the night sky, appearing to come from areas where there were no known visible objects. The question on everyone’s lips was, what could possibly be generating these X-rays?

People were stumped, because there’s not a lot of processes that have enough energy to produce X-rays. X-rays are an extremely short wavelength of light, so they’re very energetic. They’re only given off when something is extraordinarily hot (or moving very fast, like the electrons in a cathode ray tube). Not even the surface of the Sun, at 5,700°C, is hot enough to produce X-rays. The Sun’s upper atmosphere (the corona), on the other hand, is millions of degrees and is plenty hot enough to give us X-rays (remember that the wavelength of light given off depends on the temperature).58 The Sun’s X-rays were discovered in 1949 by American X-ray astronomer Herbert Friedman during a rocket flight, and although the Sun is the brightest source of X-rays in our sky, that’s only because it’s so close. The Sun is not a very powerful source of X-rays, unlike the X-rays found strewn across the sky in Giacconi’s experiments.

In 1962, Giacconi detected one of the strongest sources of X-ray light in the sky using the rocket-based method, coming from the direction of the constellation Scorpius.59 Given the technology of the X-ray detector on board the rocket at the time, that was about as detailed in terms of location that you could get; definitely not from the Moon. It was announced to the world as the first X-ray detection from outside the Solar System. With further rocket flights the location was narrowed down to a star called V818 Scorpii, and the X-ray source, being the first discovered in the constellation of Scorpius, was dubbed Scorpius X-1. This led astronomers to debate whether other stars might be giving off more X-rays from the hot corona that surrounds them, as our Sun does, and this remained the most likely explanation for a few years.

That was until 1967, when the Soviet astronomer (born in what is now Ukraine) Iosif Shklovsky argued that explanation couldn’t possibly be right. He said that stars just didn’t have enough energy to produce that many high energy X-rays; they weren’t hot enough. Shklovsky was a big name at the time, in both the scientific world and the public eye, having published a book in 1962 on intelligent life in the Universe, in his native Russian, that was then reissued in English in 1966 with Carl Sagan as a co-author.60 Shklovsky was one of the five giants who pioneered the scientific search for intelligent life beyond Earth, along with Sagan, Italian physicist Giuseppe Cocconi and American astronomers Philip Morrison and Frank Drake (of Drake equation fame, and who was supervised by Cecilia Payne-Gaposchkin).

By 1967, Shklovsky had had a thirty-year career focusing on high-energy astrophysics phenomena (from supernova remnants like the Crab Nebula to the Sun’s corona emitting X-rays) while dabbling in the orbits of the moons of Mars and extra-terrestrial life. So, when he posed a new explanation for Scorpius X-1, people took note, even if at the time it seemed pure theoretical fancy. He concluded the only process that would have enough energy to produce the X-rays seen would be an accreting (accretion is a fancy physics word that means to grow gradually in mass) dense object, like a neutron star. Shklovsky published his paper in April of 1967, seven months before Jocelyn Bell Burnell would spot that bit of scruff in her data that marked the discovery of the first neutron star.

So how did Shklovsky make this leap in understanding? From the maths of how fluids (i.e. liquids and gases) behave, physicists had long known that gas moving incredibly quickly would heat up to equally incredible temperatures. Similarly, if that gas was all moving in one direction, perhaps orbiting something, they knew that it would form a disk shape, like how a ball of pizza dough can flatten out into a pizza shape when it’s set spinning overhead (at least, by a very talented chef; mine always ends up on the floor). He suggested that the only scenario that could explain the energies of the X-rays detected was if Scorpius X-1 was a very dense object in orbit around, and also stealing matter from, the star V818 Scorpii. He argued that only a neutron star could be accreting this matter, accelerating it to huge speeds to form what’s known as an accretion disk around it, and therefore heating it to extreme temperatures so that it gave off X-rays.

With the discovery of pulsars by Jocelyn Bell Burnell, and their eventual explanation as neutron stars, Shklovsky’s hypothesis about Scorpius X-1 became all the more attractive, and the idea was eventually accepted by the scientific community at the start of the 1970s. The 1970s then saw a huge leap in the field of X-ray astronomy: space telescopes. Instead of launching rockets, scientists could launch a satellite into orbit with an X-ray detector on board. The very first was Uhuru61 in December 1970, which surveyed the entire sky, marking the locations of X-ray sources and discovering many more sources that coincided with normal stars (including Cygnus X-1, the very first candidate black hole that we heard about in chapter 5) and with newly discovered radio sources like pulsars.

One source of note was Centaurus X-3 (the third X-ray source found in the constellation of Centaurus in the southern hemisphere sky), which was detected in X-rays first but was later found to be a pulsar giving off radio waves and orbiting a normal star called Krzemiński’s star (after it’s discoverer, Polish astronomer Wojciech Krzemiński). Centaurus X-3, along with many other X-ray sources like it, left no doubt in people’s minds that these X-rays were powered by accretion, just as Shklovsky had suggested. In Centaurus X-3’s case, the compact object is a neutron star – it’s very clearly detected as a radio pulsar. But in some cases, like in Cygnus X-1, the X-ray energies were so huge, much larger than those seen coming from accreting neutron stars, that the only explanation was something much larger than the Tolman–Oppenheimer–Volkoff limit for the maximum mass of a neutron star. Cygnus X-1 could only be powered by an accreting black hole.

So, in the mid-1970s, Russian astrophysicists Nikolai Shakura, Rashid Sunyaev and Igor Novikov and American theoretical physicist Kip Thorne, first modelled how gas orbiting a black hole would heat up to anywhere from 10,000 to 10,000,000 kelvin depending on how massive the black hole (or other compact object) was. This accretion process essentially converts mass into energy (remember, because they are equivalent) in the form of light, which is also how nuclear fusion inside stars can be described. Accretion, though, is much more efficient than nuclear fusion. If 1 kilogram of hydrogen were to undergo nuclear fusion inside a star, only 0.007 per cent of that mass would be released as radiation. Whereas if 1 kilogram of hydrogen was accreted by a black hole, 10 per cent of that mass would be released as light as it spiralled towards the black hole in the accretion disk. That’s the key thing here – the light is released from the accretion disk around the black hole, which is much further out from the event horizon, so we can still detect it.

It’s these detections of X-rays that allow us to know that black holes, dead stars, are hiding out there among the stars of the Milky Way. Unlike what you might first think, black holes are terrible at saying hidden; they make the material around them light up like a Christmas tree. Because of accretion, black holes are not ‘black’ at all; they end up being the brightest objects in the entire Universe. So you’re not reading a book about Robert H. Dicke’s ‘black holes’, but one about blindingly bright mountains.