Philadelphia, 6 May 1953
Imagine for a moment that you are a mechanical engineer (if you’re already a mechanical engineer this won’t take much effort). A particularly awkward client has offered you a vast sum of money to design a machine. This is the brief: ‘I want a small device, about the size of my fist, which can pump high-pressure liquid at a rate of 5 litres a minute – although it will be required to adjust rapidly and automatically to deliver up to 20 litres a minute when needed. It must have no moving parts, and be capable of running for 80 years without interruption or maintenance of any kind.’ You would be compelled to turn down the job: despite years of effort, nobody has ever succeeded in building a pump so reliable. What makes this all the more frustrating is that such a perfect machine already exists: the human heart.
In a lifetime of 80 years the average heart beats around 3 billion times. Mine has done about half of that total so far, pumping the equivalent of 110 million litres of blood (about 44 Olympic swimming pools-full). In 40 years, the closest it’s come to misbehaving is the occasional ectopic (mistimed) beat.fn1 This sheer reliability is a marvel, but of course it is no accident; the organ has evolved this way for a reason. Our tissues are constantly in need of oxygen, and the 5 litres of blood that pass through our arteries every minute meet that demand. So what happens if the pump stops and the blood supply dries up? Surprisingly, death is not instantaneous. Some parts of the body can survive for half an hour or more: severed fingers can be successfully reattached several hours later.
The brain is more demanding. Such is its need for oxygen that the circulation only has to be interrupted for two minutes before the onset of brain damage, and at normal temperatures most of us will be dead after six minutes. As soon as surgeons began to contemplate repairing heart defects they realised this was a major obstacle to their ambitions. In 1936 the American cardiologist Samuel Levine wrote: ‘Until a satisfactory artificial circulation is developed that will nourish the systemic organs and even the coronary arteries, any lengthy operative procedures on the inside of the heart will be difficult or impossible.’1 Opening the heart meant interrupting the circulation, and this could only be done if some way could be found to keep the delicate tissues of the brain and major organs alive while the heart was unable to provide them with fresh blood. For decades this seemed an insuperable problem.
By the end of the 1940s three congenital heart defects were being treated surgically with great success. Robert Gross had shown that it was possible to cure a patent ductus arteriosus; Clarence Crafoord had pioneered the treatment of coarctation of the aorta; and Albert Blalock’s Blue Baby operation had dramatically improved hundreds of cyanotic infants. But the most common congenital condition of all remained beyond the reach of the surgeon. Thousands of babies are born every year with a hole in the heart – a defect in the septum, the tissue wall dividing the two sides of the organ. If the hole lies between the two upper chambers (atria) this is known as an atrial septal defect; if between the two lower chambers (ventricles) it’s called a ventricular septal defect.fn2 The effects of both are similar: red, oxygenated blood from the left side of the heart mixes with blue, unoxygenated blood from the right side. Because the left atrium and ventricle operate at higher pressure than the right, oxygenated blood is forced into the right side of the heart and pumped back to the lungs, causing the heart to work harder and putting undue pressure on the pulmonary blood vessels. Medics call this a left-to-right shunt: it does not cause cyanosis, because there is no reduction in blood oxygenation. This is the opposite of the right-to-left shunt caused by tetralogy of Fallot, in which unoxygenated blue blood is pushed into the systemic circulation.
A hole in the heart is a simple defect with potentially fatal consequences, and surgeons were deeply frustrated at their inability to treat it. They knew that if they could only gain access to the cardiac interior, smaller defects could be closed with a few stitches, while larger ones could be patched with some suitable material. Actually opening the organ seemed out of the question, so they came up with a number of creative ways of making interventions inside it without doing so.
In 1948 the Canadian surgeon Gordon Murray operated on four children using a clever method that employed connective tissue, the fascia lata, from the thigh. Using a large needle, he passed two or three strips of this material through the heart from front to back so that they lay in front of the defect, and then pulled them taut. His intention was that they would interweave, forming an impermeable barrier across the hole. Murray called this the ‘living suture’ technique, since the material used was not silk or catgut but the patient’s own tissue.2 Although ingenious, the method was unsatisfactory since the defects were at best imperfectly closed. One child died during the operation, and the other three were not greatly improved as a result.
At least a dozen methods of solving the problem were tested over the next few years, and only a few went any further than the animal laboratory.3 The most successful was that of Robert Gross, who came up with a truly daring way of gaining access to the interior of the heart, an idea based on simple physics. He realised that if he could somehow attach a funnel to the exterior of the heart and then make a hole in the cardiac wall, blood would not spurt out in great volumes but would simply rise a few centimetres up the funnel. He could then insert his fingers through this pool of blood into the chambers of the heart and perform simple procedures, albeit by touch alone.
After successful animal experiments he designed a 15-centimetre-high rubber funnel, which he called the atrial well. Its lower orifice was 5 centimetres in diameter, the upper aperture 13 centimetres. The base of the funnel was sewn to the cardiac wall, and a 5-centimetre incision made through it to open the heart. This was a dramatic moment, as bright red blood rushed up into the atrial well. Gross was worried that if the pressure were too high it might overflow, but in practice the pool of blood only rose a few centimetres into the funnel. Immersing his fingers into this scarlet reservoir he was able to feel the defect and then repair it. Several of his earliest patients died, but he also had some spectacular successes. One girl of fourteen was transformed: four months after her operation Gross reported that ‘she likes to play tennis and engage in other strenuous sports – all of which are entirely new experiences for her.’4
This was valuable progress, but it required skill that only a few possessed. Operating blindly in a blood-filled heart, using the two or three fingers that could fit through the atrial well, was a perilous business. A more sophisticated solution was needed, and by the time Gross reported his work in 1953 one had already been found. In September the Philadelphia surgeon John Gibbon revealed his successful use of a machine that would later make open-heart surgery routine.5 This was a historic moment, but it went virtually unnoticed: Gibbon presented his results at a minor local medical conference, and few of his colleagues knew it had even taken place. Strangely, the inventor of the heart-lung machine performed only a handful of operations before abandoning cardiac surgery for ever.
Despite early aspirations to be a poet, it was almost inevitable that John Heysham Gibbon Jr – known as Jack – would become a doctor. His father was a distinguished surgeon, one of the first in the US to suture a heart wound,6 and three earlier generations of Gibbons had also been medics.7 Shortly after becoming a fellow in surgery at Harvard, Gibbon had an idea that would preoccupy him for the next quarter of a century, prompted by his experiences one night in February 1931. A woman who had been admitted to hospital for routine surgery suddenly developed severe chest pain; she had a pulmonary embolism, a life-threatening blood clot in her pulmonary artery. Gibbon’s superior, Edward D. Churchill, had her moved to the operating theatre, where a large surgical team spent a sleepless night watching her life ebb away. At eight o’clock the following morning her pulse stopped and an emergency operation was performed, but to no avail. Many years later, Gibbon recalled his frustration during this midnight vigil:
During that long night, helplessly watching the patient struggle for life as her blood became darker and her veins more distended, the idea naturally occurred to me that if it were possible to remove continuously some of the blue blood from the patient’s swollen veins, put oxygen into that blood and allow carbon dioxide to escape from it, and then to inject continuously the now-red blood back into the patient’s arteries, we might have saved her life.8
This is a concise description of the modern heart-lung machine: a device that removes blood from the patient’s body, exchanges its carbon dioxide for fresh oxygen, and pumps it back into the body – temporarily doing the job of the patient’s own heart and lungs. Gibbon’s early development of such a machine was primarily intended to enable life-saving surgery on patients with pulmonary embolism; it did not immediately occur to him that it might also be a useful aid to cardiac surgery.
When Gibbon began to design his machine two years later he received little encouragement. Dr Churchill was unenthusiastic but reluctantly allowed him to go ahead, while other colleagues were actively hostile to the idea: they were convinced that it would not work, devouring the department’s limited resources indefinitely. The only unwavering support came from his research assistant Mary (known as Maly), a talented laboratory technician who also happened to be his wife. Her knowledge, imagination and meticulous experimental technique would be essential to the many years of hard work that followed.
The two main components Gibbon needed were obvious: an artificial lung to oxygenate the blood, and a pump to propel it through the machine and around the patient’s body. This simple description conceals a labyrinth of engineering problems. The first of these was replicating the function of the human lung, which allows the red blood cells to swap waste carbon dioxide for fresh oxygen. The organ is highly efficient at gas exchange thanks to its vast network of tiny air pockets, known as alveoli, which give a single lung an internal surface area of around 50 square metres9 – about the same as the floor space in the average one-bedroom flat. An artificial oxygenator would somehow have to emulate this efficiency without being unwieldy. Then there was the problem of air embolism: bubbles of gas entering the bloodstream. The artificial lung would work by exposing the blood to pure oxygen; if a single bubble of unabsorbed gas reached the body it might enter the vessels of the brain or heart muscle and cause catastrophic damage. The pump presented another challenge: red blood cells are fragile objects, tiny bags of fluid enclosed in a delicate membrane. If roughly handled they can burst, a phenomenon known as haemolysis. Designing a device which combined the necessary delicacy with the power to propel blood to every extremity of the body would be a formidable task. Finally, Gibbon also needed to find a way to prevent the blood clotting, as it normally does when it leaves the body and comes into contact with air.
When he retired to the library to read up on these problems, Gibbon found that many of them had already been investigated by earlier researchers. As early as 1666 the English natural scientist Robert Hooke had suggested that it might be possible to oxygenate blood outside the body. In a series of experiments on dogs, Hooke used bellows to blow air through the lungs. He discovered that inflating and deflating them was not sufficient to keep the animal alive; the air had to be constantly refreshed. In a paper read before the Royal Society, Hooke wondered whether exposing blood to fresh air in a container outside the body might be enough to sustain life; although he proposed an experiment to test this idea, he seems never to have carried it out.10
Hooke’s contemporary Richard Lower – the first person to attempt a blood transfusion – also investigated the mechanism of respiration. In his meticulous experiments he showed that blood was dark in colour immediately before it entered the lungs, and became bright red as a result of passing through them. This disproved a theory, fashionable at the time, that suggested the heart heated the blood and thereby altered its colour. Lower was in no doubt that blood changed its hue as a result of its exposure to air, and showed that this could be done outside the body by shaking a dish of dark, venous blood vigorously in a dish until it turned a vivid scarlet.11
Nobody seems to have thought of putting this discovery to practical use until the nineteenth century. The French physiologist Julien Jean César Legallois was fascinated by the difference between life and death, and in 1813 he speculated about the possibility of reanimating dead animals. He suggested that if the function of the heart were replaced by continuous injection of arterial blood, ‘whether natural, or artificially formed’, it would be possible to maintain life indefinitely, or even to perform ‘complete resurrection’ of a corpse.12 He even speculated that artificial circulation might keep a decapitated head alive, although his attempts to do so in rabbits failed because the blood clotted.
The problem of clotting was partially solved in 1821, when two other French researchers, Prevost and Dumas, showed that beating fresh blood with a whisk removed fibrin, a clot-promoting protein.13 In the 1850s Charles Brown-Séquard continued the work of Legallois with far greater success, whisking blood to oxygenate it before injecting it into the severed head of a dog: its eyes and muzzle moved, and he concluded that he had restored it to life, albeit briefly.14 His most celebrated (and gruesome) experiment took place on 18 June 1851, when he attended the execution of a criminal in Paris. The guillotine fell at 8 a.m.; Brown-Séquard then sat beside the headless corpse for the rest of the day, waiting for it to stiffen. By 9 p.m. rigor mortis had set in, and Brown-Séquard began his investigation by carefully amputating the dead man’s arm. He later recalled: ‘As I wished to inject fresh human blood, and as I could not obtain any from the hospitals at such an hour, I was obliged to make use of my own.’ Two friends who had accompanied him to watch the spectacle removed a third of a pint of blood from a vein in Brown-Séquard’s left arm, beat it vigorously and filtered it through a cloth, before injecting it into the detached limb. The blood immediately started to ooze out of the severed ends of its arteries and veins, so Brown-Séquard collected and reinjected it, doing so continuously for the next half-hour. At the end of this gory process he found that the muscles of the hand lost their cadaveric rigidity and would once again contract if stimulated.15
Further motivation for the development of an artificial circulation was provided by late nineteenth-century research into the functions of the internal organs. Scientists wanted to study the kidneys and liver working in isolation outside the body, which entailed perfusing (continuously injecting) them with oxygenated blood. In 1868 two German researchers, Ludwig and Schmidt, put blood in a balloon filled with oxygen and shook it, before pumping it through isolated organs.16 They were able to demonstrate that perfused livers continued to secrete bile, and lungs excreted carbon dioxide.17 A more sophisticated way of oxygenating the blood was invented in 1882 in Strasbourg by Waldemar von Schröder, who bubbled oxygen through venous blood. Bright red blood was produced but the bubbles produced large amounts of foam, which rendered it unusable – a problem that would remain unsolved until the 1950s.18
Another device constructed in 1885 by the Austrian researchers Max von Frey and Max Gruber has a good claim to be described as the world’s first fully functional heart-lung machine – although it was never used on a living animal. Their apparatus pumped blood into a tilted glass cylinder filled with oxygen, which constantly rotated so that the blood spread out into a thin film covering the inner surface, maximising its surface area.19 By the time the blood emerged from the bottom of the cylinder it had been oxygenated, and was then pumped into the experimental subject. Von Frey and Gruber never planned to use the device to keep an animal alive, but instead successfully employed it to perfuse the kidneys and hind legs of dogs that had already been killed. The apparatus was simple – and strikingly similar in conception to the machine John Gibbon would invent more than half a century later.20 Several other artificial oxygenators were constructed around the turn of the twentieth century; most used either the bubble or film techniques, but one invented by the American physiologist Donald Hooker (the uncle of the Hollywood star Katharine Hepburn) employed a rotating flat disc to increase the surface area of the blood, a scheme which would be imitated by later investigators.21
When Gibbon began to design his own heart-lung machine in 1933 he had no idea that a Russian scientist had already been working on the problem for a decade. Sergei Sergeyevitch Brukhonenko’s interest in the subject was prompted by his experiences as an army doctor during the First World War. He saw many soldiers die from injuries to their major organs which could not be treated because there was no way of doing so without interrupting the circulation. He began his research in 1923, and two years later designed a machine that he called the ‘autojector’.22 It used two pumps to replicate the action of the heart, but the first version of the device did not contain an artificial oxygenator; instead he used the lungs of a recently killed dog. These were placed in a dish and attached to bellows to simulate breathing. The blood passed through them was collected and used to perfuse a living animal.
Brukhonenko used his apparatus to perform a number of impressive and macabre feats. On 18 September 1925 he gave a demonstration to colleagues in which the autojector pumped oxygenated blood through the severed head of a dog, which responded to stimuli and appeared to be aware of its surroundings.fn3 When this research became known more widely it caused a sensation; some speculated that the technology might be used to keep humans alive indefinitely. The playwright George Bernard Shaw declared: ‘I am greatly tempted to have my head cut off so that I may continue to dictate plays and books independently of any illness, without having to dress or undress, or eat, or do anything at all except to produce masterpieces of dramatic art and literature.’23
On 1 November 1926 Brukhonenko became the first to achieve total cardiopulmonary bypass. He attached the autojector to the major blood vessels of a dog before stopping its heart. For the next two hours the machine took over the functions of the dog’s heart and lungs, keeping it alive until unexpected bleeding put an end to the demonstration (and the life of the unfortunate dog). In his account of this experiment, Brukhonenko suggested that in future the autojector might be used ‘for certain operations on the arrested heart’ – a far-sighted remark.24 By the late 1930s he had replaced the donor lungs with a bubble oxygenator, but his ambition to apply the machine to human patients was frustrated by the outbreak of war, which brought a halt to his research. In his experiments Brukhonenko used a recently discovered drug, suramin, to prevent the blood clotting, but it was not terribly effective. John Gibbon was lucky in having access to an alternative that was far superior, and which had only just become available: heparin.
One of the surgeons I spoke to while researching this book described heparin as the discovery that made heart surgery possible. Yet this important advance took place in curiously understated circumstances, and the man who made it died in obscurity. In 1915 Jay McLean enrolled at Johns Hopkins Medical School in Baltimore. At twenty-four, he was older than most freshmen, having had a difficult start to life: his father died when he was four, and the family home was then destroyed in the fires that followed the San Francisco earthquake of 1906. When his stepfather refused to fund him through medical school, McLean spent several years working in gold mines and on oil rigs to pay for his education. Such was his determination to become a doctor that he then travelled the 3,800 miles from California to Baltimore and presented himself at Johns Hopkins, even though he had already been rejected by the institution. Impressed by his tenacity, the dean found him a place. Doing nothing to dispel any impression that he was a rather wilful young man, McLean then announced that he would spend his first year in laboratory research, rather than following the standard curriculum.25 And so he found himself working in the laboratory of Dr William Howell, the world authority on the coagulation of blood.
Howell was investigating a substance extracted from the brain, cephalin, which he believed to be partly responsible for blood clotting. He asked McLean to identify the active ingredient of this mixture, and if possible to refine it. McLean believed the chemical might be found in greater concentrations in other organs, and so also made preparations from hearts and livers. He was curious to find out how long its effect lasted, and so continued to test his samples for some weeks after preparing them. To his surprise, after a while the substance extracted from livers began to inhibit clotting rather than promote it. When he was satisfied that this was not an experimental error he went to tell Howell that he had discovered a powerful anticoagulant. His boss was sceptical, and so McLean gave a simple demonstration: taking a beaker of cat’s blood, he stirred in a small amount of the substance. ‘I placed this on Dr Howell’s laboratory table’, wrote McLean forty years later, ‘and asked him to tell me when it clotted. It never did clot.’26
Howell decided to call the substance heparin, from the Greek word for liver. Having made one of the most important medical discoveries of the twentieth century, McLean’s subsequent career was one of surprising mediocrity: beset by financial and personal problems, he held a few minor academic positions before ending up as a struggling general practitioner, and was never again involved in meaningful medical research.27
Nobody at first understood the significance of the discovery, which had little obvious utility: researchers were trying to understand why blood clotted, not prevent it from doing so. William Howell reported McLean’s findings in a paper published in 1918, noting that when dogs were injected with heparin their blood lost its ability to coagulate, an effect which lasted for several hours.28 Eleven years later a physiologist in Toronto, Charles Best, decided to investigate further. A professor at thirty, Best was already well known as one of the scientists who had discovered insulin in 1921. He and his colleague Gordon Murray succeeded in producing much purer samples of heparin, and proving its effect on dogs.29 They extracted it at first from beef liver, and then lungs, but when the pet-food industry began to buy these items in bulk they were forced to move to beef intestines, which slowed down the pace of their research. The extraction process resulted in some appalling smells, prompting the team’s relocation to a farm a safe distance from the university campus.30
In May 1935 Murray injected heparin into a patient and found that it increased the clotting time of her blood from 8 to 30 minutes.31 One of the first to realise the importance of this work was Clarence Crafoord, the Swedish surgeon who had pioneered the treatment of aortic coarctation. Having been frustrated in his attempts to treat blood clots on the lungs, he quickly realised that this drug might be the answer to the problem.32 By 1939 heparin was deemed so valuable that when a consignment was ordered by doctors in England it was delivered by a navy destroyer rather than being risked on a merchant convoy.33
John Gibbon knew of Best and Murray’s research, and when he designed his first artificial lung in 1934 he was able to obtain heparin supplies from Toronto. Injecting his experimental animals with the drug would ensure that their blood did not clot as soon as it left their bodies. His first device used a vertical rotating glass cylinder as an oxygenator: blood trickling into this container was spread into a thin film and exposed to oxygen before being collected at the bottom and pumped back into the body.34 Because the artificial lung was too small to use on a large animal, Gibbon chose to experiment on cats. Obtaining them was not difficult: Philadelphia was overrun with strays, and the local authorities were killing 30,000 animals a year. Armed with a piece of tuna and a sack, Gibbon took to the streets at night, returning to the laboratory each time with a fresh supply of feral cats.35
The research was laborious, intricate and time-consuming. Mary Gibbon did much of the work, starting early in the morning to prepare the equipment for each day’s experiment, a process that took several hours. When it was all sterilised and assembled, she would anaesthetise the cat and connect it to an artificial respirator, before opening its thorax to reveal the heart. Cannulas (tubes) were then inserted into two vessels to convey blood to and from the heart-lung machine. Finally, heparin was injected into the bloodstream to prevent coagulation, and the pulmonary artery was compressed to simulate obstruction by a clot. At this point the heart-lung machine was turned on, and John and Mary began their observations.36
After many frustrations and refinements to the equipment, one day in 1935 John and Mary succeeded in their aim. As Gibbon tightened a clamp around the cat’s pulmonary artery to prevent blood flowing from its heart to the lungs, the machine came to life and began to take over the functions of both organs. Recalling this moment of elation near the end of his life, he wrote: ‘My wife and I threw our arms around each other and danced around the laboratory.’37 Later that year they succeeded in maintaining the life of a cat for almost four hours while its pulmonary artery was obstructed; without the heart-lung machine this would have killed it in a matter of minutes.38 The animals often developed complications after the procedure, and some died; but in 1939 Gibbon was able to announce to a surgical conference in Los Angeles that four cats kept alive by the heart-lung machine for up to twenty minutes had all made complete recoveries.39 One of the surgeons present likened the Gibbons’ achievement to ‘Jules Verne’s dreamlike visions, regarded as impossible at the time but later actually accomplished’.40
Like Brukhonenko in Russia, Gibbon was unable to continue his research in wartime, when (to the surprise and alarm of his family) he insisted on volunteering for service. When he returned to civilian life in 1945, his priority was to build a larger machine with the ability to oxygenate enough blood to keep a human alive. He realised he needed specialist engineering help for such a major undertaking, and through one of his medical students gained an introduction to Thomas J. Watson, the chairman of the International Business Machines Corporation, who agreed to construct a machine at the company’s expense.41 In later years IBM would become better known as a manufacturer of computers, but in the 1940s its engineers had wide-ranging expertise in designing everything from punched-card machines to armaments. The first IBM machine was delivered in 1946, a larger and considerably more sophisticated version of Gibbon’s prototype. When Time magazine interviewed Gibbon three years later he revealed that he had been able to keep dogs alive on this machine for up to forty-six minutes.42 He also made clear that he now saw this as a means of enabling open-heart surgery: Dwight Harken’s wartime feats had demonstrated how robust the organ was, and convinced the profession that it might be possible to go even further by stopping the heartbeat and opening it.43
Gibbon knew that the cylinder being used to oxygenate the blood was still not big enough to use on humans; to do so he calculated he would need to construct a machine seven storeys high – an obviously impractical proposition.44 Then two workers in his laboratory noticed that the rate of oxygenation increased if an obstruction was placed in the bloodstream to produce turbulence. After experimenting with various ways of doing this they replaced the revolving cylinder with a number of screens made from stainless-steel mesh. Six of these screens were suspended in parallel; blood trickled down them in an oxygen-rich atmosphere and was collected at the bottom. This gave the artificial lung a surface area of over 8 square metres, but contained in a unit small enough to fit into an operating theatre.45
Perfecting the oxygenator was the most difficult aspect of constructing a machine for artificial circulation; what, then, of the pump, the component which was to replace the heart? Shortly after beginning his research Gibbon had started to use a simple mechanism that propelled the blood without touching it. It had only one moving part: a rotating wheel with three rollers spaced along its circumference. A loop of flexible tubing fitted snugly around the outside of this wheel so that as it rotated the rollers at its edge swept along the tubing, compressing it and moving the blood inside it forwards. Known as a roller pump, this mechanism proved ideal, as it produced a smooth flow without harming the delicate blood cells. Oxygenators have continued to evolve, but in the eighty years since Gibbon first adopted it the roller pump has remained virtually unaltered, and remains a common sight in the operating theatre.
The question of who invented this simple device is a curiously vexed one, and gives some insight into the egos and rivalries of early heart surgery. Michael DeBakey had patented an almost identical pump in 1935 as a tool to facilitate blood transfusions,46 and later took credit for the invention, pointing out that he had sent Gibbon an early model.47 But roller pumps were nothing new; DeBakey’s version was merely an improvement on an existing design dating back to 1855. In fact, no fewer than eleven similar pumps had been patented before DeBakey’s, including several intended for surgical use. This chronology was painstakingly laid out by Denton Cooley in an article published in 1987;48 a cynic questioning his motives might point out that he was embroiled in a messy thirty-year feud with DeBakey at the time.
By the early 1950s Gibbon felt he was close to being able to use his machine on humans. But he no longer had the field to himself: stimulated by his research, others were trying to build an artificial heart-lung. In April 1951 the Minnesota surgeon Clarence Dennis made the first attempt to operate inside the heart using bypass, operating on a little girl diagnosed with an atrial septal defect. After successfully attaching her to the machine, Dennis discovered that this defect was far bigger than expected – so large, in fact, that it proved impossible to close it, and she could not be saved.49 Later that year two successful procedures using similar machines took place in Italy and the US, though in neither case was the surgeon attempting to operate on the heart.50
John Gibbon waited until he was absolutely satisfied with his heart-lung machine before risking its use on a patient. The first operation took place in February 1953, and like Clarence Dennis’s first attempt it was thwarted by an error in diagnosis. The patient was a fifteen-month-old baby with a failing heart, thought to be caused by a large atrial septal defect. When he opened the heart he found the organ enlarged but otherwise normal. He was forced to abandon the operation, and shortly afterwards she died.51 Such occurrences were not uncommon: before the invention of more sophisticated imaging techniques, cardiologists were largely reliant on X-rays and the stethoscope, and diagnosis was an inexact science. Surgeons often cut open a patient only to find an entirely different problem from the one expected.
Three months after this dispiriting experience Gibbon met a young woman who would become his most famous patient, an eighteen-year-old student named Cecelia Bavolek. After a healthy childhood she had suddenly started to have unexplained episodes of breathlessness and an irregular heartbeat. In March 1953 she developed alarming symptoms including a fever and coughing up blood, and was admitted to hospital.52 After weeks of tests the cardiologists agreed that she had a significant atrial septal defect. Gibbon believed that he could repair the hole in Cecelia’s heart, and discussed it with the girl and her parents, explaining that he had never before attempted the operation on a human.53 Despite the risks, Cecelia and her family agreed, and the procedure was scheduled for the following month, giving Gibbon time to prepare his equipment.
The heart-lung machine was a large and elaborate piece of machinery about the size of a grand piano. The IBM engineers had equipped it with electronics to monitor every aspect of its operation, including the temperature and pressure of the blood. These circuits, which resembled the innards of an early computer, were housed in a large cabinet that was filled with nitrogen to prevent flammable anaesthetic gases from entering and causing an explosion.54 But after the machine had been assembled and carefully checked, one vital ingredient was still required.
Early on the morning of 6 May a queue of bleary-eyed medical students lined the corridor outside Gibbon’s operating theatre. They were there to give blood, large volumes of which were required to prepare the heart-lung machine for use. This process, known as ‘priming’, was necessary because there needed to be blood already in the device at the moment when it was attached to the patient.55 Once the last donor had left, heparin was added to prevent clotting, and the machine was turned on: when blood had been thoroughly circulated through its tubes and reservoirs it was ready for use.
When John Gibbon began to operate that morning he was not entirely sure what he and his three assistants would find inside Cecelia’s chest. He was confident that she had a hole in her heart, but it was not clear whether this lay between its upper chambers (the atria) or the lower, pumping chambers (the ventricles). He began by making a long incision across her chest from one armpit to the other, curving it beneath her breasts to minimise the post-operative scar. He then opened her chest cavity in the space between her fourth and fifth ribs. With a retractor spreading the ribs apart, Gibbon had a clear view of the heart. The right ventricle was horribly swollen, and the pulmonary artery was so enlarged that it vibrated with every heartbeat. To find out what was wrong, Gibbon made a small incision in the right atrial appendage, a small muscular pouch attached to the right atrium. Through this hole he was able to insert a digit and investigate the inside of Cecelia’s heart with his fingertip. Immediately he could feel a hole between the two atria – ‘as large as a silver dollar’, as he later recorded in his operative notes.
With the diagnosis confirmed they could attach Cecelia to the heart-lung machine. She was given heparin, and plastic tubes were inserted into her blood vessels. The first of these was attached to the left subclavian artery, just above the aortic arch; the second was pushed through the right atrial appendage until it reached the venae cavae, the two main veins which return deoxygenated blood to the heart. The machine was now switched on, and began to assist Cecelia’s circulation for the first time, though some blood continued to flow through her heart. Almost immediately they noticed a serious problem: blood was leaking from the artificial lung. Insufficient heparin had been added to the donor blood, and clots had started to form on some of the oxygenating screens, impeding flow through the device. After a hurried discussion with his assistant Frank Allbritten, Gibbon decided to continue, tightening a pair of ligatures around the venae cavae to prevent any blood from entering the heart. His machine was now standing in for Cecelia’s heart and lungs.
Gibbon made a large incision in the right atrium, revealing a gaping defect in the septum. He intended to close this hole using a patch of tissue taken from the pericardial sac, but Allbritten suggested that it might be easier simply to sew its edges together. Gibbon agreed, and shortly afterwards the final stitch was in place. Once the heart had been closed and sutured, the pump was turned off and the tubes in Cecelia’s blood vessels removed. The operation had taken a little over five hours; for twenty-six minutes, Gibbon’s heart-lung machine had been the only thing keeping her alive. She was already beginning to come round as Gibbon inserted the last stitch in her chest, and an hour later was talking to her nurses on the ward.56
Cecelia made a rapid recovery, and was allowed home within a fortnight. When she returned to hospital in July her doctors found that the operation had been a total success: the septal defect was closed, and she was now able to climb stairs without becoming breathless. Though she was delighted with her improved health, she was also upset by the attention she had received: her name had appeared prominently in newspaper reports of the operation, and journalists had been pestering her and her parents incessantly. Unprepared for such attention, they changed their phone number and refused all requests for interviews.57 Apart from a few appearances in support of the American Heart Association in the 1960s she avoided public attention, living quietly until her death in 2000.
The operation also took an emotional toll on John Gibbon. When he left the operating theatre he usually wrote a short account of the procedure; but this time he gave the job to his juniors, unwilling to relive the almost unbearable stress it had entailed. For years afterwards he felt a pang of anxiety every time he opened a surgical journal and saw an article about open-heart surgery.58
There was another reason for his unease. In July 1953 he operated on two further patients: both were five years old, and neither survived. The first went into cardiac arrest before she had even been attached to the heart-lung machine and died on the operating table; the second had a complex defect which could not be repaired, and succumbed shortly afterwards.59 This was mere bad luck, but Gibbon was devastated: he had spent twenty years working on a single problem, and had lost three of his first four patients. He announced that he was suspending all cardiac surgery for a year, pending improvements in the equipment. In fact he never operated on the human heart again, handing responsibility for the entire programme to one of his juniors.60 One could interpret this as the reaction of a disillusioned man, but some of Gibbon’s closest colleagues suggest that he had achieved all he wanted to. Two decades working on one problem was enough; it was time to hand the baton to younger surgeons.
The operation on Cecelia Bavolek was by any standards a historic one, but Gibbon was curiously absent from his own triumph. When Time interviewed him he explained open-heart surgery vividly (‘like drying out a well to do some work at the bottom of it’), but refused to pose for pictures with his machine.61 Such achievements are normally documented in the international scholarly press, but Gibbon wrote only a short report for the journal of the Minnesota Medical Association. This is one reason for the general sense of anticlimax that followed Gibbon’s operation, but there are others. Progress in medicine is not defined by a single success, but by a series of them: one patient cured was not enough to convince the surgical world that the procedure was safe. Many surgeons also believed that Gibbon’s research was a dead end, since other, apparently less cumbersome, methods of operating inside the heart were already being developed and were yielding results. Gibbon, it transpired, was not even the first surgeon to perform a successful open-heart operation. That milestone was achieved in September 1952 by John Lewis, a surgeon from the University of Minnesota, who closed an atrial septal defect in a five-year-old girl using a technique completely different from Gibbon’s. Instead of maintaining the patient’s circulation artificially, he stopped it entirely. Conventional wisdom suggested that this should have fatal consequences, but Lewis had a new surgical weapon at his disposal: hypothermia.62
Whereas Gibbon spent twenty years developing his heart-lung machine, hypothermia was being used clinically within five years of the first experiments. The idea was simple: cooling the human body lowers the oxygen requirements of its tissues, so less blood is needed to keep them alive. This increases the length of time for which the circulation may be safely interrupted, giving the surgeon precious minutes in which to operate.
The first person to think of cold as a possible aid to heart surgery was a Canadian surgeon called Wilfred Bigelow. The freezing winters of his native country provided the stimulus for this interest in 1941, when he found himself amputating the fingers of a young man who had fallen prey to frostbite. He was surprised to find that little was known about why frostbitten extremities so often turned gangrenous, and spent some time looking into the problem. A few years later he went to Baltimore to work for Alfred Blalock, and took part in a number of Blue Baby operations. Watching Blalock operate around the beating heart, he realised that surgeons would never be able to cure more complex conditions while the organ was still pumping. And then an idea occurred to him: ‘One night I awoke with a simple solution to the problem, and one that did not require pumps and tubes – cool the whole body, reduce the oxygen requirements, interrupt the circulation, and open the heart.’63
When Bigelow returned to Toronto in 1947 he wasted no time in beginning his research. Surprisingly few people had investigated the effects of prolonged cooling on the human body. In the late 1930s a Philadelphia neurosurgeon, Temple Fay, noticed that cooling cancer cells appeared to inhibit their growth and division. He suspected this might have therapeutic benefit, and began to use refrigeration to treat patients with advanced malignancies. At first he used iced water to chill only the area of the tumour, but then he began to apply it to the entire body, reducing the patient’s temperature from the usual 37°C to around 30°C.64 The results were disappointing, but this study of over 100 patients nevertheless broke new ground: before these experiments hypothermia was generally believed to cause irreversible tissue damage, but Fay had shown that it could be endured for days at a time.
Bigelow’s central assumption was that cooling the body’s tissues would reduce their oxygen consumption. To test this hypothesis he cooled dogs while monitoring the oxygen levels in their blood. The first experiments took place in a giant freezer cooled to −9°C, but Bigelow and his colleagues soon tired of having to dress up as polar explorers, and instead simply placed the dog in a bath of iced water. The results were surprising: rather than decreasing the dogs’ oxygen requirements, it seemed to increase them. Eventually they realised that this anomaly was caused by shivering, which greatly sped up the dogs’ metabolism. Once they had succeeded in suppressing shivering through careful use of anaesthetics, they found that oxygen consumption did decline steadily with a reduction in temperature, exactly as predicted.65
Over the next few years, Bigelow and his team thoroughly investigated the physiological effects of hypothermia, and found that they could safely cool dogs to 20°C; any lower than this and there was a risk of fibrillation, a dangerous state in which the muscle fibres of the heart stop contracting in unison and begin to spasm chaotically. As the temperature dropped, so did the heartbeat: dogs cooled to 30°C needed only 50 per cent of their usual oxygen intake, and at 20°C this figure fell to 20 per cent.66 By 1949 they felt they knew enough about the body’s response to cold to attempt an open-heart procedure on a dog. The animals were anaesthetised and then cooled to 20°C – at this temperature no further anaesthesia was necessary, since hypothermia alone was enough to keep them unconscious. The chest was opened, and clamps placed on the main cardiac veins, stopping the circulation. Finally an incision was made into the organ and the interior of a living heart revealed for the first time. ‘What a thrill to look inside a beating heart!’ Bigelow wrote thirty years later. ‘What a dynamic, powerful organ.’67
He was eager to translate this experience into operations on human patients, but to his frustration did not get a chance to do so: local cardiologists were reluctant to let their patients undergo an untested procedure.68 Bigelow and his colleagues were crestfallen when they learned that John Lewis had already performed the first open-heart operation using hypothermia, with a Philadelphia surgeon, Charles Bailey, achieving a second success a few months later.
Bigelow’s experimental work laid the foundations for cardiac surgery using hypothermia, but the figure who showed its full potential was Henry Swan. Like Bigelow he had served with distinction as an army surgeon in France, and his experience in treating soldiers with terrible injuries to the blood vessels was invaluable preparation for heart surgery. Swan was familiar with John Gibbon’s work, but felt that the advantages of the heart-lung machine were outweighed by its cost, complexity and dangers. Hypothermia was simpler, required no complex equipment and entailed minimal expense.69 At his hospital in Denver, Swan began a spectacularly successful series of operations using the technique. He believed hypothermia could enable surgery on a range of conditions, and deliberately chose cases that were unapproachable by any other method – the only patients on whom he could justify using a relatively unproven procedure. His first fifteen patients included some with an atrial septal defect, but he also attempted to correct cases of pulmonary stenosis – a narrowing of the opening of the pulmonary artery from the heart. Thirteen were improved by their operations, and only one died during surgery.70
Hypothermic heart surgery was an extraordinary spectacle, as dramatic as anything seen in an operating theatre since the days when leg amputations were performed without anaesthetic. When the patient had been put to sleep and connected to monitoring equipment and a respirator, they were put in a tepid bath to begin the process of refrigeration. Ice cubes were then added to accelerate the cooling, and the patient was removed when their temperature was still a few degrees above the target of 28°C, since it would continue to drop for some time afterwards. They would then be thoroughly dried and transferred to the operating table. When the chest had been opened and the surgeon was ready to open the heart, artificial respiration would be ceased and blood prevented from flowing into the heart with clamps.71 The surgeon would then have between eight and ten minutes to complete the procedure before the patient’s life would be in danger. In Bigelow’s operating theatre the anaesthetist gave a minute-by-minute time check, building to a nerve-shredding climax as the crucial tenth minute approached.72 Completing a complex repair in these conditions required speed, superlative skill and unwavering self-confidence. Even Swan, who put himself through this experience over a hundred times, admitted he was ‘scared to death’ every time he operated.73
Swan’s operations on a wide range of heart conditions represented a new frontier in surgery, and in his article announcing his first fifteen cases he pointed out that he had disproved Stephen Paget’s assertion that it would never be safe to operate on the organ: ‘Given half a chance,’ he wrote, ‘the heart keeps beating.’74 But many were sceptical: it seemed strange to induce one life-threatening condition in order to treat another. Russell Brock, while impressed by Swan’s results, declared: ‘I cannot bring myself to believe that such a procedure can have a permanent place in surgery,’ adding fastidiously that the sight of a bath of iced water in an operating theatre was ‘both aesthetically and surgically unattractive’.75
The opinion of a respected figure like Brock carried much weight, but criticising a new technique on aesthetic grounds was rather missing the point. A more substantive objection to hypothermia was the fact that it afforded only ten minutes for the surgeon to perform his task. The Amsterdam surgeon Ite Boerema found a way to extend this safe period for up to half an hour by performing hypothermic surgery in a high-pressure chamber similar to those used to decompress deep-sea divers. When patients breathed air at three times normal atmospheric pressure their tissues became saturated with oxygen, radically increasing their ability to withstand circulatory arrest.76 Several patients were successfully operated on, but his surgeons hated working in a glorified tin can: the high pressure caused ear and sinus pain, and they had to endure slow depressurisation after each procedure had been completed.77 A few other pressure chambers were built, but their cost and inconvenience outweighed their benefits.
Surgeons uncertain which open-heart technique to disapprove of were spoilt for choice. One arguably even more dubious method was being used – with surprising success – by a colleague of John Lewis in Minnesota. C. Walton Lillehei was another distinguished war veteran who had treated casualties in Algeria and Tunisia before taking part in the Allied invasions of Sicily and mainland Italy in 1943, spending several months encamped at the beachhead at Anzio.78 He assisted Lewis in the first open-heart procedure in 1952, but soon became convinced that the ten-minute time limit to operations using hypothermia was a fatal flaw. Later that year, two researchers in England discovered that a dog could survive for half an hour with the main blood vessels into its heart clamped shut, as long as a small vein, the azygos, was left open.79 The blood flowing through this vein represented about 10 per cent of the heart’s usual output. The fact that the major organs could survive for so long with so little blood was entirely unexpected, and became known as the azygos flow principle. It considerably simplified the quest for a reliable means of heart-lung bypass, because it reduced the volume that needed to be oxygenated and pumped through the patient per minute.
Lillehei was made aware of this discovery by Morley Cohen, a researcher in his laboratory. Cohen’s wife was pregnant at the time, and it occurred to him that the relationship between her and their unborn child was a perfect model for what he was trying to do. In the womb the foetus, unable to breathe for itself, receives oxygenated blood from its mother via the placenta. So, he wondered, could they mimic this arrangement by connecting a large animal to a small one, making one creature breathe and pump blood for both?80
Joining the circulations of two animals, known as cross-circulation, was nothing new: in the late eighteenth century the French anatomist Marie François Xavier Bichat connected the blood vessels of two dogs in such a way that the heart of each dog propelled oxygenated blood into the head of the other.81 In the late 1940s two researchers at Mount Sinai hospital in New York even used a pump to connect a donor dog and a ‘patient’ animal, on which they attempted open-heart procedures.82 Cohen and Lillehei repeated these experiments using the azygos flow principle, comparing the results with those achieved using a heart-lung machine. To their surprise the dogs on cross-circulation, which received only 10 per cent of their usual blood supply, not only recovered more quickly but were more likely to survive than those on the heart-lung machine. With results using other methods proving disappointing, they felt they were justified in testing the idea on human patients.83
The method would only work when operating on children, since it was necessary for the donor to be bigger than the patient. Lillehei proposed to use parents as donors, given that their blood would usually be compatible with that of their children, and they would almost certainly agree to the procedure. Obtaining approval from the hospital authorities was more fraught; the procedure entailed putting a completely healthy individual – the donor – at risk, and there was considerable resistance to the idea. After much debate a faculty committee eventually gave the go-ahead, and Lillehei operated on his first case on 24 March 1954.
The patient was a thirteen-month-old boy, Gregory Glidden, who had a ventricular septal defect. His father volunteered to act as the donor, and on the morning of the operation both he and his son were given a general anaesthetic. They were placed on adjacent tables in the operating theatre, and tubes placed in major vessels. When the boy’s chest had been opened and Lillehei was ready to begin, the venous blood returning to Gregory’s heart was diverted into his father’s body through a vein in the leg. It then passed through his heart and lungs before being pumped back into the boy’s aorta. For the next twelve and a half minutes the father breathed for his son, as Lillehei opened the ventricle and sutured the defect closed. Technically, the operation passed off without a hitch, and the father was unharmed by the experience; but eleven days later the boy died, succumbing to post-operative pneumonia.84
Though disappointed, Lillehei and his team were not to be deterred. The following week they operated on another two patients, aged three and five, again using the fathers as donors. These patients made complete recoveries – the first time that anybody had succeeded in closing a hole between the ventricles of the heart. Nineteen of their first twenty-six patients survived, and Lillehei was so confident in the technique that in August he decided to operate on a child with tetralogy of Fallot.85 Even Blalock and Taussig, the architects of the Blue Baby operation, thought that total correction of such a serious condition was impossible, but Lillehei and his team proved them wrong. Between March 1954 and July the following year, Lillehei operated on forty-five patients using cross-circulation, many of them young children. All had conditions which were previously incurable, and the results were staggering: two-thirds survived, and almost half were still alive more than thirty years later.86
Despite the obvious success of cross-circulation, most surgeons had grave reservations about the technique. When Lillehei presented the results of his first three cases at a medical conference, John Gibbon spoke from the floor to articulate the fears he shared with many of his colleagues: ‘We are still convinced that it is preferable to perform operations involving an open cardiotomy [opening the heart] by some procedure which does not involve another healthy person. There must be some risk to the donor in a cross circulation.’87 Indeed there was: although it went unreported, several surgeons with an inadequate grasp of the technique attempted to use it, with disastrous results.88 In Britain, cross-circulation was deemed so dangerous that it was illegal even to attempt it.89 One of John Gibbon’s junior surgeons, George J. Haupt, remarked that it was the only known procedure which had a potential mortality of 200 per cent.90
All forty-five of Lillehei’s donors survived, although one nearly died when her heart stopped shortly after the procedure had been completed. The surgeons immediately opened her chest, intending to massage the organ to stimulate a heartbeat, but were hugely relieved when it resumed spontaneously.91 This near-miss, and the fact that cross-circulation was not suitable for use on adults, persuaded him to try other methods. Two of his departmental colleagues, Gilbert Campbell and Norman Crisp, had been experimenting with an idea first employed by Sergei Brukhonenko in the original version of his autojector: using animal lungs as an oxygenator. Campbell and Crisp used dog lungs to enable heart-lung bypass in operations on children, but most of their patients died.92 In early 1955 a thirteen-year-old called Calvin Richmond arrived at Lillehei’s hospital having been crushed by a truck, which had caused a traumatic atrial septal defect. The boy’s family were offered a choice between cross-circulation and using the dog’s lung, and chose the latter. A large dog was anaesthetised and one of its lungs removed, placed in a plastic cylinder and ventilated with pure oxygen. Calvin’s blood was pumped through the lung for twenty minutes while Lillehei closed three openings between the chambers of his heart. Remarkably, the boy made a complete recovery.93
This success was a rare exception. The leading exponent of the method was William Mustard at the Hospital for Sick Children in Toronto, just over the road from Bigelow’s department. In 1952 he had operated on seven children with congenital abnormalities using a pump to circulate their blood through a pair of lungs taken from a rhesus monkey: for over an hour these young patients breathed through the lungs of an ape.94 None survived for longer than a few hours,95 and when he finally abandoned the method a few years later, only three out of the twenty-one patients were still alive and well.96
By 1955 several methods had thus been used to enable open-heart surgery, and all seemed to have significant drawbacks. By the end of the decade, however, there was no doubt about which was the best. Although Gibbon had given up using his heart-lung machine, there were others keen to carry on where he had left off. In Sweden, Viking Björk developed an artificial lung which used rotating discs to pick up a thin film of blood and oxygenate it.97 And other Americans pursued their own research: John Kirklin from the Mayo Clinic in Minnesota visited Gibbon and with his blessing (and the aid of his engineers) built a replica of his device. An ascetic and exacting individual, Kirklin paid extraordinary attention to detail, and his careful work paid off: twenty-four of his first forty patients survived, and he concluded that the heart-lung machine was a ‘reliable and safe’ clinical tool.98
While Kirklin did much to convince the surgical profession of the merits of the heart-lung machine, the man who turned it from an expensive novelty into an affordable and practical device was Richard DeWall. He had joined Lillehei’s laboratory in early 1954 and operated the pumps for many of the early cross-circulation procedures. At Lillehei’s request, DeWall looked into the possibilities of artificial bypass, and eventually came up with a scheme radically different from that used by Gibbon.99 His oxygenator consisted of a 60-centimetre vertical plastic cylinder; blood was pumped upwards through this column, while oxygen bubbles were injected into it through eighteen hypodermic syringes at the bottom. Earlier investigators had been wary of this method of oxygenation, because of the major threat posed to the human body by gas bubbles. DeWall found a simple way of eliminating this risk: a reservoir at the top of the oxygenating column trapped any foam, and then the blood descended through a spiral of tubing, shedding any residual bubbles as it did so.100
The simplicity of DeWall’s device was revolutionary. The tubing – and the antifoaming agent with which it was treated – came from the dairy industry, and were designed for use with milk rather than blood.101 None of the parts, in fact, was specially manufactured, and DeWall estimated that the entire apparatus cost no more than $15 to make. Other early heart-lung machines took two full days to prepare, with over 450 glass and metal components to be thoroughly cleaned;102 DeWall’s had no moving parts, could be easily assembled and sterilised, and did not need an army of engineers to maintain it. The first operation using the new oxygenator took place in May 1955, and by the following year he and Lillehei had used it on ninety-four patients, treating a large range of conditions. The results were so good that DeWall’s report of these cases stated confidently that ‘on the basis of this experience it is predictable that reparative surgery in the open heart is destined to become a major field of endeavour.’103
This was the kick-start that open-heart surgery needed: surgeons from all over the world came to watch Lillehei and DeWall at work before returning home to construct their own devices. Soon an even simpler version of DeWall’s oxygenator was available: made from two heat-sealed plastic sheets, it cost a few cents to manufacture, could be easily mass-produced, and was intended to be disposed of after a single use.104 A significant improvement was unveiled shortly afterwards by Willem Kolff, the Dutch-American pioneer of artificial dialysis. Exposing the blood directly to oxygen causes damage to its cells, and Kolff invented a new type of oxygenator in which blood was kept separate from the gas by a semi-permeable membrane. Gas exchange could take place through this protective layer with great efficiency.105 Kolff’s membrane oxygenator was not perfected until the 1970s, but thereafter it became the standard technology seen in operating theatres.
Once the safety of bypass had been demonstrated, the technology was taken up with breathtaking rapidity. Medical circles were abuzz with discussion of the device, referred to in surgical slang as ‘the pump’. In 1955 there had been only two hospitals in the world where open-heart surgery was taking place; two years later, when a major conference took place in Chicago to discuss the heart-lung machine, more than 220 clinicians attended.106 Several different types of device were discussed at that gathering; the following year Russell Brock observed that there were now ‘nearly as many varieties of machine as there are of motor-cars’.107
A clear consensus in favour of artificial bypass had finally emerged; John Gibbon’s dream had come true. But although cross-circulation quickly became redundant, hypothermia did not. Swan continued to use the technique on its own with great success until the early 1960s, but others found that it was a valuable adjunct to the heart-lung machine. The DeWall oxygenator’s output was lower than that of the heart, so patients were slowly being starved of oxygen while they were connected to the device, and surgeons had only a limited window of time in which to operate. Hypothermia offered a way of prolonging it. Two Italian surgeons were the first to suggest combining the two techniques in 1953,108 and five years later a team from Duke University in North Carolina did so in operations on forty-nine patients. Initially the patients were cooled with ice packs or refrigerated blankets, but controlling the temperature in this way proved difficult. The surgeons then adopted a much better method: the blood was cooled by a heat exchanger to around 30°C as it passed through the heart-lung machine, so that the patient’s body was chilled from within. Reducing the temperature to the desired level took only a few minutes, and rewarming the blood afterwards was just as quick. The results were promising, with a survival rate of 75 per cent for critically ill patients.109
Another new use for hypothermia was discovered by Norman Shumway, a New York surgeon whose work would later lay the foundations for heart transplantation. In 1959 he showed that cooling the heart locally with saline at 4°C made it safe to operate for as long as an hour.110 Topical or general hypothermia – or a combination of the two – eventually gained general acceptance and remain common techniques today.
A further weapon entered the surgeon’s armoury in the 1950s, one which would enable far more radical intervention: stopping the heart. The first suggestion of this drastic measure can be traced back to 1851. A Scottish surgeon, James Wardrop, noted that it was possible to revive apparently dead animals by artificial respiration, and proposed that this might have therapeutic application: ‘How far it may ever be expedient to imitate such experiments, and extinguish human life for a while, in order to cure diseases, and restore it again after the disease is subdued, is a matter of grave consideration.’111
This was prophetic. A stopped heart would be much easier to operate on than one that kept beating. A little over a century later one of the leading British heart surgeons, Denis Melrose, elaborated on the idea: ‘A most valuable contribution … to the whole problem of intracardiac surgery would be made if the heart could be arrested and restarted at will.’112 Building on the observation that potassium salts caused the heart muscle to stop contracting, Melrose experimented on a veritable menagerie of isolated animal hearts: those of dogs, rabbits, a guinea pig, a kitten and a puppy. He found that an injection of potassium citrate into the aorta found its way into the coronary arteries feeding the heart muscle, which quickly stopped beating. A normal heartbeat was restored without further intervention once the chemical had dispersed.113
Shortly afterwards Melrose became the first person to deliberately cause cardiac arrest. Once the patient was on bypass the potassium solution was injected, and the heart stopped. The aorta was clamped to prevent more blood from entering the heart muscle, allowing Melrose to operate on an organ that was perfectly still and bloodless. When he finished a few minutes later the clamp was removed, and to the relief of the team the heartbeat resumed spontaneously.114
This technique is known as cardioplegia, the Greek for ‘heart paralysis’. It was at first regarded with considerable suspicion, with many surgeons concerned that the chemicals would cause lasting damage to the heart muscle. Only a few ventured to use it until a major development in the early 1970s, the consequence of a young biochemist, David Hearse, being invited to watch open-heart surgery at St Thomas’ Hospital in London. He was shocked by the primitive measures taken to protect the heart muscle during the procedure, which entailed perfusing the coronary arteries with cooled blood.115 He decided to look into possible alternatives, and after a few years of research developed an elaborate combination of drugs which greatly improved the ability of the heart to withstand total arrest for ninety minutes or even longer.116 Hearse’s detailed study of the microscopic changes to the cardiac muscle induced by cardioplegia, and the biochemical mechanisms that caused them, showed that the method was safe and indicated the best way of going about it.117 Variations on the drug cocktail he devised, known as St Thomas’ solution, have been a mainstay of the cardiac operating theatre ever since.
Even now, artificial heart-lung bypass is not without its problems. Some of the earliest surgeons to use the pump realised that bringing blood into contact with plastic and other man-made materials inevitably altered its properties. In the 1960s they started to notice a pattern of untoward symptoms displayed by patients who had undergone bypass, which was termed ‘post-perfusion syndrome’. In a landmark paper published in 1983 the British surgeon Stephen Westaby stated that ‘the damaging effects of cardiopulmonary bypass … have contributed considerably to the morbidity and mortality of cardiac surgery over the past 30 years.’118 This problem still has not been solved: one surgeon I spoke to observed that, paradoxically, any patient kept alive by the heart-lung machine is also being ‘slowly killed’ by the device. Today bypass is employed for the shortest time possible, and for some conditions new techniques have been invented which avoid its use entirely. But it has become an essential piece of kit: over a million open-heart operations take place worldwide every year.
Despite its enduring shortcomings, John Gibbon’s invention was the great leap forward that heart surgery needed, the single greatest advance in the history of the discipline. In 1950 it was possible to use the scalpel to treat only a handful of cardiac conditions. Ten years later the era of open-heart surgery had begun, and there was scarcely a type of heart disease that surgeons had not attempted to correct. In a very short time the heart-lung machine became indispensable. But being able to open the heart was only the beginning: now surgeons had to learn what to do when they got there.