Bloody Marvellous
In the early 1920s, a troubled Soviet scientist wandered around Moscow with big visions for the future of humanity.
Alexander Bogdanov, as he was called, was a writer, philosopher, physician, and committed communist – not just the kind who’s afraid of ending up in Siberia, but someone who would make even the proudest comrades blush. Inspired by his own sci-fi novels, his political ideals and his studies of single-celled organisms, Bogdanov was convinced that humans ought to share blood with each other. This would be a necessary step towards the ideal communist society, and Bogdanov suspected it would double as a cure for ageing. Ever a man of action, he used his political sway in the Kremlin and was soon given the opportunity to found an institute for blood transfusions in Moscow. Bogdanov wasted no time and started conducting transfusions right away, of course using himself as one of the test subjects.
In the beginning, everything went according to plan. Bogdanov participated in ten blood transfusions over two years and considered them a success. One friend even remarked that he thought Bogdanov looked ten years younger than his actual age. Eventually, though, Bogdanov’s luck ran out, and his eleventh blood transfusion went horribly wrong. To this day, we still don’t know exactly what happened. The blood transfusion partner had both malaria and tuberculosis, Bogdanov had an immune reaction to the blood itself and all of this took place in a country where political figures made a virtue out of murdering each other in the most creative and imaginative ways possible.
Whatever happened, Bogdanov died two weeks after the blood transfusion at the age of fifty-four, following complications of the kidneys and the heart.
* * *
Alexander Bogdanov was far from the first scientist to experiment with blood transfusions. And in fact, his level of eccentricity wasn’t that unusual in the field either. Blood transfusion experiments started all the way back in 1864, when French scientist Paul Bert thought it would be a good idea to sew two mice together – probably, at least in part, to show that he could. This unsavoury experiment paid off, in that Bert discovered how the circulatory systems of mice would automatically fuse after the operation, meaning the conjoined mice began to share blood. This peculiar phenomenon was dubbed parabiosis, and over the next few decades, other scientists occasionally ventured there as well. Among other things, their experiments helped clear the way for successful organ transplantations.
Despite the many eccentric people involved, though, it took almost 100 years from Bert’s initial experiments before scientists researched the use of parabiosis to combat ageing. American researcher Clive McCay was among the first when he tried stitching together pairs of old and young mice to see how they would affect each other. These experiments never went far, though, and soon faded into obscurity. But then in 2005, the concept resurfaced in a research group at Stanford University. Once again, the scientists sewed together two mice of different ages. They found that the pairing increased the regenerative ability of the old mouse – rejuvenated it – while simultaneously weakening the young mouse. In other words, the two mice seemed to converge towards each other’s physical states when sharing blood. Such a finding might make sense in a fantasy novel about vampires, but the scientists were quite puzzled. How could blood somehow transfer regenerative ability? Some believed youthful stem cells would travel from the young mouse and take up residence in the old mouse. Those young stem cells might then explain why the old mouse suddenly fared better. However, that turned out not to be the case. The regeneration actually comes from the old mouse’s own stem cells. It seems young blood can somehow make old stem cells lighten up and start acting young again. The effect has nothing to do with blood cells either, as studies show all that is needed for rejuvenation is blood plasma – the blood minus its cells. The remaining fluid is full of all kinds of hormones and nutrients, as well as various proteins. We already know that the composition of blood plasma changes as we age, but many scientists used to believe this was just a downstream effect of ageing. The parabiosis experiments offer a clue that the arrow of causation might point the other way as well: maybe changes to blood plasma contribute to ageing rather than just track it.
* * *
The story of rejuvenation through young blood has not been lost on entrepreneurs. After all, it would be pretty easy to pay some young people to donate blood and then sell the blood at high margins to elderly millionaires. Blood transfusions are a common medical procedure, so it wouldn’t be hard to find qualified staff either. A US company with this exact business plan, called Ambrosia (not the custard brand), opened its doors in 2016. But it was closed after the Food and Drug Administration issued a warning notice. We simply don’t know enough about this stuff yet to declare any sort of medical benefits. Claims about ‘immortality’, didn’t help the company’s credibility, either.
Fortunately, other companies are using this research in more rigorous ways. These companies hope to identify which factors in young blood are responsible for the rejuvenating effect seen in old mice. We know it can’t be the cells, so most likely it is some kind of soluble protein. If we’re lucky, it’s a single protein, or just a few. If we’re unlucky, this is one of those impossible biological labyrinths where everything affects everything else. If that’s the case, the solution might be to stick with blood plasma rather than trying to narrow it down any further. There are currently clinical trials investigating both these approaches. A few have even concluded and published their results. For instance, there was one trial in which Alzheimer’s patients received blood plasma from young people. Drumroll . . . it didn’t work.
Research into young blood is still ongoing, but new studies cast doubt on what exactly explains the rejuvenating effect. It’s certainly possible that young blood contains what we could call ‘anti-ageing factors’ – molecules that keep us young. But it turns out the constitution of old blood might be more important. You see, studies show it’s not actually necessary to replace old blood with young blood to rejuvenate old mice. You can get the same effect by replacing the blood with a simple saline solution containing a little protein. That is, old mice are equally rejuvenated if you simply draw a little blood and replace the fluid with some protein-containing salt water. This suggests what really matters in these experiments is not what you add but what you take away. Old blood must contain ‘pro-ageing factors’ that burden the mice and that it is beneficial to remove.
This finding is especially interesting because we know of a natural experiment in humans with which we can compare it: blood donation. In a typical blood donation, you lose approximately half a litre of blood. Initially, your body will replace the lost blood volume with fluid from the rest of the body, and then during the following weeks, it will replenish the blood cells and various blood components. That means blood donors have a somewhat similar experience to the old mice in the saline experiments. If occasional removal of some blood has any kind of life-extending effect, we should be able to detect it in blood donors. A Danish study has looked for just this effect – and found it. It turns out blood donors actually live longer than other people. This effect persists even when you account for the fact that blood donors might be healthier at baseline. After all, they were well enough to be allowed to start donating blood. And interestingly, the effect gets stronger the more blood donations a blood donor makes. Admittedly, the effect is moderate – you’re not going to live forever because you start donating blood. But given that it would be a good thing to do anyway, it’s worth consideration.
The connection between bloodletting and health is not new. For much of history, bloodletting was a common medical practice but for some reason often performed by barbers. It used to be normal to visit your barber for a haircut and then subsequently have a little blood drawn. In fact, the red line in barbershop poles represents the blood that used to be drawn at barbershops. At the time, people prescribed all sorts of health benefits to regular bloodletting, but the belief was based on folk wisdom, not scientific research. As a result, bloodletting was used for everything. Even gunshot wounds.
So where could the health benefits of donating blood come from? One possibility is good old hormesis. Losing half a litre of blood is a stress factor to the body – and one it’s easy to imagine we’ve evolved to handle. Nowadays, losing blood is rare, but people used to have all sorts of bloodsucking intestinal parasites, as well as a tendency to fight each other with various sharp objects. However, as we’ve discussed, it is also possible that old blood contains ‘pro-ageing factors’ – certain molecules that somehow promote ageing and that we benefit from getting rid of. If that’s the case, there are thousands of possible culprits. But one of the interesting ones is iron.
It works like this: when you donate blood, you lose a lot of red blood cells. These are the cells you use to transport oxygen from the lungs around the body. Red blood cells transport oxygen using a specific protein called haemoglobin, and inside every haemoglobin protein sit iron molecules. In fact, it is iron that gives red blood cells – and by extension blood itself – that red colour. Thus, when donating blood, you lose a lot of iron-containing red blood cells, which have to be replaced. As you’re making the new red blood cells, you use iron from cellular deposits to make haemoglobin, and in that way blood donation lowers iron levels.
Now, losing a lot of iron might not sound particularly healthy. After all, we usually warn people about getting too little iron. But iron actually appears in some pretty ghastly circumstances. For instance, people with Alzheimer’s and Parkinson’s disease have abnormal amounts of iron in the diseased areas of the brain, and Alzheimer’s progresses more rapidly in those with particularly high brain-iron levels. Similarly, there are abnormal amounts of iron in the plaque that accumulates in blood vessels with age, and which can cause heart attacks and strokes. There was even a randomised controlled trial where doctors decreased people’s cancer risk by lowering their iron levels using blood draws. The trial had 1,300 participants who were divided into two groups. One periodically had blood drawn while the other did not. When the trial ended, cancer cases were thirty-five per cent lower among those who had regularly had blood drawn. And those in the blood-drawing group who did get cancer had a sixty per cent increased chance of surviving.
Genetic studies also back up the association between iron metabolism and longevity. Do you remember Genome-Wide Association Studies (GWAS) from earlier? These are studies where scientists identify what genetic variants cause our different traits. We learned that genetic variants affecting the immune system, growth, metabolism and the generation of zombie cells were implicated in ageing. But besides that, these studies actually also implicate iron. At least, people genetically prone to higher iron levels seem to die earlier than others. This finding is backed up by actual blood measurements. In a study on 9,000 Danes, scientists looked at a protein called ferritin, which is responsible for storing iron in our bodies. The more iron you have in your body, the higher your ferritin levels will be. And in the Danish study, researchers found that high ferritin levels were associated with a greater risk of an early death – especially among men.
Now, all this doesn’t mean that low iron levels aren’t dangerous, too. They very much are, especially for premenopausal women who lose a little blood – and thereby iron – every month. But the danger of excess iron exposes a flaw in how we often think about health. More is better. People take all kinds of supplements, because why not get a little extra of everything? That’s the reasoning behind taking multivitamins, too. Maybe we’re deficient, so better get some more of everything. Unfortunately, biology just doesn’t work like that. A good example of the faults in this approach is laid out in a large study called the Iowa Women’s Health Study. Here, scientists followed 39,000 women and found, among other things, that those taking iron supplements had a higher risk of dying early than those who didn’t. The same was true for those taking a multivitamin pill, which of course contains iron.
To be fair, the reason the ‘more is better’ approach doesn’t cause problems more often is that our bodies do a pretty good job of regulating most nutrients and vitamins. In many cases, your body can excrete something if you get too much. But iron is one of the exceptions. Your body actually has no system for excreting excess iron. You passively lose a little through sweat, dead cells and bleeding, but there is no dedicated mechanism for pumping out iron if you suddenly have too much. The reason is probably that iron excess never used to be a problem in the past, because of lower dietary intakes, blood-sucking intestinal parasites and more frequent bleeding. Today, though, it’s another story, and men especially can be prone to accumulate iron with age. An extreme example is the genetic disease hereditary haemochromatosis. This genetic disease makes the affected absorb more iron than usual from their food. If not diagnosed and treated, people with haemochromatosis eventually end up with sky-high iron levels. As a result, they usually die early from cancer or heart complications, and before that start suffering from all kinds of maladies, such as diabetes, fatigue and joint pain. Unless, that is, if they have their iron levels lowered using blood draws, in which case the condition is harmless.
The Celtic curse or the Viking disease?
Hereditary haemochromatosis (HH) is almost exclusively found in Europeans. It was once nicknamed ‘the Celtic Curse’ because there’s a particularly high frequency of the disease in Ireland. Another theory is that the disease was spread by the Vikings. There’s a high frequency of HH in Scandinavia, too, and scientists have noted disease frequency tends to be high in areas raided and settled by the Vikings. Like many other genetic diseases, the development of HH requires that you inherit a mutated version of the implicated gene from both parents. If you inherit only one HH genetic variant, you’ll be fine. HH is obviously not evolutionarily advantageous, but scientists suspect the genetic variant might have become common anyway because there can be benefits to carrying a single copy. That is, perhaps the HH genetic variant persisted because those with a single copy fared better than the average person, even though those with two copies fared worse. The benefit in question could be helping farmers survive on grain-heavy diets that are low in iron. But there are other possibilities, too. The mechanism could be that slightly higher iron levels lead to a higher volume of red blood cells and thus a higher aerobic ability.
For instance, one study found that eighty per cent of medal-winning French athletes at world-class competitions have a single version of the HH genetic variant even though a lot fewer normal French people have. And other studies have shown that being a carrier of one copy of the HH genetic variant is associated with improved physical endurance compared to non-carriers.
There must be a reason why excess iron shows up in all the wrong places. One possibility is that iron promotes the formation of free radicals. It is well known that iron stimulates our metaphorical bull in a china shop. Yes, we’ve learned that free radicals are not as big an issue as scientists once thought. In low doses, they’re even beneficial, as they work through hormesis. However, as always, hormesis is about the dose. If you exceed the level of damage that the body can repair, a stressor becomes net damaging and lowers lifespan.
But there’s another possibility that can explain the iron-
longevity connection, too: microorganisms love iron. Iron is necessary for all living creatures, and microbes such as bacteria and fungi are no exception. In fact, iron works almost like fertiliser for the growth of bacteria. The difference between a harmless and a life-threatening infection can be how good the bacteria is at procuring iron for itself – or how much iron is available. This has caused troubles in underdeveloped countries, where many children are iron-deficient. Growing up with iron deficiency can stunt growth and cognitive development, so the World Health Organization recommends iron supplements to combat the deficiency. However, iron supplements can come with the downside of increasing children’s risk of getting malaria and various bacterial infections – and the supplements can also increase the severity of disease once infected.
Evolution has actually already built this knowledge into our bodies. Access to iron is one of the most important battlefields when fighting infections. If your immune system detects an infection, the body immediately turns up the production of the iron-storage protein ferritin. That way, iron can be locked away in what is essentially a molecular cage so that microbes can’t get to it. Similarly, infections also make your body increase the production of a protein called hepcidin, which blocks iron uptake from your food. So perhaps it’s time we take a closer look at the world of microbes.