The compatibility gene, p.4

The Compatibility Gene, page 4

 

The Compatibility Gene
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  

  On 7 September 1969, at the end of a week-long meeting of the British Science Association, a few weeks after Armstrong and Aldrin walked on the moon, Peter Medawar, as that year’s president, gave a reading at Exeter Cathedral – as part of a tradition at the time for the Science Association to participate in an annual religious service. As he read from the Book of Solomon – ‘For Wisdom is more mobile than any motion; because of her pureness she pervades and penetrates all things . . .’ – his voice slurred suddenly. He slumped into his seat and fell unconscious. Jean later recalled that in a flash ‘he had fallen from a pinnacle of achievements into a state very near death’. She knew instantly that he’d had a stroke.

  For at least a year after his severe right-sided cerebral haemorrhage, Medawar was incapacitated. The head office of the Medical Research Council thought they should now replace him with a younger, fully fit leader for their prestigious institute. Many in the institute, including those close to Peter, agreed that this would be sensible. A young scientist, Liz Simpson, who had recently joined Medawar’s team as a vet, had already taken over some of the day-to-day running of Medawar’s projects, which continued to be important. They were testing, for example, whether or not giving drugs to suppress the immune system and aid transplantation success would have the side effect of allowing cancer to develop. Even Simpson thought it would be better for Medawar to step down from his headship. But Peter and Jean were both stubborn about the issue. Jean especially fought the Medical Research Council, and Peter was kept on as head of the institute for another two years, even though he was paralysed down his left side, with his useless arm in a sling and his left leg in a splint. Eventually, under pressure from the Medical Research Council, he did step down, and moved in 1972 to head a transplantation biology department in a new clinical research centre at Northwick Park Hospital.

  Even after two more debilitating strokes in the mid-1980s, it was clear that Medawar’s compulsion to work remained undimmed. In a 1984 interview for New Scientist magazine, he remarked: ‘I do nothing but work – ever . . . I’m not going to retire.’40 Doctors looking at Peter’s brain scan were astonished – they couldn’t understand how he was able to have any life at all, let alone write books and work at Northwick Park Hospital each weekday. One positive outcome of his illness was that he was more often available for discussion in his laboratory41 and became more accessible to his children at home.42 One of his four children, Charles Medawar, said to me in 2010 that he has far more memories of his father after his stroke than before it.43

  The eminent evolutionary scientist and writer Stephen Jay Gould remarked that Medawar ‘lived far longer and better with half a body than the vast majority of people could ever hope to survive with all systems functional’.44 And indeed, following his first stroke, Peter and Jean had eighteen productive years together, including the publication of two books as co-authors. Liz Simpson, who helped run things at the clinical research centre while Peter was ill, recalled that ‘even 10 per cent of his mind was better than 100 per cent of most other people’s’.45

  Medawar died on 2 October 1987. His obituary in Nature, written by his protégé Avrion Mitchison, called him ‘the most distinguished British biologist of his generation’.46 To this day, Mitchison, a major scientific figure in his own right, lights up at the mention of Medawar, referring to him as ‘magical’.47 The primary importance of Medawar’s scientific work is a given, but it is these testimonies and many others alike, as well the stream of books he published, that sealed the legend of Medawar. Oxford University and University College London have buildings named after Medawar. C. P. Snow, the novelist and physicist, proclaimed that ‘If he [Medawar] had designed the world, it would be a better place.’48

  Medawar – with his co-workers Billingham and Brent – had made the glorious discovery that transplantation tolerance could be achieved for any cells present in the foetus stage of development – so-called ‘acquired tolerance’. Drawing on Gorer’s research, they also knew that a genetic component was important in controlling transplantation compatibility. But they did not have a clear idea about what our compatibility genes really did. All that was apparent was that they were important for transplantation and that somehow transplant rejection was linked to the immune system. Towards the end of Medawar’s life, a deeper understanding of compatibility genes in our immune system was emerging, but he died one week before another trio of scientists, this time at Harvard University, published an atomic-scale picture that vividly revealed how our compatibility genes work. Medawar would have loved it.

  The day before his first stroke, Medawar ended his lecture with a quotation from the seventeenth-century philosopher Thomas Hobbes. Hobbes’s writing struck a chord with Medawar in proclaiming that life is like a race and the most important thing is to be in it, to be fully engaged, ambitious and go-getting, to improve the world. Eighteen years later, that same quotation, ‘There can be no contentment but in proceeding’, was engraved on his headstone.49 Jean died in 2005 and is buried next to him.

  Medawar could not have known the full impact of his work, reaching far beyond transplantation and immunology. Yet it has also become clear that many problems in medicine are not scientific; they are social, ethical and even economical. His son Charles established an organization, the Social Audit, which evolved into a significant force aiming to hold pharmaceutical companies to account. In its heyday in the 1990s, Charles’s web pages had a million visitors per year50 and brought attention to problems such as how some drugs were being marketed unnecessarily in the Global South.

  Billingham died in 2002; the last years of his life being made miserable by Parkinson’s disease.51 Brent is the last surviving member of the ‘holy trinity’ – the only domino still standing, as he puts it.52 In his mid-eighties, he still actively pursues transplantation research, working within a large European consortium of labs looking at new ways to suppress immune responses in kidney transplantation, something that remains a considerable issue today: about 85 per cent of people in the UK needing an organ transplant are waiting for a kidney.53 Brent had started his long career by performing experiments that led to a Nobel Prize for his PhD supervisor, a prize Medawar shared with the Australian Macfarlane Burnet, who developed theories independently that ended up being vindicated by the holy trinity’s experiments. It’s Burnet whom we need to turn to next. From the other side of the world, his ideas deepened our understanding of the holy trinity’s experiments and gave a new answer to why we are ever so slightly and ever so importantly different from each other.

  2

  Self / Non-self

  It is well established that electrons and protons whirl in every atom; packs of atoms assemble in every molecule; societies of molecules create cells; and your body is a metropolis of cells. So, are we all essentially the same? No. Medawar’s story of graft rejection showed that my body can tell apart my cells from yours. Recall that his patients’ bodies could only accept skin grafted from elsewhere on their own bodies; skin taken from the bodies of others, even relatives, was rejected. How can this be? What molecular substance gives each of us our individuality and how could our bodies distinguish it? And this is where Frank Macfarlane Burnet moves things forward – by asking: how does our body know its tissues and cells as its own? Or, put another way, how does the human body discriminate self from non-self?

  Burnet was an introvert; ‘a fairly humourless dry old stick who wouldn’t let his hair down – the opposite of Medawar’, Leslie Brent recalls.1 But he is also one of the greatest thinkers there has ever been in human biology. In 1937, aged thirty-eight, Burnet formulated the idea that discriminating between what’s you and what’s not you is the immune system’s raison d’être, that recognizing and destroying substances that are non-self is precisely what the immune system must do. And from this Burnet realized that the problem of how our body recognizes disease is part and parcel of understanding how our body knows its own cells and tissues.

  This huge step forward in understanding how our immune system works descends directly from the simple fact that disease can be caused by germs. Beyond its obvious practical importance, knowledge of germs helped us to understand that disease is caused by something outside of us, something non-self. Although we all now know that germs cause disease, this fact took millennia to establish. Indeed, the history of how humans have struggled to understand disease is important in illustrating how revolutionary Burnet’s ideas really were.

  The Greek philosopher and physician Hippocrates, born around 460 BCE, is considered the first to have suggested that disease is not a direct act of God, or an outcome from some superstitious belief, but that instead it has a natural cause. Greek physicians, and later the Romans, took as fact that disease came about from an excess or deficiency of one of four ‘humours’ – black bile, yellow bile, phlegm and blood – each of which had to be present at the right levels for us to be healthy. This view endured, essentially unchanged, for two millennia.2

  A description of disease is not mere semantics: past misunderstandings have brought out the worst in human behaviour. When the Black Death arrived in Europe in 1347, a true understanding of disease was still centuries away, and the beliefs of the age had grave consequences. Estimates put deaths caused by the plague at anywhere between 75 and 200 million, slashing Europe’s population by at least a third, and possibly half. It would return in waves – though never again to such catastrophic effect – for the next 400 years. Inevitably, crowded cities were worst hit: half the populations of Paris and London perished. Chroniclers of the time said the living were scarcely able to bury the dead; that the devastation seemed more final than Noah’s flood.3 Doctors had only opinions, not facts, to explain what was going on. Most people believed that humanity was being punished by God, while astrologers asserted that the horror was caused by an alignment of planets Mars, Saturn and Jupiter (even though this doesn’t seem able to explain why only some people succumbed to the plague).

  A belief that the plague was caused by sins against God twisted into a desire to kill the enemies of Christ. One common belief was that the Black Death was spread by Jews and other non-Christians. Jews were accused of poisoning water wells in an attack against Christianity, and often confessed to this under torture. In vengeance, thousands were murdered in cities across France, Austria and Germany. The sentiment helped seed the following century’s Spanish Inquisition. A lack of understanding about the nature of disease played a role in allowing European leaders to force religious conversion and burn people at the stake. The painful irony is it that a contemporary understanding of disease reveals that human genetic variation is central to our immune defences.

  A modern view of disease begins in the nineteenth century, the giants of the era being Charles Darwin and the French microbiologist Louis Pasteur. The two legends never met face-to-face, alas, though it would have been possible. Today, Pasteur gets his name onto almost every packet of cheese, while Darwin is revered, sometimes cursed, for his supposed slaying of God. Pasteur first showed that living cells were essential for making wine and then that a similar budding and multiplying of cells occurred in soured milk. At the time, it was hotly disputed whether fermentation was some kind of mechanical breakdown of chemicals or a biological process. Pasteur clarified that minuscule living organisms, unseen by the human eye, were at the heart of these phenomena. But his brilliance was in realizing that we, too, must be exposed to this new-found world of invisible organisms. Since unseen microbes can cause dramatic changes to the nature of things – as in fermentation – he postulated that these unseen microbes might also underlie human disease. Many thought this a ridiculous idea: how could something so small that it can’t be seen kill something so much more powerful like us?

  Pasteur’s ideas about microscopic organisms highlighted a major problem: at the time, nobody knew where minuscule living organisms came from. Could minute life-forms arise from spontaneous chemical reactions when milk goes sour, or when maggots appear in rotting meat, or does life really only ever arise from pre-existing life? For the prestigious French Académie des Sciences, this was the most pressing issue of the day. Pasteur settled the debate with an ingenious simple experiment.

  He took a glass flask and shaped its neck into a thin tube bent to an s-shaped curve. To this so-called swan-neck flask he added a clear broth, similar to a soup base, which had been heated to kill off all living things. Although the broth was exposed to the air through the s-shaped neck, nothing would grow in the liquid – microbes and dust particles from the air would collect in the curve of the flask’s neck and not reach the broth. But after Pasteur then broke off the curved neck, the broth would turn cloudy – things now started to grow. Microbes had fallen into the broth from dust in the air. So, life does not spontaneously arise in the broth, it falls in from the air. But another, more subtle implication was that minute organisms are all around us.

  That such minute organisms can cause disease in humans was finally established in 1876 by the German scientist and medical doctor Robert Koch, son of a mining engineer. Koch set up a makeshift laboratory in his four-room flat while working as the district medical officer in Wollstein, Western Poland, isolated from libraries and other scientists and without financial support for research, simply using equipment he purchased himself – apart from his microscope, which was a present from his wife. By day he saw his medical patients; out of hours he worked on mice, infecting them with anthrax bacteria that he obtained from the spleens of dead farm animals.

  It was already known that organs or blood from an infected animal could pass on the disease. But one of Koch’s brilliant experiments was to culture some of the rod-shaped anthrax bacteria in the fluid from an ox’s eye, and to demonstrate that these cultured, isolated bacteria could still give mice the disease. In this way Koch established, once and for all, that bacteria can cause disease. In fact, we now know there are about 5 × 1032 bacteria on earth. It no longer seems ridiculous that minuscule unseen germs could harm us. Now the more astonishing thing is that our immune system is, more often than not, actually able to protect us.

  Koch’s and Pasteur’s discoveries complement each other perfectly but personally they were arch-enemies. For much of their careers, they fired off at each other vicious patriotic claims for their own discoveries, mirroring the Franco-German political disputes of the time.4Koch, younger by twenty years, suggested that Pasteur could not obtain microbes as pure as he could, and that Pasteur’s experiments were usually meaningless. At a meeting in Geneva in 1882, Pasteur, by then aged sixty, directed a barbed observation at Koch, who was seated in the front row. Describing his latest experiments with chicken cholera, which showed that the disease-causing bacteria could be attenuated and used as a vaccine, Pasteur then noted, ‘However blazingly clear the demonstrated truth, it has not always had the privilege of being easily accepted.’ Just to make absolutely clear who he was talking about, he continued: ‘Dr Koch, who finds nothing remarkable in this experiment . . . does not believe that I operated as I said I did, with eighty chickens . . . because that would have cost too much money.’ Sitting with his students, Koch listened unmoved to Pasteur’s nationalist punchline: ‘But in view of establishing this great fact . . . my government allowed me not to worry about the expense.’5

  The following year, an editorial in the Boston Medical and Surgical Journal wrote about the debacle with a timeless wisdom that can be transposed to any number of disputes:

  It is to be regretted that abstract questions of scientific truth or error cannot be divorced from the personalities of discoverers and wrangling over priority, and that such anger should possess celestial minds. The expanse of the unknown is broad enough for all voyagers to pursue their way without collision.6

  But perhaps these words are naive. Pioneers in science, or anything else, must be strong-willed enough to travel in a new direction and thick-skinned enough to withstand criticism from guardians of the prevailing dogma. A level of inner confidence that gets very close to arrogance is often of benefit to any trailblazer; self-belief is as critical as talent.

  To relate to this kind of almost stereotyped conflict between scientists it’s important to remember that, while artists are able to delight in their individual output being individual, scientists never really produce anything unique. They can only be first in uncovering information that otherwise would have been discovered by somebody else later. In the end both Pasteur and Koch, as well as many others, contributed to the discovery that germs cause disease. Koch won the Nobel Prize in 1905, but Pasteur had died six years before the first Nobel Prizes were awarded. Both have major institutes named after them today.

  The concept of germs is so deeply implanted in us today that it takes effort to appreciate that the idea that so small a thing could be so harmful was initially thought ridiculous. It had to be explicitly proven that disease was not caused by the wrath of evil spirits, or an imbalance of black bile, yellow bile, phlegm and blood, or a poisonous vapour from decaying matter (as in the so-called miasma theory of the Middle Ages). Distinct diseases do have different origins, but many are caused by minuscule microbes, and realizing this is undoubtedly one of the greatest triumphs of the second millennium.

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183