The body, p.5

The Body, page 5

 

The Body
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  

  Candida albicans, the fungus behind thrush, until the 1950s was found only in the mouth and genitals, but now it sometimes invades the deeper body, where it can grow on the heart and other organs, like mold on fruit. Similarly, Cryptococcus gattii was for decades known to exist in British Columbia in Canada, mostly on trees or in the soil around them, but it never harmed a human. Then, in 1999, it developed a sudden virulence, causing serious lung and brain infections among a scattering of victims in western Canada and the United States. Exact figures are impossible to come by because the disease is often misdiagnosed and, remarkably, is not reportable in California, one of the main sites of occurrence, but something over three hundred cases in western North America have been confirmed since 1999, with about a third of victims dying.

  Rather better reported are figures for coccidioidomycosis, which is more commonly known as valley fever. It occurs almost entirely in California, Arizona, and Nevada, infecting about ten thousand to fifteen thousand people a year and killing about two hundred, though the actual number is probably higher because it can be confused with pneumonias. The fungus is found in soils, and the number of cases rises whenever soils are disturbed, as with earthquakes and dust storms. Altogether fungi are thought to be responsible for about a million deaths globally every year, so hardly inconsequential.

  Finally, protists. A protist is anything that isn’t obviously plant, animal, or fungus; it is a category reserved for all those life-forms that don’t fit anywhere else. Originally, in the nineteenth century, all single-celled organisms were called protozoa. It was assumed that all were closely related, but over time it became evident that bacteria and archaea were separate kingdoms. Protists is a huge category and includes amoebas, parameciums, diatoms, slime molds, and many others that are mostly obscure to all but people working in biological fields. From a human health perspective, the most notable protists are those from the genus Plasmodium. They are the evil little creatures that transfer from mosquitoes into us and give us malaria. Protists are also responsible for toxoplasmosis, giardiasis, and cryptosporidiosis.

  * * *

  —

  There is, in short, an astounding array of microbes all around us, and we have barely begun to understand their effects on us, for good and ill. A most arresting illustration of that arose in 1992 in the north of England in the old mill town of Bradford, West Yorkshire, when Timothy Rowbotham, a government microbiologist, was sent to try to track down the source of an outbreak of pneumonia. In a sample of water he took from a storage tower, he found a microbe unlike anything he or anyone else had ever seen before. He tentatively identified it as a new bacterium, not because it was particularly bacterial in nature, but because it couldn’t be anything else. He dubbed it the Bradford coccus for want of a better term. Though he had no idea of it, Rowbotham had just changed the world of microbiology.

  Rowbotham saved the samples in a freezer for six years before sending them on to colleagues when he took early retirement. Eventually, they came into the hands of Richard Birtles, an English biochemist working in France. Birtles realized that the Bradford coccus was not a bacterium but a virus—but one that didn’t fit any definitions of what viruses should be. For a start, this one was massively bigger—by a factor of more than a hundred—than any virus previously known. Most viruses have only a dozen or so genes. This one had over a thousand. Viruses aren’t considered living things, but its genetic code contained a stretch of sixty-two letters that has been found in all living things since the dawn of creation, making it not only arguably alive but as ancient as anything else on Earth.*2

  Birtles named the new virus mimivirus, for “microbe-mimicking.” When Birtles and his colleagues wrote up their findings, they couldn’t at first find any journal that would publish them, because they were too bizarre. The cooling tower was knocked down in the late 1990s, and it appears that the only colony of this odd and ancient virus was lost with it.

  Since then, however, other colonies of even more enormous viruses have been found. In 2013, a team of French researchers led by Jean-Michel Claverie from Aix-Marseille University in France (the institution to which Birtles was attached when he characterized mimivirus) found a new giant virus that they called pandoravirus, which contains no fewer than twenty-five hundred genes, 90 percent of which are found nowhere else in nature. They then found a third group, pithovirus, which is even bigger and at least as strange. Altogether as of this writing there are now five groups of giant viruses, which are all not only different from everything else on Earth but also very different from one another. Such strange and foreign bioparticles, it has been argued, are evidence for the existence of a fourth domain of life, in addition to bacteria, archaea, and eukaryotes, the latter of which include complex life like us. Where microbes are concerned, we are really just at the beginning.

  III

  WELL INTO THE modern age, the idea that something as small as a microorganism could cause us serious harm was thought self-evidently preposterous. When the German microbiologist Robert Koch reported in 1884 that cholera was wholly caused by a bacillus (a rod-shaped bacterium), an eminent but skeptical colleague named Max von Pettenkofer was so vehemently offended by the thought that he made a great show of swallowing a vial of the bacilli to prove Koch wrong. This would be a much better anecdote if Pettenkofer had thereupon fallen gravely ill and recanted his ill-founded objections, but in fact he didn’t become ill at all. Sometimes that happens. It is now believed that Pettenkofer had suffered from cholera earlier in his life and enjoyed some residual immunity. What is less well publicized is that two of his students also drank cholera extract and both grew very ill. At all events, the episode served to delay even further general acceptance of the germ theory, as it was known. In a sense, it didn’t matter terribly much what caused cholera or many other common maladies, because there weren’t any treatments for them anyway.*3

  Before penicillin, the closest thing to a wonder drug that existed was Salvarsan, developed by the German immunologist Paul Ehrlich in 1910, but Salvarsan was effective against only a few things, principally syphilis, and had a lot of drawbacks. For a start, it was made from arsenic, so was toxic, and treatment consisted in injecting roughly a pint of solution into the patient’s arm once a week for fifty weeks or more. If it wasn’t administered exactly right, fluid could seep into muscle, causing painful and sometimes serious side effects, including the need for amputation. Doctors who could administer it safely became celebrated. Ironically, one of the most highly regarded was Alexander Fleming.

  The story of Fleming’s accidental discovery of penicillin has been told many times, but hardly any two versions are quite the same. The first thorough account of the discovery was not published until 1944, a decade and a half after the events it describes, by which time details were already blurring, but as best as can be said, the story seems to be this: In 1928, while Alexander Fleming was away on a holiday from his job as a medical researcher at St. Mary’s Hospital in London, some spores of mold from the genus Penicillium drifted into his lab and landed on a petri dish that he had left unattended. Thanks to a sequence of chance events—that Fleming hadn’t cleaned up his petri dishes before departing on holiday, that the weather was unusually cool that summer (and thus good for spores), that Fleming remained away long enough for the slow-growing mold to act—he returned to find that the bacterial growth in the petri dish had been conspicuously inhibited.

  It is often written that the type of fungus that landed on his dish was a rare one, making the discovery practically miraculous, but this appears to have been a journalistic invention. The mold was in fact Penicillium notatum (now called Penicillium chrysogenum), which is very common in London, so it was hardly momentous that a few spores should drift into his lab and settle on his agar. It has also become a commonplace that Fleming failed to exploit his discovery and that years passed before others finally converted his findings into a useful medicine. That is, at the very least, an ungenerous interpretation. First, Fleming deserves credit for perceiving the significance of the mold; a less alert scientist might simply have tossed the whole lot out. Moreover, he dutifully reported his discovery, and even noted the antibiotic implications of it, in a respected journal. He also made some effort to turn the discovery into a usable medicine, but it was a technically tricky proposition—as others would later discover—and he had more pressing research interests to pursue, so he didn’t stick with it. It is often overlooked that Fleming was a distinguished and busy scientist already. He had in 1923 discovered lysozyme, an antimicrobial enzyme found in saliva, mucus, and tears as part of the body’s first line of defense against invading pathogens, and was still preoccupied with exploring its properties. He was hardly foolish or slapdash, as is sometimes implied.

  In the early 1930s, researchers in Germany produced a group of antibacterial drugs known as sulfonamides, but they didn’t always work well and often had serious side affects. At Oxford, a team of biochemists led by the Australian-born Howard Florey began searching for a more effective alternative and in the process rediscovered Fleming’s penicillin paper. The principal investigator at Oxford was an eccentric German émigré named Ernst Chain, who bore an uncanny resemblance to Albert Einstein (right down to the bushy mustache) but had a far more challenging disposition. Chain had grown up in a wealthy Jewish family in Berlin but had decamped to England with the rise of Adolf Hitler. Chain was gifted in many fields and considered a career as a concert pianist before settling on science. But he was also a difficult man. He had a volatile temperament and slightly paranoid instincts, though it seems fair to say that if there was ever a time when a Jew might be excused paranoia it was the 1930s. He was an unlikely candidate to make any discoveries because he had a pathological fear of being poisoned in a lab. Despite his dread, he persevered and found to his astonishment that penicillin not only killed pathogens in mice but had no evident side effects. It appeared to be the perfect drug: one that could devastate its target without wreaking collateral damage. The problem, as Fleming had seen, was that it was very hard to produce penicillin in clinically useful quantities. Under Florey’s command, Oxford gave over a significant amount of resources and research space to growing mold and patiently extracting from it tiny amounts of penicillin.

  By early 1941, they had just enough to trial the drug on a policeman named Albert Alexander, who was a tragically ideal demonstration of how vulnerable humans were to infections before antibiotics. While pruning roses in his garden, Alexander had scratched his face on a thorn. The scratch had grown infected and spread. Alexander had lost an eye and now was delirious and close to death. The effect of penicillin was miraculous. Within two days, he was sitting up and looking almost back to normal. But supplies quickly ran short. In desperation the scientists filtered and reinjected all they could from Alexander’s urine, but after four days the supplies were exhausted. Poor Alexander relapsed and died.

  With Britain preoccupied by World War II and the United States not yet in it, the quest to produce bulk penicillin moved to a U.S. government research facility in Peoria, Illinois. Scientists and other interested parties all over the Allied world were secretly asked to send in soil and mold samples. Hundreds responded, but nothing they sent proved promising. Then, two years after testing had begun, a lab assistant in Peoria named Mary Hunt brought in a cantaloupe from a local grocery store. It had a “pretty golden mold” growing on it, she recalled later. That mold proved to be two hundred times more potent than anything previously tested. The name and location of the store where Mary Hunt shopped are now forgotten, and the historic cantaloupe itself was not preserved: after the mold was scraped off, it was cut into pieces and eaten by the staff. But the mold lived on. Every bit of penicillin made since that day is descended from that single random cantaloupe.

  Within a year, American pharmaceutical companies were producing 100 billion units of penicillin a month. The British discoverers found to their chagrin that the production methods had been patented by the Americans and that they were now required to pay royalties to make use of their own discovery.

  Alexander Fleming didn’t become famous as the father of penicillin until the closing days of the war, some twenty years after his serendipitous discovery, but then he became very famous indeed. He received 189 honors of all types from around the world, and even had a crater on the moon named for him. In 1945, he shared the Nobel Prize in Physiology or Medicine with Ernst Chain and Howard Florey. Florey and Chain never enjoyed the popular acclaim they deserved, partly because they were much less gregarious than Fleming and partly because his story of accidental discovery made better copy than their story of dogged application. Chain, despite sharing the Nobel Prize, became convinced that Florey had not given him sufficient credit, and their friendship, such as it was, dissolved.

  As early as 1945, in his Nobel acceptance speech, Fleming warned that microbes could easily evolve resistance to antibiotics if they were carelessly used. Seldom has a Nobel speech been more prescient.

  IV

  THE GREAT VIRTUE of penicillin—that it scythes its way through all manner of bacteria—is also its elemental weakness. The more we expose microbes to antibiotics, the more opportunity they have to develop resistance. What you are left with after a course of antibiotics, after all, are the most resistant microbes. By attacking a broad spectrum of bacteria, you stimulate lots of defensive action. At the same time, you inflict unnecessary collateral damage. Antibiotics are about as nuanced as a hand grenade. They wipe out good microbes as well as bad. Increasing evidence shows that some of the good ones may never recover, to our permanent cost.

  Most people in the Western world, by the time they reach adulthood, have received between five and twenty courses of antibiotics. The effects, it is feared, may be cumulative, with each generation passing on fewer microorganisms than the one before. Few people are more aware of this than an American scientist named Michael Kinch. In 2012, when he was director of the Yale Center for Molecular Discovery in Connecticut, Kinch’s twelve-year-old son, Grant, developed severe abdominal pains.

  “He’d been at the first day of a summer camp and he’d eaten some cupcakes,” Kinch recalls, “so we thought at first it was just a combination of excitement and overindulgence, but the symptoms got worse.” Eventually, Grant ended up in Yale New Haven Hospital, where a number of alarming things happened quickly. It was found that he had a ruptured appendix and that his intestinal microbes had escaped into the abdomen, giving him peritonitis. Then the infection developed into septicemia, which meant it had spread to his blood and could go anywhere in his body. Dismayingly, four of the antibiotics Grant was given didn’t have any effect on the marauding bacteria.

  “That was really astounding,” Kinch recalls now. “This was a kid who had been on antibiotics just once in his life, for an ear infection, and yet he had gut bacteria that were resistant to antibiotics. That shouldn’t have happened.” Fortunately, two other antibiotics did work and Grant’s life was saved.

  “He was lucky,” Kinch says. “The day is fast approaching when the bacteria inside us may not be resistant to two-thirds of the antibiotics we hit them with, but to all of them. Then we really are in trouble.”

  Today Kinch is the director of the Center for Research Innovation in Business at Washington University in St. Louis. He works in a once derelict, now stylishly renovated telephone factory that is part of a neighborhood salvation project undertaken by the university. “This used to be the best place in St. Louis to score crack,” he says with a hint of ironic pride. A cheerful man of early middle years, Kinch was brought to Washington University to foster entrepreneurship, but one of his central passions remains the future of the pharmaceutical industry and where new antibiotics will come from. In 2016, he wrote an alarming book on the matter, A Prescription for Change: The Looming Crisis in Drug Development.

  “From the 1950s through the 1990s,” he says, “roughly three antibiotics were introduced into the U.S. every year. Today it’s roughly one new antibiotic every other year. The rate of antibiotic withdrawals—because they don’t work anymore or have become obsolete—is twice the rate of new introductions. The obvious consequence of this is that the arsenal of drugs we have to treat bacterial infections has been going down. There is no sign of it stopping.”

  What makes this much worse is that a great deal of our antibiotic use is simply crazy. Almost three-quarters of the forty million antibiotic prescriptions written each year in the United States are for conditions that cannot be cured with antibiotics. According to Jeffrey Linder, professor of medicine at Northwestern University, antibiotics are prescribed for 70 percent of acute bronchitis cases, even though guidelines explicitly state that they are of no use there.

  Even more appallingly, in the United States 80 percent of antibiotics are fed to farm animals, mostly to fatten them. Fruit growers can also use antibiotics to combat bacterial infections in their crops. In consequence, most Americans consume secondhand antibiotics in their food (including even some foods labeled as organic) without knowing it. Sweden banned the agricultural use of antibiotics in 1986. The European Union followed in 1999. In 1977, the Food and Drug Administration ordered a halt to the use of antibiotics for purposes of fattening farm animals, but backed off when there was an outcry from agricultural interests and the congressional leaders who supported them.

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183