The Body, page 6
In 1945, the year that Alexander Fleming won the Nobel Prize, a typical case of pneumococcal pneumonia could be knocked out with forty thousand units of penicillin. Today, because of increased resistance, it can take more than twenty million units per day for many days to achieve the same result. On some diseases, penicillin now has no effect at all. In consequence, the death rate for infectious diseases has been climbing and is back to the level of about forty years ago.
Bacteria really are not to be trifled with. They not only have grown steadily more resistant but have evolved into a fearsome new class of pathogen commonly known, with scarcely a hint of hyperbole, as superbugs. Staphylococcus aureus is a microbe found commonly on human skin and in nostrils. Generally it does no harm, but it is an opportunist, and when the immune system is weakened, it can slip in and wreak havoc. By the 1950s, it had evolved resistance to penicillin, but luckily another antibiotic called methicillin had become available and it stopped S. aureus infections in their tracks. But just two years after methicillin’s introduction, two people at the Royal Surrey County Hospital in Guildford, near London, developed S. aureus infections that would not respond to methicillin. S. aureus had, almost overnight, evolved a new drug-resistant form. The new strain was dubbed Methicillin-resistant Staphylococcus aureus, or MRSA. Within two years, it had spread to mainland Europe. Soon after that, it leaped to the United States.
Today, MRSA and its cousins kill an estimated 700,000 people around the world annually. Until recently a drug called vancomycin was effective against MRSA, but now resistance has begun to emerge to it. At the same time, we are facing the formidable-sounding carbapenem-resistant Enterobacteriaceae (CRE) infections, which are immune to virtually everything we can throw at them. CRE kills about half of all those it sickens. Luckily, so far, it doesn’t usually infect healthy people. But watch out if it does.
Yet as the problem has grown, the pharmaceutical industry has retreated from trying to create new antibiotics. “It’s just too expensive for them,” Kinch says. “In the 1950s, for the equivalent of a billion dollars in today’s money, you could develop about ninety drugs. Today, for the same money, you can develop on average just one-third of a drug. Pharmaceutical patents last only for twenty years, but that includes the period of clinical trials. Manufacturers usually have just five years of exclusive patent protection.” In consequence, all but two of the eighteen largest pharmaceutical companies in the world have given up the search for new antibiotics. People take antibiotics for only a week or two. Much better to focus on drugs like statins or antidepressants that people can take more or less indefinitely. “No sane company will develop the next antibiotic,” Kinch says.
The problem needn’t be hopeless, but it does need to be addressed. At the current rate of spread, antimicrobial resistance is forecast to lead to ten million preventable deaths a year—that’s more people than die of cancer now—within thirty years, at a cost of perhaps $100 trillion in today’s money.
What nearly everyone agrees is that we need a more targeted approach. One interesting possibility would be to disrupt bacteria’s lines of communication. Bacteria never mount an attack until they have assembled sufficient numbers—what is known as a quorum—to make it worthwhile to do so. The idea would be to produce quorum-sensing drugs that wouldn’t kill all bacteria but would just keep their numbers permanently below the threshold, the quorum, that triggers an attack.
Another possibility is to enlist bacteriophages, a kind of virus, to hunt down and kill harmful bacteria for us. Bacteriophages—often shortened to just phages—are not well known to must of us, but they are the most abundant bioparticles on Earth. Virtually every surface on the planet, including us, is covered in them. They do one thing supremely well: each one targets a particular bacterium. That means clinicians would have to identify the offending pathogen and select the right phage to kill it, a more costly and time-consuming process, but it would make it much harder for bacteria to evolve resistance.
What is certain is that something must be done. “We tend to refer to the antibiotics crisis as a looming one,” Kinch says, “but it is not that at all. It’s a current crisis. As my son showed, these problems are with us now—and it is going to get much worse.”
Or as a doctor put it to me, “We are looking at a possibility where we can’t do hip replacements or other routine procedures because the risk of infection is too high.”
The day when people die once again from the scratch of a rose thorn may not be far away.
*1 According to Dr. Anna Machin of Oxford University, something you are doing when you are kissing another person is sampling his or her histocompatibility genes, which are involved in immune response. Though it may not be the matter uppermost on your mind at that moment, you are essentially testing whether the other person would make a good mate from an immunological perspective.
*2 For the record: GTGCCAGCAGCCGCGGTAATTCAGCTCCAATAGCGTATATTAAAGTTGCTGCAGTTAAAAAG.
*3 Koch’s discoveries are of course extremely well known, and he is justly celebrated for them. What is often overlooked, however, is what a difference small, incidental contributions can make to scientific progress, and nowhere was that better illustrated than in Koch’s own productive lab. Culturing lots and lots of different bacterial samples took up a great deal of lab space and raised the constant risk of cross-contamination. But luckily Koch had a lab assistant named Julius Richard Petri who devised the shallow dish with a protective lid that bears his name. Petri dishes took up very little space, provided a sterile and uniform environment, and effectively eliminated the risk of cross-contamination. But there was still a need for a growing medium. Various gelatins were tried, but all proved unsatisfactory. Then Fanny Hesse, the American-born wife of another junior researcher, suggested that they try agar. Fanny had learned from her grandmother to use agar to make jellies because it didn’t melt in the heat of an American summer. Agar worked perfectly for lab purposes, too. Without these two developments, Koch might have taken years longer, or possibly never succeeded, in making his breakthroughs.
4 THE BRAIN
The brain is wider than the sky,
For, put them side by side,
The one the other will include
With ease, and you beside.
—EMILY DICKINSON
THE MOST EXTRAORDINARY thing in the universe is inside your head. You could travel through every inch of outer space and very possibly nowhere find anything as marvelous and complex and high functioning as the three pounds of spongy mass between your ears.
For an object of pure wonder, the human brain is extraordinarily unprepossessing. It is, for one thing, 75 to 80 percent water, with the rest split mostly between fat and protein. Pretty amazing that three such mundane substances can come together in a way that allows us thought and memory and vision and aesthetic appreciation and all the rest. If you were to lift your brain out of your skull, you would almost certainly be surprised at how soft it is. The consistency of the brain has been variously likened to tofu, soft butter, or a slightly overcooked Jell-O pudding.
The great paradox of the brain is that everything you know about the world is provided to you by an organ that has itself never seen that world. The brain exists in silence and darkness, like a dungeoned prisoner. It has no pain receptors, literally no feelings. It has never felt warm sunshine or a soft breeze. To your brain, the world is just a stream of electrical pulses, like taps of Morse code. And out of this bare and neutral information it creates for you—quite literally creates—a vibrant, three-dimensional, sensually engaging universe. Your brain is you. Everything else is just plumbing and scaffolding.
Just sitting quietly, doing nothing at all, your brain churns through more information in thirty seconds than the Hubble Space Telescope has processed in thirty years. A morsel of cortex one cubic millimeter in size—about the size of a grain of sand—could hold two thousand terabytes of information, enough to store all the movies ever made, trailers included, or about 1.2 billion copies of this book. Altogether, the human brain is estimated to hold something on the order of two hundred exabytes of information, roughly equal to “the entire digital content of today’s world,” according to Nature Neuroscience.*1 If that is not the most extraordinary thing in the universe, then we certainly have some wonders yet to find.
* * *
—
The brain is often depicted as a hungry organ. It makes up just 2 percent of our body weight but uses 20 percent of our energy. In newborn infants, it’s no less than 65 percent. That’s partly why babies sleep all the time—their growing brains exhaust them—and have a lot of body fat, to use as an energy reserve when needed. Your muscles actually use even more of your energy, about a quarter, but you have a lot of muscle; per unit of matter, the brain is by far the most expensive of our organs. But it is also marvelously efficient. Your brain requires only about four hundred calories of energy a day—about the same as you get in a blueberry muffin. Try running your laptop for twenty-four hours on a muffin and see how far you get.
Unlike other parts of the body, the brain burns its four hundred calories at a steady rate no matter what you are doing. Hard thinking doesn’t help you slim. In fact, it doesn’t seem to confer any benefit at all. An academic at the University of California at Irvine named Richard Haier used positron emission tomography scanners to find that the hardest-working brains are usually the least productive. The most efficient brains, he found, were those that could solve a task quickly and then go into a kind of standby mode.
For all its powers, nothing about your brain is distinctively human. We use exactly the same components—neurons, axons, ganglia, and so on—as a dog or hamster. Whales and elephants have much larger brains than we have, though of course they also have much larger bodies. But even a mouse scaled up to the size of a human would have a brain just as big, and many birds would do even better. It also turns out that the human brain is a little less imposing than we had long assumed. For years, it was written that it has 100 billion nerve cells, or neurons, but a careful assessment by the Brazilian neuroscientist Suzana Herculano-Houzel in 2015 found that the number is more like 86 billion—a pretty substantial demotion.
Neurons are not like other cells, which are typically compact and spherical. Neurons are long and stringy, the better to pass on electrical signals from one to another. The main strand of a neuron is called an axon. At its terminal end, it splits into branch-like extensions called dendrites, as many as 400,000 of them. The tiny space between nerve cell endings is called a synapse. Each neuron connects with thousands of other neurons, giving trillions and trillions of connections—as many connections “in a single cubic centimeter of brain tissue as there are stars in the Milky Way,” to quote the neuroscientist David Eagleman. It is in all that complex synaptic entanglement that our intelligence lies, not in the number of neurons, as was once thought.
What is surely most curious and extraordinary about our brain is how largely unnecessary it is. To survive on Earth, you don’t need to be able to write music or engage in philosophy—you really only need to be able to outthink a quadruped—so why have we invested so much energy and risk in producing mental capacity that we don’t really need? That is just one of the many things about your brain that your brain won’t tell you.
* * *
—
As the most complex of our organs, the brain not surprisingly has more named features and landmarks than any other part of the body, but essentially it divides into three sections. At the top, literally and figuratively, is the cerebrum, which fills most of the cranial vault and is the part that we normally think of when we think of “the brain.” The cerebrum (from the Latin word for “brain”) is the seat of all our higher functions. It is divided into two hemispheres, each of which is principally concerned with one side of the body, but for reasons unknown the wiring is crossed, so that the right side of the cerebrum controls the left side of the body and vice versa. The two hemispheres are connected by a band of fibers called the corpus callosum (meaning “tough material” or literally “calloused body” in Latin). The brain is wrinkled by deep fissures known as sulci and ridges called gyri, which give it more surface area. The exact pattern of grooves and ridges in brains is distinctive to each individual—as distinctive as your fingerprints—but whether it has anything to do with your intelligence or temperament or anything else that defines you is unknown.
Each hemisphere of the cerebrum is further divided into four lobes: frontal, parietal, occipital, and temporal—each broadly specializing in certain functions. The parietal lobe manages sensory inputs like touch and temperature. The occipital lobe processes visual information, and the temporal lobe principally manages auditory information, though it also helps with processing visual information. It has been known for some years that six patches on the temporal lobe, known as face patches, become excited when we look at another face, though which parts of my face excite which of your patches is still largely uncertain, it seems. The frontal lobe is the seat of the higher functions of the brain—reasoning, forethought, problem solving, emotional control, and so on. It is the part responsible for personality, for who we are. Ironically, as Oliver Sacks once noted, the frontal lobes were the last parts of the brain to be deciphered. “Even in my own medical student days, they were called ‘the silent lobes,’ ” he wrote in 2001. That’s not because they were thought to lack functions but because those functions do not reveal themselves.
Beneath the cerebrum, at the very back of the head about where it meets the nape of the neck, is the cerebellum (Latin for “little brain”). Although the cerebellum occupies just 10 percent of the cranial cavity, it has more than half the brain’s neurons. It has a lot of neurons not because it does a great deal of thinking but because it controls balance and complex movements, and that requires an abundance of wiring.
At the base of the brain, descending from it rather like an elevator shaft connecting the brain to the spine and the body beyond, is the oldest part of the brain, the brain stem. It is the home of our more basic operations: sleeping, breathing, keeping the heart going. The brain stem doesn’t get a lot of attention in the popular consciousness, but it is so central to our existence that “brain-stem death” is the fundamental measure of deadness in humans in the United Kingdom.
Scattered through the brain rather like nuts in a fruitcake are many smaller structures—hypothalamus, amygdala, hippocampus, telencephalon, septum pellucidum, habenular commissure, entorhinal cortex, and a dozen or so others—which are collectively known as the limbic system (from the Latin limbus, meaning “peripheral”). It’s easy to go a lifetime without hearing a word about any of these components unless they go wrong. The basal ganglia, for instance, play an important part in movement, language, and thought, but it is only when they degenerate and lead to Parkinson’s disease that they normally attract attention to themselves.
Despite their obscurity and modest dimensions, the structures of the limbic system have a fundamental role in our happiness by controlling and regulating basic processes like memory, appetite, emotions, drowsiness and alertness, and the processing of sensory information. The concept of the limbic system was invented in 1952 by an American neuroscientist, Paul D. MacLean. Not all of today’s neuroscientists agree that the components form a coherent system. Many think they are just lots of disparate parts connected only by the fact that they are concerned with bodily performance rather than with thinking.
The most important component of the limbic system is a little powerhouse called the hypothalamus, which isn’t really a structure at all but just a bundle of neural cells. The name describes not what it does but where it is: under the thalamus. (The thalamus, meaning “inner chamber,” is a kind of relay station for sensory information and is an important part of the brain—there isn’t any part of the brain that isn’t important, obviously—but is not a component of the limbic system.) The hypothalamus is curiously unimposing. Though only about the size of a peanut and weighing barely a tenth of an ounce, it controls much of the most important chemistry of the body. It regulates sexual function, controls hunger and thirst, monitors blood sugar and salts, decides when you need to sleep. It may even play a part in how slowly or rapidly we age. A large measure of your success or failure as a human being is dependent on this tiny thing in the middle of your head.
The hippocampus is central to the laying down of memories. (The name comes from the Greek for “sea horse” because of its supposed resemblance to that creature.) The amygdala (Greek for “almond”) specializes in handling intense and stressful emotions—fear, anger, anxiety, phobias of all types. People whose amygdalae are destroyed are left literally fearless, and often cannot even recognize fear in others. The amygdala grows particularly lively when we are asleep, and thus may account for why our dreams are so often disturbing. Your nightmares may simply be the amygdalae unburdening themselves.*2
* * *
—
Considering how exhaustively the brain has been studied, and for how long, it is remarkable how much elemental stuff we still don’t know or at least can’t universally agree upon. Like what exactly is consciousness? Or what precisely is a thought? It is not something you can capture in a jar or smear on a microscopic slide, and yet a thought is clearly a real and definite thing. Thinking is our most vital and miraculous talent, yet in a profound physiological sense we don’t really know what thinking is.
Much the same could be said of memory. We know a good deal about how memories are assembled and how and where they are stored, but not why we keep some and not others. It clearly has little to do with actual value or utility. I can remember the entire starting lineup of the 1964 St. Louis Cardinals baseball team—something that has been of no importance to me since 1964 and wasn’t actually very useful then—and yet I cannot recollect the number of my own cell phone, or where I parked my car in any large parking lot, or what was the third of three things my wife told me to get at the supermarket, or any of a great many other things that are unquestionably more urgent and necessary than remembering the starting players for the 1964 Cardinals (who were, incidentally, Tim McCarver, Bill White, Julian Javier, Dick Groat, Ken Boyer, Lou Brock, Curt Flood, and Mike Shannon).










