The Body, page 7
So there is a huge amount we have left to learn and many things we may never learn. But equally some of the things we do know are at least as amazing as the things we don’t. Consider how we see—or, to put it slightly more accurately, how the brain tells us what we see.
Just look around you now. The eyes send a hundred billion signals to the brain every second. But that’s only part of the story. When you “see” something, only about 10 percent of the information comes from the optic nerve. Other parts of your brain have to deconstruct the signals—recognize faces, interpret movements, identify danger. In other words, the biggest part of seeing isn’t receiving visual images; it’s making sense of them.
For each visual input, it takes a tiny but perceptible amount of time—about two hundred milliseconds, one-fifth of a second—for the information to travel along the optic nerves and into the brain to be processed and interpreted. One-fifth of a second is not a trivial span of time when a rapid response is required—to step back from an oncoming car, say, or to avoid a blow to the head. To help us deal better with this fractional lag, the brain does a truly extraordinary thing: it continuously forecasts what the world will be like a fifth of a second from now, and that is what it gives us as the present. That means that we never see the world as it is at this very instant, but rather as it will be a fraction of a moment in the future. We spend our whole lives, in other words, living in a world that doesn’t quite exist yet.
The brain tricks you in a lot of ways for your own good. Sound and light reach you at very different speeds—a phenomenon we experience every time we hear a plane passing overhead and look up to find the sound coming from one part of the sky and a plane moving silently through another. In the more immediate world around you, your brain normally irons out these differences, so that you sense all stimuli as reaching you simultaneously.
In a similar way, the brain manufactures all the components that make up our senses. It is a strange, nonintuitive fact of existence that photons of light have no color, sound waves no sound, olfactory molecules no odors. As James Le Fanu has put it, “While we have the overwhelming impression that the greenness of the trees and the blueness of the sky are streaming through our eyes as through an open window, yet the particles of light impacting on the retina are colourless, just as the waves of sound impacting on the eardrum are silent and scent molecules have no smell. They are all invisible, weightless, subatomic particles of matter travelling through space.” All the richness of life is created inside your head. What you see is not what is but what your brain tells you it is, and that’s not the same thing at all. Consider a bar of soap. Has it ever struck you that soap lather is always white no matter what color the soap is? That isn’t because the soap somehow changes color when it is moistened and rubbed. Molecularly, it’s exactly as it was before. It’s just that the foam reflects light in a different way. You get the same effect with crashing waves on a beach—greeny-blue water, white foam—and lots of other phenomena. That is because color isn’t a fixed reality but a perception.
You have probably at some time or other encountered one of those illusion tests that require you to stare for fifteen or twenty seconds at a red square, then shift your vision to a blank sheet of paper, and for a few moments you will see a ghostly square of greenish blue on the white paper. This “afterimage” is a consequence of tiring some of the photoreceptors in your eyes by making them work extra intently, but what is relevant is that the greenish-blue color is not there and has never existed anywhere but in your imagination. In a very real sense, that is true of all colors.
Your brain is also extraordinarily good at finding patterns and determining order in chaos, as these two well-known illusions show:
In the first illustration, most people see only random smudges until it is pointed out to them that the picture contains a dalmatian dog; then suddenly for nearly everyone the brain fills in the missing edges and makes sense of the whole composition. The illusion dates from the 1960s, but no one seems to have kept a record of who first created it.
The second illustration does have a known history. It is called a Kanizsa triangle, after the Italian psychologist Gaetano Kanizsa, who created it in 1955. There is of course no actual triangle in the picture, except for the one your brain puts there.
Your brain does all these things for you because it is designed to help you in every way it can. Yet paradoxically it is also strikingly unreliable. Some years ago, a psychologist at the University of California at Irvine, Elizabeth Loftus, discovered that it is possible through suggestion to implant entirely false memories in people’s heads—to convince them that they were traumatically lost in a department store or shopping mall when they were small or that they were hugged by Bugs Bunny at Disneyland—even though these things never happened. (Bugs Bunny is not a Disney character and has never been at Disneyland.) She could show many people pictures of themselves as a child in which the image had been manipulated to make them look as if they were in a hot-air balloon, and often the subjects would suddenly remember the experience and excitedly describe it, even though in each case it was known that it had never happened.
Now, you might think that you could never be that suggestible, and you would probably be right—only about one-third of people are that gullible—but other evidence shows that we all sometimes completely misrecall even the most vivid events. In 2001, immediately after the 9/11 disaster at the World Trade Center in New York, psychologists at the University of Illinois took detailed statements from seven hundred people about where they were and what they were doing when they learned of the event. One year later, the psychologists asked the same question of the same people and found that nearly half now contradicted themselves in some significant way—put themselves in a different place when they learned of the disaster, believed that they had seen it on TV when in fact they had heard it on the radio, and so on—but without being aware that their recollections had changed. (I, for my part, vividly recall watching the events live on television in New Hampshire, where we were then living, with two of my children, only to learn later that one of those children was in fact in England at the time.)*3
Memory storage is idiosyncratic and strangely disjointed. The mind breaks each memory into its component parts—names, faces, locations, contexts, how a thing feels to the touch, even whether it is living or dead—and sends the parts to different places, then calls them back and reassembles them when the whole is needed again. A single fleeting thought or recollection can fire up a million or more neurons scattered across the brain. Moreover, these fragments of memory move around over time, migrating from one part of the cortex to another, for reasons entirely unknown. It’s no wonder we get details muddled.
The upshot is that memory is not a fixed and permanent record, like a document in a filing cabinet. It is something much more hazy and mutable. As Elizabeth Loftus told an interviewer in 2013, “It’s a little more like a Wikipedia page. You can go in there and change it, and so can other people.”
* * *
—
Memories are categorized in many different ways, and no two authorities seem to use quite the same terminologies. The most frequently cited divisions are long-term, short-term, and working (for duration) and procedural, conceptual, semantic, declarative, implicit, autobiographical, and sensual (for type). Fundamentally, however, memories come in two principal varieties: declarative and procedural. Declarative memory is the kind you can put into words—the names of state capitals, your date of birth, how to spell “ophthalmologist,” and everything else you know as fact. Procedural memory describes the things you know and understand but couldn’t so easily put into words—how to swim, drive a car, peel an orange, identify colors.
Working memory is where short-term and long-term memories combine. Say you are presented with a mathematical problem to solve. The problem resides in short-term memory—you won’t, after all, need to remember the problem months from now—but the skills necessary to make the computation are kept in long-term memory.
Researchers also sometimes find it useful to distinguish between recall memory, which is what you can remember spontaneously—the kinds of things you know when you do a general knowledge quiz—and recognition memory, which is where you are a bit hazy on the substance but can recall the context. Recognition memory explains why so many of us struggle to remember the contents of a book but can often recall where we read the book, the color or design of the cover, and other seeming irrelevancies. Recognition memory is actually useful because it doesn’t clutter the brain with unnecessary details but does help us to remember where we can find those details if we should need them again.
Short-term memory is really short—no more than half a minute or so for things like addresses and phone numbers. (If you can still remember something after half a minute, it is no longer technically a short-term memory. It’s long term.) Most people’s short-term memory is pretty abysmal. Six random words or digits is about all that most of us can reliably retain for more than a few moments.
On the other hand, with effort we can train our memories to perform the most extraordinary stunts. Every year the United States has a national memory championship, and the feats performed there are truly astounding. One memory champion could recall 4,140 random digits after looking at them for just thirty minutes. Another was able to remember twenty-seven randomly shuffled decks of cards in the same time period. Yet another could recall a single deck of cards after thirty-two seconds of study. That may not be the most worthwhile use of the human mind, but it is certainly a demonstration of its incredible powers and versatility. Most of the memory champions, by the way, are not spectacularly intelligent. They just are motivated enough to train their memories to do some extraordinary tricks.
It used to be thought that every experience is stored permanently as memory somewhere in the brain but that most of it is locked away beyond our power of immediate recall. The idea arose principally from a series of experiments in Canada from the 1930s to the 1950s by the neurosurgeon Wilder Penfield. While carrying out surgical procedures at the Montreal Neurological Institute, Penfield discovered that when he touched a probe to patients’ brains, it often evoked powerful sensations—vivid smells from childhood, feelings of euphoria, sometimes a recollection of a forgotten scene from very early life. From this it was concluded that the brain records and stores every conscious event in our lives, however trivial. Now, however, it is thought that the stimulation was mostly providing the sensation of memory and that what the patients were experiencing was more like a hallucination than a recalled event.
What is certainly true is that we retain a great deal more than we can easily summon to mind. You may not recollect much of a neighborhood you lived in when you were small, but if you went back and walked around it, you would almost certainly remember very particular details you hadn’t thought about for years. With sufficient time and prompting, we would probably all be astonished at how much we have stored away inside us.
The person from whom we learned a good deal of what we know about memory was, ironically, a man who had very little of it himself. Henry Molaison was an amiable and good-looking young man of twenty-seven in Connecticut who suffered from crippling episodes of epilepsy. In 1953, inspired by the efforts of Wilder Penfield in Canada, a surgeon named William Scoville drilled into Molaison’s head and removed half of the hippocampus from each side of his brain and most of the amygdalae. The procedure greatly reduced Molaison’s seizures (though it didn’t entirely eliminate them) but at the tragic cost of robbing him of the ability to form new memories—a condition known as anterograde amnesia. Molaison could recall events from his distant past but had almost no capacity to form new memories. Someone who left the room would be immediately forgotten. Even a psychiatrist who saw him almost daily for years was a new person to him each time she came through the door. Molaison always recognized himself in the mirror but was often astounded at how old he had become. Occasionally, and mysteriously, he was able to lay down just a few memories. He could recall that John Glenn was an astronaut and Lee Harvey Oswald an assassin (though he couldn’t recall whom Oswald had assassinated) and learned the address and layout of his new house when he moved. But beyond that he was locked in an eternal present that he could never understand. Poor Henry Molaison’s plight was the first scientific intimation that the hippocampus has a central role in laying down memories. But what scientists learned from Molaison was not so much how memory works as how difficult it is to understand how it works.
* * *
—
What is surely the most striking feature of the brain is that all its higher processes—thinking, seeing, hearing, and so on—happen right at the surface, in the four-millimeter-thick sheath of the cerebral cortex. The person who first mapped this area was the German neurologist Korbinian Brodmann (1868–1918). Brodmann was one of the most brilliant and least appreciated of modern neuroscientists. In 1909, while working at a research institute in Berlin, he painstakingly identified forty-seven distinct regions of the cerebral cortex, which have been known ever since as Brodmann areas. “Rarely in the history of neuroscience has a single illustration been as influential,” wrote Karl Zilles and Katrin Amunts in Nature Neuroscience a century later.
Painfully shy, Brodmann was repeatedly overlooked for promotions despite the importance of his work and struggled for years to secure an adequate research position. His career was further sidetracked with the outbreak of World War I, when he was sent to work at a mental asylum in Tübingen. Finally, in 1917, at the age of forty-eight his luck turned. He landed an important job as head of the Department of Topographical Anatomy at an institute in Munich. At last he had the economic security to get married and have a child, both of which he did in short order. Brodmann enjoyed not quite a year of unaccustomed serenity. In the summer of 1918, eleven and a half months after his marriage, two and a half months after the birth of his child, and at the very height of his happiness, he contracted a sudden infection and within five days was dead. He was forty-nine years old.
The area that Brodmann mapped, the cerebral cortex, is the brain’s celebrated gray matter. Beneath it is the much greater volume of white matter, which is so called because the neurons are sheathed in a pale fatty insulator called myelin, which greatly accelerates the speed at which signals are transmitted. Both white matter and gray matter are misleadingly named. Gray matter isn’t terribly gray in life, but has a pinkish blush. It only becomes conspicuously gray in the absence of blood flow and with the addition of preservatives. White matter is also a posthumous attribute because the pickling process turns the myelin coatings on its nerve fibers a luminous white.
Incidentally, the idea that we use only 10 percent of our brains is a myth. No one knows where the idea came from, but it has never been true or close to true. You may not use it all terribly sensibly, but you employ all your brain in one way or another.
* * *
—
The brain takes a long time to form completely. A teenager’s brain is only about 80 percent finished (which may not come as a great surprise to the parents of teenagers). Although most of the growth of the brain occurs in the first two years and is 95 percent completed by the age of ten, the synapses aren’t fully wired until a young person is in his or her mid- to late twenties. That means that the teenage years effectively extend well into adulthood. In the meantime, the person in question will almost certainly have more impulsive, less reflective behavior than his elders and will also be more susceptible to the effects of alcohol. “The teenage brain is not just an adult brain with fewer miles on it,” Frances E. Jensen, a neurology professor, told Harvard Magazine in 2008. It is, rather, a different kind of brain altogether.
The nucleus accumbens, a region of the forebrain associated with pleasure, grows to its largest size in one’s teenage years. At the same time, the body produces more dopamine, the neurotransmitter that conveys pleasure, than it ever will again. That is why the sensations you feel as a teenager are more intense than at any other time of life. But it also means that seeking pleasure is an occupational hazard for teenagers. The leading cause of deaths among teenagers is accidents—and the leading cause of accidents is simply being with other teenagers. When more than one teenager is in a car, for instance, the risk of an accident multiplies by 400 percent.
Everybody has heard of neurons, but not so many are familiar with the other main brain cells, glia or glial cells, which is a little odd because they outnumber neurons by ten to one. Glia (the word means “glue” or “putty”) are the cells that support neurons in the brain and central nervous system. For a long time, they were assumed to be not too important—their role was thought to be principally to provide a kind of physical support, or extracellular matrix as anatomists put it, for neurons—but now it is known that they engage in a lot of important chemistry, from producing myelin to clearing away wastes.










