The body, p.8

The Body, page 8

 

The Body
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  

  * * *

  —

  There is quite a lot of disagreement over whether the brain can make new neurons. A team at Columbia University led by Maura Boldrini announced in early 2018 that the brain’s hippocampi definitely produce at least some new neurons, but a team at the University of California at San Francisco came to precisely the opposite conclusion. The difficulty is that there is no certain way of telling whether neurons in the brain are new or not. What is beyond doubt is that even if we do make some new neurons, it is nothing like enough to offset the kind of loss you get from general aging, never mind stroke or Alzheimer’s. So either literally or to all intents and purposes, once you pass early childhood, you have all the brain cells you are ever going to have.

  On the plus side, the brain is able to compensate for quite severe loss of mass. In one case cited by the British doctor James Le Fanu in his book Why Us?, doctors scanning the brain of a middle-aged man of normal intelligence were astounded to discover that two-thirds of the space inside his skull was occupied by a giant benign cyst that he had evidently had since infancy. All of his frontal lobes and some of his parietal and temporal lobes were missing. The remaining third of his brain had simply taken on the duties and functions of the missing two-thirds and had done it so well that neither he nor anyone else had ever suspected that he was operating at a much-reduced capacity.

  * * *

  —

  For all its marvels, the brain is a curiously undemonstrative organ. The heart pumps, the lungs inflate and deflate, the intestines quietly ripple and gurgle, but the brain just sits pudding-like, giving away nothing. Nothing in its structure outwardly suggests that this is an instrument of higher thinking. As Professor John R. Searle of Berkeley once put it, “If you were designing an organic machine to pump blood you might come up with something like a heart, but if you were designing a machine to produce consciousness, who would think of a hundred billion neurons?”

  So it is hardly surprising that our understanding of how the brain functions was slow in coming and largely inadvertent. One of the great (and, it must be said, most written about) events in early neuroscience occurred in 1848 in rural Vermont when a young railroad builder named Phineas Gage was packing dynamite into a rock and it exploded prematurely, shooting a two-foot tamping rod through his left cheek and out the top of his head before it clattered back to Earth about fifty feet away. The rod removed a perfect core of brain about an inch in diameter. Miraculously, Gage survived and appears not even to have lost consciousness, though he did lose his left eye and his personality was forever transformed. Previously happy-go-lucky and popular, he was now moody, argumentative, and given to profane outbursts. He was just “no longer Gage,” as one old friend reported sadly. As often happens to people with frontal lobe damage, he had no insight into his condition and didn’t understand that he had changed. Unable to settle, he drifted from New England to South America and on to San Francisco, where he died aged thirty-six after falling prey to seizures.

  Gage’s misfortune was the first proof that physical damage to the brain could transform personality, but over the following decades others noticed that when tumors destroyed or impinged upon parts of the frontal lobes, the victims sometimes became curiously placid and serene. In the 1880s, in a series of operations, a Swiss physician named Gottlieb Burckhardt surgically removed eighteen grams of brain from a disturbed woman, in the process turning her (in his own words) from “a dangerous and excited demented person to a quiet demented one.” He tried the process on five more patients, but three died and two developed epilepsy, so he gave up. Fifty years later, in Portugal, a professor of neurology at the University of Lisbon, Egas Moniz, decided to try again and began experimentally cutting the frontal lobes of schizophrenics to see if that might quiet their troubled minds. It was the invention of the frontal lobotomy (though it was then often called a leukotomy, particularly in Britain).

  Moniz provided an almost perfect demonstration of how not to do science. He undertook operations without having any idea what damage they might do or what the outcomes would be. He conducted no preliminary experiments on animals. He didn’t select his patients with particular care and didn’t monitor outcomes closely afterward. He didn’t actually perform any of the surgeries himself, but supervised his juniors—though freely took credit for any successes. The practice did actually work up to a point. People with lobotomies generally became less violent and more tractable, but they also routinely suffered massive, irreversible loss of personality. Despite the many shortcomings of the procedure and Moniz’s lamentable clinical standards, he was feted around the world and in 1949 received the ultimate accolade of a Nobel Prize.

  In the United States, a doctor named Walter Jackson Freeman heard of Moniz’s procedure and became his most enthusiastic acolyte. Over a period of almost forty years, Freeman traveled the country performing lobotomies on almost anyone brought before him. On one tour, he lobotomized 225 people in twelve days. Some of his patients were as young as four years old. He operated on people with phobias, on drunks picked up off the street, on people convicted of homosexual acts—on anyone, in short, with almost any kind of perceived mental or social aberration. Freeman’s method was so swift and brutal that it made other doctors recoil. He inserted a standard household ice pick into the brain through the eye socket, tapping it through the skull bone with a hammer, then wriggled it vigorously to sever neural connections. Here is his breezy description of the procedure in a letter to his son:

  I have been…knocking them out with a shock and while they are under the “anesthetic” thrusting an ice pick up between the eyeball and the eyelid through the roof of the orbit actually into the frontal lobe of the brain and making the lateral cut by swinging the thing from side to side. I have done two patients on both sides and another on one side without running into any complications, except a very black eye in one case. There may be trouble later on but it seemed fairly easy, although definitely a disagreeable thing to watch.

  Indeed. The procedure was so crude that an experienced neurologist from New York University fainted while watching a Freeman operation. But it was quick: patients generally could go home within an hour. It was this quickness and simplicity that dazzled many in the medical community. Freeman was extraordinarily casual in his approach. He operated without gloves or a surgical mask, usually in street clothes. The method caused no scarring but also meant that he was operating blind without any certainty about which mental capacities he was destroying. Because ice picks were not designed for brain surgery, sometimes they would break off inside the patient’s head and have to be surgically removed, if they didn’t kill the patient first. Eventually, Freeman devised a specialized instrument for the procedure, but it was essentially just a more robust ice pick.

  What is perhaps most remarkable is that Freeman was a psychiatrist with no surgical certification, a fact that horrified many other physicians. About two-thirds of Freeman’s subjects received no benefit from the procedure or were worse off. Two percent died. His most notorious failure was Rosemary Kennedy, sister of the future president. In 1941, she was twenty-three years old, a vivacious and attractive girl but headstrong and with a tendency to mood swings. She also had some learning difficulties, though these seem not to have been nearly as severe and disabling as has sometimes been reported. Her father, exasperated by her willfulness, had her lobotomized by Freeman without consulting his wife. The lobotomy essentially destroyed Rosemary. She spent the next sixty-four years in a care home in the Midwest, unable to speak, incontinent, and bereft of personality. Her loving mother did not visit her for twenty years.

  Gradually, as it became evident that Freeman and others like him were leaving trails of human wreckage behind them, the procedure fell out of fashion, especially with the development of effective psychoactive drugs. Freeman continued to perform lobotomies well into his seventies before finally retiring in 1967. But the effects that he and others left in their wake lasted for years. I can speak with some experience here. In the early 1970s, I worked for two years at a psychiatric hospital outside London where one ward was occupied in large part by people who had been lobotomized in the 1940s and 1950s. They were, almost without exception, obedient, lifeless shells.*4

  * * *

  —

  The brain is one of our most vulnerable organs. Paradoxically, the very fact that the brain is so snugly encased in its protective skull leaves it susceptible to damage when it swells from infection or when fluid is added to it, as with a bleed, because the additional material has nowhere to go. The result is compression of the brain, which can be fatal. The brain is also easily injured by being dashed against the skull by sudden violence as in a car crash or fall. A thin layer of cerebrospinal fluid in the meninges, the brain’s outer membrane, provides a bit of cushioning, but only a bit. These injuries, known as contrecoup injuries, appear on the opposite side of the brain from the point of impact because the brain is flung against its own protective (or in this case not so protective) casing.

  Above all, the brain is vulnerable to its own internal storms. Strokes and seizures are peculiarly human frailties. Most other mammals never suffer strokes, and for those that do, it is a rare event. But for humans, it is the second most common cause of death globally, according to the World Health Organization. Why this should be is something of a mystery. As Daniel E. Lieberman observes in The Story of the Human Body, we have an excellent blood supply to the brain to minimize stroke and yet we get strokes.

  Epilepsy likewise is a perennial mystery, but with the additional burden that sufferers have been shunned and demonized throughout history. Well into the twentieth century, it was commonly believed by medical authorities that seizures were infectious—that just watching someone have a seizure could provoke a seizure in others. Epileptics were often treated as mental defectives and confined to institutions. As recently as 1956, it was illegal in seventeen U.S. states for epileptics to marry; in eighteen states, epileptics could be involuntarily sterilized. The last of these laws was repealed only in 1980. In Britain, epilepsy remained on the statute books as grounds for annulment until 1970. As Rajendra Kale put it in the British Medical Journal some years ago, “The history of epilepsy can be summarised as 4,000 years of ignorance, superstition and stigma followed by 100 years of knowledge, superstition and stigma.”

  Epilepsy isn’t really a single disease but a collection of symptoms that can range from a brief lapse of awareness to prolonged convulsions, all caused by misfiring neurons in the brain. Epilepsy can be brought on by illness or head trauma, but very often there is no clear precipitating event, just a sudden, frightening seizure from out of the blue. Modern drugs have greatly reduced or eliminated seizures for millions of sufferers, but about 20 percent of epileptics do not respond successfully to medications. Every year about one epileptic in a thousand dies during or just after a seizure in a condition known as sudden unexpected death in epilepsy. As Colin Grant noted in A Smell of Burning: The Story of Epilepsy, “No one knows what causes it. The heart just stops.” (An additional one in a thousand epileptics dies tragically each year from losing consciousness in unfortunate circumstances—in the bath, say, or by striking their head badly in a fall.)

  The inescapable fact is that the brain is an unnerving place as well as a marvelous one. There seems to be an almost limitless number of curious or bizarre syndromes and conditions associated with neural disorders. Anton-Babinski syndrome, for instance, is a condition in which people are blind but refuse to believe it. In Riddoch syndrome, victims cannot see objects unless they are in motion. Capgras syndrome is a condition in which sufferers become convinced that those close to them are impostors. In Klüver-Bucy syndrome, the victims develop an urge to eat and fornicate indiscriminately (to the understandable dismay of loved ones). Perhaps the most bizarre of all is Cotard delusion, in which the sufferer believes he is dead and cannot be convinced otherwise.

  Nothing about the brain is simple. Even being unconscious is a complicated matter. As well as being asleep, anesthetized, or concussed, you can be in a coma (eyes closed and wholly unaware), a vegetative state (eyes open but unaware), or minimally conscious (occasionally lucid but mostly confused or unaware). Locked-in syndrome is different again. It is being fully alert but paralyzed and often able to communicate only with eye blinks.

  No one knows how many people are alive but minimally conscious or worse, but Nature Neuroscience suggested in 2014 that the number globally is probably in the hundreds of thousands. In 1997, Adrian Owen, then a young neuroscientist working in Cambridge, England, discovered that some people thought to be in a vegetative state are in fact fully aware but powerless to indicate the fact to anyone.

  In his book Into the Gray Zone, Owen discusses the case of a patient named Amy who suffered a serious head injury in a fall and for years lay in a hospital bed. Using an fMRI scanner, and carefully watching the woman’s neural responses when researchers asked her a series of questions, they were able to determine that she was fully conscious. “She had heard every conversation, recognised every visitor, and listened intently to every decision being made on her behalf.” But she was unable to move a muscle—to open her eyes, scratch an itch, express any desire. Owen believes that something in the region of 15 to 20 percent of people thought to be in a permanent vegetative state are in fact fully aware. Even now the only certain way to tell if a brain is working is if its owner says it is.

  Perhaps nothing is more unexpected about our brains than that they are much smaller today than they were ten thousand or twelve thousand years ago, and by quite a lot. The average brain has shrunk from 1,500 cubic centimeters then to 1,350 cubic centimeters now. That’s equivalent to scooping out a portion of brain about the size of a tennis ball. That’s not at all easy to explain, because it happened all over the world at the same time, as if we agreed to reduce our brains by treaty. The common presumption is that our brains have simply become more efficient and able to pack more performance into a smaller space, rather like cell phones, which have grown more sophisticated as they have contracted in size. But no one can prove that we haven’t simply grown dimmer.

  Over roughly the same period, our skulls have also become thinner. No one can really explain that either. It may be simply that a less robust and active lifestyle means that we don’t need to invest in skull bone in the way we used to. But then again it may simply be that we aren’t what we once were.

  And with that sobering thought to reflect upon, let’s look at the rest of the head.

  *1 I am much indebted to Dr. Magnus Bordewich, director of research in the Department of Computer Science at Durham University, for some of these calculations.

  *2 You have two of each, one in each hemisphere, so really they ought to be referred to in the plural (thalami, hippocampi, amygdalae, and so on), but they seldom are.

  *3 Another extraordinary example of imaginary memories occurred in an experiment at an unidentified university in Canada where sixty volunteer students were confronted with the accusation that during adolescence they had committed a crime involving theft or assault for which they had been arrested. None of this had actually happened, but after three sessions with a kindly but manipulative interviewer, 70 percent of the volunteers confessed to these imaginary incidents, often adding vivid incriminating details—entirely imaginary but sincerely believed.

  *4 In surely its most questionable entry, the 2001 Oxford Companion to the Body says, “For many people the term ‘lobotomy’ conjures up images of disturbed beings whose brains have been damaged or mutilated extensively, leaving them at best in a vegetative state without a personality or feelings. This was never true.” Actually, it was.

  5 THE HEAD

  This was not merely an idea, but a flash of inspiration. At the sight of that skull, I seemed to see all of a sudden, lighted up as a vast plain under a flaming sky, the problem of the nature of the criminal.

  —CESARE LOMBROSO

  WE ALL KNOW that you can’t live without your head, but how long exactly is a question that received rather a lot of attention in the late eighteenth century. It was a good time to wonder because the French Revolution gave inquiring minds a steady supply of freshly lopped heads to examine.

  A decapitated head will still have some oxygenated blood in it, so loss of consciousness may not be instantaneous. Estimates of how long the brain can keep working range from two seconds to seven, and that is assuming a clean removal, which was by no means always the case. Heads don’t come off easily even with stout blows from a specially sharpened ax wielded by an expert. As Frances Larson notes in her fascinating history of decapitation, Severed, Mary, Queen of Scots, needed three hearty whacks before her head hit the basket, and hers was a comparatively delicate neck.

  Many observers at executions claimed to have witnessed evidence of consciousness from newly separated heads. Charlotte Corday, guillotined in 1793 for the murder of the radical leader Jean-Paul Marat, was said to wear a look of fury and resentment when the executioner held her head up to the cheering crowd. Others, as Larson notes, were reported to have blinked or moved their lips as if trying to speak. A man named Terier was said to have turned his gaze to a speaker some fifteen minutes after being separated from his body. But how much of this was reflex, or exaggerated in the retelling, no one could say. In 1803, two German researchers decided to bring some scientific rigor to the matter. They pounced on the heads as they fell and examined them immediately for any sign of alertness, shouting, “Do you hear me?” None responded, and the investigators concluded that loss of consciousness was immediate or at least too swift to measure.

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183