The Body, page 39
It was an unfairness that ended up saving lives, for in 1943, still a student, Schatz followed a hunch that soil microbes might provide an additional antibiotic to put alongside the new drug penicillin, which, for all its value, didn’t work against bacteria of a type known as Gram-negative. This included the microbe responsible for tuberculosis. Schatz patiently tested hundreds of samples and in just under a year came up with streptomycin, the first drug to vanquish Gram-negative bacteria. It was one of the most important microbiological breakthroughs of the twentieth century.*
Schatz’s supervisor, Selman Waksman, immediately saw the potential of Schatz’s discovery. He took charge of the clinical trials of the drug and, in the process, had Schatz sign an agreement ceding patent rights to Rutgers. Soon afterward, Schatz discovered that Waksman was taking full credit for the discovery and keeping Schatz from being invited to meetings and conferences where he would have received praise and attention. With the passage of time, Schatz also discovered that Waksman had not relinquished patent rights himself, but was pocketing a generous share of profits, which were soon running into millions of dollars a year.
Unable to get any satisfaction, Schatz eventually sued Waksman and Rutgers, and won. In settlement, he was given a portion of the royalties and credit as co-discoverer, but the lawsuit ruined him: it was considered very bad form to sue a superior in academia in those days. For many years, the only work Schatz could find was at a small agricultural college in Pennsylvania. His papers were repeatedly rejected by leading journals. When he wrote an account of the discovery of streptomycin as it had really happened, the only publication he could find that would accept it was the Pakistan Dental Review.
In 1952, in one of the supreme injustices of modern science, Selman Waksman was awarded the Nobel Prize in Physiology or Medicine. Albert Schatz received nothing. Waksman continued to take the credit for the discovery for the rest of his life. He didn’t mention Schatz in his Nobel acceptance speech or in his 1958 autobiography, in which he merely noted in passing that he had been assisted in his discovery by a graduate student. When Waksman died in 1973, he was described in many obituaries as “the father of antibiotics,” which he most assuredly was not.
Twenty years after Waksman’s death, the American Society for Microbiology made a somewhat belated attempt at amends by inviting Schatz to address the society on the occasion of the fiftieth anniversary of streptomycin’s discovery. In recognition of his achievements, and presumably without giving the matter a lot of thought, it bestowed on him its highest award: the Selman A. Waksman medal. Life sometimes really is very unfair.
If there is a hopeful moral to the story, it is that medical science progresses anyway. Thanks to thousands and thousands of mostly unsung heroes like Albert Schatz, our armory against assaults of nature has grown stronger and stronger with every passing generation—a fact happily reflected in dramatically improved life spans across the planet.
By one reckoning, life expectancy on Earth improved by as much in the twentieth century as in the whole of the preceding eight thousand years. The average life span for an American male went from 46 years in 1900 to 74 by century’s end. For American women, the improvement was better still—from 48 to 80. Elsewhere, the improvements have been little short of breathtaking. A woman born in Singapore today can expect to live for 87.6 years, more than double what her great-grandmother could have counted on. Across the planet as a whole, life expectancy grew from 48.1 years for men in 1950 (which is as far back as global records reliably go) to 70.5 today; for women the rise was from 52.9 to 75.6 years. In more than two dozen countries, life expectancy today is over 80 years. At the top is Hong Kong at 84.3 years, closely followed by Japan at 83.8 and Italy at 83.5. The United Kingdom does quite well at 81.6 years, while the United States, for reasons that will be discussed below, comes in at a decidedly mediocre life expectancy of 78.6 years. Globally, however, the story is one of success, with most countries, even in the developing world, recording improvements of 40 to 60 percent in life spans in just a generation or two.
Nor do we die as we used to. Consider the lists below of principal causes of death in 1900 and now. (The accompanying numbers indicate deaths per 100,000 of population in each category.)
The most striking difference between the two eras is that nearly half of deaths in 1900 were from infectious diseases compared with just 3 percent now. Tuberculosis and diphtheria have disappeared from the modern top ten but been replaced by Alzheimer’s and diabetes. Accidents as a cause of death have jumped from seventh place to fifth, not because we have grown clumsier, but because other causes have been eliminated from the top tier. In the same way, heart disease in 1900 killed 137.4 people per 100,000 per year, while today it kills 192.9 per 100,000, a 40 percent increase, but that’s almost entirely because other things used to kill people first. The same goes for cancer.
There are, it must be said, problems with life expectancy figures. All death lists are in some measure arbitrary, particularly with respect to the elderly, who may have lots of debilitating conditions, any one of which may finish them off and all of which are bound to contribute. In 1993, two American epidemiologists, William Foege and Michael McGinnis, wrote a famous paper for The Journal of the American Medical Association arguing that the leading causes of death recorded on mortality tables—heart attacks, diabetes, cancer, and so on—were very often outcomes of other conditions and that the real causes were factors like smoking, poor diet, illicit use of drugs, and other behaviors overlooked on death certificates.
A separate problem is that deaths in the past were often recorded in strikingly vague and imaginative terms. When the writer and traveler George Borrow died in England in 1881, to cite one example, the cause of death was listed as “decay of nature.” Who can say what that might have been? Others were recorded as being carried off by “nervous fevers,” “stagnation of the fluids,” “sore teeth,” and “fright,” among many other causes of a wholly uncertain nature. Such ambiguous terms make it nearly impossible to produce reliable comparisons between causes of death now and in the past. Even for the two lists above, there is no telling how much correspondence may exist between senility in 1900 and Alzheimer’s disease today.
It is also important to bear in mind that historic life expectancy figures were always skewed by childhood deaths. When we read that life expectancy was forty-six years for American men in 1900, that doesn’t mean that most men got to forty-six and then keeled over. Life expectancies were short because so many children died in infancy, and that dragged the average down for everyone. If you got past childhood, the chances of living to a reasonably advanced age weren’t bad. Lots of people died early, but it was by no means a cause of wonder when people lived into old age. As the American academic Marlene Zuk has put it, “Old age is not a recent invention, but its commonness is.” The most heartening advance of recent times, however, is the striking improvement in mortality rates for the very young. In 1950, 216 children in every thousand—nearly a quarter—died before the age of five. Today the figure is just 38.9 early childhood deaths in a thousand—one-fifth what it was seventy years ago.
Even allowing for all the uncertainties, there is no question that early in the twentieth century people in the developed world began to enjoy much better prospects for living longer lives in better health. As the Harvard physiologist Lawrence Henderson famously remarked, “At some point between 1900 and 1912, a random patient with a random disease, consulting a doctor chosen at random, had for the first time in history a better than fifty-fifty chance of profiting from the encounter.” The more or less universal consensus among historians and academics was that medical science somehow turned a corner when it entered the twentieth century and just kept getting better and better as the century progressed.
Any number of reasons have been proposed for the improvement. The rise of penicillin and other antibiotics like Albert Schatz’s streptomycin had an obvious and significant impact on infectious diseases, but other medicines flooded the market, too, as the century proceeded. By 1950, half of the medicines available for prescription had been invented or discovered in just the previous ten years. Another huge boost can be attributed to vaccines. In 1921, America had about 200,000 cases of diphtheria; by the early 1980s, with vaccination, that had fallen to just 3. In roughly the same period, whooping cough and measles infections fell from about 1.1 million cases a year to just 1,500. Before vaccines, 20,000 Americans a year got polio. By the 1980s, that had dropped to 7 a year. According to the British Nobel laureate Max Perutz, vaccinations might have saved more lives in the twentieth century even than antibiotics.
The one thing no one doubted was that practically all the credit for the great advances lay securely with medical science. But then, in the early 1960s, a British epidemiologist named Thomas McKeown (1912–88) looked at the records again and noted some curious anomalies. Deaths from a large number of maladies—tuberculosis, whooping cough, measles, and scarlet fever notably—had begun to decline well before effective treatments had become available. Tuberculosis deaths in Britain dropped from four thousand per million in 1828 to twelve hundred in 1900, and to just eight hundred per million in 1925—a fall of 80 percent in a century. Medicine could account for none of that. Childhood scarlet fever deaths went from twenty-three per ten thousand in 1865 to just one per ten thousand in 1935, again without vaccines or other effective medical interventions. All told, McKeown suggested, medicine could account for no more than perhaps 20 percent of the improvements. All the rest were the result of improved sanitation and diet, healthier lifestyles, and even things like the rise of the railways, which improved food distribution, bringing fresher meat and vegetables to city dwellers.
McKeown’s thesis attracted a good deal of criticism. Opponents maintained that McKeown was carefully selective in the diseases he used to illustrate his thesis and that he underplayed the role of improved medical care. Max Perutz, one of his critics, argued persuasively that hygiene standards in the nineteenth century hadn’t advanced at all, but were continually eroded by the hordes of people crowding into newly industrializing cities and living in squalid conditions. The quality of drinking water in New York City, for one, declined steadily, and dangerously, in the nineteenth century—so much so that by 1900 residents of Manhattan were being instructed to boil all water before using it. The city didn’t get its first filtration plant until just before World War I. It was the same in almost every other major urban area in America as growth outpaced municipalities’ abilities or willingness to provide safe water and efficient sewerage.
However we decide to apportion the credit for our improved life spans, the bottom line is that nearly all of us are better able today to resist the contagions and afflictions that commonly sickened our great-grandparents, while having massively better medical care to call on when we need it. In short, we have never had it so good.
Or at least we have never had it so good if we are reasonably well-off. If there is one thing that should alarm and concern us today, it is how unequally the benefits of the last century have been shared. British life expectancies might have soared overall, but as John Lanchester noted in an essay in the London Review of Books in 2017, males in the East End of Glasgow today have a life expectancy of just fifty-four years—nine years less than a man in India. In exactly the same way, a thirty-year-old black male in Harlem, New York, is at much greater risk of dying than a thirty-year-old male Bangladeshi—and not, as you might think, from drugs or street violence but from stroke, heart disease, cancer, or diabetes.
Climb aboard a bus or subway train in almost any large city in the Western world and you can experience similar vast disparities with a short journey. In Paris, travel five stops on the Metro’s B line from Port-Royal to La Plaine—Stade de France and you will find yourself among people who have an 82 percent greater chance of dying in a given year than those just down the line. In London, life expectancy drops reliably by one year for every two stops traveled eastward from Westminster on the District Line of the Underground. In St. Louis, Missouri, make a twenty-minute drive from prosperous Clayton to the inner-city Jeff-Vander-Lou neighborhood and life expectancy drops by one year for every minute of the journey, a little over two years for every mile.
Two things can be said with confidence about life expectancy in the world today. One is that it is really helpful to be rich. If you are middle-aged, exceptionally well-off, and from almost any high-income nation, the chances are excellent that you will live into your late eighties. Someone who is otherwise identical to you but poor—exercises as devotedly, sleeps as many hours, eats a similarly healthy diet, but just has less money in the bank—can expect to die between ten and fifteen years sooner. That’s a lot of difference for an equivalent lifestyle, and no one is sure how to account for it.
The second thing that can be said with regard to life expectancy is that it is not a good idea to be an American. Compared with your peers in the rest of the industrialized world, even being well-off doesn’t help you here. A randomly selected American aged forty-five to fifty-four is more than twice as likely to die, from any cause, as someone from the same age-group in Sweden. Just consider that. If you are a middle-aged American, your risk of dying before your time is more than double that of a person picked at random off the streets of Uppsala or Stockholm or Linköping. It is much the same when other nationalities are brought in for comparison. For every 400 middle-aged Americans who die each year, just 220 die in Australia, 230 in Britain, 290 in Germany, and 300 in France.
These health deficits begin at birth and go right on through life. Children in the United States are 70 percent more likely to die in childhood than children in the rest of the wealthy world. Among rich countries, America is at or near the bottom for virtually every measure of medical well-being—for chronic disease, depression, drug abuse, homicide, teenage pregnancies, HIV prevalence. Even sufferers of cystic fibrosis live ten years longer on average in Canada than in the United States. What is perhaps most surprising is that all these poorer outcomes apply not just to underprivileged citizens but to prosperous white college-educated Americans when compared with their socioeconomic equivalents abroad.
This is all a touch counterintuitive when you consider that America spends more on health care than any other nation—two and a half times more per person than the average for all the other developed nations of the world. One-fifth of all the money Americans earn—$10,209 a year for every citizen, $3.2 trillion altogether—is spent on health care. It is the nation’s sixth-largest industry and provides one-sixth of its employment. You can’t get health care any higher on a national agenda without putting everyone in a white coat or uniform.
Yet despite the generous spending, and the undoubted high quality of American hospitals and health care generally, the United States comes just thirty-first in global rankings of life expectancy, behind Cyprus, Costa Rica, and Chile, and just ahead of Cuba and Albania.
How to explain such a paradox? Well, to begin with, and most inescapably, Americans lead more unhealthy lifestyles than most other people, and that is true at all levels of society. As Allan S. Detsky observed in The New Yorker, “Even wealthy Americans are not isolated from a lifestyle filled with oversized food portions, physical inactivity, and stress.” The average Dutch or Swedish citizen consumes about 20 percent fewer calories than the average American, for instance. That doesn’t sound massively excessive, but it adds up to 250,000 calories over the course of a year. You would get a similar boost if you sat down about twice a week and ate an entire cheesecake.
Life in America is also much riskier, especially for young people. A U.S. teenager is twice as likely to be killed in a car accident as a young person in a comparable country abroad and is eighty-two times more likely to be killed by a gun. Americans drink and drive more often than almost anybody else and wear seat belts less devotedly than everyone in the rich world but the Italians. Nearly all advanced nations require helmets for all motorcyclists and passengers. In America, 60 percent of states don’t. Three states have no helmet requirements at any age, and sixteen others require them only for riders aged twenty or under. Once citizens of those states reach their maturity, they can let the wind, and all too often the pavement, run through their hair. A helmeted rider is 70 percent less likely to suffer a brain injury and about 40 percent less likely to die in a crash. In consequence of all these factors, the United States records a really quite spectacular 11 traffic deaths per 100,000 people every year, compared with 3.1 in the United Kingdom, 3.4 in Sweden, and 4.3 in Japan.
Where America really differs from other countries is in the colossal costs of its health care. An angiogram, a survey by The New York Times found, costs an average of $914 in the United States, $35 in Canada. Insulin costs about six times as much in America as it does in Europe. The average hip replacement costs $40,364 in America, almost six times the cost in Spain, while an MRI scan in the United States is, at $1,121, four times more than in the Netherlands. The entire system is notoriously unwieldy and cost-heavy. America has about 800,000 practicing physicians but needs twice that number of people to administer its payments system. The inescapable conclusion is that higher spending in America doesn’t necessarily result in better medicine, just higher costs.
One commonly accepted yardstick for quality of health care is five-year cancer survival rates, and here there are great disparities. For colon cancer, five-year survival rates are 71.8 percent in South Korea and 70.6 percent in Australia, but just 64.9 percent in the United States. For cervical cancer, Japan comes out on top at 71.4 percent, closely followed by Denmark at 69.1 percent, with the United States at a middling 67 percent. For breast cancer, the United States tops the world rankings with 90.2 percent of victims still alive after five years, just ahead of Australia at 89.1 percent and considerably ahead of Britain at 85.6 percent. It is worth noting that overall survival figures can mask a lot of troubling ethnic disparities. For cervical cancer, for instance, white women in the United States have a 69 percent five-year survival rate, which puts them near the top of world rankings, while black women have just a 55 percent survival rate, leaving them close to the bottom. (That is all black women, rich and poor alike.)










