Uncertainty, p.1

Uncertainty, page 1



1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Larger Font   Reset Font Size   Smaller Font   Night Mode Off   Night Mode



  Title Page

























  About the Author

  Also by David Lindley


  He is the God of order and not of confusion.

  – Isaac Newton

  Chaos was the law of nature; order was the dream of man.

  – Henry Adams


  If science is the attempt to extract order from confusion, then in early 1927 it veered onto an unexpected path. In March of that year, Werner Heisenberg, a physicist only twenty-five years old but already of international renown, set down a piece of scientific reasoning that was in equal measure simple, subtle, and startling. Heisenberg himself could hardly claim he knew exactly what he had done. He struggled to find an apt word to capture the sense of it. Most of the time he used a German word readily translated as “inexactness.” In a couple of places, with a slightly different intention, he tried “indeterminacy.” But under the irresistible pressure of his mentor and sometime taskmaster Niels Bohr, Heisenberg grudgingly added a postscript that brought a new word onto the stage: uncertainty. And so it was that Heisenberg’s discovery became indelibly known as the uncertainty principle.

  It’s not the best word. Uncertainty was hardly new to science in 1927. Experimental results always have a little slack in them. Theoretical predictions are only as good as the assumptions behind them. In the endless back-and-forth between experiment and theory, it’s uncertainty that tells the scientist how to proceed. Experiments probe ever finer details. Theories undergo adjustment and revision. When scientists have resolved one level of disagreement, they move down to the next. Uncertainty, discrepancy, and inconsistency are the stock-in-trade of any lively scientific discipline.

  So Heisenberg didn’t introduce uncertainty into science. What he changed, and profoundly so, was its very nature and meaning. It had always seemed a vanquishable foe. Starting with Copernicus and Galileo, with Kepler and Newton, modern science evolved through the application of logical reasoning to verifiable facts and data. Theories, couched in the rigorous language of mathematics, were meant to be analytical and precise. They offered a system, a structure, a thorough accounting that would replace mystery and happenstance with reason and cause. In the scientific universe, nothing happens except that something makes it happen. There is no spontaneity, no whimsy. The phenomena of nature might be inordinately complicated, but at bottom science must reveal order and predictability. Facts are facts, laws are laws. There can be no exceptions. The mills of science, like those they replaced, would grind exceeding small. And just as perfectly.

  For a century or two, the dream seemed realizable. If scientists of one generation, building on the work of the last, could see that they had yet to achieve their ideal, they could equally believe that those who came after them would finish the job. The power of reason implied the ineluctability of progress. Science would become more grandiose, more encompassing in scope, yet at the same time more detailed, more scrupulous. Nature was knowable—and if it was knowable then one day, necessarily, it would be known.

  This classical vision, springing from the physical sciences, became in the nineteenth century the dominant model for science of all kinds. Geologists, biologists, even the first generation of psychologists, pictured the natural world in its entirety as an intricate but inerrant machine. All sciences aspired to the ideal that physics offered. The trick was to define your science in terms of observations and phenomena that lent themselves to precise description—reducible to numbers, that is—and then to find mathematical laws that tied those numbers into an inescapable system.

  No doubt the task was hard. If ever scientists were daunted by their ambitions, it was because of the sheer complexity of the machine they were trying to tease apart. Perhaps the laws of nature would be too vast for their brains to fathom. Perhaps scientists would find they could write down the laws of nature only to discover they lacked the analytical and calculational firepower to work out the consequences. If the project of absolute scientific comprehension were to falter, it would be because the human mind wasn’t up to the task, not because nature itself was intractable.

  And that’s why Heisenberg’s argument proved so unsettling. It targeted an unsuspected weakness in the edifice of science—in the substructure, so to speak, a part of the foundation that had gone unexamined because it had seemed so self-evidently secure.

  Heisenberg took no issue with the perfectibility of the laws of nature. Instead, it was in the very facts of nature that he found strange and alarming difficulties. His uncertainty principle concerned the most elementary act of science: How do we acquire knowledge about the world, the kind of knowledge that we can subject to scientific scrutiny? How, in the particular example Heisenberg took, do we know where some object is and how fast it is moving? It was a question that would have baffled Heisenberg’s predecessors. At any time, a moving object has some speed and position. There are ways of measuring or observing these things. The better your observation, the more accurate the result. What else is there to say?

  Plenty more, Heisenberg discovered. His conclusion, so revolutionary and esoteric, has been expressed in words that have become almost commonplace. You can measure the speed of a particle, or you can measure its position, but you can’t measure both. Or: the more precisely you find out the position, the less well you can know its speed. Or, more indirectly and less obviously: the act of observing changes the thing observed.

  The bottom line, at any rate, seems to be that facts are not the simple, hard things they were supposed to be. In the classical picture of the natural world as a great machine, it had been taken for granted that all the working parts of the machinery could be defined with limitless precision and that all their interconnections could be exactly understood. Everything had its place, and there was a place for everything. This had seemed both fundamental and essential. To have a hope of comprehending the universe, you had first to assume that you could find out, piece by piece, what all the components of the universe were and what they were doing. Heisenberg, it seemed, was saying that you couldn’t always find out what you wanted to know, that your ability even to describe the natural world was circumscribed. If you couldn’t describe it as you wished, how could you hope to reason out its laws?

  The implications of Heisenberg’s discovery were obscure. And it came on the heels of an equally remarkable, equally perplexing insight that Heisenberg had delivered just two years earlier, when in a visionary flash he saw how to build the theory that became known as quantum mechanics. While the rest of the physics world struggled to keep up, Heisenberg, with a young man’s purity of vision, was eager to forge ahead, rewriting the fundamental rules of physics in an abstruse new theoretical language
that even he could not yet claim he fully grasped. But Niels Bohr, a man given to slow and sometimes exasperatingly careful reflection, saw the need to assimilate the new to the old. The difficult but essential task, he saw, was to make sense of the new quantum physics without throwing overboard the hard-won successes of the previous era. He and Heisenberg wrangled painfully over how best to portray the emerging, still controversial science.

  Another voice came into the argument. By the time Heisenberg announced his principle, Albert Einstein was close to fifty. He was the old man of science, respected, revered, but no longer always attended to. Younger scientists were doing the important work. Einstein occupied the role of lofty commentator. He too, in his day, had been a revolutionary. In his great year of 1905, with his theory of relativity, he had overthrown the old Newtonian idea of absolute space and time. Events that one observer saw as simultaneous might seem to another to happen in sequence, one after the other. A third observer might see that sequence reversed. Heisenberg loosely adduced Einstein’s revolutionary principle in support of his own: different observers see the world differently.

  But this, to Einstein, was a monstrous misrepresentation of his own greatest achievement. Relativity, to be sure, allowed for differing perspectives, but the whole point of his theory was that it allowed apparently contradictory observations to be reconciled in a way that all observers could accept. In Heisenberg’s world, as far as Einstein could see, the very idea of a true fact seemed to crumble into an assortment of irreconcilable points of view. And that, said Einstein, was unacceptable, if science was to mean anything reliable. Here was another fierce intellectual struggle, Heisenberg and Bohr this time joining arms against the old master.

  From this shifting, three-way debate there eventually emerged a practical, workaday definition of the uncertainty principle that most physicists continue to find convenient and at least moderately comprehensible—as long as they choose not to think too hard about the still unresolved philosophical or metaphysical difficulties it throws up. Reluctantly, Einstein conceded the technical correctness of the system Heisenberg and Bohr laid out. But he could never accept that it was the last word. To him, the new physics remained until his dying day an unsatisfactory compromise, an interim measure that must eventually be supplanted by a theory resting on the old principles he cherished. Heisenberg’s uncertainty, Einstein stoutly insisted, was a sign of human inability to comprehend the physical world, not an indication of something strange and inaccessible about the world itself.

  Einstein’s profound distaste for the kind of physics that Bohr and Heisenberg were forging blossomed into what was indeed a struggle for the soul of science. Now that the battle is over, that phrase may seem melodramatic. But in the 1920s, when this new physics was emerging, it was all too evident that the foundations of physical science had come under an unprecedented scrutiny. And cracks showed. With Bohr overseeing the task, the foundations were rebuilt—or, as Einstein might have said, propped up—while the superstructure remained more or less as it was. This remarkable rehabilitation forms the core of the story this book tells. Among the principals there were no neutral voices. Nor was it a matter of one side being clearly delineated against another. Allegiances shifted. Views changed. And even now, Einstein’s skeptical spirit lingers over the ostensible victory claimed by Bohr and his adherents.

  This central story has both an afterword and a preface.

  The uncertainty principle has become a catchphrase for the general difficulty, not just in science, of establishing untainted knowledge. When journalists admit that their own views can influence the stories they are reporting, or when anthropologists lament how their presence disrupts the behavior of the cultures they are examining, Heisenberg’s principle is not far away: The observer changes the thing observed. When literary theorists assert that a text offers a variety of meanings, according to the tastes and prejudices of different readers, Heisenberg lurks in the background: The act of observation determines what is and isn’t observed.

  Does this have anything to do with basic physics? Hardly! Why, then, has Heisenberg’s principle been so enthusiastically appropriated by other disciplines? This curious annexation of an esoteric idea arises, I suggest later, not so much because journalists, anthropologists, literary critics, and the like are eager to find dubious scientific justification for their own assertions, but rather because the uncertainty principle makes scientific knowledge itself less daunting to the nonscientists and more like the slippery, elusive kind of knowing we daily grapple with.

  To get to that part of the story, however, we must first understand where Heisenberg’s uncertainty came from. Scientific revolutions, like any other kind, do not arrive out of thin air. They have roots and antecedents. Uncertainty represents the culmination of quantum mechanics, which by 1927 had already overturned many of the old convictions of classical, nineteenth-century physics. But quantum mechanics was itself a response to problems that the older physics could not handle. Certainty, in science, has always been a fraught issue, and although quantum theory and Heisenberg’s uncertainty are unquestionably products of the twentieth century, their earliest glimmerings appeared almost one hundred years earlier. So it is that the tale begins in the opening decades of the nineteenth century.

  Chapter 1


  Robert Brown, son of a Scottish clergyman, was the archetypal self-made scholar, sober, diligent, and careful to the point of fanaticism. Born in 1773, he trained in medicine at Edinburgh, then served for some years as a surgeon’s assistant in a Fifeshire regiment. There he put his spare time to worthy use. Rising early, he taught himself German (nouns and their declensions before breakfast, his diary records, conjugation of auxiliary verbs afterward) so that he could master the considerable German literature on botany, his chosen subject. On a visit to London in 1798, the young Scotsman met and so impressed the great botanist Sir Joseph Banks, president of the Royal Society, that on Banks’s recommendation he sailed three years later on a long voyage to Australia, returning in 1805 with close to four thousand exotic plant specimens neatly stowed on his ship. These he spent the next several years assiduously describing, classifying, and cataloging, serving meanwhile as Banks’s librarian and personal assistant. Brown’s remarkable Australian trove, along with Banks’s own equally notable collection, became the heart of the botanical department of the British Museum, of which Brown became the first professional curator. He was, said a visitor to Banks’s London house, “a walking catalogue of every book in the world.”

  Charles Darwin, before he was married, passed many a Sunday with the learned Robert Brown. In his autobiography Darwin describes a contradictory man, vastly knowledgeable but powerfully inclined to pedantry, generous in some ways, crabbed and suspicious in others. “He seemed to me to be chiefly remarkable for the minuteness of his observations and their perfect accuracy. He never propounded to me any large scientific views in biology,” Darwin writes. “He poured out his knowledge to me in the most unreserved manner, yet was strangely jealous on some points.” Brown was notorious, Darwin adds, for refusing to lend out specimens from his vast collection, even specimens that no one else possessed and which he knew he would never make any use of himself.

  It is ironic, then, that this dry, cautious man should be commemorated now mainly as the observer of a curious phenomenon, Brownian motion, that represented the capricious intrusion of randomness and unpredictability into the elegant mansion of Victorian science. It was indeed the very scrupulousness of Brown’s observations that made the implications of Brownian motion so grave.

  In June 1827, Brown began a study of pollen grains from Clarkia pulchella, a wildflower, popular today with gardeners, that had been discovered in Idaho in 1806 by Meriwether Lewis but named by him for his co-explorer William Clark. Characteristically, he intended to scrutinize minutely the shape and size of pollen particles, hoping that this would shed light on their function and on the way they interacted with other parts of
the plant to fulfill their reproductive role.

  Brown had acquired a microscope of recent and improved design. Its compound lenses largely banished the rainbow-hued fringes of color that afflicted the borders of objects seen in more primitive instruments. Under Brown’s eye the ghostly shapes of the pollen grains sprang clearly into view, their edges neatly delineated. Even so, the images were not perfect. The pollen grains wouldn’t stay still. They moved about, jiggled endlessly this way and that; they shimmered and stuttered; they drifted with strange erratic grace across the microscope’s field of view.

  This incessant motion complicated Brown’s planned investigations, but it was not so very surprising. More than a century and a half earlier Antony van Leeuwenhoek, a draper from Delft, Holland, had astonished and delighted the scientific world when he described tiny “animalcules” of strange and myriad form that his crude microscope revealed in droplets of pond water, in scrapings from the unbrushed teeth of old men, and even in a suspension of ordinary household pepper crushed into plain water. “The motion of most of these animalcules in the water was so swift, and so various, upwards, downwards, and round about, that ’twas wonderful to see,” the entranced Leeuwenhoek wrote. His discovery not only spurred further scientific investigation but also led well-to-do citizens to purchase microscopes for their parlors and drawing rooms, where they could amaze their guests with this new wonder of nature.

  Some animalcules had tiny hairs or finny extensions that enabled them to swim about. Others wriggled like little eels. It was easy to imagine that their meanderings were purposeful in some rudimentary way. Pollen grains, on the other hand, were simple in shape and had no moving parts. Still, they were undeniably organic. It seemed to Brown not unreasonable to suppose that pollen grains—especially as they were the male parts of a plant’s reproductive equipment—might possess some vital spirit that impelled them to move in their amusing but inscrutable fashion.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
Turn Navi Off
Turn Navi On
Scroll Up

Other author's books: