The waltz of reason, p.16

The Waltz of Reason, page 16

 

The Waltz of Reason
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  

  Figure 6.14. An angle and two horns.

  Indeed, let P be a point on a line g, and consider circles tangent at P to this line (Figure 6.14). The angles of such circles with the line (Euclid called them “horns”) are a well-ordered set of magnitudes (the larger the horn, the smaller the radius). Each such horn is smaller than each “ordinary” angle at P, which is obtained by intersecting g with a line. Every nonzero angle, small as it may be, is larger than any horn. Add a thousand horns and you will still not have an angle. Horns, therefore, may be viewed as the infinitesimals of angles.

  For the length of segments on a line, however, the Archimedean property holds. More precisely, it belongs to those axioms required for Euclidean geometry that belatedly were added by Hilbert. What the Archimedean property means is that for any real number x, there is a natural number n that is larger. This property is assumed to hold, by fiat. It is convenient, although there is no logical necessity for it. Once this property is accepted as an axiom, infinitesimals are excluded from the game.

  Robinson showed, however, that there is no inconsistency in dropping this “axiom of Archimedes” and assuming instead that there exist some strange numbers that are larger than any natural number n. They are said to be infinitely large. Their inverses are the infinitesimals: these are positive numbers larger than 0 and yet smaller than every , for any n. (Thus, it is strange that Cantor, who was unfazed by an infinite ordinal such as ω, had rejected the infinitesimals in harsh terms.) The real numbers, together with the infinitely large and the infinitely small numbers, form the hyperreals. They prove responsive to all the customary activities of analysis. In particular, they make sense of everything that Leibniz claimed for his infinitesimals. So, the infinitesimals have returned. People just had to get used to them.

  For some time, there was even hope that a new generation of mathematicians would be trained without having to undergo the rites of epsilontics. In the end, this did not happen. But nonstandard analysis helped discover and prove new results in standard analysis. Better still, it provided a belated justification for the baroque adventurers who, advancing by bold faith, had created infinitesimal calculus, unshaken by the dialectical quirks of sophistry.

  7

  Probability

  A Random Walk to St. Petersburg

  Leaving Little to Chance

  It must have been the most enjoyable of all encounters between mathematics and philosophy. It certainly was the most rewarding ever. In 1728, at a banquet in Paris, an up-and-coming young philosophe and homme de lettres who had named himself Voltaire came to sit next to Monsieur de la Condamine, an accomplished mathematician, not yet thirty years of age.

  Figure 7.1. Voltaire (1694–1778).

  Figure 7.2. Charles-Marie de la Condamine (1701–1774).

  The conversation of Voltaire and Condamine soon turned to the newest scheme for a state lottery. The French government, perennially short of money, was trying to encourage their subjects to acquire bonds. The treasury offered, as an incentive, a lottery: bondholders could participate by buying a ticket for the price of one thousandth of the current value of their bond. The winner would be paid the nominal value of the bond (which at that time of depression was much higher than the current market value) and, in addition, a huge jackpot—half a million livres. More than enough to live in state for the rest of the century, the luxurious Siècle des Lumières, a time when it really paid to be rich.

  Something was wrong with that lottery scheme, said Condamine. The size of the jackpot was always the same, but the prize of the ticket varied with the value of the bond. Some bonds were high-priced, some others next to worthless. This was a typical example of an unfair game.

  Voltaire quickly grasped the point. The ministry of finance had made a serious blunder indeed. Voltaire could denounce it publicly—or else exploit it on the sly. By that time, he had abundantly proved that he was not afraid of making fun of the French authorities. His mordant wit had brought him a year-long imprisonment in the Bastille, and two years of exile in England. He could keep on goading the government and ridicule the French treasury for its inability to do their sums properly. But would it not be more profitable to keep mum and turn the blunder of the state officials to his own benefit? Voltaire liked to say that he knew far too many penniless writers to ever wish to add to their ranks. Now Condamine showed him how to get rich quick and land a great coup by using a bit of mathematics.

  Roughly speaking, the trick was to buy many of the low-price lottery tickets. Really many, ideally all of them! This would provide their owner with a high chance of winning the lottery, and this for a relatively modest cost.

  The basic idea was simple, the execution extremely complicated. One had to set up a syndicate, use straw men and sham deals, and ensure, one way or another, the good will of the right officials. Fortunately, Voltaire was a peerless networker, known all over France.

  The plan succeeded beyond all hopes. Month after month, the state lottery was repeated, with hefty winnings for all who were part of the scheme. It took two years before the authorities smelled a rat. Condamine and Voltaire were taken to court. The hoax could have cost them dear. But in the end, they got away with it: they were able to prove that they had done nothing illegal. They had simply exploited the rules of the game. The lottery came to an end, and the hapless controleur général des finances was fired.

  Voltaire and Condamine had become wealthy—stinking rich, to use a term from our less enlightened century. What happened next? Condamine, shortly after having eluded jail, was elected to the French Academy of Science. He used his fortune to undertake large-scale scientific expeditions, an Alexander von Humboldt avant la lettre. First, a spell in the Levant, in the company of the most illustrious freebooter of France, and then ten years in South America, including an unheard-of trip across the Amazon Basin. Later in life, Condamine championed a Europe-wide campaign for inoculation against smallpox. Approaching death at age seventy-three, he volunteered to test a new type of surgery against hernia. It succeeded, according to the doctors; but this time luck turned against Condamine. He died of wound fever.

  And Voltaire? He went on multiplying his newfound wealth and acquired an extraordinary, almost princely degree of independence. His Lettres Philosophiques, published a few years after his lottery scam, acted as “the first bomb thrown at the ancien régime.” Voltaire became the figurehead of the Enlightenment, the top celebrity of his time. And the “calculus of chance”—probability theory—had played an essential part in his stellar career.

  This theory is often labeled as the mathematics of randomness. But Voltaire denied randomness. Chance does not exist, he said. We only speak of it when we know the effect while ignoring the cause. Everything, however, has its cause. Nothing is due to chance.

  This phrase was not meant as a sly allusion to his uncanny winnings at the lottery. It was the settled opinion of all modern philosophers. The worldview had turned strictly deterministic, at the very latest with Newton. As a matter of fact, it had not been all that different in the centuries before—except that then everything was ordained by the will of God, and now by the laws of science. (In the last hundred years, views on causality and determinism have undergone another shift, once more due to physics: randomness plays an almost impenetrable role in quantum mechanics. We shall keep away from that minefield.)

  Playing with Chance

  Chance is notoriously hard to define. Is it the force that causes something to happen without any known reason for doing so? Something that happens when several causes intermingle? Something that can be, but also not be? This is just a small sample (a random sample) of attempts to explain the word chance. Mathematicians, however, do not try to define chance. They want to reckon with it. This is more modest, and at the same time more ambitious.

  The calculus of probability emerged rather late in the history of mathematics. Its birth is commonly ascribed to an exchange of letters between the philosopher and mathematician Blaise Pascal and the lawyer and mathematician Pierre Fermat. But the ideas were in the air, toward the middle of the seventeenth century, and attracted thinkers such as Galilei, Newton, Leibniz, and Huygens—the Who’s Who of exact thinking in their age. Within a few years, it all crystallized. By 1660, probability theory was established.

  It began with games of luck. They provide a limitless supply of mathematical problems. This makes it all the more puzzling that ancient mathematics knows no theory of probability. The Greeks, for instance, considered themselves the inventors of dice games—allegedly they hit upon it as a pastime during their siege of Troy. In actual fact, thousands of years before that time, Egyptians had played with dice. Some dice were found in graves from the First Dynasty. However, it is likely that the ancient Greeks invented coins, and one may safely assume that “Heads or Tails” was played not long afterward. Card games date from the Middle Ages. When Johannes Gutenberg opened shop, he duly printed the Bible first, but—in the very same year—a set of Tarot cards, too. Lotteries emerged in Renaissance Italy. The French Enlightenment provided us with roulette (the invention of a police officer, so it seems). Mechanization brought gaming machines, and digitalization an endless stream of apps for gambling. Humans simply love to toy with chance.

  It is all the more surprising that we are utterly inept in estimating probabilities. Mathematics is spiced with paradoxical results, but in the calculus of probability, they really clog up.

  An example, maybe? Let us assume that you live in a region where the probability of being infected by a certain virus amounts to one in a thousand. Imagine a test that infallibly recognizes the virus, but yields a false positive with a probability of 5 percent. And now imagine that you have tested positive. How probable is it that you really carry the virus?

  Take your time before you answer. The test is not completely precise, as we have seen. It can yield the wrong result. How likely is it in your case? Please consider before hazarding a guess. The most common answer is: “I am infected with a probability of 95 percent.” This answer is wrong. The probability is less than 2 percent, as we shall see in a moment.

  Another example? Two firms have developed drugs against that viral infection. Now comes the time for clinical trials. First, people will be tested who are less than sixty-five years old, and hence do not belong to the high-risk group. Drug Alpha helps 90 people out of the 240 who try it. Drug Beta helps 20 out of a sample of 60. Since 90 out of 240 is larger than 20 out of 60, we may conclude that Alpha works better than Beta, for people less than sixty-five years old. Next, the drug is tested in the high-risk group, on people who are sixty-five years old or older. A random sample of 60 senior citizens tries Alpha, and 30 of them feel relief. Beta is used on 240 elderly people, and helps 110 of them. Since 110 out of 240 is less than 30 out of 60, Alpha does better than Beta again. The health department orders huge amounts of Alpha.

  But wait a minute, warns an expert. Let us look at the sums. Altogether 300 people have been tested with Alpha, and 300 with Beta. Alpha has provided relief to 120, and Beta to 130. Doesn’t this indicate that Beta is better? Very confusing! Let us check our numbers again. Alpha does better for both the younger and the older samples—but less well in toto. Can that be right?

  The Dice Are Cast

  The theory of probability got kickstarted with a few riddles about dice. Here are two of them.

  When we throw two dice, we can obtain the sum 9 in two ways, as 3 + 6 or 4 + 5. Similarly, we can obtain the sum 10 in two ways, as 4 + 6 or 5 + 5. Hence, “sum 10” should be as likely as “sum 9,” and they should occur with the same frequency. Yet, this is not the case: 11.1 percent of all throws yield sum 9, and only 8.3 percent sum 10. Why is 10 less likely than 9?

  The simplest way to explain it is to paint one die red and one white, which makes it easier to tell them apart. The sum 4 + 5 can be obtained in two ways: if the red die shows number 4 and the white number 5, or if white shows 4 and red 5. On the other hand, the sum 5 + 5 is obtained when both red and white show number 5. There is no second way. Hence, 4 + 5 is twice as likely as 5 + 5. The outcome 3 + 6 is, by the same argument, just as likely as 4 + 6—or 4 + 5, of course.

  To make this more explicit: The result of a throw is “red shows number x, and white shows number y,” which we write as (x, y). The red die is equally likely to land on each of its sides (for symmetry reasons, as mathematicians like to say). Thus, “red shows x” has probability . Similarly, “white shows y” has probability , no matter what red shows. Hence, the probability of “event” (x, y) is always the same, namely .

  Figure 7.3. The thirty-six possibilities for (x, y).

  Four of these events yield the sum 9, namely (3, 6), (6, 3), (4, 5), and (5, 4). Accordingly, the probability for this is = . But only three events, namely (4, 6), (6, 4), and (5, 5), yield the sum 10; its probability is therefore = , which is smaller.

  We might try the same with three dice, painted red, white, and blue. The sum 9 and the sum 10 can each be obtained in six different ways—but now 10 is more frequent than 9!

  Here comes the second teaser. Pierre and Blaise roll dice repeatedly, until one of them has won thrice. (If both throw the same number, the round does not count.) Each player has staked six doubloons into the pool. The game is well underway, with lucky Pierre leading 2 to 1. Unexpectedly, Cardinal Richelieu passes by and orders the game to stop—immediately, messieurs. No more rolls.

  Pierre pockets the pool, all twelve doubloons of it. “I was ahead, I am the winner.”

  “Not at all,” protests Blaise. “The game did not come to the prescribed end, and hence has not taken place. Each of us gets his six doubloons back.”

  “This is hardly fair practice,” says Pierre. “I have made twice as many points as you. Hence, I am entitled to receive twice as much as you from the pool, namely two thirds. If my arithmétique serves me well, that makes eight doubloons for me, and four for you.”

  “Well then, I’ll accept that,” replies Blaise. This answer raises dark suspicions in Pierre’s mind, and he starts computing. How likely is it that Blaise would have won the game? And it becomes clear that the probability is merely 25 percent: the only way for Blaise to win overall is to win the next two rounds, and the probability for that is . Pierre’s chance to win is three times greater. Hence, the fair way to split is for Pierre to pocket nine doubloons and let honest Blaise keep the remaining three.

  Before taking our leave from the dice, one last little game. Let us take three dice (white, gray, and black, say). This time, we will not number their six sides with 1 to 6, as usual, but number the eighteen sides from 1 to 18, as shown in Figure 7.4. (The sides of the black die are marked with the numbers 18, 10, 9, 8, 7, and 5; the white die with 17, 16, 15, 4, 3, and 2; and the gray one with 14, 13, 12, 11, 6, and 1.)

  Figure 7.4. Three dice to play Rock-Paper-Scissors with.

  It seems that the numbers have been distributed fairly. Indeed, the sum of the numbers on each die is the same, namely 57. Now, you pick up one die, I pick up another, and each tries to roll the higher number. There is, of course, no possibility for a draw.

  You may have noticed that I let you choose your die first. This is not out of mere politeness. It gives me an advantage. Though I certainly may lose some rounds, I will draw invariably ahead, if we keep playing for long enough. I win each round with a probability of 58 percent.

  Let us start a fresh game. This time, you pick the die that has proved so lucky for me. I take the third one, and lo and behold, I am winning again. The odds are once more in my favor. So, let us start afresh. You pick the die that served me so well in the second match. I pick the one that served you so poorly in our first match. And what happens? I win again. It is as if the dice were playing Rock-Paper-Scissors: black beats white, white beats gray, gray beats black, always with a likelihood of 58 percent. Can you see why?

  In due time, the calculus of probability left gambling saloons to reach out for more serious fields. First, it was applied by insurance companies. What is the fair price of life insurance? Obviously, this depends on life expectancy. It is not by chance that the first demographic tables are about as old as the baroque brainteasers about dicing. Later, probability took a star role in statistical mechanics, and later still (but not much later) in genetics. Today, physics, chemistry, economics, and biology are inconceivable without probability theory. James Clerk Maxwell touted it as “the true logic of the world,” and Pierre-Simon Laplace as dealing with “the most important questions of life.” Admittedly, Albert Einstein claimed that “God does not play dice” (some wit added: “But if He did, He’d win”). Yet quantum physics sees chance everywhere. Erwin Schrödinger stated that “chance is the common root of the strict causality in physics,” and Jacques Monod viewed chance as “the foundation of the wonderful edifice of evolution.” And the philosopher Bertrand Russell hit the bull’s-eye when he said: “Probability is the most important concept of modern science, especially as nobody has the slightest notion what it means.”

  The Two Sides of Probability

  All reasoning with probabilities starts with possibilities: or more precisely, with the set of all possible outcomes. Mathematicians are fond of calling this set Ω (omega), its subsets events, and its members—the possible outcomes—elementary events. If you throw two dice, the set Ω consists of all pairs (x, y) of integers between 1 and 6, and thus has thirty-six elements. The event “the sum is 10” consists of the three pairs (5, 5), (6, 4), and (4, 6)—each of these outcomes is an elementary event.

  Let us assume, to start with, that there are only finitely many outcomes. Each elementary event is ascribed a number—the probability of the outcome. The numbers are nonnegative and sum up to 1. It is as if each element has a weight, and the total weight of Ω is 1. The probability of the subset A (i.e., the event A) is defined as the sum of the weights of all members making up A.

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183