Lecture 3 of David Kung's Mind-Bending Math is called "Probability Paradoxes." In this lecture, Dave Kung discusses some of the most well-known paradoxes in mathematics.
Let me start with a little something about probability. Statistics (this subject of tomorrow's lecture) and probability make up one of the major strands of Common Core Math. All Integrated Math courses incorporate a little of stats and probability into each course. But even the traditional pathway includes some probability -- and according to the Common Core documents, probability should be included in the traditional Geometry course. But as we have seen so far, very few Geometry courses actually include probability, and it doesn't appear on the PARCC Geometry tests at all.
The probability paradoxes mentioned in Kung's lecture are so well-known that I'd rather just provide links to them rather than discuss them in full detail. Kung attributes the first of these to Martin Gardner, the 20th century American whose name was synonymous with recreational math. For decades he had a weekly puzzle column in Parade Weekly. Kung says that he will mention Gardner several times throughout this course.
Three Prisoners Problem (Kung names the prisoners Abel, Bertrand, and Cantor. All three of these refer to mathematicians, and indeed I've mentioned two of them here on the blog. Abel was one of the two who proved that there is no quintic formula, and Cantor is associated with the middle thirds set, a special fractal. Kung mentions Bertrand himself later in today's lecture.)
Boy or Girl Paradox
But the best known probability paradox is the Monty Hall Paradox -- named after the Canadian host of the game show Let's Make a Deal. (He is still alive at 94 years of age.) Here's how Kung describes the problem:
Suppose that you're on a game show and you're given the choice of three doors. Behind one door is a car. Behind the others are goats, or something you don't want. You pick a door -- door number 1 -- and the host, who knows what's behind the doors, opens another door -- door number 3, which has a goat behind it. Then he asks, "Would you like to switch to door number 2?"
As it turns out, it's actually much better to switch. The paradox is that even though there are two doors left, these doors don't have an equal probability of 1/2. Instead, your door has only a 1/3 chance of hiding the car, and switching to the other door gives you a 2/3 chance of winning.
The Monty Hall Paradox was discussed by another Parade Weekly columnist -- Marilyn vos Savant, also known as the woman with the world's highest IQ (around 200). Unlike Gardner, who passed away a few years ago, vos Savant is still alive, and still writes for Parade Weekly (in her most recent column, she explains why calendars start on Sunday). About 25 years ago, there was a hotly debated column about the Monty Hall Paradox:
Monty Hall Paradox
One way to dramatize why it's much better to switch is to imagine that there are a thousand doors instead of three. Kung describes the situation where you choose door 816, and Monty reveals a goat behind every door except your door and 142. There was a one out of 1000, or 0.1% chance, that your original 816 is correct, so now switching wins with probability 99.9%
Here's a way to avoid the Monty Hall Paradox: once you make a decision, there's nothing that Monty can do to make the probability of winning go up unless there's something else that can happen that can make the probability go down. For example, say the original problem is changed so that it's possible that after you make your decision, Monty can sometimes open the door to the car and not a goat. If Monty does this, then your chances of winning go down to zero. In this case, the appearance of a goat would mean that your chances of winning really do improve to 1/2. But in the original problem, Monty knows where the car is, and he always reveals a goat. Since the probability can never go down, it can never go up either. You originally have a 1/3 chance of winning, and so this probability must remain 1/3 throughout the game.
There was a similar game show, Deal or No Deal, in which a million dollars is hidden behind one of twenty-six cases, so you have a 1/26 chance of winning. Throughout the game, you choose cases to open, and when there are two cases left, with one containing the million, the host Howie Mandel asks whether the player would like to switch cases. So what is the probability of winning now? The real question is, is it possible at any stage of the game for the winning probability to go down? The answer is a resounding yes -- the player is the one who chooses the previous 24 cases to open, and so it's possible that one of these 24 cases contains the million, reducing the chances of winning to zero. So the fact that thus hasn't happened means that the winning probability really has gone up to 1/2.
But notice that even though the winning probability is 1/2, and so it makes no difference whether to switch cases or not, almost no one ever switches cases. I believe the reason for this is psychological -- player feel much worse if they once had the million and gave it away than if they never had the million in the first place. This compounds the original Monty Hall problem -- even if the probability really does change to 1/2, players would be so afraid that they chose the car and give it away that they're likely to avoid switching. So the fact that the true probability is only 1/3 means that the game show would give away only half as many cars as they would if the players followed perfect strategy.
But we observe that the original Monty Hall problem no longer applies to the modern version of the Let's Make a Deal, since the show no longer uses animals for Zonks. On the subchannel BUZZR TV, classic game shows, including shows from the classic Monty Hall era, are aired. I actually watched an episode of it last night.
One thing I noticed about the real Monty Hall show is that the three doors only appear at the end of the show -- when it's time for the Big Deal. At that point there are no goats or other Zonks, and the players are never offered a chance to switch doors -- the opportunity to switch (usually to give up a prize behind a curtain for cash) appears only in smaller deals well before the Big Deal is played. On the particular episode I watched, everyone made the correct switch decision except for a lady near the end of the show who kept Monty's Cash Box, which contained $160, instead of a furniture package behind the curtain worth about ten times as much.
On the modern version of Let's Make a Deal, there is a sub-game in which the Monty Hall Paradox is actually relevant. This game is called "Three of a Kind." There are six cards -- let's say there are three aces and three kings. You are to choose three cards, and if they all match, you win the car. Notice that there are 20 possibilities (that is, six choose three or 6 nCr 3) and only two of those combinations lead to wins (the three aces and the three kings). The probability of winning is only 2/20 or 10%. But then the current host Wayne Brady reveals that two of your cards match (let's say they're aces) and that two of the cards you didn't choose also match (kings), then offers you the opportunity to switch to a smaller prize. Notice that at least two of your chosen cards must match no matter what -- which means that the winning probability can never go down. This means that the probability of winning can never go up either. That last card has only a 10% chance of being an ace and a 90% chance of being the king -- but of course, players don't choose the smaller prize anywhere near 90% of the time.
The Quick Conundrum for today is about metronomes -- if we place three of them on a moving platform, the metronomes send waves to each other that have them tick in sync. The final paradox for the day is called Bertrand's Paradox (yes, there's that Bertrand I mentioned earlier). It's actually a great paradox to use in a Geometry class, at some point after area is taught:
And now we move on to Lesson 1-9, the Triangle Inequality. This is what I wrote last year:
Now finally we can prove the big one, the Triangle Inequality. This proof comes from Dr. M -- but Dr. M writes that his proof goes all the way back to Euclid. Here is the proof from Euclid, where he gives it as his Proposition I.20:
Here is the two-column proof as given by Dr. M. His proof has eight steps, but I decided to add two more steps near the beginning. Step 1 is the Given, and Step 2 involves extending a line segment, so that it's similar to Step 2 of the Unequal Sides proof. Indeed, the proofs of Unequal Sides and the Triangle Inequality are similar in several aspects:
Triangle Inequality Theorem:
The sum of the lengths of two sides of any triangle is greater than the length of the third side.
Given: Triangle ABC
Prove: AC + BC > AB
1. Triangle ABC 1. Given
2. Identify point D on ray BC 2. On a ray, there is exactly one point at a given distance from
with CD = AC an endpoint.
3. angle CAD = angle CDA 3. Isosceles Triangle Theorem
4. angle BAD = BAC + CAD 4. Angle Addition Postulate
5. angle BAD > angle CAD 5. Equation to Inequality Property
6. angle BAD > angle CDA 6. Substitution (step 3 into step 5)
7. BD > AB 7. Unequal Angles Theorem
8. BD = BC + CD 8. Betweenness Theorem (Segment Addition)
9. BD = BC + AC 9. Substitution (step 2 into step 8)
10. BC + AC > AB 10. Substitution (step 9 into step 7)
To help my student out back when I was tutoring, I also included another indirect proof in the exercises. We are given a triangle with two sides of lengths 9 cm and 20 cm, and we are asked whether the 9 cm side must be the shortest side. So we assume that it isn't the shortest side -- that is, that the third side must be even shorter than 9 cm. This would mean that the sum of the two shortest sides must be less than 9 + 9, or 18 cm, and so by the Triangle Inequality, the longest side must be shorter than 18 cm. But this contradicts the fact that it is 20 cm longer. Therefore the shortest side must be the 9 cm side. QED
Notice that the U of Chicago text probably expects an informal reason from the students. A full indirect proof can't be given because this question comes from Lesson 1-9, while indirect proofs aren't given until Chapter 13.
Let's conclude today's post with another paradox from Kung. If you were born on an odd day of the month, flip a coin 200 times and record the results. If you were born on an even day, write down 200 fake coin flips. Here's how we can tell if they're real or not -- if you avoid having six heads or tails in a row, you probably faked it and are born on an even day.