Time In

Start the clock

A modest proposal for improving football: the ‘time-in’

If you’ve ever noticed that football games slow to a predictable crawl at the end of each half, the time-in is the rule for you. The idea is simple: When the clock is stopped, for whatever reason, a coach could call a “time-in,” and force the clock to start up again. Think of it as the antimatter version of the timeout.

The time-in is so powerful that I recommend it be strictly rationed: each team would get only one time-in per season. The possibility of a sudden time-in would loom large in every coach’s mind at the most tense points in the game, introducing just enough concern and uncertainty to make the game different. Timeworn clock-management strategies would no longer be a given. And yet, for the average viewer on a Sunday, the game on the field would still be your father’s football.

Of course, this assumes that the time-in is used that game. If it hasn’t been used yet, it affects the game in a different, but more subtle way: the opposing team will simply have to assume that it might be used. Coaches would enter the realm of game theory: how do we calculate when it’s the best game to use it? And what if the other team is expecting us to think this way?

Mistakes? What Were the Odds of That?

What’s luck got to do with it? The maths of gambling

He wasn’t on a lucky streak, he was using his knowledge of mathematics to understand, and beat, the odds.

“Beat the odds” isn’t quite as bad as “defies the laws of physics,” I think. But exploiting knowledge of the odds to win isn’t beating the odds. Beating the odds is winning when you shouldn’t — drawing to an inside straight and hitting it to win a hand is beating the odds. Exploiting the situation to make the odds go in your favor — making it so you should win more than you lose — is not.

A spin of the roulette wheel is just like the toss of a coin. Each spin is independent, with a 50:50 chance of the ball landing on black or red.

Well, no. A roulette wheel has 37 or 38 slots, depending on where you play, with 36 of them being black or red. The others are green — 0 and 00 (Europe has one, the US has both. Sort of.) That’s why the house makes profit offering “even money” on black or red bets on a US wheel; the probability of winning is slightly less than 50%. (They also make money on the single-number payouts, at 35:1) All of the bets on a 00 roulette wheel have a house advantage of at least 5.26%; single-0 wheels have a smaller house advantage but there also seems to be a correlation with higher-stakes limits. The previous link also presents a section on debunking the “doubling down” method for roulette. Winning at roulette is truly “beating the odds” since the house always has an advantage.

So please don’t follow the advice here. But note this:

For what it is worth, the sum of all the numbers in roulette is 666.

Think Ahead

Does closing roads cut delays?

Yes, because people do the wrong analysis.

The authors give a simple example of how this could play out: Imagine two routes to a destination, a short but narrow bridge and a longer but wider highway. Let’s also imagine that the combined travel times of all the drivers is shortest if half take the bridge and half take the highway. But because each driver is selfishly trying to seek the shortest route for himself, this doesn’t happen. At first, everyone will go for the bridge because it’s shorter. But then, as the bridge becomes backed up, more drivers start taking the highway, until the congestion on the bridge starts to clear up. At that point more drivers go back to the bridge, which then becomes backed up again. Eventually, the traffic flow settles into what’s called the Nash equilibrium (named for the beautifully minded mathematician), in which each route takes the same amount of time. But in this equilibrium the travel time is actually longer than the average time it would take if half of the drivers took each route.

Note that this still happens even if – indeed, especially if – all the drivers have perfect information about what all the other drivers are doing, such as with a GPS that gives real-time traffic updates.

The problem here is similar to the one of feedback, as anyone who has designed and tested gain/feedback circuitry can attest. There is an oscillation to the signal — an ebb and flow of traffic density. There is a delay in the time between the signal and the feedback, and at some point the delay is 180º out of phase, so you add to the problem rather than subtracting from it.

Note that the last quoted sentence is actually incorrect — the real-time traffic update information tells you where traffic is, not where it is going. If you knew that a lot of drivers were heading to the bridge and would be there in 15 minutes — about when you would arrive — you wouldn’t take the bridge. But all you know is how many are on the bridge right now. The information you are missing is how many drivers have made the decision to use the bridge.

Death to Exponential Growth

Your body wasn’t built to last: a lesson from human mortality rates

What do you think are the odds that you will die during the next year? Try to put a number to it — 1 in 100? 1 in 10,000? Whatever it is, it will be twice as large 8 years from now.

This startling fact was first noticed by the British actuary Benjamin Gompertz in 1825 and is now called the “Gompertz Law of human mortality.” Your probability of dying during a given year doubles every 8 years. For me, a 25-year-old American, the probability of dying during the next year is a fairly miniscule 0.03% — about 1 in 3,000. When I’m 33 it will be about 1 in 1,500, when I’m 42 it will be about 1 in 750, and so on. By the time I reach age 100 (and I do plan on it) the probability of living to 101 will only be about 50%. This is seriously fast growth — my mortality rate is increasing exponentially with age.

Cause and Effect

Dean Dad asks

Why do so many states require only two years of math in high school?

[…]

We have anecdotal evidence that suggests that students who actually take math for all four years of high school do better in math here than those who don’t. We also have anecdotal evidence that bears crap in the woods. Why the hell do the high schools only require two years of math?

And there is followup at Uncertain Principles

There is a lot of discussion, so I may have missed someone raising the following point:

People who take four years of math and do well are probably good at math. Whatever distribution of students took the math for two years, I’d bet that it’s not the same as the distribution who took it for four. I’ll bet the players who go out for (pick your sport) do better at that sport in gym than the players that don’t, because you tend not to pursue and enjoy an activity if you suck at it.

The discussion seems to be dealing more with the other reasons why schools don’t require four years of math. I can ignore that for a moment and still assume an ideal case not limited by the availability of teachers or caused by bureaucracy. To me, the proposed solution embedded in the rhetorical question is not the head-slap obvious conclusion.

Giving the Devil His Due

I read The Devil Is in the Digits, an analysis of the Iranian voting results, and something doesn’t feel quite right about it. (And it’s not that these two political science student authors are being touted as mathematicians in some of the blogs linking to the story) Disclaimer: there seem to be lots of reasons to question the vote. I’m not addressing anything but the rigor of this analysis.

Now, I could be wrong about this, because anything past basic probability gives me trouble — I’m not particularly skilled (my lowest math grades were on probability exams. What are the odds of that?), and those feeble skills have atrophied for most anything beyond simple dice-rolling and poker calculations.

But I do recall that when you multiply probabilities together, it needs to be for independent events. And I question what’s going on here.

We find too many 7s and not enough 5s in the last digit. We expect each digit (0, 1, 2, and so on) to appear at the end of 10 percent of the vote counts. But in Iran’s provincial results, the digit 7 appears 17 percent of the time, and only 4 percent of the results end in the number 5. Two such departures from the average — a spike of 17 percent or more in one digit and a drop to 4 percent or less in another — are extremely unlikely. Fewer than four in a hundred non-fraudulent elections would produce such numbers.

OK, the premise seems fine. You expect each digit to show up 10% of the time, but you can deviate from that and still have a random distribution. But the relationship between the digits is not random — if you have too many 7s, you must have fewer of other numbers! So what I want to know is how they arrived at the four percent result.

Let me illustrate with an example that’s easier to see, and one I can work through: coin tosses. If you toss a coin twice, there are three outcomes: Two heads (25% of the time) a head and a tail (50%) and two tails (25% of the time). So while the expected, average result is one head, it only happens half the time — a result of either two heads or two tails isn’t evidence of anything fishy; we don’t have enough trials. But here’s the biggie: what is the probability of getting two heads, and no tails? It’s still 25%, because (heads) and (not tails) are not independent results. They have the maximum amount of correlation you can get, and since they aren’t independent results, you wouldn’t multiply the probabilities together to find the answer.

I found an analysis someone did using random numbers, and their model simulation gives the odds of a number appearing 5 or fewer times as about 20%, and appearing more than 20 time as 11%. But the odds of both shouldn’t simply be the product of the two, because the results would be correlated in some fashion that’s more involved than the coin-tossing.

So I wonder how they arrived at 4%. It’s not at all clear.

The second part of their analysis is of the last two digits, and whether they are adjacent (or identical) numbers or not, e.g. 54 (adjacent) vs 59 (not).

To check for deviations of this type, we examined the pairs of last and second-to-last digits in Iran’s vote counts. On average, if the results had not been manipulated, 70 percent of these pairs should consist of distinct, non-adjacent digits.

Not so in the data from Iran: Only 62 percent of the pairs contain non-adjacent digits.

Aha! They assume that the numbers are perfectly distributed, and we know the last digits are not; I didn’t see any mention of the second-to-last digit. So one has to wonder whether this analysis holds. I can certainly think of some examples where it fails: the second-to-last digits are all 5, and the last digits are all 4, 5 or 6. In that unlikely result, there would be zero pairs that were non-adjacent, rather than 70%. So I have to wonder how far the assumption holds and how badly it fails. And if these odds depend on the distribution, the digits and the pairings are not independent of each other, so multiplying the probabilities won’t give the right answer.

That’s what my gut and some basic probability math, dredged up from the recesses of my brain tell me. Perhaps someone who does math for a living can confirm that I’m right or tell me that I’m wrong and should stick to my day job. (or that I’m right and I should still stick to my day job)

Mathworld

Some random thoughts on coordinate systems, math-y terminology and the real world.

I went into Bed, Bath & Beyond the other day, and wondered, “What’s beyond Bed, Bath & Beyond? The name does not specify how far beyond bed and bath it goes.

As you drive through northern Pennsylvania there’s a sign that reads “Endless Mountains, next 6 exits.” What? Only 6? They’re endless, right? It must be that they’re endless in some other dimension, and I’m just going through with some perpendicular component.

Back when I lived in Orlando, there was a novelty/gift store called The Infinite Mushroom. One day I stopped off and there was a sign on the door explaining that they had moved. But if they were infinite, how could you tell? To make matters even more confusing, they later expanded. Obviously they could not be infinite in all dimensions. I started to refer to them as the semi-infinite mushroom after that.

Curse You, Nonlinearity!

Too complex to exist

Complexity and connectivity, and the problems they can cause. And the notion that “too big to fail” might mean “too big to be allowed to exist.”

Much like the power grid, the financial system is a series of complex, interlocking contingencies. And in such a system, the biggest risk of all – that the system as a whole might fail – is not related in any simple way to the risk profiles of its individual parts. Like a downed tree, the failure of one part of the system can trigger an unpredictable cascade that can propagate throughout the entire system.

[…]

[W]e tend to overlook a fact that should be obvious – that once everything is connected, problems can spread as easily as solutions, sometimes more so. Thanks to globally connected transportation systems, epidemics of disease like SARS, avian influenza, and swine flu can spread farther and faster than ever before. Thanks to the Internet, e-mail viruses, nasty rumors, and embarrassing truths can spread to colleagues, loved ones, or even around the world before remedial action can be taken to stop them. And thanks to globally connected financial markets, a drop in real-estate prices in California can hurt the retirement benefits of civil servants in the UK.