
Originally Posted by
Sinfix_15
Okay. He threw for 316 yards. Let's assume that the number of yards could be anywhere from 0 to 1000 yards. So that makes the odds of throwing 316 1 in 1000.
The average of 31.6 yards per play is just the same number as above divided by 10. So really you're saying he made 10 plays. Let's say that he could have made anywhere from 0 to 100 plays, so the odds of this number of plays is 1/100.
And the TV ratings, let's assume they could go anywhere from 0 to 100, with 3 significant figures. So the odds of the ratings being 31.6 are 1/1000.
So multiply those together:
1/1000 * 1/100 * 1/1000
= 1/100,000,000
1 in 100 million, clearly nothing interesting or out of the ordinary about that.