Categories

# Rearranging the Law of Large Numbers

The Law of Large Numbers is completely meaningless to me when it is phrased as “the probability that the sum of results, divided by the number of independent samples, equals expected value gets very high as the number of independent samples goes to infinity.” mu=(sum of ys)/n, in which mu is the expected value, the ys are the results of independent samples, and n is the number of samples. Why should anything converge to the expected value?

But it is very meaningful to me when it is rephrased as “the probability that the expected value times the number of independent samples equals the sum of the results gets very high as the number of independent samples goes to infinity.” mu*n=sum of ys.

Yes, as the number of samples gets high, you know better and better exactly what your aggregate results will be.

As you multiply your expected value by larger and larger sample sizes, expected value goes from being totally fictitious and unhelpful to completely real.  If I get \$100 with a 50% chance and zero otherwise, it is meaningless to tell me that my expected value is \$50. I will never have \$50. But if you tell me that I will face this chance 1000 times, then I can tell you with great confidence that I will have \$50 times 1000 equals \$50,000.