A script aiming to visualize basic probability theory concepts.
julia -e "include(\"ProbTheory.jl\");
LLN();
CLT();
BE_binom_heatmap(tight = false);
BE_binom_heatmap(tight = true);
BE_binom_slice()"
Law of Large Numbers (n = 600) [Ref]
This is sampling from a Uniform(-100000, 100000), so
Berry-Esseen Theorem - Binomial Case [Ref (Thm 1)]
Theorem 1 Verbatim:
Let
In the case
This is a specific version of the Berry-Esseen Theorem, which states that the distribution of the sample means from a specific subset of random variables will converge to the standard normal, by an order of
This Binomial variant gives two bounds, the first being general for all values of p, and the second being tighter but only for the middle third range.
The next two plots depict the “error” term between the Binomial and Standard Normal. Essentially, they numerically back Schulz’ Theorem.
Binomial BE Heatmap (n = 500, tight = false)
Binomial BE Heatmap (n = 500, tight = true)
Binomial BE Slice (n = 20, tight = true)
We can also check that none of the tight errors are negative in the middle third, but are in the outer two thirds:
julia -e "include(\"ProbTheory.jl\"); BE_binom_bounds(1000)"
For n = 1000 on p = [0.001, 0.999], there are 183 negative differences.
For n = 1000 on p = [0.333, 0.666], there are 0 negative differences.
Using a Uniform Distribution: