From: OsherD on
From Osher Doctorow

The variance and standard deviation of a random variable (the former
is the square of the latter) are the usual measures of Uncertainty, as
in Schrodinger's "proof" of the HUP (Heisenberg's Uncertainty
Principle) with a few assumptions.

As a second derivation of the "Coincidental" nature of the HUP, notice
that variance is an "aggregated" or "mean" quantity since it is the
expectation of (X - E(X))^2 where E(X) is the mean of X. That is:

1) Var(X) = E(X - E(X))^2 = E(X^2) - [E(X)]^2

However, the Non-Aggregated variables and measures are always more
accurate than the Aggregated ones, so consider the Probabilistic
analogs of Var(X):

2) P(a < = X < = b), where [a, b] is any proper subset of the range of
X, possibly infinite (a and or b can be +/- infinity in the proper
order).

The HUP claims that:

3) Var(X)Var(Y) > = k > 0 where k is a simple linear function of h
(Planck's constant), and X is position, Y is momentum.

It is relatively to easy to prove that the analogous claim for (2) is
false when X and Y are continuous random variables, since:

4) P(a < = X < = b) = P(X < = b) - P(X < = a) for X continuous

and similarly for Y, and if we choose X > = 0 (for example, the Gamma
or F distributions) so that F(a) = P(X < = 0) = 0, we get from (2):

5) P(a < = X < = b) = P(X < = b) = (definition) F(b)

But for X continuous, its cumulative distribution function F_X or F is
differentiable so is continuous on an interval arbitrarily close to
[a, b] (we can truncate [a, b] to make [a, b] bounded but with the
distribution arbitrarily close to the original one). So there exists
a d < b such that:

6) F(d) = P(a < = X < = d) < k with k > 0, a < d < b.

Likewise for Y, and writing its cumulative distribution function as G
and the analog of d as d ' , we have:

7) F(d)G(d ' ) < k (with proper choice of constants and relabelling)

This contradicts the claim that for the analog of HUP:

8) P(a < = X < = d) P(c < = Y < = d ' ) > = k (with proper
relabelling).

Osher Doctorow