From: Osher Doctorow on
From Osher Doctorow

Earlier in this thread, I defined:

1) P_IND(A-->B) = 1 + P(A)P(B) - P(A)
2) P(A-->B) = 1 + P(AB) - P(A)

We can now define the "Independence" Operator IND as:

3) IND(P(A-->B)) = P_IND(A-->B) = 1 + P(A)P(B) - P(A) = (in LISP-like
notation) (1, P(A)P(B), P(A)).

Then "Statistical Dependence", which is E. Lehmann's "Positive/
Negative Quadrant Statistical Dependence" of the 1960s cited
previously in this thread, symbolically DEP(A, B), is:

4) DEP(A, B) = P(AB) - P(A)P(B) (positive, negative, or nil/zero)

in terms of (random) sets/events, and from (3) and (2) this is:

5) DEP(A, B) = P(A-->B) - P_IND(A-->B) = P(A-->B) - IND(P(A-->B)) =
(in LISP-like notation) (1, P(A-->B), IND(P(A-->B)) = (1, AB,
P(A)P(B)) if P(P(A)P(B)) is defined as P(A)P(B).

Recall that in LISP-like notation, from a few posts ago, (1, A, B) = 1
+ P(AB) - P(A) or 1 + P(B) - P(A) for P(B) < = P(A) depending on
whether (1, A, B) refers to P(A-->B) or P ' (A-->B) respectively.

Osher Doctorow