From: Osher Doctorow on
From Osher Doctorow

We have seen from the last few subsections and sections that
INTERSECTION of A and B, denoted here by:

1) AB

appears to play a critical role in physics for (random) set/events A,
B.

Giorgi Japaridze of Villanova University Pennsylvania USA, in
"Separating the basic logics of the basic recurrences," 18 pages,
arXiv: 1007.1324 v2 [CS.LO] 9 Jul 2010, carries the "Computability
Logic" analogs or partial analogs of this further into diverse
directions.

Computability Logic CoL is definable as:

2) CoL = constructive game semantics redevelopment of logic.

The INTERSECTION is replaced in CoL by ^ (parallel conjunction), which
recurs in key results. Three logical operators are isolated as being
of basic importance, one being defined in terms of infinite
conjunctions A ^ A ^ ...., another somewhat more complicated involving
duplicating or "splitting" the current position of A (game A) into two
games, and a third involving only counting countably many wins of
games in the previous type.

I will let readers look at the paper before (hopefully) returning for
more details.

Osher Doctorow
From: Osher Doctorow on
From Osher Doctorow

Notice the importance of intersection AB (the intersection of A and B)
for (random) set/events in Probable Causation/Influence and in other
schools or applications of Probability/Statistics.

1) If you know P(AB) for A = {w: X(w) < = x}, B = {w: Y(w) < = y} for
continuous random variables X, Y, for all appropriate x, y values of X
and Y, then you "completely know" the probability distributions of X
and Y. P(AB) is called the Joint Comulative Distribution Function
(Joint cdf) of X and Y, denoted F(x, y) or FX,Y(x, y) where in the
latter X, Y are subscripts of F.

2) There are analogs of (1) for discrete random variables and so on.

3) Discovery of new joint cdfs or their multivariate generalizations
to more than 2 random variables is an important result in mathematical
probability/statistics and often has important applications outside
it. Even univariate or "marginal" cdfs F(x) or FX(x) = P{w: X(w) < =
x} are of considerable importance, including in engineering
reliability theory, physics, pure and applied probability and
statistics, etc.

4) Conditional Probability P(B|A) = P(AB)/P(A) for P(A) not 0.
5) Probable Causation/Influence (PI) P(A-->B) = 1 + P(AB) - P(A) and P
' (A-->B) = 1 + P(B) - P(A) for P(B) < = P(A).

6) P(AB) = 0 and P(AB) = 2P(A) - 1 have been shown in recent threads
here to play a key role in the theory of separating and relating
fundamental attractive interactions and their repulsive analogs
including Gravitation vs Repulsion.

7) P(AB) - P(A)P(B) is E. Lehmann's (late 1960s) Positive or Negative
Statistical Quadrant Dependence, or if 0 is Statistical Independence,
both key in deep probability/statistics research.

Osher Doctorow