From: Robert Israel on
David Bernier <david250(a)videotron.ca> writes:


> Williams writes:
> "Intuitively, Brownian bridge with values in R,
> BB(R), is BM(R) with time-parameter set [0,1]
> conditioned to be at 0 at times 0 and 1.
> Rigorously, there is unique measure W^{0,0}
> on C[0, 1] with [...] "
>
> [ BM(R) is Brownian motion with values in R.]
>
> I think he means standard Brownian
> motion, say x(t), where t is a real number
> in [0, oo), x(0) = 0 and the variance
> of the Gaussian normal r.v. x(t) is t,
> with mean zero. So in particular
> x(1) or the position at time t=1 is
> a r.v. whose variance is 1.
>
> I have some understanding of BM on an
> intuitive level. I can't explain precisely
> how one "conditions [Brownian motion]
> to be at 0 at times t=0 and t=1." or
> how the measure W^{0,0} on C[0, 1]
> is defined or constructed.

Given a standard Brownian motion W(t), one nice way of
constructing a Brownian bridge is to take Z(t) = W(t) - t W(1)
for 0 <= t <= 1. This is not the same thing as conditioning
W(t) to be 0 at t=1, but it constructs a process with the required
properties.
By construction, Z(0) = W(0) = 0 and Z(1) = 0,
E[Z(t)] = E[W(t)] - t E[W(1)] = 0, and for 0 <= s <= t <= 1,
E[Z(s) Z(t)] = E[(W(s) - s W(1))(W(t) - t W(1))]
= E[W(s) W(t)] - s E[W(t) W(1)] - t E[W(s) W(1)] + s t E[W(1)^2]
= E[W(s)^2] - s E[W(t)^2] - t E[W(s)^2] + s t E[W(1)^2]
= s - 2 s t + s t = s (1 - t).
--
Robert Israel israel(a)math.MyUniversitysInitials.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
From: ArtflDodgr on
In article <hvv41f018pn(a)news6.newsguy.com>,
David Bernier <david250(a)videotron.ca> wrote:

> Suppose we have 500,000 red chips and 500,000 green chips of the
> same size. They are put in a pouch.
>
> We mark a y=0 when t=0 on graph paper.
> At time t = 1, we choose a chip at random from the pouch.
> If it's green, it counts for +1. If it's red, it counts
> for -1.
>
> y at t=1, or y(1) = +1 if the chip is green, and -1 if the
> chip is red.
>
> Say we have chose 999 chips (without replacement). The next time
> we choose the 1000th chip from the pouch.
>
> Then y(1000) = y(999) +1 if the 1000th chip from the pouch is green,
> and y(1000) = y(999) -1 if the 1000th chip from the pouch is red.
>
> We continue choosing chips at random until the pouch is empty.
> So in the end we remove 1 million chips. With equal numbers
> of red and green chips, y(1,000,000) = 0.
>
> So y(0) = y(10^6) = 0.
>
> My question is, as we increase the number of chips, and after
> scaling down of the graphs, can we get something that
> approaches a Brownian bridge?

If the number of chips is N, and you scale time by 1/N and space by
1/sqrt{N} then you get precisely the (standard) Brownian Bridge as
distributional limit for large N.

A good place to convince yourself of this is Billingley's book
"Convergence of Probability Measures". Brownian Bridge is discussed in
Chapter 2, and the convergence asserted above is an immediate
consequence of Theorem 24.1 (page 209). [My references here are to the
first edition of CoPM.]


> There are simulations of Brownian bridges with initial and final values
> of 0.556 and 1.000 here:
>
> < http://demonstrations.wolfram.com/BrownianBridge/ >
>
> It can be worthwhile searching for:
> David Williams Brownian bridge xi
>
> For a Brownian bridge on [0, 1] with initial and final values 1,
> with vertical scaling fixed once and for all,
> we can let
> X = excursion of the Brownian bridge B(t)
> X = sup_{0<=t<=1} B(t) - inf_{0<=t<=1} B(t) .
>
> Then X is a random variable depending on a sample Brownian bridge.
>
> Then according to Williams,
>
> E[ X^z ]/2 = xi(z), where
>
> xi is related to Riemann zeta as follows:
> xi(s) = (1/2) s (s-1) pi^(-s/2) Gamma(s/2) zeta(s).
>
> This gives xi(2) = (1/2) 2 (1) (1/pi) (1) pi^2/6 = pi/6.
> xi(2) = 0.523598775598298873077..........
>
> A Monte Carlo simulation gives:
>
> E[ X^2 ] /2 ~= to:
>
> [....]
> xi(2.000000 + I*0.000000) ~= 0.522339 + I*0.000000
> xi(2.000000 + I*0.000000) ~= 0.521997 + I*0.000000
> xi(2.000000 + I*0.000000) ~= 0.521565 + I*0.000000
> xi(2.000000 + I*0.000000) ~= 0.521978 + I*0.000000
> xi(2.000000 + I*0.000000) ~= 0.521417 + I*0.000000
>
>
> David Bernier

--
A.
From: BURT on
On Jun 29, 3:20 pm, ArtflDodgr <artfldo...(a)aol.com> wrote:
> In article <hvv41f01...(a)news6.newsguy.com>,
>  David Bernier <david...(a)videotron.ca> wrote:
>
>
>
>
>
> > Suppose we have 500,000 red chips and 500,000 green chips of the
> > same size.  They are put in a pouch.
>
> > We mark a y=0 when t=0 on graph paper.
> > At time t = 1, we choose a chip at random from the pouch.
> > If it's green, it counts for +1.  If it's red, it counts
> > for -1.
>
> > y at t=1, or y(1) = +1 if the chip is green, and -1 if the
> > chip is red.
>
> > Say we have chose 999 chips (without replacement).  The next time
> > we choose the 1000th chip from the pouch.
>
> > Then y(1000) = y(999) +1 if the 1000th chip from the pouch is green,
> > and  y(1000) = y(999) -1 if the 1000th chip from the pouch is red.
>
> > We continue choosing chips at random until the pouch is empty.
> > So in the end we remove 1 million chips.  With equal numbers
> > of red and green chips, y(1,000,000) = 0.
>
> > So y(0) = y(10^6) = 0.
>
> > My question is, as we increase the number of chips, and after
> > scaling down of the  graphs, can we get something that
> > approaches a Brownian bridge?
>
> If the number of chips is N, and you scale time by 1/N and space by
> 1/sqrt{N} then you get precisely the (standard) Brownian Bridge as
> distributional limit for large N.  
>
> A good place to convince yourself of this is Billingley's book
> "Convergence of Probability Measures".  Brownian Bridge is discussed in
> Chapter 2, and the convergence asserted above is an immediate
> consequence of Theorem 24.1 (page 209).  [My references here are to the
> first edition of CoPM.]
>
>
>
>
>
> > There are simulations of Brownian bridges with initial and final values
> > of 0.556 and 1.000 here:
>
> > <http://demonstrations.wolfram.com/BrownianBridge/>
>
> > It can be worthwhile searching for:
> > David Williams Brownian bridge xi
>
> > For a Brownian bridge on [0, 1] with initial and final values 1,
> > with vertical scaling fixed once and for all,
> > we can let
> > X = excursion of the Brownian bridge B(t)
> >   X = sup_{0<=t<=1} B(t) - inf_{0<=t<=1} B(t) .
>
> > Then X is a random variable depending on a sample Brownian bridge.
>
> > Then according to Williams,
>
> > E[ X^z ]/2  = xi(z), where
>
> > xi is related to Riemann zeta as follows:
> > xi(s) = (1/2) s (s-1) pi^(-s/2) Gamma(s/2) zeta(s).
>
> > This gives xi(2) = (1/2) 2 (1) (1/pi) (1) pi^2/6 = pi/6.
> > xi(2) = 0.523598775598298873077..........
>
> > A Monte Carlo simulation gives:
>
> > E[ X^2 ] /2 ~= to:
>
> > [....]
> > xi(2.000000 + I*0.000000) ~= 0.522339 + I*0.000000
> > xi(2.000000 + I*0.000000) ~= 0.521997 + I*0.000000
> > xi(2.000000 + I*0.000000) ~= 0.521565 + I*0.000000
> > xi(2.000000 + I*0.000000) ~= 0.521978 + I*0.000000
> > xi(2.000000 + I*0.000000) ~= 0.521417 + I*0.000000
>
> > David Bernier
>
> --
> A.- Hide quoted text -
>
> - Show quoted text -- Hide quoted text -
>
> - Show quoted text -

I tried to walk randomly once but I didn't get anywhere.