Prev: NEWSFLASH! Meami.org finds: As it turns out P.NP-complete problems are a set of problems which any other NP-problem can be reduced to in polynomial time, but which retain the ability to have their solution verified
Next: Collatz-like conjecture using integer square root
From: WizardOfOzzz on 6 Jul 2010 13:04 Hey all, I have a problem where I have two 1D distributions represented with mean/variance. I need to calculated a third distribution that is the minimum of a random choice from both. For example, if the distributions aren't overlaping you can just take the lower distribution, but as overlap is added there will be a greater and greater chance of the distribution with the higher mean still producing a lower result. I also need to represent the resulting distribution using only mean/variance, even if that isn't the best approximation. Any ideas? I figure it there should be a closed form solution from the original means/variances to calculate the minimum mean/variance. Thanks in advance!!
From: Robert Israel on 6 Jul 2010 21:37 WizardOfOzzz <eric_penner(a)hotmail.com> writes: > Hey all, > > I have a problem where I have two 1D distributions represented with > mean/variance. I need to calculated a third distribution that is the > minimum of a random choice from both. If X and Y are independent random variables with CDF F_X and F_Y respectively, Z = min(X,Y) has CDF F_Z(z) = 1 - (1 - F_X(z))(1 - F_Y(z)). If X and Y have densities f_X and f_Y, then Z has density f_Z(z) = (F_Z)'(z) = f_X(z) (1 - F_Y(z)) + f_Y(z) (1 - F_X(z)). > I also need to represent the resulting distribution using only > mean/variance, even if that isn't the best approximation. > Any ideas? I figure it there should be a closed form solution from the > original means/variances to calculate the minimum mean/variance. No, the mean and variance of Z do not depend only on the means and variances of X and Y. But by "represented with mean/variance" I suppose you're assuming distributions from a specific two-parameter family, e.g. normal distributions. In that case the mean m_z and variance v_z of Z are rather complicated functions of the means and variances m_x, v_x, m_y, v_y of X and Y, involving some non-elementary integrals. -- Robert Israel israel(a)math.MyUniversitysInitials.ca Department of Mathematics http://www.math.ubc.ca/~israel University of British Columbia Vancouver, BC, Canada
From: Robert Israel on 6 Jul 2010 22:02 Robert Israel <israel(a)math.MyUniversitysInitials.ca> writes: > WizardOfOzzz <eric_penner(a)hotmail.com> writes: > > > Hey all, > > > > I have a problem where I have two 1D distributions represented with > > mean/variance. I need to calculated a third distribution that is the > > minimum of a random choice from both. > > If X and Y are independent random variables with CDF F_X and F_Y > respectively, > Z = min(X,Y) has CDF F_Z(z) = 1 - (1 - F_X(z))(1 - F_Y(z)). If X and Y > have > densities f_X and f_Y, then Z has density > f_Z(z) = (F_Z)'(z) = f_X(z) (1 - F_Y(z)) + f_Y(z) (1 - F_X(z)). > > > I also need to represent the resulting distribution using only > > mean/variance, even if that isn't the best approximation. > > > Any ideas? I figure it there should be a closed form solution from the > > original means/variances to calculate the minimum mean/variance. > > No, the mean and variance of Z do not depend only on the means and > variances > of X and Y. > > But by "represented with mean/variance" I suppose you're assuming > distributions > from a specific two-parameter family, e.g. normal distributions. In that > case > the mean m_z and variance v_z of Z are rather complicated functions of > the means and variances m_x, v_x, m_y, v_y of X and Y, involving some > non-elementary integrals. For simplicity, suppose m_x = 0 and v_x = 1 (to which the general case can be reduced by linear transformation). Then a series expansion around m_y = 0, v_y = 1 is m_z = -1/sqrt(pi) - (v_y - 1)/(4 sqrt(pi)) + m_y/2 + (v_y - 1)^2/(32 sqrt(pi)) - m_y^2/(4 sqrt(pi)) + m_y^2 (v_y - 1)/(16 sqrt(pi)) - (v_y - 1)^3/(128 sqrt(pi)) - 3 m_y^2 (v_y - 1)^2/(128 sqrt(pi)) + 5 (v_y - 1)^4/(2048 sqrt(pi)) + m_y^4/(96 sqrt(pi)) + ... v_z = 1 - 1/pi + (pi - 1)(v_y - 1)/(2 pi) + (pi-2) m_y/(2 sqrt(pi)) + m_y (v_y - 1)/(2 sqrt(pi)) + m_y (v_y - 1)^2/(8 sqrt(pi)) - m_y^4/(24 pi) + m_y^3 (v_y - 1)/(24 sqrt(pi)) - 3 m_y (v_y - 1)^3/(64 sqrt(pi)) + ... and this should provide a fairly good approximation if m_y is close to 0 and v_y close to 1. -- Robert Israel israel(a)math.MyUniversitysInitials.ca Department of Mathematics http://www.math.ubc.ca/~israel University of British Columbia Vancouver, BC, Canada
From: Ray Vickson on 6 Jul 2010 22:24 On Jul 6, 7:02 pm, Robert Israel <isr...(a)math.MyUniversitysInitials.ca> wrote: > Robert Israel <isr...(a)math.MyUniversitysInitials.ca> writes: > > WizardOfOzzz <eric_pen...(a)hotmail.com> writes: > > > > Hey all, > > > > I have a problem where I have two 1D distributions represented with > > > mean/variance. I need to calculated a third distribution that is the > > > minimum of a random choice from both. > > > If X and Y are independent random variables with CDF F_X and F_Y > > respectively, > > Z = min(X,Y) has CDF F_Z(z) = 1 - (1 - F_X(z))(1 - F_Y(z)). If X and Y > > have > > densities f_X and f_Y, then Z has density > > f_Z(z) = (F_Z)'(z) = f_X(z) (1 - F_Y(z)) + f_Y(z) (1 - F_X(z)). > > > > I also need to represent the resulting distribution using only > > > mean/variance, even if that isn't the best approximation. > > > > Any ideas? I figure it there should be a closed form solution from the > > > original means/variances to calculate the minimum mean/variance. > > > No, the mean and variance of Z do not depend only on the means and > > variances > > of X and Y. > > > But by "represented with mean/variance" I suppose you're assuming > > distributions > > from a specific two-parameter family, e.g. normal distributions. In that > > case > > the mean m_z and variance v_z of Z are rather complicated functions of > > the means and variances m_x, v_x, m_y, v_y of X and Y, involving some > > non-elementary integrals. > > For simplicity, suppose m_x = 0 and v_x = 1 (to which the general case can > be reduced by linear transformation). Then a series expansion around m_y = 0, > v_y = 1 is > > m_z = -1/sqrt(pi) - (v_y - 1)/(4 sqrt(pi)) + m_y/2 + (v_y - 1)^2/(32 sqrt(pi)) > - m_y^2/(4 sqrt(pi)) + m_y^2 (v_y - 1)/(16 sqrt(pi)) > - (v_y - 1)^3/(128 sqrt(pi)) - 3 m_y^2 (v_y - 1)^2/(128 sqrt(pi)) > + 5 (v_y - 1)^4/(2048 sqrt(pi)) + m_y^4/(96 sqrt(pi)) + ... > > v_z = 1 - 1/pi + (pi - 1)(v_y - 1)/(2 pi) + (pi-2) m_y/(2 sqrt(pi)) > + m_y (v_y - 1)/(2 sqrt(pi)) + m_y (v_y - 1)^2/(8 sqrt(pi)) > - m_y^4/(24 pi) + m_y^3 (v_y - 1)/(24 sqrt(pi)) > - 3 m_y (v_y - 1)^3/(64 sqrt(pi)) + ... > > and this should provide a fairly good approximation if m_y is close to 0 and > v_y close to 1. > -- How did you get these expressions? R.G. Vickson > Robert Israel isr...(a)math.MyUniversitysInitials.ca > Department of Mathematics http://www.math.ubc.ca/~israel > University of British Columbia Vancouver, BC, Canada
From: Chip Eastham on 7 Jul 2010 00:28 On Jul 6, 5:04 pm, WizardOfOzzz <eric_pen...(a)hotmail.com> wrote: > Hey all, > > I have a problem where I have two 1D distributions represented with mean/variance. I need to calculated a third distribution that is the minimum of a random choice from both. > > For example, if the distributions aren't overlaping you can just take the lower distribution, but as overlap is added there will be a greater and greater chance of the distribution with the higher mean still producing a lower result. > > I also need to represent the resulting distribution using only mean/variance, even if that isn't the best approximation. > > Any ideas? I figure it there should be a closed form solution from the original means/variances to calculate the minimum mean/variance. > > Thanks in advance!! I suspect what you mean is that you have two random variables X,Y taking real values whose distributions are known, and you want to get the distribution of the derived random variable min(X,Y). However one needs more information to answer that question, as the joint distribution of X and Y is not implied in general by their separate (marginal) distributions. To proceed one might assume independent distributions. The distribution of min(X,Y) is perhaps most easily approached by way of the cumulative distribution function, since (assuming independence): Pr( min(X,Y) < z ) = 1 - Pr( min(X,Y) >= z ) = 1 - Pr( X >= z)*Pr( Y >= z ) = 1 - (1 - Pr( X < z ))*(1 - Pr( Y < z )) = Pr( X < z ) + Pr( Y < z ) + Pr( X < z )*Pr( Y < z ) regards, chip
|
Next
|
Last
Pages: 1 2 Prev: NEWSFLASH! Meami.org finds: As it turns out P.NP-complete problems are a set of problems which any other NP-problem can be reduced to in polynomial time, but which retain the ability to have their solution verified Next: Collatz-like conjecture using integer square root |