Prev: Optimization Problem
Next: read vrml
From: Nitesh on 24 Jun 2010 05:17 "John D'Errico" <woodchips(a)rochester.rr.com> wrote in message <hvsvq4$ifc$1(a)fred.mathworks.com>... > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsr28$cfl$1(a)fred.mathworks.com>... > > "us " <us(a)neurol.unizh.ch> wrote in message <hvsqdk$1c0$1(a)fred.mathworks.com>... > > > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsq0g$4pj$1(a)fred.mathworks.com>... > > > > There is a linear function of two variables that I am trying to minimize under an equality constraint. But, the constraint is non linear in the variables. Is there any technique to solve this? or can I use approximations and linearize the constraint? > > > > > > don't double post(!)... > > > rather, show CSSM what you came up with so far... > > > > > > us > > I'm sorry..i didnt mean to double post. > > > > I havent made any progress in this.I still have the linear objective function in two variables and a nonlinear equality constraint.Is it possible to use lagrange multipliers to solve this? > > Surely it depends on the specific problem how I might > attack this. > > Depending on the specific equality constraint, I might > choose to work in the 1-manifold induced by the > equality constraint. Then it could be written as an > fminbnd problem, or perhaps just solved by hand. > > So if the constraint is > > x^2 + y^2 = r^2 > > where I wish to minimize the objective u = y - x, then I > can reduce the problem to minimizing the objective > > u = r*sin(theta) - r*cos(theta) = r*(sin(theta) - cos(theta)) > > Since r is fixed, it is trivial to minimize this expression > as a function of theta, then recover x and y from > > x = r*cos(theta), y = r*sin(theta) > > Just as easily, one can use Lagrange multipliers on the > problem. Minimize the augmented objective function > > (y - x) + (x^2 + y^2 - r^2)*lambda > > Differentiate wrt x and then y, set them equal to zero. > > -1 + 2*lambda*x = 0 > 1 + 2*lambda*y = 0 > > Add and combine. > > lambda*(x + y) = 0 > > So either lambda is zero, or we know that x + y = 0. > If x + y = 0, then y = - x. We can now return to the > constraint. > > x^2 + (-x)^2 = r^2 > > therefore > > x = r/sqrt(2) > > or > > x = -r/sqrt(2) > > so now we know the solution must lie at one of these > points. > > (x,y) = (+/- r/sqrt(2), +/- r/sqrt(2)) > > Thus y - x must be minimized at > > (r/sqrt(2), - r/sqrt(2)) > > WTP? > > John I understood your method. My objective function to be minimized is u = x*a+y*b and the constraint is ( (x+y)^(x+y+2) ) *( (x-y)^(x-y) )*( 2^(-y) )* ( y^-1 )* (x^(-x)) = 1. I also know that x>y. If the constraint equation was linear or even quadratic it would have been possible to solve this using nonlinear programming. I tried to approximate the constraint but to no avail.
From: Johan Löfberg on 24 Jun 2010 07:05 The global solver in the YALMIP toolbox can be used, if you have a nonlinear solver such as fmincon installed, and an LP solver. In YALMIP, it is recommended to apply a logarithm on the constraints, and introduce some extra variables to make the problem more sparse and structured sdpvar x y s t Objective = -x-2*y Constraints = [t == x+y, s == x-y, (2+t)*log(t)+s*log(s) == log(y)+x*log(x)] Constraints = [Constraints,100>x>y>1e-4] solvesdp(Constraints,Objective,sdpsettings('solver','bmibnb')) 53 : -7.613E-001 0.59 -7.717E-001 2 + 53 Finishing. Cost: -0.7613 Gap: 0.58837% ans = yalmiptime: 0.2030 solvertime: 10.5940 info: 'No problems detected (BMIBNB)' problem: 0 dimacs: [NaN NaN 0 0 NaN NaN] >> x Linear scalar (real, 1 variable, current value : 0.26633) >> y Linear scalar (real, 1 variable, current value : 0.24749) "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvv7qi$ncl$1(a)fred.mathworks.com>... > "John D'Errico" <woodchips(a)rochester.rr.com> wrote in message <hvsvq4$ifc$1(a)fred.mathworks.com>... > > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsr28$cfl$1(a)fred.mathworks.com>... > > > "us " <us(a)neurol.unizh.ch> wrote in message <hvsqdk$1c0$1(a)fred.mathworks.com>... > > > > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsq0g$4pj$1(a)fred.mathworks.com>... > > > > > There is a linear function of two variables that I am trying to minimize under an equality constraint. But, the constraint is non linear in the variables. Is there any technique to solve this? or can I use approximations and linearize the constraint? > > > > > > > > don't double post(!)... > > > > rather, show CSSM what you came up with so far... > > > > > > > > us > > > I'm sorry..i didnt mean to double post. > > > > > > I havent made any progress in this.I still have the linear objective function in two variables and a nonlinear equality constraint.Is it possible to use lagrange multipliers to solve this? > > > > Surely it depends on the specific problem how I might > > attack this. > > > > Depending on the specific equality constraint, I might > > choose to work in the 1-manifold induced by the > > equality constraint. Then it could be written as an > > fminbnd problem, or perhaps just solved by hand. > > > > So if the constraint is > > > > x^2 + y^2 = r^2 > > > > where I wish to minimize the objective u = y - x, then I > > can reduce the problem to minimizing the objective > > > > u = r*sin(theta) - r*cos(theta) = r*(sin(theta) - cos(theta)) > > > > Since r is fixed, it is trivial to minimize this expression > > as a function of theta, then recover x and y from > > > > x = r*cos(theta), y = r*sin(theta) > > > > Just as easily, one can use Lagrange multipliers on the > > problem. Minimize the augmented objective function > > > > (y - x) + (x^2 + y^2 - r^2)*lambda > > > > Differentiate wrt x and then y, set them equal to zero. > > > > -1 + 2*lambda*x = 0 > > 1 + 2*lambda*y = 0 > > > > Add and combine. > > > > lambda*(x + y) = 0 > > > > So either lambda is zero, or we know that x + y = 0. > > If x + y = 0, then y = - x. We can now return to the > > constraint. > > > > x^2 + (-x)^2 = r^2 > > > > therefore > > > > x = r/sqrt(2) > > > > or > > > > x = -r/sqrt(2) > > > > so now we know the solution must lie at one of these > > points. > > > > (x,y) = (+/- r/sqrt(2), +/- r/sqrt(2)) > > > > Thus y - x must be minimized at > > > > (r/sqrt(2), - r/sqrt(2)) > > > > WTP? > > > > John > > I understood your method. > > My objective function to be minimized is > u = x*a+y*b and the constraint is > ( (x+y)^(x+y+2) ) *( (x-y)^(x-y) )*( 2^(-y) )* ( y^-1 )* (x^(-x)) = 1. > I also know that x>y. > > If the constraint equation was linear or even quadratic it would have been possible to solve this using nonlinear programming. > > I tried to approximate the constraint but to no avail.
From: John D'Errico on 24 Jun 2010 07:41 "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvv7qi$ncl$1(a)fred.mathworks.com>... > I understood your method. > > My objective function to be minimized is > u = x*a+y*b and the constraint is > ( (x+y)^(x+y+2) ) *( (x-y)^(x-y) )*( 2^(-y) )* ( y^-1 )* (x^(-x)) = 1. > I also know that x>y. > > If the constraint equation was linear or even quadratic it would have been possible to solve this using nonlinear programming. > > I tried to approximate the constraint but to no avail. Suppose you logged the constraint? In the form you have written it, this will be impossible to work with in terms of floating point numbers. (x+y+2)*log(x+y) + (x-y)*log(x-y) - y*log(2) - log(y) - x*log(x) = 0 At least it is now easier to write, and also a bit better to work with numerically. Your expression would have been pure hell in term of numerical issues. Even this one points out a few important things. Not only must we have x > y, but x and y both positive are also important. Even in the logged form, this is still nasty. I've had a bit of a rough time working with it to even plot the solution manifold. With a little effort, I was able to plot it though. It is a curve in the (x,y) plane, with x and y both 0 < x,y < 1. My temptation would probably be to generate the points on your solution locus, then fit a parametric cubic spline through them, parametric in arc length. Then the minimization problem becomes a simple call to fminbnd, as a function of the parameter. HTH, John
From: Johan Löfberg on 24 Jun 2010 08:35 "Johan Löfberg" <loefberg(a)control.ee.ethz.ch> wrote in message <hvve5h$c85$1(a)fred.mathworks.com>... > The global solver in the YALMIP toolbox can be used, if you have a nonlinear solver such as fmincon installed, and an LP solver. > > In YALMIP, it is recommended to apply a logarithm on the constraints, and introduce some extra variables to make the problem more sparse and structured > > sdpvar x y s t > Objective = -x-2*y > Constraints = [t == x+y, s == x-y, (2+t)*log(t)+s*log(s) == log(y)+x*log(x)] > Constraints = [Constraints,100>x>y>1e-4] > solvesdp(Constraints,Objective,sdpsettings('solver','bmibnb')) > 53 : -7.613E-001 0.59 -7.717E-001 2 > + 53 Finishing. Cost: -0.7613 Gap: 0.58837% > > ans = > > yalmiptime: 0.2030 > solvertime: 10.5940 > info: 'No problems detected (BMIBNB)' > problem: 0 > dimacs: [NaN NaN 0 0 NaN NaN] > > >> x > Linear scalar (real, 1 variable, current value : 0.26633) > >> y > Linear scalar (real, 1 variable, current value : 0.24749) > > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvv7qi$ncl$1(a)fred.mathworks.com>... > > "John D'Errico" <woodchips(a)rochester.rr.com> wrote in message <hvsvq4$ifc$1(a)fred.mathworks.com>... > > > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsr28$cfl$1(a)fred.mathworks.com>... > > > > "us " <us(a)neurol.unizh.ch> wrote in message <hvsqdk$1c0$1(a)fred.mathworks.com>... > > > > > "Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsq0g$4pj$1(a)fred.mathworks.com>... > > > > > > There is a linear function of two variables that I am trying to minimize under an equality constraint. But, the constraint is non linear in the variables. Is there any technique to solve this? or can I use approximations and linearize the constraint? > > > > > > > > > > don't double post(!)... > > > > > rather, show CSSM what you came up with so far... > > > > > > > > > > us > > > > I'm sorry..i didnt mean to double post. > > > > > > > > I havent made any progress in this.I still have the linear objective function in two variables and a nonlinear equality constraint.Is it possible to use lagrange multipliers to solve this? > > > > > > Surely it depends on the specific problem how I might > > > attack this. > > > > > > Depending on the specific equality constraint, I might > > > choose to work in the 1-manifold induced by the > > > equality constraint. Then it could be written as an > > > fminbnd problem, or perhaps just solved by hand. > > > > > > So if the constraint is > > > > > > x^2 + y^2 = r^2 > > > > > > where I wish to minimize the objective u = y - x, then I > > > can reduce the problem to minimizing the objective > > > > > > u = r*sin(theta) - r*cos(theta) = r*(sin(theta) - cos(theta)) > > > > > > Since r is fixed, it is trivial to minimize this expression > > > as a function of theta, then recover x and y from > > > > > > x = r*cos(theta), y = r*sin(theta) > > > > > > Just as easily, one can use Lagrange multipliers on the > > > problem. Minimize the augmented objective function > > > > > > (y - x) + (x^2 + y^2 - r^2)*lambda > > > > > > Differentiate wrt x and then y, set them equal to zero. > > > > > > -1 + 2*lambda*x = 0 > > > 1 + 2*lambda*y = 0 > > > > > > Add and combine. > > > > > > lambda*(x + y) = 0 > > > > > > So either lambda is zero, or we know that x + y = 0. > > > If x + y = 0, then y = - x. We can now return to the > > > constraint. > > > > > > x^2 + (-x)^2 = r^2 > > > > > > therefore > > > > > > x = r/sqrt(2) > > > > > > or > > > > > > x = -r/sqrt(2) > > > > > > so now we know the solution must lie at one of these > > > points. > > > > > > (x,y) = (+/- r/sqrt(2), +/- r/sqrt(2)) > > > > > > Thus y - x must be minimized at > > > > > > (r/sqrt(2), - r/sqrt(2)) > > > > > > WTP? > > > > > > John > > > > I understood your method. > > > > My objective function to be minimized is > > u = x*a+y*b and the constraint is > > ( (x+y)^(x+y+2) ) *( (x-y)^(x-y) )*( 2^(-y) )* ( y^-1 )* (x^(-x)) = 1. > > I also know that x>y. > > > > If the constraint equation was linear or even quadratic it would have been possible to solve this using nonlinear programming. > > > > I tried to approximate the constraint but to no avail. Whoops, should be Constraints = [t == x+y, s == x-y, (2+t)*log(t)+s*log(s)-y*log(2) == log(y)+x*log(x)]
From: Gene on 24 Jun 2010 09:39
"Nitesh " <nitesh.prabhu(a)yahoo.co.in> wrote in message <hvsq0g$4pj$1(a)fred.mathworks.com>... > There is a linear function of two variables that I am trying to minimize under an equality constraint. But, the constraint is non linear in the variables. Is there any technique to solve this? or can I use approximations and linearize the constraint? Natish: Perhaps a little geometry will be useful in attacking your problem. The cost functional can be interpreted as an inner product u(x,y) =< ([a;b], [x, y]> Here (I assume) a and b are data and x and y are unknowns. To visualize the maximization process, first construct the perpendicular to [a;b] through the origin. In linear-algebra speak this is the subspace orthogonal to the given vector and for any point on this line (in 2D, more generally it's a plane of co-dimension one) the value of the inner-product is zero. Translating the line along [a;b] you can convince yourself that the value of the inner-product is the same for any point on the translated line (a linear variety). Now you have to work on the geometry of your constraint set. As I understand it, you have x >= y and y >= 0 along with your f(x,y) = 0. The first two define a cone - the 'lower-half' of the positive quadrant. This leaves the problem of understanding the locus f(x, y) = 0. The idea is to 'slide' the orthogonal subspace along the [a;b] vector until it 'separates' the plane such that no feasible point is 'above' the linear variety. Note there is no guarantee that the problem has a solution - the locus f(x,y) = 0 need not be bounded. I have not looked at any details for your 'f'. Good luck gene |