From: Petros on
I try to run a one-variable optimization of a function that makes use of the heaviside function.

I have:
Q={700 if x(1)<5, 600 if 5<x(1)<10, 200 if 10<x(1)} or using heaviside functions,
Q=700-100*heaviside(x(1)-5)-400*heaviside(x(1)-10)

also I have
P=100+100*heaviside(x(1)-6)+400*heaviside(x(1)-13)

bound: x(1)>0

I want to minimize f=Q-P

I always get:
Optimization completed because the objective function is non-decreasing in
feasible directions, to within the default value of the function tolerance,
and constraints were satisfied to within the default value of the constraint tolerance.

and I get as solution the point I give as starting point. I believe the problem is the heavisides?

Thanks.
From: Roger Stafford on
"Petros " <p3tris(a)gmail.com> wrote in message <hv34vv$91u$1(a)fred.mathworks.com>...
> I try to run a one-variable optimization of a function that makes use of the heaviside function.
>
> I have:
> Q={700 if x(1)<5, 600 if 5<x(1)<10, 200 if 10<x(1)} or using heaviside functions,
> Q=700-100*heaviside(x(1)-5)-400*heaviside(x(1)-10)
>
> also I have
> P=100+100*heaviside(x(1)-6)+400*heaviside(x(1)-13)
>
> bound: x(1)>0
>
> I want to minimize f=Q-P
>
> I always get:
> Optimization completed because the objective function is non-decreasing in
> feasible directions, to within the default value of the function tolerance,
> and constraints were satisfied to within the default value of the constraint tolerance.
>
> and I get as solution the point I give as starting point. I believe the problem is the heavisides?
>
> Thanks.

It is not appropriate to use the standard optimization routines for step functions like this. These optimization routines typically depend heavily on their input functions being continuous and preferably possessing continuous derivatives, which is most certainly not true of the heavyside function.

In any case you are making a very hard job out of an easy one. Just evaluate P-Q at and on either side of each step and find the minimum of these. What could be simpler?

Roger Stafford
From: Walter Roberson on
Petros wrote:
> I try to run a one-variable optimization of a function that makes use of
> the heaviside function.

> I always get:
> Optimization completed because the objective function is non-decreasing
> in feasible directions, to within the default value of the function
> tolerance,
> and constraints were satisfied to within the default value of the
> constraint tolerance.

Which minimizer are you using? A number of the minimizers assume that
the differential of the function is continuous, which is not the case
for Heaviside. Or to put it another way, your graph is too smooth
(indeed, it is mostly flat), and most minimizers need a slope to work with.

You don't need a full minimizer for functions such as that: you only
have to test both sides of each boundary condition.
From: Petros on
Thanks for replying. Really the function is not that one. The real function is a 2-variable function with quadratic parts and heaviside functions. I just wanted to start with something simpler. I cannot avoid the heaviside functions unfortunately, because they represent decisions made by a controller. So, the functions are

P={a if 0<x1<5; b if 5<x1} and
Q={c if 0<x2<19; d if 19<x2} and
L=(Q-P-Q0)*FINE(Q-P-Q0) where
FINE(Q-P-Q0)={e if 0<(Q-P-Q0)<100; f if 100<(Q-P-Q0)<200}

And I try to maximize (Q-P-L) or minimize -(Q-P-L).
It's not something I can do by hand for repeated problems and I thought matlab could help me out.

Thanks again.
From: Petros on
If I use an approximation like: heaviside(x)=1/(1+exp(-50000*x)) which gives a good approximation of the heaviside funtion, will it be ok for using fmincon?