From: Matt J on
"Antony " <mutang.bing(a)gmail.com> wrote in message <i2irns$94i$1(a)fred.mathworks.com>...

> I have another problem. Maybe we can not directly solve it and I think the result might be more complxe than my former problem. The problem is:
> if g(x) = ||KX-B||^0.6 with all the other settings as the former problem, what is \partial{g}/\partial{x}?
========

This is equivalent to (||KX-B||^2)^0.3

So you can use your original result, with one more step of the chain rule leading to

Gradient = 0.3*(||KX-B||^2)^(-.7) * 2*K'*(K*X-B)
From: Antony on
"Matt J " <mattjacREMOVE(a)THISieee.spam> wrote in message <i2it4t$73b$1(a)fred.mathworks.com>...
> "Antony " <mutang.bing(a)gmail.com> wrote in message <i2irns$94i$1(a)fred.mathworks.com>...
>
> > I have another problem. Maybe we can not directly solve it and I think the result might be more complxe than my former problem. The problem is:
> > if g(x) = ||KX-B||^0.6 with all the other settings as the former problem, what is \partial{g}/\partial{x}?
> ========
>
> This is equivalent to (||KX-B||^2)^0.3
>
> So you can use your original result, with one more step of the chain rule leading to
>
> Gradient = 0.3*(||KX-B||^2)^(-.7) * 2*K'*(K*X-B)

Why not write it as Gradient = 0.3 *2*K'*(K*X-B)*(||KX-B||^2)^(-.7) according to the chain rule? It is because K'*(K*X-B) is a scalar and there is no difference between them? Thank you!
From: Matt J on
"Antony " <mutang.bing(a)gmail.com> wrote in message <i2isv7$pi7$1(a)fred.mathworks.com>...

> But, according to the chain rule, I may apply it to f(X)=||KX-B||^0.6 and obtain the result of the derivate as 0.6*K.'*(K*X-B)^{-0.4}?
======================

No, this wouldn't be the correct expression. From my last post, I get, after some simplification

Gradient = 0.6*K.'*(K*X-B)/||K*X-B||^(1.4)


>This result seems rather complex for some numerical optimization.
=======================

Well, your objective function f(X)=||KX-B||^0.6 is unusually complex...

For one thing, this function is not differentiable at points where K*X=B, which means that if the minimum lies there, you cannot use gradient-based approaches to find it.
From: Antony on
"Matt J " <mattjacREMOVE(a)THISieee.spam> wrote in message <i2itv4$rr8$1(a)fred.mathworks.com>...
> "Antony " <mutang.bing(a)gmail.com> wrote in message <i2isv7$pi7$1(a)fred.mathworks.com>...
>
> > But, according to the chain rule, I may apply it to f(X)=||KX-B||^0.6 and obtain the result of the derivate as 0.6*K.'*(K*X-B)^{-0.4}?
> ======================
>
> No, this wouldn't be the correct expression. From my last post, I get, after some simplification
>
> Gradient = 0.6*K.'*(K*X-B)/||K*X-B||^(1.4)
>
>
> >This result seems rather complex for some numerical optimization.
> =======================
>
> Well, your objective function f(X)=||KX-B||^0.6 is unusually complex...
>
> For one thing, this function is not differentiable at points where K*X=B, which means that if the minimum lies there, you cannot use gradient-based approaches to find it.

Dear Matt, thank a lot for your time in my question. I appreciate your help! I understand the difficulties of such type of optimization problems now. This might be the reason that papers always figure out another efficient solutions to such type of non-convex problems. Thanks again!

Also, thanks a lot for all other guys' kind and patient helps, especially to Roger Stafford and Brian Borchers.

Antony
First  |  Prev  | 
Pages: 1 2
Prev: variance x n is called?
Next: pcm