From: Bruno Luong on
"Antony " <mutang.bing(a)gmail.com> wrote in message <i20kpf$8sh$1(a)fred.mathworks.com>...
> Brian Borchers <borchers.brian(a)gmail.com> wrote in message <b4dd9eeb-e56c-4448-9e52-717916ce8465(a)y4g2000yqy.googlegroups.com>...
> >
> > If possible, put your equation in text format. If that's not
> > possible, most readers of this group understand LaTeX notation.
>
> OK. Thank you! I try to write the inverse problem as below:
>
> min_x {(Ax-b)^2 + \lambda_1 ||x|| + \lambda_2 ||gradient(x)||}
> s.t. x>0
>
> where Ax-b is refomulated from Ax=b, ||x|| is the Tikhonov regularizer and ||gradient(x)|| is the smoothness regularizer.
>

I assume you meant st (x>=0) since minimization won't work for opened set. Have you tried LSQNONNEG or QUADPROG with LARGESCALE option?

Bruno
From: Bruno Luong on
"Antony " <mutang.bing(a)gmail.com> wrote in message <i20kpf$8sh$1(a)fred.mathworks.com>...
> Brian Borchers <borchers.brian(a)gmail.com> wrote in message <b4dd9eeb-e56c-4448-9e52-717916ce8465(a)y4g2000yqy.googlegroups.com>...
> >
> > If possible, put your equation in text format. If that's not
> > possible, most readers of this group understand LaTeX notation.
>
> OK. Thank you! I try to write the inverse problem as below:
>
> min_x {(Ax-b)^2 + \lambda_1 ||x|| + \lambda_2 ||gradient(x)||}
> s.t. x>0
>
> where Ax-b is refomulated from Ax=b, ||x|| is the Tikhonov regularizer and ||gradient(x)|| is the smoothness regularizer.
>

Also do you meant

min_x {(Ax-b)^2 + \lambda_1 ||x||^2 + \lambda_2 ||gradient(x)||^2}

The later is standard Tickhonov (not that you can group the 2-norm and the gradient semi-norm in H1 norm)

Bruno
From: Antony on
"Bruno Luong" <b.luong(a)fogale.findmycountry> wrote in message <i20obs$ie7$1(a)fred.mathworks.com>...
....
>
> I assume you meant st (x>=0) since minimization won't work for opened set. Have you tried LSQNONNEG or QUADPROG with LARGESCALE option?
>
> Bruno

Thanks Bruno. Yes, I mean positive values of x. But it seems that LSQNONNEG can not work with constraints. I also have no idea of QUADPROG because it targets the form of min_x{x^T Hx + f^T x}. I need to write the 2-norm style of minimization into this form?
From: Antony on
"Bruno Luong" <b.luong(a)fogale.findmycountry> wrote in message <i20p2c$1ft$1(a)fred.mathworks.com>...
>
> Also do you meant
>
> min_x {(Ax-b)^2 + \lambda_1 ||x||^2 + \lambda_2 ||gradient(x)||^2}
>
> The later is standard Tickhonov (not that you can group the 2-norm and the gradient semi-norm in H1 norm)
>
> Bruno

Thanks a lot for your advice! I find that I often made such mistakes before... Thanks again!

Antony
From: Bruno Luong on
"Antony " <mutang.bing(a)gmail.com> wrote in message <i20v93$lkr$1(a)fred.mathworks.com>...
> "Bruno Luong" <b.luong(a)fogale.findmycountry> wrote in message <i20obs$ie7$1(a)fred.mathworks.com>...
> ...
> >
> > I assume you meant st (x>=0) since minimization won't work for opened set. Have you tried LSQNONNEG or QUADPROG with LARGESCALE option?
> >
> > Bruno
>
> Thanks Bruno. Yes, I mean positive values of x. But it seems that LSQNONNEG can not work with constraints.

Not sure what you meant by that, LSQNONNEG specific is designed for this constraint.

>I also have no idea of QUADPROG because it targets the form of min_x{x^T Hx + f^T x}. I need to write the 2-norm style of minimization into this form?

Yes you can write you function as quadratic form.

1/2 | Ax - b | = 1/2 x'*H*x + g'*x + c

with H = A'*A; g' = b'*A; and c = 1/2*|b|^2 (independent of x). You can do the same with regularization and add them together.

Bruno