From: Kaba on
Hi,

Here's an interesting problem:

Let
P = [p_1, ..., p_n] in R^{m x n}, and
R = [r_1, ..., r_n] in R^{m x n}

Find a matrix A in R^{m x m) with det(A) > 0, such that
the least-squares error
E = sum_{i = 1}^n || Ap_i - r_i ||^2
is minimized.

The non-oriented version of this problem has solution
A = (RP^T)(PP^T)^-1. However, the orientation requirement
seems to make the problem harder.

Can you see a way to approach this problem?

--
http://kaba.hilvi.org
From: Kaba on
Kaba wrote:
> Hi,
>
> Here's an interesting problem:
>
> Let
> P = [p_1, ..., p_n] in R^{m x n}, and
> R = [r_1, ..., r_n] in R^{m x n}
>
> Find a matrix A in R^{m x m) with det(A) > 0, such that
> the least-squares error
> E = sum_{i = 1}^n || Ap_i - r_i ||^2
> is minimized.
>
> The non-oriented version of this problem has solution
> A = (RP^T)(PP^T)^-1. However, the orientation requirement
> seems to make the problem harder.
>
> Can you see a way to approach this problem?

Aah, I see it:

Let

1) <x> = <x, x> = ||x||^2
2) B be a matrix that minimizes E, without restrictions on orientation.
3) Q be a reflection, i.e. Q^T Q = I and Q^2 = I.
4) B have wrong orientation.
5) A = QB

Then

E = sum_{i = 1}^n <Ap_i - r_i>
= sum_{i = 1}^n [<Ap_i> - 2<Ap_i, r_i> + <r_i>]
= sum_{i = 1}^n [<Ap_i> - 2<Ap_i, r_i> + <r_i>]
= sum_{i = 1}^n [<QBp_i> - 2<QBp_i, r_i> + <r_i>]
= sum_{i = 1}^n [<Bp_i> - 2<QBp_i, r_i> + <r_i>]

We thus see that to minimize E we need to maximize
E' = sum_{i = 1}^n <QBp_i, r_i>.

However, this problem I already know how to solve. It is the oriented
rigid linear least-squares problem.

--
http://kaba.hilvi.org
From: Kaba on
Kaba wrote:
> Aah, I see it:
>
> Let
>
> 1) <x> = <x, x> = ||x||^2
> 2) B be a matrix that minimizes E, without restrictions on orientation.
> 3) Q be a reflection, i.e. Q^T Q = I and Q^2 = I.

Or maybe not: maybe the relation between A and B is not necessarily a
reflection...

--
http://kaba.hilvi.org
From: Robert Israel on
Kaba <none(a)here.com> writes:

> Hi,
>
> Here's an interesting problem:
>
> Let
> P = [p_1, ..., p_n] in R^{m x n}, and
> R = [r_1, ..., r_n] in R^{m x n}
>
> Find a matrix A in R^{m x m) with det(A) > 0, such that
> the least-squares error
> E = sum_{i = 1}^n || Ap_i - r_i ||^2
> is minimized.
>
> The non-oriented version of this problem has solution
> A = (RP^T)(PP^T)^-1. However, the orientation requirement
> seems to make the problem harder.
>
> Can you see a way to approach this problem?

If a solution of the non-oriented version has det(A) > 0, then
that does it. Otherwise, there will be no solution.
But you might want to make the constraint det(A) >= 0, in which case
you might get a solution with det(A) = 0 using a Lagrange multiplier.
--
Robert Israel israel(a)math.MyUniversitysInitials.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
From: Robert Israel on
Kaba <none(a)here.com> writes:

> Kaba wrote:
> > Hi,
> >
> > Here's an interesting problem:
> >
> > Let
> > P = [p_1, ..., p_n] in R^{m x n}, and
> > R = [r_1, ..., r_n] in R^{m x n}
> >
> > Find a matrix A in R^{m x m) with det(A) > 0, such that
> > the least-squares error
> > E = sum_{i = 1}^n || Ap_i - r_i ||^2
> > is minimized.
> >
> > The non-oriented version of this problem has solution
> > A = (RP^T)(PP^T)^-1. However, the orientation requirement
> > seems to make the problem harder.
> >
> > Can you see a way to approach this problem?
>
> Aah, I see it:
>
> Let
>
> 1) <x> = <x, x> = ||x||^2
> 2) B be a matrix that minimizes E, without restrictions on orientation.
> 3) Q be a reflection, i.e. Q^T Q = I and Q^2 = I.
> 4) B have wrong orientation.
> 5) A = QB

No, this will almost never work.
The matrix B is likely to be invertible, while if det(B) < 0
your minimizer will have to have det(A) = 0: note that the objective
is convex in A, so any local minimum is a global minimum.
--
Robert Israel israel(a)math.MyUniversitysInitials.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
 |  Next  |  Last
Pages: 1 2
Prev: roots to *any* polynomial
Next: JSH: Using Usenet