From: John D'Errico on
Josh <josh.boys(a)gmail.com> wrote in message <a09c953c-55eb-4e94-b973-71c1e7f00e2b(a)g19g2000yqc.googlegroups.com>...

> The task that I am undertaking is reproducing the work of Agarwal and
> Triggs outlined in their paper "Recovering 3D Human Pose from
> Monocular Images". Their paper presents an approach for recovering
> pose (55D vectors) from silhouette images (100D vectors) using ridge
> regression.
>
> Yet, you say this is a hopeless task. Can you please explain the
> discrepancy?

One thing that you have never bothered to tell us is
what is the order of your model?

You talk about using ridge regression on a 4th order
polynomial model. But you have never said if the
goal is to build a 4th order model, or a simple linear
one, or at least something close.

Perhaps if you explain your problem clearly, you might
have more success. I don't have access to that paper,
nor do I have any reason to wish to read it.

Anyway, the fact that someone published a paper on
the subject means nothing. I recall many papers
published about cold fusion too.

And finally, as I said before, you might want to spend
some time learning about regression modeling, rather
than trying to jump into a huge problem with no
knowledge of what you are doing.

John
From: Bruno Luong on
J B <trifinite84(a)googlemail.com> wrote in message <e1e9378c-dc91-4381-b967-e8c027284bbe(a)d16g2000yqb.googlegroups.com>...
> Hello,
>
> I wish to use ridge regression to learn a mapping between data that is
> of 100 dimensions and another that is of 22 dimensions. Before I can
> do this, I need to gain a better understanding of ridge regression and
> how it is used within matlab, so I am trying to use ridge regression
> in the simple case of polynomial fitting.
>
> In my test example, I am using the function y = sin(2*pi*x), where x
> is 11 points evenly spaced in the range [0.0, 1.0] (incrementing from
> 0.0 by 0.1).
>
> Given the help manual, the ridge function is defined as follows: B1 =
> ridge(Y, X, K). However, the manual goes on to state that for making
> predictions, B0 = ridge(Y, X, K, 0) is better suited.
>
> Firstly, given the above function and values of x, is it correct that
> my input variables should be defined as follows:
>

If you want the constant term, you need a column with all '1' in the matrix X. Polynomial in Matlab is also written in reverse order:

x = [0:0.1:1]'
y = sin(2*pi*x)

X = bsxfun(@power,x,[4:-1:0])
b = inv(X'*X)*X'*y % regression, badly done for educational purpose only
% y = b(1)*x^4 + b(2)*x^3 + b(3)*x^2 + b(4)*x + b(5)
plot(x,polyval(b,x))

Bruno
From: Bruno Luong on

> plot(x,polyval(b,x))
>

Note that polyval(b,x) is equal to X*b

Bruno
From: Peter Perkins on
On 6/27/2010 2:08 PM, J B wrote:
> Given the help manual, the ridge function is defined as follows: B1 =
> ridge(Y, X, K). However, the manual goes on to state that for making
> predictions, B0 = ridge(Y, X, K, 0) is better suited.

> Given the polynomial w0 + w1*x + w2*x^2 + ... are these coefficients
> starting at w0 or w1?

It depends on whether you called RIDGE with that fourth inputset to 0,
or not. With your X and Y, this:

B0 = ridge(Y, X, K, 0)
x = X(:,1);
y = [ones(size(X,1),1) X] * B0;
plot(x,y,'-o');

evaluates the fitted polynomial at the x values. I confess that I've
never heard of someone using ridge regression on a polynomial like this,
but presumably you're just trying something simple. Perhaps too simple.

Hope this helps.
From: Bruno Luong on
Peter Perkins <Peter.Perkins(a)MathRemoveThisWorks.com> wrote in message <i0a9ie$1af$1(a)fred.mathworks.com>...

> I confess that I've
> never heard of someone using ridge regression on a polynomial like this,
> but presumably you're just trying something simple. Perhaps too simple.

It should be fine to use Ridge regression on polynomial fitting, user just have to pay close attention to the regularization matrix which should be appropriately constructed, such as proportional to the Gram's matrix of polynomial basis and/or its derivative. Otherwise, using identity of X as above, the garbage out is warrantied.

Bruno