From: Evan Ruzanski on
Hello,

I'm trying to find linear regression coefficients based on maximizing the correlation coefficient between a set of observed predictors (X) and observations (y) (instead of minimizing the LSE as is done with standard linear regression, i.e., b = [(X^TX)^-1]X^Ty). In other words, I'm trying to find an optimum set of predictor coefficients based on minimizing a different cost function.

For example (using randn just for illustration but I intend to use real data),
>> X = randn(5,2)

X =

-0.0301 -0.8637
-0.1649 0.0774
0.6277 -1.2141
1.0933 -1.1135
1.1093 -0.0068

>> y = randn(5,1)

y =

1.5326
-0.7697
0.3714
-0.2256
1.1174

>> tf = @(b)(1-corrcoef(X*b,y));
>> >> y_est = fminsearch(tf,[1;1]);
??? Subscripted assignment dimension mismatch.

Error in ==> fminsearch at 205
fv(:,1) = funfcn(x,varargin{:});

Can someone please tell me the right/best way to do this (or even if this is a well-posed problem)???

Many thanks!
From: Steven Lord on

"Evan Ruzanski" <ruzanski.02(a)engr.colostate.edu> wrote in message
news:i1hcqf$gph$1(a)fred.mathworks.com...
> Hello,
>
> I'm trying to find linear regression coefficients based on maximizing the
> correlation coefficient between a set of observed predictors (X) and
> observations (y) (instead of minimizing the LSE as is done with standard
> linear regression, i.e., b = [(X^TX)^-1]X^Ty). In other words, I'm trying
> to find an optimum set of predictor coefficients based on minimizing a
> different cost function.
>
> For example (using randn just for illustration but I intend to use real
> data),

*snip code indicating X is 5-by-2 and y is 5-by-1*

>>> tf = @(b)(1-corrcoef(X*b,y));

What size is the output of this function when evaluated at the initial guess
you passed in to FMINSEARCH?

What size of output does FMINSEARCH expect the objective function you pass
into it to return? [This is listed in the first paragraph in HELP
FMINSEARCH.]

You will need to modify your objective function so that the two sizes that
answer those two questions agree.

--
Steve Lord
slord(a)mathworks.com
comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
To contact Technical Support use the Contact Us link on
http://www.mathworks.com


From: Evan Ruzanski on
Hi Steven,

Thank you for the reply. In this simple example, the variable b is 2 X 1 and the output of the objective function tf is 1 X 1. Can you tell me how to modify the objective function so that the two sizes that answer those two questions agree? I don't see the answer in the first paragraph of HELP FMINSEARCH:

>> help fminsearch
FMINSEARCH Multidimensional unconstrained nonlinear minimization (Nelder-Mead).
X = FMINSEARCH(FUN,X0) starts at X0 and attempts to find a local minimizer
X of the function FUN. FUN is a function handle. FUN accepts input X and
returns a scalar function value F evaluated at X. X0 can be a scalar, vector
or matrix.

"Steven Lord" <slord(a)mathworks.com> wrote in message <i1hqrr$ngq$1(a)fred.mathworks.com>...
>
> *snip code indicating X is 5-by-2 and y is 5-by-1*
>
> >>> tf = @(b)(1-corrcoef(X*b,y));
>
> What size is the output of this function when evaluated at the initial guess
> you passed in to FMINSEARCH?
>
> What size of output does FMINSEARCH expect the objective function you pass
> into it to return? [This is listed in the first paragraph in HELP
> FMINSEARCH.]
>
> You will need to modify your objective function so that the two sizes that
> answer those two questions agree.
>
> --
> Steve Lord
> slord(a)mathworks.com
> comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
> To contact Technical Support use the Contact Us link on
> http://www.mathworks.com
>
From: John D'Errico on
"Evan Ruzanski" <ruzanski.02(a)engr.colostate.edu> wrote in message <i1hcqf$gph$1(a)fred.mathworks.com>...
> Hello,
>
> I'm trying to find linear regression coefficients based on maximizing the correlation coefficient between a set of observed predictors (X) and observations (y) (instead of minimizing the LSE as is done with standard linear regression, i.e., b = [(X^TX)^-1]X^Ty). In other words, I'm trying to find an optimum set of predictor coefficients based on minimizing a different cost function.


DON'T do this.

To start with, you do not even know how to
solve the regression problem. This is a TERRIBLE
line of code:

b = [(X'*X)^-1]X'*y);

It uses a matrix inverse instead of backslash. It
squares the condition number, making the
problem more ill-conditioned than it should be.

The correct way to solve that problem is:

b = X\y;

If that fails, then the use of fminsearch to solve
your problem is (I'm sorry to say this) laughable.
Fminsearch will not be able to resolve a poorly
conditioned least squares problem more accurately
than backslash.

If you still have problems, then the next thing
to do is to learn to use rescaling or another
transformation of your problem to improve the
conditioning. Better, recognize when the problem
is simply a result of terribly generated data, that
will never be adequate for estimation as it is.

John
From: Evan Ruzanski on
Hi John,

Thank you for your reply. Your posts have helped me before and you consistently give outstanding advice to this newsgroup. Many thanks for sharing your wisdom.

Let me clarify and give you the "whole story" behind my original post. Perhaps you could kindly provide input...

I gave a degenerate example in my post here just to understand the interface for fminsearch. I absolutely agree with everything you said with regard to the technical content (or lackthereof) of my post; I was using a very simple numerical example just to figure out how to use fminsearch. I thought more people would reply to a simpler example!

The real problem I am trying to solve is fitting a linear regression model to some real data, but instead of minimizing the mean square error (as is traditionally done with linear regression, specifically, multiple linear regression in my case and gives a very simple solution), I wish to minimize the dispersion component of the RMSE (i.e., trying to minimize phase errors), given by,

RMSE_disp = sqrt(2*std(p_hat)*std(p)*(1-cc(p_hat,p)),

where ccm = corrcoef(p_hat,p) and cc = ccm(2,1). Also, I parameterize p_hat = X*beta. p (m x 1) and X (m x n) are observed, beta (n x 1) is the vector to be minimized.

I did a quick analysis using a surface plot of disp(std(p_hat)*std(p),cc(p_hat,p)) and it looks to be convex. In lieu of trudging through an analytical solution to minimization using the derivative of disp w.r.t beta, I thought to use fminsearch to minimize disp.

What do you think about such an approach???

Best regards,
Evan

"John D'Errico" <woodchips(a)rochester.rr.com> wrote in message <i1isd1$cgh$1(a)fred.mathworks.com>...
> "Evan Ruzanski" <ruzanski.02(a)engr.colostate.edu> wrote in message <i1hcqf$gph$1(a)fred.mathworks.com>...
> > Hello,
> >
> > I'm trying to find linear regression coefficients based on maximizing the correlation coefficient between a set of observed predictors (X) and observations (y) (instead of minimizing the LSE as is done with standard linear regression, i.e., b = [(X^TX)^-1]X^Ty). In other words, I'm trying to find an optimum set of predictor coefficients based on minimizing a different cost function.
>
>
> DON'T do this.
>
> To start with, you do not even know how to
> solve the regression problem. This is a TERRIBLE
> line of code:
>
> b = [(X'*X)^-1]X'*y);
>
> It uses a matrix inverse instead of backslash. It
> squares the condition number, making the
> problem more ill-conditioned than it should be.
>
> The correct way to solve that problem is:
>
> b = X\y;
>
> If that fails, then the use of fminsearch to solve
> your problem is (I'm sorry to say this) laughable.
> Fminsearch will not be able to resolve a poorly
> conditioned least squares problem more accurately
> than backslash.
>
> If you still have problems, then the next thing
> to do is to learn to use rescaling or another
> transformation of your problem to improve the
> conditioning. Better, recognize when the problem
> is simply a result of terribly generated data, that
> will never be adequate for estimation as it is.
>
> John