From: Eric Diaz on
Dear Matlab User Community,

The mathworks has posted a simple example of how to perform separable nonlinear least squares using the LSQCURVEFIT function.

http://www.mathworks.com/support/solutions/en/data/1-18E03/index.html?product=OP&solution=1-18E03

This is a good solution, in the sense that it provides a simple separable lsqcurvefit setup for a simple mono-exponential function such as, y = A.*exp(B.*XDATA)

However, what if the user would like to apply the user-defined jacobian optimset by returning [y J] as the output for the function LSQCURVEFIT is using? The given example returns linear coefficients as the second output term of the function (as opposed to the jacobian) and thus confuses the user?

In addition, what if the user would like to use jacobian information for both parameters, but runs into the problem of the fact that the separable equation is only optimized by LSQCURVEFIT around 1 parameter, not all of them? Can the linear regression use the jacobian information provided for the A parameter? Can confidence intervals of the A parameter be computed? Can nlpredci still be used?

These are issues that I am concerned with but have not been able to come up with any satisfactory solution yet.

Any discourse on these topics would be greatly appreciated and of practical value to me. Thank you in advance!

Eric Diaz
From: John D'Errico on
"Eric Diaz" <eric.diaz(a)gmail.com> wrote in message <hqv6ll$ltj$1(a)fred.mathworks.com>...
> Dear Matlab User Community,
>
> The mathworks has posted a simple example of how to perform separable nonlinear least squares using the LSQCURVEFIT function.
>
> http://www.mathworks.com/support/solutions/en/data/1-18E03/index.html?product=OP&solution=1-18E03
>
> This is a good solution, in the sense that it provides a simple separable lsqcurvefit setup for a simple mono-exponential function such as, y = A.*exp(B.*XDATA)
>
> However, what if the user would like to apply the user-defined jacobian optimset by returning [y J] as the output for the function LSQCURVEFIT is using? The given example returns linear coefficients as the second output term of the function (as opposed to the jacobian) and thus confuses the user?
>
> In addition, what if the user would like to use jacobian information for both parameters, but runs into the problem of the fact that the separable equation is only optimized by LSQCURVEFIT around 1 parameter, not all of them? Can the linear regression use the jacobian information provided for the A parameter? Can confidence intervals of the A parameter be computed? Can nlpredci still be used?
>
> These are issues that I am concerned with but have not been able to come up with any satisfactory solution yet.
>
> Any discourse on these topics would be greatly appreciated and of practical value to me. Thank you in advance!
>
> Eric Diaz

Yes, you can get confidence intervals, but it will
take some work on your part.

I would suggest the best way is to do it is to create
the Jacobian matrix for all of the parameters. This
is easily done here, or for any such problem. Then
form estimates of the parameter variances as I teach
how to do in my optimization tips and tricks.

John
From: Eric Diaz on
"John D'Errico" <woodchips(a)rochester.rr.com> wrote in message
>
> Yes, you can get confidence intervals, but it will
> take some work on your part.
>
> I would suggest the best way is to do it is to create
> the Jacobian matrix for all of the parameters. This
> is easily done here, or for any such problem. Then
> form estimates of the parameter variances as I teach
> how to do in my optimization tips and tricks.
>
> John

Dear John,

Thank you for your reply. I know that you are an expert in this area. I already have the jacobian matrix for all the parameters and am currently using [y J] for the output of my function.

The way that I currently compute my confidence intervals is with nlpredci, but I also compute my standard errors as the sqrt(diag(covariance)), where covariance = (J'*J)./mse

That being said, my goal was to enhance my current approach by using variable projection. Yet, it seems to be confounded by the questions I originally posted.

If there is any way you could be a bit more concrete, I would appreciate it.
I have provided a setup below, for which I would like to implement separable nonlinear least squares (similar to the example), while still providing the jacobians to lsqcurvefit and being able to predict the confidence intervals.

Setup:

p0 = [A0 B0]; %define these yourself
lb = [0 0];
ub = [inf inf]]

options = optimset( 'Display' , 'none','Diagnostics','off',...
'MaxIter' , 10,'MaxFunEvals', 500,...
'TolFun' , 1e-12,'TolX' ,1e-12,...
'Jacobian' , 'on','LargeScale','on');

[parEst,sse,res,dummyVar,dummyVar,dummyVar,J] = lsqcurvefit(@myfunc,p0,xdata,ydata,lb,ub,options);

% estimate standard error
DOF = numel(xdata)-numel(p0); % Degrees of Freedom

function [mse, cov, parSE] = estimateSE(DOF,sse,J);
% This function is called after lsqcurvefit
mse = sse/(DOF); % Mean square error
cov = (J'*J)./mse; % Estimated covariance matrix of coeff estim
parSE = sqrt(abs(diag(cov)));
end

% This section predicts the CI for the nonlinear fit
time = 0:0.01:xdata(end);
FitCurve = myfunc(parEst,time);
[ypred, delta] = nlpredci(@myfunc,time,parEst,res,J);


function [Sig Jacob] = myfunc(p,xdata)
% This function is called by lsqcurvefit.
% x is a vector which contains the coefficients of the
% equation. xdata is the data set
% passed to lscurvefit.

A = p(1);
B = p(2);

Sig = A.*exp(-xdata./B);
Jp1 = exp(-xdata./B);
Jp2 = A.*(xdata./(B.^2).*exp(-xdata./B);
Jacob = [Jp1; Jp2]';
end

Thank you again in advance,

Eric
From: Eric Diaz on
Bump...geez there has been a lot of spam on here! I hope that something can be done to these spammers and about the spam, my posts are getting lost in the mix...
From: Eric Diaz on
Bumpity bump bump...wow, I thought more people would be interested in this topic. John's answer was woefully too vague and inadequate. Anyone else interested, i.e., TMW?
 |  Next  |  Last
Pages: 1 2
Prev: fmincon
Next: How to set atomic system in Simulink