From: Anton on
Hello, I need to fit a curve very accurately. The function is

n = ( k * sqrt(1 + x^2 / h^2)+phi ) / pi

where k, h, phi are fitting parameters
and the data is
n =
2
3
4
5
6
7

x =
3.7910
4.0640
4.2850
4.4930
4.6750
4.8440

I tried doing a nonlinear least squares, but it comes up a fitted curve that is essentially straight, so the residuals go up and then down, and are not randomly distributed.

Can you suggest a better method?

Thank you,

Antonmaz
From: John D'Errico on
"Anton " <anton.mazurenko(a)gmail.com> wrote in message <hqnmt4$1bj$1(a)fred.mathworks.com>...
> Hello, I need to fit a curve very accurately. The function is
>
> n = ( k * sqrt(1 + x^2 / h^2)+phi ) / pi
>
> where k, h, phi are fitting parameters
> and the data is
> n =
> 2
> 3
> 4
> 5
> 6
> 7
>
> x =
> 3.7910
> 4.0640
> 4.2850
> 4.4930
> 4.6750
> 4.8440
>
> I tried doing a nonlinear least squares, but it comes up a fitted curve that is essentially straight, so the residuals go up and then down, and are not randomly distributed.
>
> Can you suggest a better method?

Sigh. You need to fit this curve very accurately.

Of course, the model that you pose does not look
even remotely like the data that you have. You
cannot fit a model simply by wanting very much
for it to fit. Prayer won't help either.

Since you apparently have no idea of a good model
for this data, we cannot help you. The best I can
offer is the simple polyfit.

P = polyfit(x,n,2);

At least it fits quite nicely. And anything of a higher
order than a quadratic polynomial is pretty much a
waste of cpu cycles.

John
From: James Phillips on
This data set fits extremely well to the standard Steinhart-Hart equation:

n = coeff_A + coeff_B * ln(x) + coeff_C * ln(x)^3

and appears to both interpolate and extrapolate smoothly. I found this by using the 'Function Finder' at http://zunzun.com

Wikipedia link to Steinhart-Hart equation: http://en.wikipedia.org/wiki/Steinhart%E2%80%93Hart_equation

James Phillips
zunzun(a)zunzun.com
http://zunzun.com

"Anton " <anton.mazurenko(a)gmail.com> wrote in message <hqnmt4$1bj$1(a)fred.mathworks.com>...
> Hello, I need to fit a curve very accurately. The function is
>
> n = ( k * sqrt(1 + x^2 / h^2)+phi ) / pi
>
> where k, h, phi are fitting parameters
> and the data is
> n =
> 2
> 3
> 4
> 5
> 6
> 7
>
> x =
> 3.7910
> 4.0640
> 4.2850
> 4.4930
> 4.6750
> 4.8440
>
> I tried doing a nonlinear least squares, but it comes up a fitted curve that is essentially straight, so the residuals go up and then down, and are not randomly distributed.
>
> Can you suggest a better method?
>
> Thank you,
>
> Antonmaz
From: John D'Errico on
"James Phillips" <zunzun(a)zunzun.com> wrote in message <htb944$fa7$1(a)fred.mathworks.com>...
> This data set fits extremely well to the standard Steinhart-Hart equation:
>
> n = coeff_A + coeff_B * ln(x) + coeff_C * ln(x)^3
>
> and appears to both interpolate and extrapolate smoothly. I found this by using the 'Function Finder' at http://zunzun.com
>
> Wikipedia link to Steinhart-Hart equation: http://en.wikipedia.org/wiki/Steinhart%E2%80%93Hart_equation
>
> James Phillips
> zunzun(a)zunzun.com
> http://zunzun.com

Sorry, but this is just silly. Ridiculous in the extreme.

Note that the model you pose results in a fit that
while accurate, is in fact poorer than the fit that
you get from a simple quadratic polynomial.

The model you pose has a residual standard deviation
of 0.012987. The R^2 is 0.99996.

While these are reasonably good numbers for a curve
fit, note that had you tried a simple quadratic
polynomial, you would have found that it yields a
lower residual error and a higher r^2.

The quadratic model has a residual standard deviation
of 0.011293, with R^2 = 0.99996. And the last that I
checked, quadratic polynomials also extrapolate fairly
smoothly. Note that the quadratic has the same number
of coefficients, (three) as the model that you pulled out
of a hat.

My point is that throwing a canned routine at your
data to automatically test a variety of models until you
find one that fits is a remarkably fatuous strategy,
especially if you can't bother to compare the results
to a simple quadratic polynomial fit on the same data.
Yes, you are proud of your code. But next time try
showing some common sense when you post a
response like this.

John
From: James Phillips on
"John D'Errico" <woodchips(a)rochester.rr.com> wrote in message <htbbvv$jtc$1(a)fred.mathworks.com>...
>
> Sorry, but this is just silly. Ridiculous in the extreme.

I apologize for posing an well-understood alternative physical model that might help explain the underlying physics of the data set, even though you did not. Better?

Note the number of digits of precision in the data set.

James