From: Joerg Roesgen on
Is there any straightforward way to do Orthogonal Distance Regression (ODR) in Mathematica? If not, this would be a nice feature to have in Mathematica.

Background:

Regular nonlinear least square fitting minimizes in one dimension the distance between the data and some function. ODR does it in all dimensions simultaneously. This is desirable e.g. when the data have errors in more than one dimension, or if parametric equations are used. The basic approach is similar to nonlinear least squares, only that one additional step is done in each iteration on top of minimizing the distance between the function and the data. Each data point has a corresponding point on the fit function, and the distance between these two points is minimized by moving the corresponding point along the function. So, if nn is the number of fitting parameters in nonlinear least squares, and nd is the number of data points, then ODR has nn+nd fitting parameters if we are in 2D.

Thanks,
Joerg

From: Tomas Garza on
I don't know what you mean by "straightforward", but perhaps you'd care to
look at a demonstration I submitted some time ago to the Wolfram Demonstrations Project:



http://demonstrations.wolfram.com/OrdinaryRegressionAndOrthogonalRegressionInThePlane/



Tomas

> Date: Wed, 19 May 2010 20:13:14 -0400
> From: biophys.hershey(a)me.com
> Subject: Orthogonal Distance Regression available?
> To: mathgroup(a)smc.vnet.net
>
> Is there any straightforward way to do Orthogonal Distance Regression (ODR) in Mathematica? If not, this would be a nice feature to have in Mathematica.
>
> Background:
>
> Regular nonlinear least square fitting minimizes in one dimension the distance between the data and some function. ODR does it in all dimensions simultaneously. This is desirable e.g. when the data have errors in more than one dimension, or if parametric equations are used. The basic approach is similar to nonlinear least squares, only that one additional step is done in each iteration on top of minimizing the distance between the function and the data. Each data point has a corresponding point on the fit function, and the distance between these two points is minimized by moving the corresponding point along the function. So, if nn is the number of fitting parameters in nonlinear least squares, and nd is the number of data points, then ODR has nn+nd fitting parameters if we are in 2D.
>
> Thanks,
> Joerg
>

From: Joerg Roesgen on
Thanks, but if I understand this demonstration correctly, it is about a linear problem. I am particularly interested in non-linear problems.

Joerg


On May 20, 2010, at 2:21 PM, Tomas Garza wrote:

> I don't know what you mean by "straightforward", but perhaps you'd care to look at a demonstration I submitted some time ago to the Wolfram Demonstrations Project:
>
> http://demonstrations.wolfram.com/OrdinaryRegressionAndOrthogonalRegressionInThePlane/
>
> Tomas
>
> > Date: Wed, 19 May 2010 20:13:14 -0400
> > From: biophys.hershey(a)me.com
> > Subject: Orthogonal Distance Regression available?
> > To: mathgroup(a)smc.vnet.net
> >
> > Is there any straightforward way to do Orthogonal Distance Regression (ODR) in Mathematica? If not, this would be a nice feature to have in Mathematica.
> >
> > Background:
> >
> > Regular nonlinear least square fitting minimizes in one dimension the distance between the data and some function. ODR does it in all dimensions simultaneously. This is desirable e.g. when the data have errors in more than one dimension, or if parametric equations are used. The basic approach is similar to nonlinear least squares, only that one additional step is done in each iteration on top of minimizing the distance between the function and the data. Each data point has a corresponding point on the fit function, and the distance between these two points is minimized by moving the corresponding point along the function. So, if nn is the number of fitting parameters in nonlinear least squares, and nd is the number of data points, then ODR has nn+nd fitting parameters if we are in 2D.
> >
> > Thanks,
> > Joerg
> >

From: Valeri Astanoff on
On 20 mai, 02:11, Joerg Roesgen <biophys.hers...(a)me.com> wrote:
> Is there any straightforward way to do Orthogonal Distance Regression (ODR) in Mathematica? If not, this would be a nice feature to have in Mathematica.
>
> Background:
>
> Regular nonlinear least square fitting minimizes in one dimension the distance between the data and some function. ODR does it in all dimensions simultaneously. This is desirable e.g. when the data have errors in more than one dimension, or if parametric equations are used. The basic approach is similar to nonlinear least squares, only that one additional step is done in each iteration on top of minimizing the distance between the function and the data. Each data point has a corresponding point on the fit function, and the distance between these two points is minimized by moving the corresponding point along the function. So, if nn is the number of fitting parameters in nonlinear least squares, and nd is the number of data points, then ODR has nn+nd fitting parameters if we are in 2D.
>
> Thanks,
> Joerg

Good day,

Though knowing less than epsilon about "ODR"
(orthogonal distance regression), I had fun coding
this little amateur ODR fitting module
that could be useful to fit implicit curves
to experimental data.
I mimicked the syntax of FindFit, juste adding
an optional list of constraints on parameters
to avoid nonsensical solutions.

In[1]:=
ODRFit[data_List?MatrixQ, expr_, pars_List, vars_List,
constraints_List:{}]:=

Module[{nbVars, coords, nn, slots, fexpr, nbPoints, pts, dx, xn, xx0,
xx,
estimates, deviates, distances, allVars, constr, allConstr, mini},
nbVars = Length[vars]; nn = Range[nbVars];
nbPoints = Length[data];
coords = Table[Unique["x"], {nbVars}];
slots = Array[Slot , {nbVars}];
fexpr = Evaluate[expr /. Thread[vars -> slots]] & ;

Scan[(dx[n_] := dx[n] = (xn = data[[All, n]];
Mean[Rest[xn - RotateRight[xn]]]);
xx0[n_] := xx0[n] = data[[All, n]];
xx[n_] := xx[n] = Array[coords[[n]], nbPoints]), nn];

pts = Array[xx, nbVars] // Transpose;
estimates = Flatten /@ ({pts, (fexpr @@ #) & /@ pts} // Transpose);
deviates = estimates - data;
distances = #.# & /@ deviates;
allVars = Join[Sequence @@ pts, pars];
Do[constr[n] = Thread[xx0[n] - Abs(a)dx[n] <= xx[n] <= xx0[n] +
Abs(a)dx[n]],
{n, 1, nbVars}];
allConstr = DeleteCases[Join[constraints,
Sequence @@ Array[constr, nbVars]], False];
mini = NMinimize[{Tr(a)distances, allConstr}, allVars];
Print["ODR min = ", mini[[1]]];
mini[[2]]
];


Just to test a trivial case :
In[2]:= a x + b /. ODRFit[{{0, 1}, {1, 3}, {2, 5}}, a x + b, {a, b},
{x}]
During evaluation of In[2]:= ODR min = 7.21337*10^-21
Out[2]= 1.+ 2. x

In[3]:= a x + b /. FindFit[{{0, 1}, {1, 3}, {2, 5}}, a x + b, {a, b},
{x}]
Out[3]= 1.+ 2. x



Fitting an implicit parabola (function value = 0):
In[4]:= pointsOffParabola = {{-3, 7.5, 0}, {-2.75, 6, 0},
{-2.6, 8, 0}, {-2.3, 4, 0}, {-2.3, 6.5, 0}, {-1.8, 4.5, 0},
{-1.7, 2, 0}, {-1.3, 1.1, 0}, {-1.3, 2.8, 0}, {-1, 0.3, 0},
{-0.7, 1.1, 0}, {0, -1, 0}, {0, 1, 0}, {0.7, 1.1, 0},
{1, 0.3, 0}, {1.3, 1.1, 0}, {1.3, 2.8, 0}, {1.7, 2, 0},
{1.8, 4.5, 0}, {2.3, 4, 0}, {2.3, 6.5, 0}, {2.6, 8, 0},
{2.75, 6, 0}, {3, 7.5, 0}};

In[5]:= implicit = a x^2 + b y + c;

In[6]:= implicit /. ODRFit[pointsOffParabola, implicit, {a, b, c}, {x,
y} ]
During evaluation of In[6]:= ODR min = 3.56096*10^-22
Out[6]= 1.63375*10^-12 - 3.4106*10^-12 x^2 + 3.04125*10^-12 y
Without any constraint one gets 0 + 0x^2 + 0 y, just like FindFit :

In[7]:= implicit /. FindFit[pointsOffParabola, implicit, {a, b, c},
{x, y} ]
Out[7]= 0.+ 0. x^2 + 0. y

Now, if we add some constraint on a b c we get something more likely :
In[8]:= implicit /. ODRFit[pointsOffParabola,implicit, {a, b, c}, {x,
y},
{a^2 + b^2 + c^2 > 1} ]
During evaluation of In[8]:= ODR min = 2.6778
Out[8]= -0.0997602 + 0.71637 x^2 - 0.690552 y

(roughly the parabola y = x^2, which I used to compute the data)

I'm aware this work is quite tentative, and any critique is welcome.

--
Valeri Astanoff