From: Yngve Munck-Lindblom on
I have a 3D magnetic resonance image (192x256x256 voxels) made up of voxels for which I know the coordinates of each voxel in some reference coordinate system (RCS). Further I have a smaller 3D image (10x10x30 voxels) for which I know the voxel coordinates in the same RCS. The two images may have an arbitrary orientation with respect to each other and are both rotated with respect to the RCS axes.

What I want to do is to reslice the large image along the directions of the small image, i.e. find what the large image looks like within the smaller image and with respect to the coordinate system of the small image.

I have been playing around with interpolation using interp3, but I'm not quite sure if this is a good approach. Especially I'm afraid that it is way too slow (I know that a few faster routines are available in the file-exchange). General advice here is appreciated.

But a specific problem: The image frame coordinates are projected onto the RCS axes and by this they lose the perfect linearity normally gained by using linspace making interp3 complaining about my coordinates not being produced by meshgrid. They're very close to being linear (only differences in the last three decimals for double precision numbers). It's yet another numerical thing.

How to resolve this issue? (truncation, relinearizing, etc...)?
From: Matt J on
"Yngve Munck-Lindblom" <yngvechr(a)fys.ku.dk> wrote in message <hvkqa1$n96$1(a)fred.mathworks.com>...

> But a specific problem: The image frame coordinates are projected onto the RCS axes and by this they lose the perfect linearity normally gained by using linspace making interp3 complaining about my coordinates not being produced by meshgrid. They're very close to being linear (only differences in the last three decimals for double precision numbers). It's yet another numerical thing.
===================

Sounds like you are passing your coordinates to the wrong function arguments. When you call interp3 as follows

interp3(X,Y,Z,V, XI,YI,ZI)

it is X,Y,Z that must be plaid/monotonic, as if produced by meshgrid. However, your projected coordinates should be passed to XI,YI,ZI. There is no such requirement on them.
From: Roger Stafford on
"Yngve Munck-Lindblom" <yngvechr(a)fys.ku.dk> wrote in message <hvkqa1$n96$1(a)fred.mathworks.com>...
> I have a 3D magnetic resonance image (192x256x256 voxels) made up of voxels for which I know the coordinates of each voxel in some reference coordinate system (RCS). Further I have a smaller 3D image (10x10x30 voxels) for which I know the voxel coordinates in the same RCS. The two images may have an arbitrary orientation with respect to each other and are both rotated with respect to the RCS axes.
>
> What I want to do is to reslice the large image along the directions of the small image, i.e. find what the large image looks like within the smaller image and with respect to the coordinate system of the small image.
>
> I have been playing around with interpolation using interp3, but I'm not quite sure if this is a good approach. Especially I'm afraid that it is way too slow (I know that a few faster routines are available in the file-exchange). General advice here is appreciated.
>
> But a specific problem: The image frame coordinates are projected onto the RCS axes and by this they lose the perfect linearity normally gained by using linspace making interp3 complaining about my coordinates not being produced by meshgrid. They're very close to being linear (only differences in the last three decimals for double precision numbers). It's yet another numerical thing.
>
> How to resolve this issue? (truncation, relinearizing, etc...)?
- - - - - - - - - -
This sounds very much like a problem for the matlab statistics toolbox 'procrustes' function. This has been discussed in many threads on this newsgroup in the past. For example, at:

http://www.mathworks.com/matlabcentral/newsreader/view_thread/169096

Here is one sentence from the Mathworks' description of this function: "d = procrustes(X,Y) determines a linear transformation (translation, reflection, orthogonal rotation, and scaling) of the points in matrix Y to best conform them to the points in matrix X."

Since both your images are mapped onto a common coordinate system, you appear to have an approximate correspondence among the 10x10x10 points of the small image to a like number in the large image as would be needed in procrustes for determining the transformation you need.

There may be a need here for interp3 in making this correspondence more accurate. That is, you may want to use interpolation to find a set of hypothetical 10x10x10 points in the large image that best correspond to those in the small image.

Roger Stafford
From: Yngve Munck-Lindblom on
"Matt J " <mattjacREMOVE(a)THISieee.spam> wrote in message <hvleod$cvc$1(a)fred.mathworks.com>...
> "Yngve Munck-Lindblom" <yngvechr(a)fys.ku.dk> wrote in message <hvkqa1$n96$1(a)fred.mathworks.com>...
>
> > But a specific problem: The image frame coordinates are projected onto the RCS axes and by this they lose the perfect linearity normally gained by using linspace making interp3 complaining about my coordinates not being produced by meshgrid. They're very close to being linear (only differences in the last three decimals for double precision numbers). It's yet another numerical thing.
> ===================
>
> Sounds like you are passing your coordinates to the wrong function arguments. When you call interp3 as follows
>
> interp3(X,Y,Z,V, XI,YI,ZI)
>
> it is X,Y,Z that must be plaid/monotonic, as if produced by meshgrid. However, your projected coordinates should be passed to XI,YI,ZI. There is no such requirement on them.

I think you're right. The projected coordinates were just so close to being plaid/monotonic as well that it seemed like an easier task to make this small correction. Now I'm going for the solution of solving the problem from the coordinate system of the large image. That implies a change of basis for my vectors describing the small image.
From: Yngve Munck-Lindblom on
"Roger Stafford" <ellieandrogerxyzzy(a)mindspring.com.invalid> wrote in message <hvlhm6$ber$1(a)fred.mathworks.com>...
> "Yngve Munck-Lindblom" <yngvechr(a)fys.ku.dk> wrote in message <hvkqa1$n96$1(a)fred.mathworks.com>...
> > I have a 3D magnetic resonance image (192x256x256 voxels) made up of voxels for which I know the coordinates of each voxel in some reference coordinate system (RCS). Further I have a smaller 3D image (10x10x30 voxels) for which I know the voxel coordinates in the same RCS. The two images may have an arbitrary orientation with respect to each other and are both rotated with respect to the RCS axes.
> >
> > What I want to do is to reslice the large image along the directions of the small image, i.e. find what the large image looks like within the smaller image and with respect to the coordinate system of the small image.
> >
> > I have been playing around with interpolation using interp3, but I'm not quite sure if this is a good approach. Especially I'm afraid that it is way too slow (I know that a few faster routines are available in the file-exchange). General advice here is appreciated.
> >
> > But a specific problem: The image frame coordinates are projected onto the RCS axes and by this they lose the perfect linearity normally gained by using linspace making interp3 complaining about my coordinates not being produced by meshgrid. They're very close to being linear (only differences in the last three decimals for double precision numbers). It's yet another numerical thing.
> >
> > How to resolve this issue? (truncation, relinearizing, etc...)?
> - - - - - - - - - -
> This sounds very much like a problem for the matlab statistics toolbox 'procrustes' function. This has been discussed in many threads on this newsgroup in the past. For example, at:
>
> http://www.mathworks.com/matlabcentral/newsreader/view_thread/169096
>
> Here is one sentence from the Mathworks' description of this function: "d = procrustes(X,Y) determines a linear transformation (translation, reflection, orthogonal rotation, and scaling) of the points in matrix Y to best conform them to the points in matrix X."
>
> Since both your images are mapped onto a common coordinate system, you appear to have an approximate correspondence among the 10x10x10 points of the small image to a like number in the large image as would be needed in procrustes for determining the transformation you need.
>
> There may be a need here for interp3 in making this correspondence more accurate. That is, you may want to use interpolation to find a set of hypothetical 10x10x10 points in the large image that best correspond to those in the small image.
>
> Roger Stafford

This is a very interesting function. However, I can see that my problem description was incomplete in some respects. My tasks are related to MRI and MRS (magnetic resonance spectroscopy) and the problem of aligning/coregistering images to each other is quite common. One good tool for this is the SPM toolbox.

The bad news for me is that SPM probably can't do all the things I need, but still I'm using a strange combination of matlab and spm to do the job. SPM has found the correct relative position of the images, I only need to cut out slices at the right angles. You might wonder why I don't coregistrate with the small image as the reference for the final directions but this has to do with other issues related to the way I receive data from the MR scanner.

Anyway, procrustes might be handy another time.