Prev: GUI verification
Next: surface area calculation
From: Tim on 15 Mar 2010 15:23 Ok here is the situation. I am trying to perfectly match up some aerial photographs with each other and google earth so that a small group of pixels can be compared day-to-day, month-to-month. The problem is that the airplanes don't fly in perfectly straight lines or with the perfectly stable elevation, so the pictures can be distorted. The photos also contain hyper-spectral information (why I need MATLAB), so I would like each pixel to stay in tact. My theory is to divide each image into columns and then to shift each column to align with the other columns. The problem is that I've never done any image working in MATLAB before. I'm not really looking for you to write code for me (although that would be great if you did), but more of a theoretical "how would you go about this sort of problem?" kind of input. I posted some pictures so you can get a better idea of what I am talking about. Please help. Google Earth (Reference for alignment): http://i793.photobucket.com/albums/yy215/hursty_4/1.jpg?t=1268679743 My aerial photograph (distorted): http://i793.photobucket.com/albums/yy215/hursty_4/2.jpg?t=1268679741 My theory to match them up (dividing into columns, 1 to a few pixels wide, then shifting each column to match up): http://i793.photobucket.com/albums/yy215/hursty_4/3.jpg?t=1268680711
From: Ashish Uthama on 15 Mar 2010 15:39 On Mon, 15 Mar 2010 15:23:11 -0400, Tim <thurst(a)remove.this.iastate.edu> wrote: > Ok here is the situation. > I am trying to perfectly match up some aerial photographs with each > other and google earth so that a small group of pixels can be compared > day-to-day, month-to-month. The problem is that the airplanes don't fly > in perfectly straight lines or with the perfectly stable elevation, so > the pictures can be distorted. The photos also contain hyper-spectral > information (why I need MATLAB), so I would like each pixel to stay in > tact. My theory is to divide each image into columns and then to shift > each column to align with the other columns. The problem is that I've > never done any image working in MATLAB before. I'm not really looking > for you to write code for me (although that would be great if you did), > but more of a theoretical "how would you go about this sort of problem?" > kind of input. > I posted some pictures so you can get a better idea of what I am talking > about. > Please help. > > > Google Earth (Reference for alignment): > http://i793.photobucket.com/albums/yy215/hursty_4/1.jpg?t=1268679743 > > My aerial photograph (distorted): > http://i793.photobucket.com/albums/yy215/hursty_4/2.jpg?t=1268679741 > > My theory to match them up (dividing into columns, 1 to a few pixels > wide, then shifting each column to match up): > http://i793.photobucket.com/albums/yy215/hursty_4/3.jpg?t=1268680711 These might help: http://www.mathworks.com/products/image/demos.html (scroll down to 'image registration') Its not clear to me what you mean by 'pixel to stay in tact', maybe using 'nearest' neighbor interpolation is what you are looking for?
From: Tim on 15 Mar 2010 15:50 > These might help: > http://www.mathworks.com/products/image/demos.html (scroll down to 'image > registration') > > Its not clear to me what you mean by 'pixel to stay in tact', maybe using > 'nearest' neighbor interpolation is what you are looking for? Just realized that I had a spelling error in the title.... *alignment. Anyway, what I meant by having each pixel stay in tact, was that rotating the image or doing anything that will blend 2 pixels together would mess up my data because each pixel has a spectrum associated with it also. So if I did some image adjustment that severely altered a pixel's orientation or anything like that, the corresponding spectrum wouldn't work out so well. That is why I thought that I should stick to just translating pixels in 1 direction at a time.
From: ImageAnalyst on 15 Mar 2010 16:03 Tim: Each spectral measurement is just another monochrome image to be warped. You may have slight changes to the spectrum at non-integer pixel locations but I doubt they'd be severe. I think they'd be slight. Plus, how would you know anyway" You don't have data for that location and so the warping/interpolation is as good a guess as any.
From: Tim on 15 Mar 2010 17:36
So do you think that my approach sounds like it will work? And I know that each wavelength would make its own monochrome image also... but I was unsure of how to shift the hyperspectral regions at the same time as the visible spectrum picture without using MATLAB. I was just trying to anticipate someone responding with "use photoshop, not MATLAB" or something along those lines. ImageAnalyst <imageanalyst(a)mailinator.com> wrote in message <87a943ee-8a70-4303-9a56-e26103f6a083(a)o30g2000yqb.googlegroups.com>... > Tim: > Each spectral measurement is just another monochrome image to be > warped. You may have slight changes to the spectrum at non-integer > pixel locations but I doubt they'd be severe. I think they'd be > slight. Plus, how would you know anyway" You don't have data for > that location and so the warping/interpolation is as good a guess as > any. |