Help - Search - Members - Calendar
Full Version: Undistorting Hazcam images?
Unmanned Spaceflight.com > Mars & Missions > Past and Future > MER > Tech, General and Imagery
lupine
Hi,

I'm new here, but in my lurking I've seen a lot of great stuff, and it looks like this might be the right place to ask this question.

I'm using Gennery's CAHVORE method in some Computer Vision research with a wide-angle lens. Performing the calibration and using the parameters in my work is not a problem, but I'd like to perform an undistortion of the images. I've found some vague details while looking up the Hazcam info, but there's not enough concrete details in it to work it out.

I was wondering whether anybody here knows how to linearize/undistort Hazcam imagery.

Any help would be greatly appreciated.
algorimancer
I have some experience using the CAHVOR (not CAHVORE) model to convert pixels to real-space vectors (and the reverse), which requires both the CAHVOR model and its inverse. The inverse for the CAHVORE model is a bit tricky to implement, as I recall, and I decided (in the context of developing my rangefinder/photogrammetry tool http://www.clarkandersen.com/RangeFinder.htm) that it was more trouble than it was worth, as the hazcams don't show much of interest. If I recall correctly, linearizing would be a matter of converting pixels to vectors, then projecting the vectors onto a plane orthogonal to the camera sight-vector. I've messed-about with projecting pixels onto the ground plane, but I'm sure that there must be a better technique than the one I used. Let me know if I can be of assistance.
djellison
It doesn't help you that much in terms of the work flow - but as a reference point - all the images that get dumped onto the PDS are provided in an undistorted way as well....

http://anserver1.eprsl.wustl.edu/navigator...FF69GJP1275L0M1

for example.


Doug
algorimancer
Perhaps this will be helpful.

http://www-mipl.jpl.nasa.gov/vicar/vicar29...p/marscahv.html
Ant103
You can use IRIS software. At the base, it's a powerfull software for astronomic imaging processing, but, it have some option, and, in our case, rectilinearization of fish-eye-like images.
http://www.astrosurf.com/buil/us/iris/iris.htm
And the examples :
http://www.astrosurf.com/buil/iris/new530/new530_us.htm

It exist also Gimp plug-in for made-it. I think that there are on the Panorama Tools.
lupine
Thanks for the replies so far.

algorimancer: Yes, that's about the most useful document I've found so far too. I think I'm starting to understand the geometry of the system, but I'll need to have a play with it later to see if I really understand it. I've seen hints in various places that the actual distortion correction performed is only an approximation designed for 3D vision, not a "rigorous" method like I'm after.

It took me the best part of several months to implement and verify the calibration method in Matlab, so I imagine it would be a pretty significant undertaking for a more "traditional" language like C++ (I've thought about porting my code, but there doesn't seem to be any real reason to). The reference text for the least squares algorithm (Mikhail's "Observations and Least Squares") is pretty complex too, and it doesn't match up too well with the CAHVORE paper (there's a few additions that didn't make sense at first).

Ant103: Thanks for the link. I'd discovered this a while back now too through these forums, but I had to discard it pretty quickly, since I work on a Mac and would prefer not to need external tools to convert our images.

I had been thinking that I should just continue the projection of each pixel's incoming ray through to the image plane, which should give the desired image results ("straight lines must be straight"). I guess I can try both these methods out, and compare them to an existing undistortion algorithm that works on an entirely different principle.

I'm starting with some actual camera calibration this week (after many simulations), and I'll be sure to report back here if I discover anything that might be of interest. I'm also happy to talk about the implementation of the calibration method if anybody is interested.

Cheers!
lupine
I thought I'd come back and give a quick update as to what I found.

I can now successfully undistort HazCam images using a slight variation on the methods given by Gennery in his paper and the help from the MARSCAHV program. At the moment, it projects points to and from a planar surface, rather than the ovoid shape that's mentioned. I had the most difficulty trying to compute theta (the angle from the shifted optical centre c' to the actual point in 3D space) -- in the end, I decided to discard this and use the initial approximation for theta, since the Newton's Method solution doesn't seem to want to converge for me for all points (I haven't checked yet, but I suspect it's points towards the edges of the image that are causing the most problems).

You can see the results of this in the image below:


The implementation is reasonably simple (and surprisingly quick for 1024x1024 pixel destination images):
  • For each pixel in your corrected image, calculate the vector r from section 6.2
  • Then, compute the corresponding pixel coordinate in the source image using the calculations for x_hat and y_hat in sections 5 and 3.
  • You can then bilinearly interpolate to get the pixel value in the new image.
For me, these results are good enough -- I'm using them primarily to compare with another calibration method. Hopefully this might be of some use for others who are interested in using the HazCam images -- apparently it's necessary for the stereo depth computations...
Stu
Well done lupine, your hard work and dedication sure looks to have paid off smile.gif
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.