Help - Search - Members - Calendar
Full Version: JunoCam projection code aimed at mosaicing
Unmanned Spaceflight.com > Outer Solar System > Jupiter > Juno
ramana
Hello!
I wanted to share a code that I've been working on to project and mosaic JunoCam images. The code is written in Python and uses the SPICE library: https://github.com/ramanakumars/JunoCamProjection. I thought people in this forum might be interested in using it or comparing with existing pipelines. The goal of this code is to generate mosaics by stacking multiple images for a given perijove pass with little manual labor. An example of both the projection and the mosaicing process is in the examples folder. Here is an example of a mosaic from PJ27 data:



Ramana
Antdoghalo
That is not bad! Though the color is a little too heavy on the blue, I am not sure if it is the program or the raw images causing that.
mcaplinger
QUOTE (ramana @ Dec 15 2021, 12:08 PM) *
I wanted to share a code that I've been working on to project and mosaic JunoCam images.

Very nice, the code is very clean and readable.

Seems like you have a lot of blending artifacts in the mosaic, only to be expected. I haven't really examined your blending method in detail, it's working really well to hide gross seams but there's feature doubling, I think, at small scales. This is a tough problem and I'm not sure what the best way to do it is that isn't very labor-intensive.
ramana
Thank you!!

QUOTE
Though the color is a little too heavy on the blue, I am not sure if it is the program or the raw images causing that.

Yeah, that is an artifact of the color correction and histogram equalization. The code outputs a "raw" mosaic which goes through a color correction function. I think it should be possible to get better representations by post-processing the raw image directly rather than getting the output from the color correction step. I use the color correction function to de-haze the mosaic since I'm interested in uniform image representations across multiple perijoves, so my use case tends to be less aesthetic.

QUOTE
Seems like you have a lot of blending artifacts in the mosaic, only to be expected.

Yes, and it's much more noticeable near the poles. I've been trying to figure out a way out of this. I fixed a lot of the artifacts a while ago by doing jitter correction (by fitting the limb of Jupiter in each image to find the start time offset). That might not be accurate enough for polar latitudes. I also think it's due to the changing resolution with latitude... I don't have a clear fix for that right now, except for doing lower resolution reconstruction.
mcaplinger
QUOTE (ramana @ Dec 16 2021, 12:06 PM) *
I fixed a lot of the artifacts a while ago by doing jitter correction (by fitting the limb of Jupiter in each image to find the start time offset). That might not be accurate enough for polar latitudes. I also think it's due to the changing resolution with latitude... I don't have a clear fix for that right now, except for doing lower resolution reconstruction.

It's always tough to debug these sorts of things. You might compare what your code produces for a single framelet against what ISIS produces, if you can make the investment in time to install and understand ISIS. I would expect that if you are using reconstructed SPKs and all the correct up-to-date kernel files (spacecraft clock and leapsecond are very important for C kernel use), the residual errors would be maybe 2-3 pixels?

As far as I could tell in a quick skim, you weren't applying the 61.88 msec start time bias described in https://naif.jpl.nasa.gov/pub/naif/JUNO/ker..._junocam_v03.ti but your limb offset should fix that if it's done correctly.
Bjorn Jonsson
At least in my experience, 2-3 pixels is typical for residual errors.

This looks very promising. Do I understand correctly from the code/examples that in the PJ27 mosaic above you are using FFT to do the illumination correction?
ramana
QUOTE
You might compare what your code produces for a single framelet against what ISIS produces

That's a good point! I have worked with the ISIS package (and pysis) before but dropped it because it got a little cumbersome to update kernels. I'll check out what the differences are.

QUOTE
As far as I could tell in a quick skim, you weren't applying the 61.88 msec start time bias

I add this when calculating the spacecraft ET for each framelet. Here, for example: https://github.com/ramanakumars/JunoCamProj...ojector.py#L254. The time_bias is 61.88ms pulled from the SPICE kernel pool.

QUOTE
At least in my experience, 2-3 pixels is typical for residual errors.

This is in the original framelet projection, right? Is this with jitter correction?

QUOTE
Do I understand correctly from the code/examples that in the PJ27 mosaic above you are using FFT to do the illumination correction?

Yup, that's correct.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.