QUOTE (mcaplinger @ May 20 2016, 06:31 PM)
Can be done for the engineering cameras, does not exist for Mastcam/MAHLI/MARDI because the amount of smear is vastly smaller with their interline sensors than for the frame transfer sensors of the engineering cameras.
That explains a lot. Thanks!
I'd note that, with a Bayer pattern, even a little smear can make good debayering challenging. Start really saturating red pixels here and there and it can make it a real pain to generate a good green channel.
Then again, my primary experience with that issue was trying to process the descent frames, and the 0-second exposures I wished for above wouldn't have worked for them anyway, due to the motion.
That makes me think of another wishlist item, though for PDS, not for the dailies.
I don't
fully understand what goes into producing the ground calibration files, but the flat IMGs' data doesn't take the same format as the EDRs', so it seems to me that at some point prior to what's published, there must have been some set of raw sensor data that was processed to produce the flat IMG. It looked like the data in the flat IMG had undergone some sort of desmear, but I couldn't find information as to how exactly they produced the file. I remember finding stuff about 'we imaged a uniform sphere at such and such' or whatever, but not what steps had occurred to process raw sensor data into the published IMG. I believed I had a way to perform a better desmear, but if the published flat IMG had already had a desmear performed on the source data that was used to created it, then I couldn't even really test my theory unless I managed to un-desmear the flat file, which I actually tried to do for a while before just giving up on the whole thing. It's more than a little possible I'd chased a school of red herring far into troubled waters, but at any rate, for future missions, it would be nice to have the raw data that was used to produce the flat IMGs and some explanation of the processing that was done to produce it.
(I'm so needy, I know)QUOTE (pgrindrod @ May 23 2016, 05:22 AM)
I should say that I was thinking about the daily release images - the PDS (or other archive) will have its own standards and deliverables, this idea was just related to the daily images in order to maximise what people might be able to do before archived data are available if they have the right information.
In that case, two things come to mind.
1. LBLs for the dailies would be very helpful. I could have sworn there was a time when I could see LBLs somewhere for the dailies. .... Looking into it a little, it looks like I can for some cameras, just not the Malin cameras. Using the JSON API, I
can still get timestamps, CAHV(OR(E)) data, and the vector and quaternion from the site frame to the rover nav frame for
most of those, which is a lot better than nothing. If they threw solar elevation and azimuth in there, it would save me a lot of time. The JSON API started doing this thing a while back where it starts giving you 403s (Forbidden) if you go back more than a few dozen sols, which is unfortunate; it would be nice if it at least covered the range that hasn't made it into PDS yet, though frankly, I'm just thankful it exists at all.
2. PNGs (I can always dream). I guess what my dream list for dailies boils down to is... as close to what's going to be in PDS as they're willing to give us.
I guess a third thing that comes to mind would be the most recently measured optical depth, which would be helpful for certain applications. I guess publishing that with the dailies would probably rob some people of some papers, so I'll not hold my breath for that; I don't think manual respiration is possible while you're dreaming, anyway.