I’ve finished up the color images from the most recent PDS release. They can be found at
www.lyle.org/~markoff or in MMB (update:advanced update images:update images from:lyle.org). Because of changing conditions on Mars during this period of time, I had to adjust my processing to deal with a dominant hue overwhelming the scenes. Here’s a little bit of an explanation of what changed in this release.
As a part of my processing, I have been applying the
CIECAM02 chromatic adaptation transform. This formula applies the white point of the source images to transform the image to the display white point. For this release, I've used a new source white point for each rover, one which better describes the way that our eyes would chromatically adapt to scene if we were there to see it ourselves.
The source white point I used for every previous release was based on the white and gray rings of a sundial shortly after landing. If complete loyalty to the perceptive model were the goal, the source white point should be different each day, even each time of day, but I chose to use just a single white point for all images. The hope is that changes in illumination would be discernable in comparisons of images taken under different conditions. If the chromatic adaptation were done perfectly for each individual image, then it would be harder to tell a dusty day from a clear one because the colors of the scene would be adjusted to remove such lighting effects. My hope was that by showing these changes, one could appreciate the images individually, with regards to the scene itself, and as a collection, to chronicle the changing conditions on Mars over the course of the mission.
Unfortunately, accomplishing both goals became difficult as the dust levels rose. Around sol 350, Opportunity's images started becoming dominated by a red/yellow hue, (assumedly due indirectly to rising dust level and its effect on sky brightness and color) While it may be interesting to look at these images and say "wow, it was getting dusty at this time in the mission", appreciating the content shown in the images was becoming increasingly hard since they were showing less hue variation.
This ‘reddening’ continued, for both rovers, on through this release as well. So, it seemed like a re-assessment of the source white point was in order. It’s hard to trust the calibration target this late into the mission for determining illumination. That would require knowing exactly how much dust has settled on its surface on a given sol. Instead, I used the content of the various scenes themselves to derive the source white point.
This method also allowed another issue to be addressed, which the original calibration target-derived white point did not. The traditional treatment of chromatic adaptation fundamentally attempts to adjust images so that illumination effects are removed (white reflectors are always white no matter what the lighting). Recent models try more specifically to maintain human perception from the scene to the display environment. In reality, our brains have no idea what the illumination of a scene actually is, nor do they know what objects are white reflectors. The content of the scene itself determines how our brains chromatically adapt. This means that a scene which is dominated by a single hue will cause our brains to chromatically adapt even under perfect lighting. Mars is the case in point for this distinction. On Earth, landscape scenes often contain a wide variety of hues, which makes the lighting the most important determinant in how our brains will adapt to the scene. On Mars, the color of the dust dominants every scene, indirectly in that the dust-laden sky is illuminating shadowed areas but also directly in that every way you could possibly look you will see the color of dust. So, by deriving the white point from the scenes themselves, I hope to better represent the actual perception of the scenes if we were there to see them ourselves.