QUOTE (propguy @ Jul 5 2016, 12:37 AM)
![*](http://www.unmannedspaceflight.com/style_images/ip.boardpr/post_snapback.gif)
Kudos to Mike Caplinger and all of MSS for that approach movie.
Thank you. There hasn't been much discussion of how the movie was made. We took highly compressed RGB images once every 15 minutes for 17 days (every 30 minutes on day 1), from 12 June to 29 June, with a few multihour gaps. The decompressed and dark-subtracted images were processed through a pipeline I wrote in Python using the OpenCV toolkit, which finds the planet in each color band, subpixel registers the colors to each other, rotates the image to north up, attempts to mask out the planet and then stretches the background harder so that the moons are visible, and then composites everything together. (No spacecraft attitude telemetry was used because we weren't sure when the C kernels would be available.) Images where the planet was split across filter boundaries had to be fixed manually using a GUI I hacked together. Those frames were then handed off to my colleague Mike Ravine, who laboriously fixed all of the remaining stray light, noise pixels, color misregistration, etc by hand. Those were handed off to JPL for production.
Sorry about the lack of release of the raw data. That decision was made above the pay grade of anybody at MSSS.