Help - Search - Members - Calendar
Full Version: Butterflow Algorithm - Motion Interpolation
Unmanned Spaceflight.com > EVA > Image Processing Techniques
scalbers
Greetings,

I just downloaded Dan Delany's Butterflow software package from the Himawari forum thread: https://github.com/dthpham/butterflow

I'm wondering a bit on how to use it, in the sense of methods of input. Here are a few random questions.

1) Can it input a directory of image frames?

2) Can the associated python code be called from other language interfaces, such as IDL and FORTRAN?

I'll take a look in the meantime at Dan's Javascript interface: https://github.com/dandelany/animate-earth/...ipeline/main.js

Thanks,
JohnVV
just now reading the code

starting with the setup.py

the very first "help desk " set of questions are normally ........
What is your Operating system ?
and what version is it ?
and What exactly is the problem ?

do you have the prerequisites installed
looks like OpenCV and ffmpeg needed
python tools like "numpy"

in "butterflow-master/docs/"

the "Install-From-Source-Guide.md" text file has the build instructions
at the END of the file is a option to run a TEST
QUOTE
### Testing
You can run a suite of tests against Butterflow to ensure everything is set up
properly with `python setup.py test` or just `nosetests` or `nosetests2` if you
have [nose](https://nose.readthedocs.org/en/latest/) installed. Tests will fail
if you don't have OpenCL set up.


there is also the "Example-Usage.md" in this same folder
it has as the first thing -- help
CODE
butterflow -h


but as to #1 ? it looks like a input video is needed
ffmpeg can convert a folder of images into a ogv or mpeg
Dan Delany
Hi! Just to clarify - Butterflow isn't my code, it's an open source tool someone else (dthpham) wrote, my only contributions thus far have been bug reports smile.gif The scripts in my animate-earth repository you linked to just construct parameters for it and call it as one step of a pipeline. Anyway, to answer your questions:

#1 As JohnVV mentioned, it only takes video files - but it's easy to make a lossless video file from a folder of jpg's with ffmpeg:

CODE
ffmpeg -framerate 30 -pattern_type glob -i 'imgDir/*.jpg' -c:v libx264 -preset ultrafast -qp 0 -r 30 -pix_fmt yuv420p output.mp4


imgDir is your input path, and you can change the '30's to the desired framerate of your video - though it doesn't matter too much since Butterflow can change the playback rate & framerate as well. Note that this is optimized for speed, not file size, so files will be very big - you can replace 'ultrafast' with 'veryslow' to save space at the expense of time.

As for #2, Butterflow is strictly a command line tool which takes a video file and outputs an interpolated video file. So if your favorite scripting language can run a shell command, you can use it in that manner (as I am doing). However, it's not really a library/API with bindings to other languages. But there are a couple possibilities if you want to explore this route. One is to fork Butterflow and modify it to do what you want it to do - it's an open source git repository.

Another is to go deeper use the underlying algorithm in your own code instead. Butterflow is mostly just a wrapper for OpenCV's implementation of the Farneback optical flow algorithm. It does all the annoying steps of decoding/encoding the video file and passing each pair of frames to the function with the right parameters, etc. But if you wanted to handle this yourself, you could write your own wrapper around this implementation, which is also open source. OpenCV is written in C++ and has bindings for Java, Python and Matlab, with some community support of varying quality for C#, Ruby, Perl, and Javascript. No IDL or Fortran though, I'm afraid.

For experimenting with Butterflow, I definitely recommend reading all the docs in this folder and following the "Install from Source Guide" rather than using any kind of package manager. FFMpeg and OpenCL are the main external dependencies. I have had success installing it on newer Macs as well as Ubuntu 15.04.

And as for my scripts, there's a lot there you probably don't need, but the main "interesting" thing I'm doing is accounting for missing images in the Himawari data. Butterflow can take a parameter which is a list of segments that should be interpolated at different rates - so I look through each day's images, look for gaps, and construct a segment parameter for Butterflow that interpolates more frames during the gaps, according to how many frames are missing. I'm using a pretty rudimentary method which only works because I get data that is always timestamped neatly on the :10's, even after missing frames, so it's easy math to figure out how many frames are missing/expected. It wouldn't be too hard to generalize this to account for "messy" timed datasets, which may have a good average framerate, but high variance in the timing between the frames.

Anyway, hope that helps you and any others interested in giving it a try. Happy to answer more questions if you have them.
scalbers
As I sort of understand this, the "lossless" video file would still play back with the colorspace being sub-sampled. To play back a sequence of frames losslessly we might need a javascript HTML webpage set up? This may be a good topic for another thread.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.