This stuff is still in a rather early stage of development (or should I say "research"), but might already be useful in some use cases.
You should be able to get the latest version like this:
git clone git://github.com/guijemont/GstStabilizer.git
This software is distributed under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
A copy of the GNU Lesser General Public License, and of the GNU General Public License (to which it refers) respectively in the files COPYING.LESSER and COPYING.
- Python (only tested with 2.7) http://python.org/
- GStreamer 0.10 (only tested with rather recent versions, what a recent distro has should do) http://gstreamer.freedesktop.org/
- gst-python (same as above) http://gstreamer.freedesktop.org/
- OpenCV >= 2.1.0, with the "new style"
cv2python bindings compiled http://opencv.willowgarage.com/
For the elements to be recognized, you need to include in
the directory where you checked out GstStabilizer (that is, the directory that
python directory and not the
python directory itself).
gst-launch filesrc location=<my_shaky_video> ! decodebin ! tee name=tee \ tee. ! ffmpegcolorspace ! opticalflowfinder ! opticalflowrevert name=mux \ tee. ! ffmpegcolorspace ! mux. \ mux. ! ffmpegcolorspace ! xvimagesink
Note that depending on the video and the options you give to
opticalflowfinder, live stabilisation might not always be doable. If it's
too laggy, your probably want to encode and save the stream instead of sending
it to a visualisation sink.
You want to have a look at the myriad of options that can be set in
The most important of them is the algorithm, the two currently implemented are:
- The faster one, good for typical video streams where there is little change from one frame to the next.
- Slower, but can handle big changes from one frame to the next. Very useful for time lapses taken from a moving camera (specially developed for a time lapse from a tethered helium balloon, see http://balloonfreaks.mooo.com/).
- Only works if the original stream always points towards the same area of interest, i.e. does not support voluntary camera movements, they are all recognized as "shake" that should be compensated.
- Slower than what it could actually be