Home

timothyb89

JFFmpeg

JFFmpeg is a Java-based wrapper library for FFmpeg, designed to accommodate video editing in Java applications. It's a part of JMEdit.

It provides a command-based interface to FFmpeg that uses no native libraries and is version independent. In other words, it's intended to be as portable as possible, and only depends on the system having an available FFmpeg install.

The API mainly focuses on the features required by JMEdit (that is, rendering to dumped frames) and ease of use. Who knew that you could render to a video in under 10 lines of code?

:::java
// get am FFmpeg instance
FFmpeg ffmpeg = new FFmpeg();

// get the media file to annotate
MediaFile file = ffmpeg.getMediaFile("/path/to/somevideo.webm");
// dump audio (optional, but probably wanted)
file.dumpAudio();
// dump the frames in the file to disk as jpgs
file.dumpFrames();

// create annotation
Annotation a = new Annotation("Some annotation text here!", file);
// set default values for the style
a.buildStyle().defaults();
// render over the specified range of frames
a.render(file.getFramesAt(1, 6)); // from 1 seconds to 6 seconds

// save as a new file
file.rebuild(new File("/path/to/annotated/file.webm");

A unique feature to JFFmpeg is its advanced 2D rendering and animation support. It includes a lightweight GUI library specifically designed for use in videos and 2D animation. Modeled after Swing, it supports extremely easy keyframe animation of various components and makes custom animations extremely easy. These "components" can be anything: text, images, or even another video entirely. Check out the demos below to see some examples of what sort of animations can be done with JFFmpeg.

Demonstrations

  • Animation of an Image component: Demonstrates keyframe animation of a single component, as well as the built-in easing support.
  • Embedding a Video: Uses the same animations as in the previous video, but uses another video in place of a static image.
  • Seamless Video Embedding: Embeds another video, but automatically filters and removes the background to make the embedding seamless.

For all available demonstrations, take a look at this YouTube playlist

Main Features

  • Can easily get detailed metadata from FFmpeg, including stream info such as video framerate, codec, and format info.
  • Allows developers to create video effects that can easily render onto videos as is they were a normal graphics environment.
  • Easy rendering and animation on top of existing videos
  • Embedding of other videos inside a preexisting video

Are there any limitations?

Yes. Currently it does not cover anywhere near the majority of features available through FFmpeg. There's some other outstanding issues:

  • Video combination: Limited support. Frames can be inserted at any point in a video, however this can/will cause audio issues (see below).
  • Audio manipulation: Currently audio is only preserved. As FFmpeg wasn't really designed for this, it isn't very good at performing some simple tasks, like splicing audio or generating tones/silence/etc. Something like SoX may be used in the future. At present the easiest method is to manually modify the audio track in an editor like Audacity and merge it back with the video again.
  • Special parsers need to be written for separate container formats. Currently supported formats includes WebM videos, and preliminary support for AVI and MP4. Actual video codecs (like vp8, etc) should all work and their info is grabbed from FFmpeg at runtime. On a positive note, writing parsers is trivial if given a couple of example info blocks.

There's plenty of others as well. Essentially, the current (alpha) version does what we need for JMEdit and not much else.