Capture Live Video Feeds from JMonkeyEngine


Written by:

Robert McIntyre

1 The Problem

So you've made your cool new JMonkeyEngine3 game and you want to create a demo video to show off your hard work. Screen capturing is the most straightforward way to do this, but it can slow down your game and produce low-quality video as a result. A better way is to record a video feed directly from the game while it is running.

In this post, I'll explain how you can alter your JMonkeyEngine3 game to output video while it is running. The main trick is to alter the pace of JMonkeyEngine3's in-game time: we allow the engine as much time as it needs to compute complicated in-game events and to encode video frames. As a result, the game appears to speed up and slow down as the computational demands shift, but the end result is perfectly smooth video output at a constant framerate.

2 Video recording requires a steady framerate

2.1 The built-in Timer rushes to keep up.

Standard JME3 applications use a Timer object to manage time in the simulated world. Because most JME3 applications (e.g. games) are supposed to happen “live”, the built-in Timer requires simulated time to match real time. This means that the application must rush to finish all of its calculations on schedule: the more complicated the calculations, the more the application is obligated to rush. And if the workload becomes too much to handle on schedule, Timer forces the application to cut corners: it demands fast, approximate answers instead of careful, accurate ones. Although this policy sometimes causes physically impossible glitches and choppy framerates, it ensures that the user will never be kept waiting while the computer stops to make a complicated calculation.

Now, the built-in Timer values speed over accuracy because real-time applications require it. On the other hand, if your goal is to record a glitch-free video, you need a Timer that will take its time to ensure that all calculations are accurate, even if they take a long time. In the next section, we will create a new kind of Timer—called IsoTimer—which slows down to let the computer finish all its calculations. The result is a perfectly steady framerate and a flawless physical simulation.

2.2 IsoTimer records time like a metronome

The easiest way to achieve this special timing is to create a new timer that always reports the same framerate to JME3 every time it is called.


package com.aurellem.capture;

import com.jme3.system.Timer;

 * A standard JME3 application that extends SimpleApplication or
 * Application tries as hard as it can to keep in sync with
 * user-time. If a ball is rolling at 1 game-mile per game-hour in the
 * game, and you wait for one user-hour as measured by the clock on
 * your wall, then the ball should have traveled exactly one
 * game-mile. In order to keep sync with the real world, the game
 * throttles its physics engine and graphics display. If the
 * computations involved in running the game are too intense, then the
 * game will first skip frames, then sacrifice physics accuracy. If
 * there are particularly demanding computations, then you may only
 * get 1 fps, and the ball may tunnel through the floor or obstacles
 * due to inaccurate physics simulation, but after the end of one
 * user-hour, that ball will have traveled one game-mile.
 * When we're recording video or audio, we don't care if the game-time
 * syncs with user-time, but instead whether the time in the recorded
 * video (video-time) syncs with user-time. To continue the analogy,
 * if we recorded the ball rolling at 1 game-mile per game-hour and
 * watched the video later, we would want to see 30 fps video of the
 * ball rolling at 1 video-mile per user-hour. It doesn't matter how
 * much user-time it took to simulate that hour of game-time to make
 * the high-quality recording.  If an Application uses this IsoTimer
 * instead of the normal one, we can be sure that every call to
 * simpleUpdate, for example, corresponds to exactly (1 / fps) seconds
 * of game-time. This lets us record perfect video and audio even on
 * a slow computer.
 * @author Robert McIntyre

public class IsoTimer extends Timer {

        private long framerate;
        private int ticks;

        public IsoTimer(float framerate){
                this.framerate = (long) framerate;
                this.ticks = 0;

        public long getTime() {
                return ticks;

        public long getResolution() {
                return framerate;

        public float getFrameRate() {
                return framerate;

        public float getTimePerFrame() {
                return (float) (1.0f / framerate);

        public void update() {this.ticks++;}

        public void reset() {this.ticks = 0;}


If an Application uses this IsoTimer instead of the normal one, we can be sure that every call to simpleUpdate, for example, corresponds to exactly \((\frac{1}{fps})\) seconds of game-time.

3 VideoRecorder manages video feeds in JMonkeyEngine.

3.1 AbstractVideoRecorder provides a general framework for managing videos.

Now that the issue of time is solved, we just need a function that writes each frame to a video. We can put this function somewhere where it will be called exactly once per frame.

The basic functions that a VideoRecorder should support are recording, starting, stopping, and possibly a cleanup step where it finalizes the recording (e.g. by writing headers for a video file).

An appropriate interface describing this behavior could look like this:



import java.awt.image.BufferedImage;

public interface VideoRecorder{

         * Write this image to video, disk, etc.
         * @param image the image to write
        void record(BufferedImage image);
         * Stop recording temporarily.  The recording can be started again
         * with start()
        void pause();
         * Start the recording.
        void start();
         * Closes the video file, writing appropriate headers, trailers, etc.
         * After this is called, no more recording can be done.
        void finish();  

JME3 already provides exactly the class we need: the SceneProcessor class can be attached to any viewport and the methods defined therein will be called at the appropriate points in the rendering process.

However, it is also important to properly close the video stream and write headers and such, and even though SceneProcessor has a .cleanup() method, it is only called when the SceneProcessor is removed from the RenderManager, not when the game is shutting down when the user pressed ESC, for example. To obtain reliable shutdown behavior, we also have to implement AppState, which provides a .cleanup() method that is called on shutdown.

Here is an AbstractVideoRecorder class that takes care of the details of setup and teardown.



import java.awt.image.BufferedImage;
import java.nio.ByteBuffer;

import com.jme3.renderer.Camera;
import com.jme3.renderer.RenderManager;
import com.jme3.renderer.ViewPort;
import com.jme3.renderer.queue.RenderQueue;
import com.jme3.texture.FrameBuffer;
import com.jme3.util.BufferUtils;
import com.jme3.util.Screenshots;

 * <code>VideoRecorder</code> copies the frames it receives to video. 
 * To ensure smooth video at a constant framerate, you should set your 
 * application's timer to a new <code>IsoTimer</code>.  This class will 
 * auto-determine the framerate of the video based on the time difference 
 * between the first two frames it receives, although you can manually set 
 * the framerate by calling <code>setFps(newFramerate)</code>.  Be sure to 
 * place this processor *after* any other processors whose effects you want 
 * to be included in the output video. You can attach multiple 
 * <code>VideoProcessor</code>s to the same <code>ViewPort</code>.
 * For example,
 * <code>
 * someViewPort.addProcessor(new VideoProcessor(file1));
 * someViewPort.addProcessor(someShadowRenderer);
 * someViewPort.addProcessor(new VideoProcessor(file2));
 * </code>
 * will output a video without shadows to <code>file1</code> and a video 
 * with shadows to <code>file2</code>
 * @author Robert McIntyre

public abstract class AbstractVideoRecorder 
        implements SceneProcessor, VideoRecorder, AppState{

        final File output;
        Camera camera;
        int width;
        int height;
        String targetFileName;
        FrameBuffer frameBuffer;
        Double fps = null;
        RenderManager renderManager;
        ByteBuffer byteBuffer;
        BufferedImage rawFrame;
        boolean isInitilized = false;
        boolean paused = false;
        public AbstractVideoRecorder(File output) throws IOException {
                this.output = output;
                this.targetFileName = this.output.getCanonicalPath();   
        public double getFps() {return this.fps;}
        public AbstractVideoRecorder setFps(double fps) {
                this.fps = fps;
                return this;
        public void initialize(RenderManager rm, ViewPort viewPort) {
                Camera camera = viewPort.getCamera();
                this.width = camera.getWidth();
                this.height = camera.getHeight();
                rawFrame = new BufferedImage(width, height, 
                byteBuffer = BufferUtils.createByteBuffer(width * height * 4 );
                this.renderManager = rm;
                this.isInitilized = true;

        public void reshape(ViewPort vp, int w, int h) {}
        public boolean isInitialized() {return this.isInitilized;}

        public void preFrame(float tpf) {
                if (null == this.fps){
                        this.setFps(1.0 / tpf);}
        public void postQueue(RenderQueue rq) {}

        public void postFrame(FrameBuffer out) {
                if (!this.paused){
                        renderManager.getRenderer().readFrameBuffer(out, byteBuffer);
                        Screenshots.convertScreenShot(byteBuffer, rawFrame);
        public void cleanup(){
        public void pause(){
                this.paused = true;
        public void start(){
                this.paused = false;

        // methods from AppState
        public void initialize(AppStateManager stateManager, Application app) {}

        public void setEnabled(boolean active) {
                if (active) {this.start();}
                else {this.pause();}

        public boolean isEnabled() {
                return this.paused;

        public void stateAttached(AppStateManager stateManager) {}

        public void stateDetached(AppStateManager stateManager) {

        public void update(float tpf) {}        
        public void render(RenderManager rm) {}
        public void postRender() {}

3.2 There are many options for handling video files in Java

If you want to generate video from Java, a great option is Xuggle. It takes care of everything related to video encoding and decoding and runs on Windows, Linux and Mac. Out of all the video frameworks for Java, I personally like this one the best.

Here is a VideoRecorder that uses Xuggle to write each frame to a video file.


// package;

// // import java.awt.image.BufferedImage;
// // import;
// // import;
// // import java.util.concurrent.TimeUnit;

// // import com.xuggle.mediatool.IMediaWriter;
// // import com.xuggle.mediatool.ToolFactory;
// // import com.xuggle.xuggler.IRational;

// /**
//  * Handles writing video files using Xuggle.
//  *
//  * @author Robert McIntyre
//  *
//  */

// public class XuggleVideoRecorder extends AbstractVideoRecorder{

//   //   IMediaWriter writer;
//   //   BufferedImage frame;
//   //   int videoChannel = 0;
//   //   long currentTimeStamp = 0;
//   //   boolean videoReady = false;
//      public XuggleVideoRecorder(File output) 
//              throws IOException {super(output);}
//   //   public void initVideo(){
//      // this.frame = new BufferedImage(
//      //                             width, height,
//      //                             BufferedImage.TYPE_3BYTE_BGR);
//      // this.writer = ToolFactory.makeWriter(this.targetFileName);
//      // writer.addVideoStream(videoChannel, 
//      //                    0, IRational.make(fps), 
//      //                    width, height);
//      // this.videoReady = true;
//   //   }
//   //   public void record(BufferedImage rawFrame) {
//      // if (!this.videoReady){initVideo();}
//      // // convert the Image into the form that Xuggle likes.
//      // this.frame.getGraphics().drawImage(rawFrame, 0, 0, null);
//      // writer.encodeVideo(videoChannel, 
//      //                 frame,
//      //                 currentTimeStamp, TimeUnit.NANOSECONDS);
//      // currentTimeStamp += (long) (1000000000.0 / fps);
//   //   }

//   //   public void finish() {
//      // writer.close();
//   //   }
// }

With this, we are able to record video!

However, it can be hard to properly install Xuggle. If you would rather not use Xuggle, here is an alternate class that uses Werner Randelshofer's excellent pure Java AVI file writer.



import java.awt.image.BufferedImage;

import ca.randelshofer.AVIOutputStream;

public class AVIVideoRecorder extends AbstractVideoRecorder{

    AVIOutputStream out = null;
    boolean videoReady = false;
    BufferedImage frame;
    public AVIVideoRecorder(File output) throws IOException {
        this.out = new 
            AVIOutputStream(output, AVIOutputStream.VideoFormat.RAW, 24);
    public void initVideo (){
        frame = new BufferedImage(
                                  width, height,
        out.setFrameRate((int) Math.round(this.fps));
        out.setVideoDimension(width, height);
        this.videoReady = true;
    public void record(BufferedImage rawFrame) {
        if (!videoReady){initVideo();}
        this.frame.getGraphics().drawImage(rawFrame, 0, 0, null);
        try {out.writeFrame(frame);}
        catch (IOException e){e.printStackTrace();}
    public void finish() {
        try {out.close();} 
        catch (IOException e) {e.printStackTrace();}



This AVIVideoRecorder is more limited than the XuggleVideoRecorder, but requires less external dependencies.

Finally, for those of you who prefer to create the final video from a sequence of images, there is the FileVideoRecorder, which records each frame to a folder as a sequentially numbered image file. Note that you have to remember the FPS at which you recorded the video, as this information is lost when saving each frame to a file.



import java.awt.image.BufferedImage;
import javax.imageio.ImageIO;

public  class FileVideoRecorder extends AbstractVideoRecorder{
    int current;
    File outDir;
    String formatName = "png";
    public FileVideoRecorder(File output) throws IOException {
        if (output.exists() 
            && output.isDirectory() 
            && (0 == output.listFiles().length)){
            // good
        else if (!output.exists()){
        else {
            throw new IOException("argument must be either an empty " + 
                                  "directory or a nonexistent one.");
        this.outDir = output;

    public void record(BufferedImage rawFrame) {
        String name = String.format("%07d.%s" , current++, formatName);
        File target = new File(output, name);
        try {ImageIO.write(rawFrame, formatName, target);}
        catch (IOException e) {e.printStackTrace();}

    public void finish() {}

4 How to record videos yourself

4.1 Include this code.

No matter how complicated your application is, it's easy to add support for video output with just a few lines of code.

And although you can use VideoRecorder to record advanced split-screen videos with multiple views, in the simplest case, you want to capture a single view— exactly what's on screen. In this case, the following simple captureVideo method will do the job:

public static void captureVideo(final Application app, 
                                final File file) throws IOException{
    final AbstractVideoRecorder videoRecorder;
    if (file.getCanonicalPath().endsWith(".avi")){
        videoRecorder = new AVIVideoRecorder(file);}
    else if (file.isDirectory()){
        videoRecorder = new FileVideoRecorder(file);}
    else { videoRecorder = new XuggleVideoRecorder(file);}
    Callable<Object> thunk = new Callable<Object>(){
        public Object call(){
            ViewPort viewPort = 
            .createPostView("aurellem record", app.getCamera());
            viewPort.setClearFlags(false, false, false);
            // get GUI node stuff
            for (Spatial s : app.getGuiViewPort().getScenes()){
            return null;

This method selects the appropriate VideoRecorder class for the file type you specify, and instructs your application to record video to the file.

Now that you have a captureVideo method, you use it like this:

Establish an Isotimer and set its framerate
For example, if you want to record video with a framerate of 30 fps, include the following line of code somewhere in the initialization of your application:
this.setTimer(new IsoTimer(30));
Choose the output file
If you want to record from the game's main ViewPort to a file called /home/r/record.flv, then include the following line of code somewhere before you call app.start();
Capture.captureVideo(app, new File("/home/r/record.flv"));

4.2 Simple example

This example will record video from the ocean scene from the JMonkeyEngine test suite.

File video = File.createTempFile("JME-water-video", ".avi");
captureVideo(app, video);

I've added support for this under a class called com.aurellem.capture.Capture. You can get it here.

4.3 Hello Video! example

I've taken ./jme3/src/test/jme3test/helloworld/ and augmented it with video output as follows:


package com.aurellem.capture.examples;


import jme3test.helloworld.HelloLoop;

import com.aurellem.capture.Capture;
import com.aurellem.capture.IsoTimer;

/** Recording Video from your Application is simple.  If all you want
 *  to do is record Video, then follow the following steps.
 *  1.) Set the Application's timer to an IsoTimer.  The framerate of
 *  the IsoTimer will determine the framerate of the resulting video.
 *  2.) Call Capture.captureVideo(yourApplication, target-file) before
 *  calling yourApplication.start()
 *  That's it!  If you have any comments/problems, please PM me on the
 *  jMonkeyEngine forms.  My username is bortreb.
 * @author Robert McIntyre
public class HelloVideoRecording {

    public static void main(String[] ignore) throws IOException {
        Application app = new HelloLoop();
        File video = File.createTempFile("JME-simple-video", ".avi");
        app.setTimer(new IsoTimer(60));
        Capture.captureVideo(app, video);

The videos are created in the hello-video directory

du -h hello-video/*
932K    hello-video/hello-video-moving.flv
640K    hello-video/hello-video-static.flv

And can be immediately uploaded to YouTube

5 Showcase of recorded videos

I encoded most of the original JME3 Hello demos for your viewing pleasure, all using the Capture and IsoTimer classes.

5.1 HelloTerrain

5.2 HelloAssets

5.3 HelloEffects

5.4 HelloCollision

5.5 HelloAnimation

5.6 HelloLoop

Author: Robert McIntyre

Created: 2015-04-19 Sun 07:04

Emacs 24.4.1 (Org mode 8.3beta)