Skip to content

Commit

Permalink
OpenGL done.
Browse files Browse the repository at this point in the history
  • Loading branch information
Oliver Rode committed Mar 26, 2018
1 parent 5574697 commit 68283ff
Show file tree
Hide file tree
Showing 14 changed files with 167 additions and 117 deletions.
19 changes: 13 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,13 @@
# Frankenstein VR
Free Video Converter for Virtual Reality and 3D based on [FFmpeg](https://ffmpeg.org) and [OpenCV](http://www.opencv.org/releases.html) for Java.
It's a small, open-source platform for OpenCV-based video filtering, where custom filters can be simply added into the processing pipeline:
Video Stream Analysis and Manipulation Framework for Java, where custom filters can be simply added into the processing pipeline.

<img src="doc/pipeline.png" width="100%">
The Tool supports
* [OpenCV](http://www.opencv.org/releases.html) - computer vision and machine learning
* [JogAmp (OpenGL, OpenCL, and OpenAL)](http://jogamp.org) - 3D Graphics, Multimedia and Processing
* [FFmpeg](https://ffmpeg.org) - record, convert and stream audio and video.
* [VLC](https://www.videolan.org/vlc/) - video stream recording

For developers: When you work on custom filters, you can concentrate on manipulating images with the OpenCV library. A list of available filters see: [SegmentFilters](https://github.com/olir/Frankenstein/blob/master/doc/SegmentFilters.md).
<img src="doc/pipeline.png" width="100%">

The pipeline allows input as video file, camera, network stream or pictures stored as left/right 3D slides (e.g. from nikon camera).

Expand All @@ -14,14 +17,15 @@ VR videos appear like displayed on a virtual 160-inch curved 3D display in front

3D is optional. Hence, Frankenstein VR can be used solely for classic video processing. It focuses on filters, that are not part of common tools.


## Screenshots
<img src="doc/config.png" width="45%"> <img src="doc/processing.png" width="45%" />


## Samples
I have uploaded some samples to vimeo: <a href="https://vimeo.com/user68089135"><img src="doc/vimeo.png"/></a>

## Features
## Basic Features
Frankenstein VR is an experimental video converter with some video filters/features:
- Virtual Reality side-by-side converter (projection, padding, shrinking)
- Anaglyph (e.g. red/blue) to grayscale side-by-side converter
Expand All @@ -40,9 +44,9 @@ For current or full status see [Release Notes](https://github.com/olir/Frankenst

# HOWTO run it
Install Pre-Requisites first (see below), then you have 3 options to start it:
* _From Maven:_ For the current version, use maven and run it in app folder with **mvn -pl .,app clean package exec:exec**
* _Jar execution:_ Download and execute the jar file from the release (see section below)
* _Java Webstart:_ You can execute releases with Java Webstart (see section below)
* _From Maven:_ For the current version, use maven and run it in app folder with **mvn -pl .,app clean package exec:exec**

## Pre-Requisites
- [FFmpeg 3.1.1+](https://ffmpeg.org) installed. Select path at first startup (is stored in frankenstein.ini at user-home)
Expand All @@ -66,5 +70,8 @@ FFMPEG build contains H264 encoder based on the OpenH264 library, that should be
- e.g.: [Pre-Release 0.1](https://github.com/olir/Frankenstein/releases/download/0.1/launch.jnlp)
3. Accept warnings and execute.

##For developers##
When you work on custom filters, you can concentrate on manipulating images with the OpenCV library. For more details read [SegmentFilters](https://github.com/olir/Frankenstein/blob/master/doc/SegmentFilters.md).



45 changes: 8 additions & 37 deletions app/src/main/java/de/screenflow/frankenstein/MovieProcessor.java
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@
import de.screenflow.frankenstein.task.Task;
import de.screenflow.frankenstein.task.TaskHandler;
import de.screenflow.frankenstein.task.TimeTaskHandler;
import de.screenflow.frankenstein.vf.DefaultFilterContext;
import de.screenflow.frankenstein.vf.FilterContext;
import de.screenflow.frankenstein.vf.FilterElement;
import de.screenflow.frankenstein.vf.SegmentVideoFilter;
Expand Down Expand Up @@ -65,7 +66,7 @@ public class MovieProcessor {
private File ffmpeg;

private SegmentVideoFilter previewFilter=null;

public MovieProcessor(Configuration configuration) {
this.ffmpegPath = configuration.getFFmpegPath();
this.tempPath = new File(configuration.getTempPath());
Expand All @@ -92,13 +93,7 @@ public void init(ProcessingListener l) {
}
}

FilterContext context = new FilterContext() {
HashMap<String,Object> valMap = new HashMap<String,Object>();
@Override
public Object getValue(String key) {
return valMap.get(key);
}
};
FilterContext context = new DefaultFilterContext();
newFrame = frame;
for (VideoFilter filter : filters) {
System.out.println("MovieProcessor process " + filter.getClass().getName());
Expand Down Expand Up @@ -142,13 +137,7 @@ public void processStreamFrame(ProcessingListener l) {

frame = configuration.getSource().getFrame();
if (frame != null && !frame.empty()) {
FilterContext context = new FilterContext() {
HashMap<String,Object> valMap = new HashMap<String,Object>();
@Override
public Object getValue(String key) {
return valMap.get(key);
}
};
FilterContext context = new DefaultFilterContext();
Mat newFrame = frame;
for (VideoFilter filter : filters) {
// System.out.println("MovieProcessor processStreamFrame " +
Expand Down Expand Up @@ -230,13 +219,7 @@ public boolean process(ProcessingListener l) {
}
if (frame != null && !frame.empty()) {
if (!filters.isEmpty()) {
FilterContext context = new FilterContext() {
HashMap<String,Object> valMap = new HashMap<String,Object>();
@Override
public Object getValue(String key) {
return valMap.get(key);
}
};
FilterContext context = new DefaultFilterContext();
newFrame = frame;
for (VideoFilter filter : filters) {
// System.out.println("MovieProcessor
Expand All @@ -247,13 +230,7 @@ public Object getValue(String key) {
newFrame = frame;
}
if (localFilters != null && !localFilters.isEmpty()) {
FilterContext context = new FilterContext() {
HashMap<String,Object> valMap = new HashMap<String,Object>();
@Override
public Object getValue(String key) {
return valMap.get(key);
}
};
FilterContext context = new DefaultFilterContext();
for (FilterElement element : localFilters) {
if (element.filter != null) {
if (element.r.start <= i && i < element.r.end) {
Expand Down Expand Up @@ -471,13 +448,7 @@ public void seek(final ProcessingListener l, int frameId) {
currentPos = configuration.getSource().seek(frameId, l);
frame = configuration.getSource().getFrame();
if (frame != null && !frame.empty()) {
FilterContext context = new FilterContext() {
HashMap<String,Object> valMap = new HashMap<String,Object>();
@Override
public Object getValue(String key) {
return valMap.get(key);
}
};
FilterContext context = new DefaultFilterContext();
Mat newFrame = frame;
for (VideoFilter filter : filters) {
// System.out.println("MovieProcessor process "+filter.getClass().getName());
Expand Down Expand Up @@ -580,6 +551,6 @@ public void handleLine(String line) {
}

public void setPreviewFilter(SegmentVideoFilter selectedFilter) {
previewFilter = selectedFilter;
previewFilter = selectedFilter;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
package de.screenflow.frankenstein.vf;

import java.util.HashMap;

public class DefaultFilterContext implements FilterContext {
private HashMap<String,Object> valMap = new HashMap<String,Object>();

public DefaultFilterContext() {
}

@Override
public Object getValue(String key) {
return valMap.get(key);
}

@Override
public Object putValue(String key, Object value) {
return valMap.put(key, value);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@

public interface FilterContext {
Object getValue(String Key);
Object putValue(String Key, Object value);
}
Original file line number Diff line number Diff line change
Expand Up @@ -2,24 +2,17 @@

import java.awt.image.BufferedImage;
import java.awt.image.DataBufferByte;
import java.awt.image.DataBufferInt;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.util.ArrayList;
import java.util.List;
import java.util.Random;

import javax.imageio.ImageIO;

import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfByte;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;

import com.jogamp.opengl.DefaultGLCapabilitiesChooser;
import com.jogamp.opengl.GL2;
Expand All @@ -29,18 +22,30 @@
import com.jogamp.opengl.GLProfile;
import com.jogamp.opengl.fixedfunc.GLMatrixFunc;
import com.jogamp.opengl.util.awt.AWTGLReadBufferUtil;
import com.jogamp.opengl.util.gl2.GLUT;

import de.screenflow.frankenstein.vf.FilterContext;

public class GLExampleFilter extends DefaultSegmentFilter {
public class GLExampleFilter extends NativeSegmentFilter<GLExampleConfigController> {

private final static String JNI_FILTER_CLASS = "de.screenflow.frankenstein.vf.jni.MatBlender";

private final Method jniProxyProcessMethod;

Mat glFrame;
GLProfile glp;
GLCapabilities caps;


@SuppressWarnings("unchecked")
public GLExampleFilter() {
super("glexample");

super("glexample", JNI_FILTER_CLASS);
try {
jniProxyProcessMethod = getJniProxyClass().getMethod("process", Object.class, int.class, Object.class,
Object.class);
} catch (NoSuchMethodException | SecurityException | IllegalArgumentException e) {
throw new RuntimeException("jni wrapper creation failed", e);
}

glp = GLProfile.getDefault();
caps = new GLCapabilities(glp);
caps.setHardwareAccelerated(true);
Expand All @@ -55,54 +60,36 @@ public GLExampleFilter() {

@Override
public Mat process(Mat sourceFrame, int frameId, FilterContext context) {
System.out.println("c = "+sourceFrame.cols()+" r = "+sourceFrame.rows());
// System.out.println("c = "+sourceFrame.cols()+" r =
// "+sourceFrame.rows());
if (glFrame == null || glFrame.cols() != sourceFrame.cols() || glFrame.rows() != sourceFrame.rows()) {
glFrame = sourceFrame.clone();
}

GLAutoDrawable drawable = init(glFrame.cols(), glFrame.rows());
BufferedImage bufImg = render(drawable, glFrame.cols(), glFrame.rows());
BufferedImage bufImg = render(drawable, glFrame.cols(), glFrame.rows(), frameId);

BufferedImage image = new BufferedImage(bufImg.getWidth(), bufImg.getHeight(), BufferedImage.TYPE_4BYTE_ABGR);
image.getGraphics().drawImage(bufImg, 0, 0, null);
glFrame = new Mat(image.getHeight(), image.getWidth(), CvType.CV_8UC4);
byte[] data = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
glFrame.put(0, 0, data);

return sourceFrame;

/*
try {
return BufferedImage2Mat(image);
}
catch (IOException e) {
jniProxyProcessMethod.invoke(getJniProxy(), sourceFrame, frameId, context, glFrame);
} catch (IllegalAccessException | IllegalArgumentException | InvocationTargetException e) {
e.printStackTrace();
return sourceFrame;
}
*/
}

public static Mat BufferedImage2Mat(BufferedImage image) throws IOException {
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
ImageIO.write(image, "png", byteArrayOutputStream);
byteArrayOutputStream.flush();
return Imgcodecs.imdecode(new MatOfByte(byteArrayOutputStream.toByteArray()), Imgcodecs.CV_LOAD_IMAGE_UNCHANGED);
return sourceFrame;
}

public static BufferedImage Mat2BufferedImage(Mat matrix)throws IOException {
MatOfByte mob=new MatOfByte();
Imgcodecs.imencode(".png", matrix, mob);
return ImageIO.read(new ByteArrayInputStream(mob.toArray()));
}

@Override
protected void initializeController() {
// getConfigController(). ...
}

static int numPoints = 100;
static Random r = new Random();

public GLAutoDrawable init(int width, int height) {
public GLAutoDrawable init(int width, int height) {
GLDrawableFactory factory = GLDrawableFactory.getFactory(glp);

GLAutoDrawable drawable = factory.createOffscreenAutoDrawable(factory.getDefaultDevice(), caps,
Expand All @@ -112,25 +99,7 @@ public GLAutoDrawable init(int width, int height) {
return drawable;
}

private BufferedImage render(GLAutoDrawable drawable, int width, int height) {

List<Float> data = new ArrayList<Float>(numPoints * 2);

// simulate some data here
for (int i = 0; i < numPoints; i++) {
float x = r.nextInt(width);
float y = r.nextInt(height);
data.add(x);
data.add(y);
}

// x and y for each point, 4 bytes for each
FloatBuffer buffer = ByteBuffer.allocateDirect(numPoints * 2 * 4).order(ByteOrder.nativeOrder())
.asFloatBuffer();
for (Float d : data) {
buffer.put(d);
}
buffer.rewind();
private BufferedImage render(GLAutoDrawable drawable, int width, int height, int frameId) {

GL2 gl = drawable.getGL().getGL2();

Expand All @@ -142,16 +111,16 @@ private BufferedImage render(GLAutoDrawable drawable, int width, int height) {

gl.glOrtho(0d, width, height, 0d, -1d, 1d);
gl.glPointSize(4f);
gl.glColor3f(1f, 0f, 0f);
gl.glColor3f(0.8f, 0.8f, 0.8f);

gl.glEnableClientState(GL2.GL_VERTEX_ARRAY);
gl.glVertexPointer(2, GL2.GL_FLOAT, 0, buffer);
gl.glDrawArrays(GL2.GL_POINTS, 0, numPoints);
gl.glDisableClientState(GL2.GL_VERTEX_ARRAY);
GLUT glut = new GLUT();
String text = ""+frameId;
gl.glRasterPos3d(111, 111, 0);
glut.glutBitmapString(GLUT.BITMAP_TIMES_ROMAN_24, text);

BufferedImage im = new AWTGLReadBufferUtil(drawable.getGLProfile(), true)
.readPixelsToBufferedImage(drawable.getGL(), 0, 0, width, height, true);

return im;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -21,19 +21,18 @@
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLClassLoader;
import java.util.PropertyResourceBundle;
import java.util.ResourceBundle;

import de.screenflow.frankenstein.fxml.FxMain;
import de.screenflow.frankenstein.vf.SegmentVideoFilter;

public abstract class NativeSegmentFilter<C> extends DefaultSegmentFilter implements SegmentVideoFilter {
private static URLClassLoader loader = null;

@SuppressWarnings("rawtypes")
private final Class jniProxyClass;
private final Object jniProxy;
private final Method jniProxyInitMethod;

@SuppressWarnings("unchecked")
protected NativeSegmentFilter(String identifier, String proxyClassName) {
super(identifier);

Expand Down Expand Up @@ -64,6 +63,7 @@ static synchronized URLClassLoader getLoader() throws MalformedURLException {
return loader;
}

@SuppressWarnings("rawtypes")
protected Class getJniProxyClass() {
return jniProxyClass;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ public Mat process(Mat rgbaImage, int frameId, FilterContext context) {

VideoEqualizerConfigController c = ((VideoEqualizerConfigController)getConfigController());
try {
jniProxyProcessMethod.invoke(getJniProxy(), mHsvMat, frameId, c.getBrightness(), c.getContrast(), c.getSaturation());
jniProxyProcessMethod.invoke(getJniProxy(), mHsvMat, frameId, context, c.getBrightness(), c.getContrast(), c.getSaturation());
} catch (IllegalAccessException | IllegalArgumentException | InvocationTargetException e) {
e.printStackTrace();
}
Expand Down

0 comments on commit 68283ff

Please sign in to comment.