Requires openFrameworks 0.8.0 or higher: http://www.openframeworks.cc/setup/raspberrypi/
The SD Cards/content distributed at resonate.io used a pre-release version of openFrameworks. If you need the version of this repo that is compatible with the pre-release you can find that here: https://github.com/andreasmuller/RaspberryPiWorkshop/releases/tag/resonate
Hello world, openFrameworks style!
The multi-screen applications need a little bit of setup. In the data/Settings/ folder there are 3 files to know about:
If this file exists, that computer is going to acts as the server. It will still draw things to screen.
This contains the information about which screen you are.
<Settings> <ScreenIndex>0</ScreenIndex> </Settings>
This contains information needed when acting as a server. The ServerSendHost should be the multicast address of your network. On OSX you can find this out by typing ifconfig into the terminal.
By specifying the multicast address for the network, the clients will automatically find the server and no IP information is needed by each client.
<Settings> <ServerSendHost>10.255.255.255</ServerSendHost> <ServerSendPort>7778</ServerSendPort> <ServerReceivePort>7777</ServerReceivePort> </Settings>
A simple standalone server app that does no drawing of it's own, it only send out the server time to each client, this is used by MultiScreenSimpleSync.
Takes the synced time from StandaloneTimingServer and displays a rotating circle and pulses the background. If the multicast address is set up correctly, this animation will be the same for any client node that is started.
A basic demo of an application that draws things over multiple screens.
Here we use the same program to act as both the server and the client, as dictated by the existence of IsServer.txt in data/Settings. Only one computer on the network should act as the server.
This program will spawn particles that will travel across the screens. How many screens the server assumes it is creating content for can be set by the variable screenAmount in the update() function.
The screen a client node is rendering is read from ClientSettings.xml, but you can press the number keys on your keyboard to change this at runtime.
These network techniques are obviously great for writing programs that create graphics over multiple screens, but there are many other uses as well. In this program each Pi will do some computer vision on a piece of video (a live stream in a real world scenario), process the result into an outline and send it off to a central server.
Here we are simply displaying the results on the central server, but this technique could be useful for merging the data from a large number of Pis + cameras to track users in a large space for instance.
Indeed, decoding 1080p video and running shaders on the result!
The projects whose names begin with ShaderExample are various examples of shaders that work with the new GLES2 Renderer in OF
This example is an OpenGL ES 2 compatible version of the alphaMaskingShaderExample found in openFrameworks/examples/gl/alphaMaskingShaderExample. It is a good project to compare the differences in GLSL syntax as shaders written for the RPi will not work on the desktop (or vice-versa)
A port from openFrameworks/examples/gl/billboardExample and contains both desktop and RPi compatible shaders.
A nice feature of the new OpenGL ES 2 renderer is that the shaders use the same version of GLSL that works with WebGL. The following projects use shaders that were ported from ShaderToy. https://www.shadertoy.com/
This example is a good starting point to write your own shader. The project is set up to load an empty vertex and fragment shader from the bin/data folder
The Sony PS3 Eye is a popular USB camera in both the OpenCv and openFrameworks community. It is fairly inexpensive and works well on the RPi at 320x240@60fps. openFrameworks video capture works with the Pi out out of the box, but with some simple extra commands you can offload some of the colorspace conversion to the hardware.
This example allows you to access the depth and RGB streams of a Asus Xtion Pro Live depth camera using OpenNI. It uses the newer OpenNI2 which has a much simpler syntax than the previous version. The project also works on the Mac Desktop which is useful to record .oni files that the project can also playback. http://www.asus.com/Multimedia/Xtion_PRO/
This project uses SPI which Rasbian has disabled by default. You must edit the file
/etc/modprobe.d/raspi-blacklist.conf and comment out or remove the 2 lines (tutorial here: http://www.skpang.co.uk/blog/archives/575). If you are using an SD Card from the Resonate workshop this was already done for you.
This example needs to be run as root to access the GPIO features. Once you compile the project use the command
sudo make run to run the program
wiringPi (https://projects.drogon.net/raspberry-pi/wiringpi/) is a great library that makes working with the RPi's GPIO very Arduino-like. This example shows you how to use it with OF and an analog to digital convertor (ADC) to change the playback speed of a sound.