Liquid Galaxy Project Ideas Page
A place to list all those cool or wacky ideas you have for Liquid Galaxy rigs. If you have any ideas please send them to the mailing list.
New Control Devices
Most Liquid Galaxy's are controlled by a SpaceNavigator and a touchscreen. We'd like to see other input devices control Google Earth. Obvious examples include the Wii Remote, Microsoft Kinect, Android phones/tablets, using accelerometers, GPS, head and eye trackers or other novel input.
- Head tracking on a curved screen example - Uni of Texas ECE Senior Design Competition Entry
- How about using the Leap Frog Interactive Toy Globe or another electronic globe as a controller?
Panoramic Content Production
Because Liquid Galaxy uses commodity hardware, we're generally limited to flat display technologies. This means we're displaying multiple flat planar views arranged in a cylinder (rectilinear). However most panoramic content publishers use spherical or cylindrical (curvilinear) projections. Displaying this type of content on a Liquid Galaxy requires conversion to multiple planar views. We'd like to see this process automated as much as possible.
- How can we get content (live or not) from spherical or panoramic cameras eg. Point Grey Lady Bug's
Another way of producing 360 degree panoramas and-or point clouds is structure from motion. Possible project in this field could be:
- Dockerize OpenSfM or Bundler
- Create simple web-gui for managing image sequences and results
- Add point clouds viewer for CesiumJS or use any other web viewer for point clouds (optional)
System, Network and Caching Performance Monitoring
We'd like better insight into each level of the Liquid Galaxy stack, especially the multiple levels of caching. A detailed near-real-time performance monitoring solution could help diagnose bottlenecks and configure the Squid HTTP cache for better performance. Metrics from each system should be collected and displayed immediately, including disk usage, networking, CPU and GPU utilization, HTTP cache hits and misses, etc. What is an optimal cache size (or content age) for the google content?
Add or adapt Networked View Synchronisation to other Applications
Google Earth is certainly a "killer app" for the Liquid Galaxy platform. But there are many applications that could be easily enhanced, coordinating multiple instances rendering portions of a panoramic view. Here's a few ideas we have...
- MPlayer has already been patched, enabling coordinated playback for PanoramicVideo. However the existing patch works only for video, not audio. Modify MPlayer to also send UDP master packets even if video frames are not available to be rendered ie. during audio file playback. Alternatively, modify VLC to account for bezels when using “-wall” filter for immersive ultra-widescreen movie display eg. Cinerama epics!
- Develop some of the virtual world clients to provide a networked immersive experience on Liquid Galaxy - OpenSimulator, KataSpace (pictured), Cave SecondLife, VastPark, MOSES, Delta3D.
- Enable more Open Source VideoGames and/or Game Engines to run in Liquid Galaxy. For example Ogre3D, Alien Arena, Spring RTS Engine, IrrLicht, Panda3D, FlightGear, Scorched3D. Quake 3 and Cube2:Sauer Braten could be used as examples.
- Document setups for commercial simulation apps that support multiple-machine clusters - such as FS X and X-Plane, surely there are some racing sims?
- Use/adapt Google Art Project.
- Add viewsync to Cesium.
- Setup scientific data visualisation applications that could be used in a Liquid Galaxy. Some apps may be fairly straightforward others may require complex configuration or software development. Here are some examples - ParaView, Processing, Xdmx, Chromium (the GL library not the Google one), EqualiserGraphics, CalVR suite, COVISE, FreeVR, Mechdyne CAVElib, CSIRO VizStack, Avizo VSG, OssimPlanet, VMD/NAMD, Blender, KDE Marble, OpenInventor, VR Juggler, vrkit, SAGE, Syzygy/Aszgard, Coin3D, Amira (VR option), Stellarium, Celestia, BallView, Microsoft Worldwide Telescope
- A multi-screen YouTube launcher. Synchronise playlists across all the displays. Perhaps when doing a video search show one video on each screen. Chrome/Firefox extension to open windows/tabs on specific LG screens may be useful here.
Touch Screen Control Enhancements
- Application (Earth/Mplayer/Sauerbraten) “selection” buttons using xdotool to do window searches and map/unmap or similar.
- Tour Control (not likely feasible).
- Load/Unload KML from touchscreen.
- Control standard Google Earth features eg. toggle layers and grid, Sun mode, etc.
Google Earth Networking & Collaboration Enhancements
- A contributed script called "viewsyncrelay” can act as the recipient for the Google Earth ViewSync packets sent by the Google Earth master. The script broadcasts the packets to the slave nodes in a Galaxy setup. As a middle-man the script can potentially alter the values, execute scripts on the clients, collect statistics, trigger sound effects, etc. Help this script grow into a more functional and extensible tool. Must know Perl (or similiar) and be familiar with UDP network communication.
- Connect several LG rigs together for shared virtual tours and Google Earth-based field trips. Can probably be achieved by adapting viewsyncrelay.pl and some ViewSync->KML->ViewSync glue.
- Immersive multi-screen Google Plus 'Hangout' video conference with friends on a Liquid Galaxy! See NTT t-Room as an example. EVO or AccessGrid may be options. WebRTC and/or Hangout API may also help here.
GigaPan Viewer, on Adobe Air
- Add UDP broadcast/receive to GigaPan Desktop Viewer
- Update GigaPan Desktop to AIR 2.5 namespace so that it can be made into an Android .apk
- Extend the AIR usage further to allow Android-controlled (or not) gaming on a Galaxy setup.
Liquid Galaxy System Deployment Automation
Presently Liquid Galaxy systems are complex to deploy, requiring several hours of one or more experienced Linux system administrator's time. This wiki does have installation instructions, but the process could benefit from better automation, testing and documentation. Features like GUI configuration, or automated, even dynamic personality assignment would put Liquid Galaxy much nearer the reach of enthusiasts.
Bring the desktop Google Earth user-interface experience to Liquid Galaxy
Liquid Galaxy setups are fantastic platforms for showcasing Google layers and datasets. However it is difficult for users to load their own KML, datasets and interact with some features of Google Earth eg. turning on/off specific layers. If LG was as easy to interact with as desktop Google Earth a whole community of education and scientific users would thank you! There are open-source tools that may help here eg. Synergy and Sikuli, Input Director (Windows only).
- HowTo's for running up Liquid Galaxy on non-Linux platforms.
- Trigger location-based sound effects using a database of geo-located audio for example, the British Library UK Sound Map, Global Soundscape Network, UrbanRemix, water splash when diving into ocean, sound of the surf when near the coast, etc. Is Pumilio (an open source soundscape manager) able to export KML which is useful in Google Earth?
- Investigate potential for Mumble, TeamSpeak or Ventrilo for surround/3d locating audio in rig as well as for inter-rig voice communication.
- Construct a 3D model for calibration of LG rigs. Basically a cylinder with a test pattern, angle of orientation written around it.
- Link with and develop a HowTo Guide for geography teachers & school computer clubs.
- Further develop real-time in-world Google Earth avatars, currently prototyped using ViewSync->KML. Would be cool for classrooms.
- Need some way of benchmarking Liquid Galaxy rig overall performance.
- Investigate BoyGrouping for clustered displays when using the vvvv real-time vis toolkit.
- Check multi-machine rendering with Most Pixels Ever and Polycode.