First off, I would like to say that this application is a prototype-turned-commercial-application-turned-open-source-project. While this does not excuse it, the code itself can be slightly messy at times, and I hope to improve this over time. (with your assistance, if you have the time)
Nonetheless, thank you for taking the time to read this document and are possibly interested in helping out. Let me know if there is anything unclear. The easiest way to reach me is via the following methods...
- I've opened an #onoffice channel at the AFrame Slack group.
- You can reach me via a direct message on Twitter at @rvdleun.
- Finally, you can also email me at firstname.lastname@example.org.
To create a virtual environment where Windows, Mac and Linux users can work in peace using any HMD that supports WebVR.
On/Office can promote the pro's of building a VR application with standard browser technology. For years, we've been building apps that work on nearly any device, from the mobile to the desktop. WebVR offers this as well, letting us build something that can work from the Cardboard to the Vive.
This is why it's key that any feature in On/Office should be accessible on any device, regardless of whether it is 3DOF, 6DOF or has controllers. The Electron application allows us to develop an interface that the user can access to control his environment.
However, this doesn't mean that we should ignore the hardware with more capabilities. If a feature can be enhanced with controller input, it should. But my personal preference is to first get something working on less powerful hardware and then move upwards.
What can I contribute
The README.md file contains a roadmap with features that I would like to implement. Please check the issues page for any progress on the feature and indicate if you want to put some work into it.
Frameworks and links
- The project is divided into two parts: The desktop application(that runs on the computer and acts as a server) and the WebVR client that runs in a browser.
- The application is created using Electron, with the code being developed using Angular. The client files are served using Express.
- The VR Client application is developed with vue.js for the splashscreen and AFrame for the WebVR part.
- The Angular2-Electron-Boilerplate was used as a starting point.
- Communication between the application and client are done via socket.io and WebRTC.
Clone this repository locally:
git clone https://github.com/stokingerl/Angular2-Electron-Boilerplate.git
New builds are automatically generated after a new commit in the master branch, using Codeship. The results are uploaded to an Amazon S3 bucket.
At the moment, I have to admit that I do most of my development and testing on my Macbook. I do try out and test it as thoroughly on Linux and Windows as possible, but not as much as I'd should/want to.
Due to some issues with running the standard commands(in particular the size being 3x as big as it should be), a separate project in the
build directory is setup to facilitate new builds.
Take the following steps to create a new build...
- Enter the
sh prepare-build.sh. This will prepare the source files in the directory.
- Next, run either
package-windowsto build a version for your platform.
- The result can be found in the
The application is a user interface that lets the user setup a 360 panoramic image, pick what screen is going to be displayed in the virtual environment and set a numeric pincode for additional security.
Once the user starts streaming, an Express web-server will start serving the files for the client. After the user has logged in, a user interface will pop up that lets the user resize and center the screen in the virtual world.
To start development
npm start-windowsto build and run the project.
This will open the application locally. There is currently no hot-reloading available, so you will need to close down and rerun the command to see your desktop changes.
- src/app - Angular source files
- src/assets - Assets(images, fonts, etc)
- src/electron - Electron source files
The app only has a single page implemented with two separate screens. The
MainPageComponent is responsible for what screen is being displayed and playing the right animation to switch.
Contains all components required for the settings screens, the first screen that the user sees
Contains all components required when the user is actively sharing his desktop.
Contains services that are required by multiple components
Contains the main.js Electron script that initializes the application
Contains components files, each one responsible for a certain feature within the application
The client contains the browser application that the user visits on his headset to view the Virtual Office.
To start development
npm start-windowsto open the Electron app.
- Press 'Start Streaming'
- Open a browser and connect to
Note that you won't need to rerun the start command. Any changes made in the
client directory will immediately be displayed on refreshing the page.
- client/assets - Assets(images, fonts, etc))
- client/components - AFrame components
- client/splash-screen - Vue scripts that handle the splash screen
- client/systems - AFrame systems
- client/vendor - All vendor files
- When adding
?no-sourceto the URL, then no video will be streamed and the splashscreen is automatically removed once the client has finished connecting. This was added to ease development.
This section will cover how a connection is setup between the desktop and the VR Headset. In the code(and this section), the desktop will be referred to as host, and the headset connection as the client.
- (host) After the user has pressed 'Start Streaming',
setWebServerActiveis called in webserver.component.js. This boots the Express server to serve the client's files, starts socket.io and initialises the PeerJS server.
- (client) Once the user has opened the client in the browser, it will first initialise the VR scene and then connects to the host via socket.io. This is coded in splash-screen.js.
- (client) After receiving a
client_message(indicating that the host has approved and wants to start streaming), the
setup()function in webrtc.system.js is called.
- (client) A connection the PeerJS server is made. Once this has been opened, the client has received a random ID to identify itself. This ID is sent to the host.
- (host) The
setupConnection()function in stream.service.ts handles the clientID, along with the sources that it has to stream. At the moment, streaming only one display is supported.
- (host) A stream is created. The host also connects to PeerJS now. Once this is done, it uses the call() to connection to the Client ID.
- (client) Back in webrtc.system.js, the call is received and answered. This results in the stream being available in the