Browse the web in VR by live streaming a web page into Aframe using PhantomJS and ffmpeg.
I am using:
- Node v7.2.0
- PhantomJS v2.1.1
- Chrome v54
- ffmpeg v3.2
- OSX v10.12.1
This setup allows you to continuously render a web page in PhantomJS and stream it into an Aframe VR scene by rendering it onto a canvas. It then forwards user events (eg click, keypress) back to PhantomJS allowing the user to interact with the "browser". Not surprisingly, the performance is shit. This is just a proof of concept. Originally my idea was that I could port existing 2D code editing or text editing web applications into Aframe.
Going forward I would like to look into SlimerJS instead of Phantom and even using virtual machines such as VirtualBox.
First install PhantomJS with
npm install phantomjs-prebuilt -g or
yarn global add phantomjs-prebuilt. You can check if it's installed with
For Mac users, install ffmpeg with
brew install ffmpeg. You can check if it's installed with
ffmpeg -version. I'm not sure how you'd install ffmpeg on Windows or Linux, so you are on your own.
npm install or
yarn install. This will automatically run
browserify public/packages.js > public/packages.combined.js after it installs all of the Node dependencies - see
This is a little complex to run. You'll need to run:
- A static file server to host the Aframe scene. You can run this using
yarn start. These are just shortcuts for
node app.js. This will run on port 3000.
- The web socket server that will stream our mpeg data to the browser, adapted from jsmpeg. I've also put the command for this into a script so you can just run
sh scripts/start-streaming-server.sh. This listens for data from PhantomJS on port 8082 and then allows the browser to connect via a Websocket on port 8084. Once the browser connects, it will send all of the mpeg data to the browser.
- A script that runs PhantomJS, pipes the rendered PNG output to ffmpeg, and then streams the mpeg output from that to the server mentioned in step 2. I've put the command for this into a script, so you can just run
http://localhost:3000. It will probably take a few seconds for streaming to start. If the 3D web page doesn't show up, try refreshing.
The PhantomJS script will output to two logfiles:
logs/mainwhich will show general PhantomJS log messages
I'd recommend tailing these logs as you work with
tail -f logs/main and
tail -f logs/page_errors. The reason for doing this instead of using the usual
console.log is that we can't output anything to
stdout except for the raw PNG data since we are piping this into ffmpeg.