-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Teleport: a real-time network protocol for VR #356
Comments
How will you approach publishing test code, specs and designs? |
The current draft spec is at https://docs.teleportvr.io/protocol.html. The reference implementation is described at https://docs.teleportvr.io/reference/index.html, and the test code is at https://github.com/simul/Teleport. |
Status of this? I believe you said you took this convo to another channel? cc @fire |
I'm not aware of another channel - I will try to join the Wednesday 22nd meeting if there are any questions/comments. |
pulling in @technobaboo |
Dr. Kennedy spoke about the effort/intent during todays call. @fire has ideas for godot engine implementation |
I tagged others in the # metaverse-traversal channel. If this channel doesn't suffice, we can spin up a #omi-portals-pro-tem channel. |
This sounds very much like https://xrfragment.org (which I'm contributing to), but iirc teleport is more of a server-client thing? |
Teleport: a real-time network protocol for VR
Contributors
Summary
The protocol arose out of our work at Simul for game development studios. I became interested specifically in the challenge of opposing forces in VR - application complexity versus comfort (e.g. high-quality graphics versus lightweight headsets etc). It seemed that streaming might provide an answer except for the latency problem. We investigated hybrid streaming: rendering foregrounds locally on the headset with streamed geometry/materials, and backgrounds (e.g. 10 metres+) remotely on the server streamed as video. This gets around many of the latency issues because head motion, hand-motion etc are handled locally, while more abstract logic is done server-side.
This became a more general approach to the question: if there really were a spatial internet, what would the network protocol look like for transport? The result that we've prototyped and which I want to propose to you is this: a real-time protocol capable of generically streaming geometry, materials, video, audio from server-to-client; and control inputs from client-to-server. This type of protocol can support a very wide array of applications.
Example Use Cases
Tie this proposal back to the real-world problems it is trying to solve using examples. Use existing work when applicable.
In typical usage, The user launches a thin-client application on their device, and chooses or types a URL to connect to. They will be in the application in a few seconds.
For example, the app could be a game. The server sends a list of desired control mappings (to be matched up via OpenXR to the user's hardware). It sends "pose mappings" also, e.g. maps the 3D controller objects to the OpenXR poses for the device controllers. So these will move in the local loop without latency. The client is sent a lobby area with high-scores and instructions displayed in 3D text objects within the environment. Then play begins: the lobby is removed and the play area is sent. Object positions are sent as hierarchies of "poses" with attached mesh identifiers. Motion is sent as updates to these "poses" - this is typical game networking, except more generic.
Another application could be in a factory-floor setting. The factory's digital twin would be sent as geometry. Instructions for a training task would be sent as text objects for display. The user would attempt to press buttons or levers, pick up and move objects etc. When an object is "picked up" it would be remapped to the user's local hierarchy, allowing it to be "carried" without latency (the server would receive updates on these positions, and is the final arbiter of state).
The above case is interesting from a deployment perspective: there would be no need to install a new version of the training application on each headset when updated: like a website, only the server needs to be updated.
Implementation
The current version of the protocol is operational and supports a wide variety of behaviours. It supports one-way video streams, two-way audio streams, and one-way streams of geometry, materials and textures. It supports multiple clients connecting per server, and is implemented on PC via Direct3D and OpenXR, and on Meta Quest via Vulkan and OpenXR.
The protocol operates four main connections between a client and server:
What remains to get to a 1.0 will involve larger-scale use-case testing and some extensive examination of the structure and theory of the system, looking for potential roadblocks or scaling issues. As it has been some time since I have published anything scientific and that in engineering rather than computer science, I would greatly appreciate some help with drafting and locking down the underpinnings of the spec.
The text was updated successfully, but these errors were encountered: