-
Notifications
You must be signed in to change notification settings - Fork 243
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Technical question #30
Comments
If you have a well tessellated world, it could be possible to distort using the vertex shader. Bear in mind that this implies changing the current pipeline (e.g., CardboardDistortionRenderer_renderEyeToDisplay() should not be called anymore). The GVR library only supported the same post process distortion that Cardboard is doing now. |
How does gvr handle async reprojection then ? |
As mentioned here, async reprojection involves adjusting the position of the rendered frame just before it is seen by the user. This requires a special hook from a phone's display driver, ideally per-scanline. On Daydream ready phones, Google VR Services is able to subscribe to this hook; all other phones and other apps do not have access to this hook. |
Hello, I managed to get half-screen warp running on select phones without the gvr service, but the main issue is that android does not expose the exact timing information of the display, and therefore I have to use the choreographer workaround. |
I am wondering:
Assuming you are only placing a 2D canvas (for example a UI element) in VR 3D space. Is it possible to
use projective texturing to map that canvas onto the distortion mesh from the view of 'HeadSpaceFromStartSpace' translation matrix ?
Then you could create a simple Shader that takes a 2D surface and a translation matrix for the position of the 2D surface in 3D space that does all the distortion correction in one render pass.
I assume in the gvr_lib you are doing something similar but obviously it is not open source.
The text was updated successfully, but these errors were encountered: