Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Technical question #30

Open
Consti10 opened this issue Apr 21, 2020 · 4 comments
Open

Technical question #30

Consti10 opened this issue Apr 21, 2020 · 4 comments
Labels
question Further information is requested

Comments

@Consti10
Copy link

I am wondering:
Assuming you are only placing a 2D canvas (for example a UI element) in VR 3D space. Is it possible to
use projective texturing to map that canvas onto the distortion mesh from the view of 'HeadSpaceFromStartSpace' translation matrix ?
Then you could create a simple Shader that takes a 2D surface and a translation matrix for the position of the 2D surface in 3D space that does all the distortion correction in one render pass.
I assume in the gvr_lib you are doing something similar but obviously it is not open source.

@jballoffet jballoffet added feature request New feature or request question Further information is requested labels May 28, 2020
@jballoffet
Copy link
Member

If you have a well tessellated world, it could be possible to distort using the vertex shader. Bear in mind that this implies changing the current pipeline (e.g., CardboardDistortionRenderer_renderEyeToDisplay() should not be called anymore).

The GVR library only supported the same post process distortion that Cardboard is doing now.

@Consti10
Copy link
Author

How does gvr handle async reprojection then ?

@chaosemer chaosemer removed the feature request New feature or request label Oct 8, 2020
@agalbachicar-gg
Copy link
Contributor

As mentioned here, async reprojection involves adjusting the position of the rendered frame just before it is seen by the user. This requires a special hook from a phone's display driver, ideally per-scanline. On Daydream ready phones, Google VR Services is able to subscribe to this hook; all other phones and other apps do not have access to this hook.

@Consti10
Copy link
Author

Hello,
My question was more about how the gvr shaders do the 'position adjusment', since there is a lot of complex math involved. Unfortunately, this interesting part was not open sourced. Even though you can obviosly check out john carmacks original work:
repo

I managed to get half-screen warp running on select phones without the gvr service, but the main issue is that android does not expose the exact timing information of the display, and therefore I have to use the choreographer workaround.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants