Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Rendering] Use 16bit data types on the GPU where possible #356

Closed
3 tasks
Tracked by #259
kanerogers opened this issue Sep 20, 2022 · 1 comment
Closed
3 tasks
Tracked by #259

[Rendering] Use 16bit data types on the GPU where possible #356

kanerogers opened this issue Sep 20, 2022 · 1 comment
Assignees

Comments

@kanerogers
Copy link
Collaborator

kanerogers commented Sep 20, 2022

Background

As Hotham's renderer is primarily memory bandwidth bound on the Quest 2, dropping to 16 bit data types will not only improve bandwidth, but in general only use a single instruction, compared to two, when operating on them. This could be a very significant performance increase.

TODO

  • Modify Vertex struct
  • Modify asset importer to convert to the correct precision
  • Modify shaders
@kanerogers kanerogers changed the title Use 16bit data types where possible [Rendering] Use 16bit data types on the GPU where possible Sep 20, 2022
@kanerogers
Copy link
Collaborator Author

Closed in #415

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants