Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rendering into RGIntegerFormat textures doesn't work #27686

Closed
Aloalo opened this issue Feb 5, 2024 · 2 comments · Fixed by #27688
Closed

Rendering into RGIntegerFormat textures doesn't work #27686

Aloalo opened this issue Feb 5, 2024 · 2 comments · Fixed by #27688
Labels

Comments

@Aloalo
Copy link
Contributor

Aloalo commented Feb 5, 2024

Description

When creating a WebGLRenderTarget with RGIntegerFormat a WebGL error occurs: WebGL: INVALID_VALUE: texImage2D: invalid internalformat or (GL_INVALID_OPERATION: Fragment shader output type does not match the bound framebuffer attachment type.). Managed to fix the issue by adding the follwing code to this function:

if ( glFormat === _gl.RG_INTEGER ) {

	if ( glType === _gl.UNSIGNED_BYTE ) internalFormat = _gl.RG8UI;
	if ( glType === _gl.UNSIGNED_SHORT ) internalFormat = _gl.RG16UI;
	if ( glType === _gl.UNSIGNED_INT ) internalFormat = _gl.RG32UI;
	if ( glType === _gl.BYTE ) internalFormat = _gl.RG8I;
	if ( glType === _gl.SHORT ) internalFormat = _gl.RG16I;
	if ( glType === _gl.INT ) internalFormat = _gl.RG32I;

}

Not sure if this is the only place this is needed?

Reproduction steps

  1. Create a WebGLRenderTarget with RGIntegerFormat and RG8UI internal format.
  2. Render into it.

Code

jsfiddle

Live example

jsfiddle

Screenshots

No response

Version

r160

Device

No response

Browser

No response

OS

No response

@Mugen87
Copy link
Collaborator

Mugen87 commented Feb 5, 2024

Would you mind filing a PR with your fix? The addition looks correct. getInternalFormat() is the only place where something is missing for supporting RGIntegerFormat.

BTW: In your fiddle, you are not creating the instance of WebGLCubeRenderTarget correctly. The ctor has only two parameters, not three. Besides, position shader attributes are not of type vec4 but vec3.

In any event, adding your code makes the fiddle work without errors.

@Aloalo
Copy link
Contributor Author

Aloalo commented Feb 5, 2024

Ok, thanks, I'll create a PR.

Fixed the WebGLCubeRenderTarget construction in the fiddle for posterity: https://jsfiddle.net/h3e4zncb/1/
The position attribute is vec4, I'm using a RawShaderMaterial. Even if the underlying buffer data is vec3, WebGL will automatically set the 4th component to 1.0.

From https://registry.khronos.org/OpenGL/specs/es/3.0/es_spec_3.0.pdf:

Unspecified y and z components are implicitly set to 0.0 for floating-point array types and 0 for integer array types.
Unspecified w components are implicitly set to 1.0 for floating-point array types
and 1 for integer array types.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants