Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GLTFExporter: Ensure joints are uint8 or uint16. #13609

Merged
merged 2 commits into from
Mar 21, 2018

Conversation

donmccurdy
Copy link
Collaborator

Seeing some models loaded in other formats with e.g. Float32Array types for skinIndex, so we may want to convert that to the correct type before export.

/cc @looeee

@donmccurdy
Copy link
Collaborator Author

might be good to enforce the rest of the types here, too, i suppose

@looeee
Copy link
Collaborator

looeee commented Mar 16, 2018

Seems like it's just the AssimpLoader, ColladaLoader, FBXLoader and SEA3DLoader that do this. Perhaps we should update those instead?

! ( attribute instanceof THREE.Uint16BufferAttribute ) &&
! ( attribute instanceof THREE.Uint8BufferAttribute ) ) {

attribute = new THREE.Uint16BufferAttribute( new Uint16Array( attribute.array ), attribute.itemSize, attribute.normalized );
Copy link
Collaborator

@takahirox takahirox Mar 16, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think just in case we need warning here if we down size from Uint32 or Float32 because data can be broken if it has values over 0xFFFF (probably it's super rare tho). Checking max value would be another option, it's a bit expensive tho.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a max range for indices in WebGL? If not, is there any reason to prefer Uint16 over Uint32 here (besides memory)?

Copy link
Collaborator

@takahirox takahirox Mar 16, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a max range for indices in WebGL?

Three.js handles skinning in GPU. Bone Matrices can be texture or uniforms array in shader. They can limit the bone indices range. So I know over 0xFFFF is too huge.

is there any reason to prefer Uint16 over Uint32 here (besides memory)?

IMO, no. But we decided to aggressively check/verify parameters in exporter. So this checking and display warning is good to me.

Oops, I've misread your question. I thought "prefer Uint32 over Uint16". Anyways, yes. in glTF skinIndex must be Uint8 or Uint16.

@takahirox
Copy link
Collaborator

Perhaps we should update those instead?

IMO, we should update both the exporter end loaders.

Regarding the exporter, we decided to aggressively check/verify parameters in exporters. So this checking is good. Three.js doesn't restrict skinIndex type (yet), so users can set manually Uint32 or Float32.

Regarding the loaders, bone max num 0xFFFF is large enough in general (I've never seen over 1K bones model). So using Uint16 is reasonable. Checking max value in the array and using appropriate type would be more reasonable and less problematic as BufferGeometry.setIndex() does, it'd be a bit expensive tho.

@looeee
Copy link
Collaborator

looeee commented Mar 16, 2018

Doing a bit more reading around, it seems like the max number of bones in WebGL is probably 128, and may be less on some devices. I guess that may increase over time, but it's unlikely to be anything like 65535 anytime soon, so Uint16 is more than sufficient.

@takahirox
Copy link
Collaborator

Where is 128 from? Anyways, Uint16 should be ok in general.

And also this PR is good to me because of the exporter policy, aggressively ensuring the valid data.

@looeee
Copy link
Collaborator

looeee commented Mar 16, 2018

I presume it's because of this:

https://webglstats.com/webgl/parameter/MAX_VERTEX_UNIFORM_VECTORS

128 is the max number of vertex uniforms that you can assume a WebGL 1 device will be able to handle.
As far as I know, each bone takes up one uniform.

@takahirox
Copy link
Collaborator

takahirox commented Mar 16, 2018

We use texture for boneMatrices in shader as the default if browser supports float texture. In that case max bone number can be bigger. Actually I render models with 140 and 200 bones on Three.js

I found that Three.js seems to limit the max bone number 1024 with bone texture. Search allocateBones in three.js.

So Uint16 should be good enough. The loaders should use Uint8/Uint16 for skinIndex. And I think it's also ok to ensure in the exporter because of our policy as I mentioned.

@looeee
Copy link
Collaborator

looeee commented Mar 16, 2018

Ah ok, good to know.
In any case I agree with strict data validation in the exporter.

@looeee
Copy link
Collaborator

looeee commented Mar 16, 2018

Wait though (and sorry, I know this is a bit off topic now), aren't the exact same devices that only support 128 vertex uniforms also likely to not support float textures?
Checking support for those, it's only around 70%: https://webglstats.com/search?query=oes_texture_float

So we are still back to having to assume that the limit for a low powered device may be 128.

In any case, the upper limit for a modern GPU with float textures support will be much higher. So Uint16 seems reasonable.

@takahirox
Copy link
Collaborator

takahirox commented Mar 16, 2018

That can be for a low powered (and maybe old) device.

But the exporter should care of upper-limit, rather than lower-limit, not to lose data info. Especially in glTF, optimization for such devices would be done by post-pipeline tool.

And yeah, for the loaders Uint16 is reasonable.

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented Mar 16, 2018

In addition to device limits, glTF disallows uint32 (because that many bones could not be consistently supported, I think). E.g. UE4 has a max of 65536 total bones, or 255 per mesh.

Only specific types are allowed for each attribute type: https://github.com/KhronosGroup/glTF/blob/master/specification/2.0/README.md#meshes

@mrdoob
Copy link
Owner

mrdoob commented Mar 16, 2018

THREE.Uint16BufferAttribute already does a Uint16Array() inside.

@takahirox
Copy link
Collaborator

In addition to device limits, glTF disallows uint32

Right. I've misread @looeee question and might have made you all a bit confused. I've update the answer to the question.

@takahirox
Copy link
Collaborator

THREE.Uint16BufferAttribute already does a Uint16Array() inside.

Yeah, we can replace with

new THREE.Uint16BufferAttribute( attribute.array, attribute.itemSize, attribute.normalized );

@donmccurdy
Copy link
Collaborator Author

Ok, removed the extra Uint16Array copy and added a warning to the console.

@mrdoob
Copy link
Owner

mrdoob commented Mar 17, 2018

This may be more robust...?

var array = attribute.array;
if ( attributeName === 'JOINTS_0' &&
	! ( array instanceof Uint16Array ) &&
	! ( array instanceof Uint8Array ) ) {

	console.warn( 'GLTFExporter: Attribute "skinIndex" converted to type UNSIGNED_SHORT.' );
	attribute = new THREE.BufferAttribute( new Uint16Array( array ), attribute.itemSize, attribute.normalized );

}

@donmccurdy
Copy link
Collaborator Author

Done ✅

@mrdoob mrdoob added this to the r92 milestone Mar 21, 2018
@mrdoob mrdoob merged commit 40cf507 into mrdoob:dev Mar 21, 2018
@mrdoob
Copy link
Owner

mrdoob commented Mar 21, 2018

Thanks!

@donmccurdy donmccurdy deleted the bug-gltfexporter-uint-joints branch July 9, 2021 19:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants