-
-
Notifications
You must be signed in to change notification settings - Fork 35.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GLTFExporter: Ensure joints are uint8 or uint16. #13609
Conversation
might be good to enforce the rest of the types here, too, i suppose |
Seems like it's just the |
! ( attribute instanceof THREE.Uint16BufferAttribute ) && | ||
! ( attribute instanceof THREE.Uint8BufferAttribute ) ) { | ||
|
||
attribute = new THREE.Uint16BufferAttribute( new Uint16Array( attribute.array ), attribute.itemSize, attribute.normalized ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think just in case we need warning here if we down size from Uint32
or Float32
because data can be broken if it has values over 0xFFFF (probably it's super rare tho). Checking max value would be another option, it's a bit expensive tho.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a max range for indices in WebGL? If not, is there any reason to prefer Uint16 over Uint32 here (besides memory)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a max range for indices in WebGL?
Three.js handles skinning in GPU. Bone Matrices can be texture or uniforms array in shader. They can limit the bone indices range. So I know over 0xFFFF is too huge.
is there any reason to prefer Uint16 over Uint32 here (besides memory)?
IMO, no. But we decided to aggressively check/verify parameters in exporter. So this checking and display warning is good to me.
Oops, I've misread your question. I thought "prefer Uint32 over Uint16". Anyways, yes. in glTF skinIndex must be Uint8 or Uint16.
IMO, we should update both the exporter end loaders. Regarding the exporter, we decided to aggressively check/verify parameters in exporters. So this checking is good. Three.js doesn't restrict Regarding the loaders, bone max num 0xFFFF is large enough in general (I've never seen over 1K bones model). So using |
Doing a bit more reading around, it seems like the max number of bones in WebGL is probably 128, and may be less on some devices. I guess that may increase over time, but it's unlikely to be anything like 65535 anytime soon, so Uint16 is more than sufficient. |
Where is 128 from? Anyways, And also this PR is good to me because of the exporter policy, aggressively ensuring the valid data. |
I presume it's because of this: https://webglstats.com/webgl/parameter/MAX_VERTEX_UNIFORM_VECTORS 128 is the max number of vertex uniforms that you can assume a WebGL 1 device will be able to handle. |
We use texture for boneMatrices in shader as the default if browser supports float texture. In that case max bone number can be bigger. Actually I render models with 140 and 200 bones on Three.js I found that Three.js seems to limit the max bone number 1024 with bone texture. Search So |
Ah ok, good to know. |
Wait though (and sorry, I know this is a bit off topic now), aren't the exact same devices that only support 128 vertex uniforms also likely to not support float textures? So we are still back to having to assume that the limit for a low powered device may be 128. In any case, the upper limit for a modern GPU with float textures support will be much higher. So Uint16 seems reasonable. |
That can be for a low powered (and maybe old) device. But the exporter should care of upper-limit, rather than lower-limit, not to lose data info. Especially in glTF, optimization for such devices would be done by post-pipeline tool. And yeah, for the loaders Uint16 is reasonable. |
In addition to device limits, glTF disallows uint32 (because that many bones could not be consistently supported, I think). E.g. UE4 has a max of 65536 total bones, or 255 per mesh. Only specific types are allowed for each attribute type: https://github.com/KhronosGroup/glTF/blob/master/specification/2.0/README.md#meshes |
|
Right. I've misread @looeee question and might have made you all a bit confused. I've update the answer to the question. |
Yeah, we can replace with
|
905f339
to
655fc34
Compare
Ok, removed the extra Uint16Array copy and added a warning to the console. |
This may be more robust...? var array = attribute.array;
if ( attributeName === 'JOINTS_0' &&
! ( array instanceof Uint16Array ) &&
! ( array instanceof Uint8Array ) ) {
console.warn( 'GLTFExporter: Attribute "skinIndex" converted to type UNSIGNED_SHORT.' );
attribute = new THREE.BufferAttribute( new Uint16Array( array ), attribute.itemSize, attribute.normalized );
} |
Done ✅ |
Thanks! |
Seeing some models loaded in other formats with e.g.
Float32Array
types forskinIndex
, so we may want to convert that to the correct type before export./cc @looeee