-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve handling of duplicate resources #9
Comments
I like the duplicate detection, but that could be slow so a flag would be interesting here too. In general finding duplicate resources on a GLTF and simplifying could be a function on the toolkit. As for performing LOD merging first, we could invert the order, but that also means we would have to perform changes in all affected LODs when packing, which can be a little tricky. I like having LOD merging be the last step. I think my ideal solution would be to perform packing of LOD zero, then cache some hash of each resource when adding it to the final GLTF, and use these hashes to de-dupe when packing LODs. More specifically:
That way we won't have dupes + we only pack once, at the cost of hashing each file when packing (i.e. some optional param would be the map<resource hash, index of the final resource>. What do you think @erikdahlstrom ? |
Sounds good to me. I think the merger needs a bit more work to deal with the case where only some resources are shared, especially wrt getting the right offsets for all the dependent fields in the glTF json. I have the addition of the commandline flag on a branch, https://github.com/erikdahlstrom/glTF-Toolkit/tree/add_material_flag, I'd be happy to submit it as a pull-request if you want. |
Thanks! I think you should send as a PR when we have the hashing described above. |
Closed by #14 |
Given a set of gltf input LODs that have the same materials for each LOD:
Some possible solutions:
The text was updated successfully, but these errors were encountered: