-
-
Notifications
You must be signed in to change notification settings - Fork 265
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Got invalid dimensions for input #808
Comments
I've tried other Tile Size Targets, no effect. It lacks some more parameters. |
We don't support things that take in parameters. I don't even know what this model is or what it's for. The only kinds of models officially supported are super resolution models. |
@joeyballentine these models are for generating PBR textures (basecolor, height, normal, occlusion, roughness). |
It would be where we actually run the ONNX model. Though I don't know how to actually set the other parameters as I have not tried to do so before. |
As far as I understand this is just the output size. |
It's missing two parameters |
Yes, output size (width, height, default: 2048, 2048). |
kha.Image.fromBytes is not where it actually runs the model |
There is a very confusing code scattered over many subprojects.
Yes. https://github.com/armory3d/armorlab/blob/a3fbea595478637de59efac7f7f38c746e57a08f/Sources/arm/node/brush/PhotoToPBRNode.hx#L64 |
The issue is actually just with how we get the number of input channels. chaiNNer is expecting the models to use dynamic axes, where the shape is ["batch_size", 3 (or whatever int value for the number of channels), "height", "width"], the values in quotes being string literals. So the check to get the shape order (bchw vs. bhwc) currently relies on determining which index (1 or 3) contains the int (which because of the dynamic axes is assumed to be in_nc). Since this model uses static axes, all values are ints, and the code returns 1 (at index 0) as the number of input channels instead of 3. Hardcoding 3 allowed this model to run in my testing. Basically, to get something like this to work, we need another way to get in_nc that doesn't assume dynamic axes. This may be resolvable by just checking that index 1 is an int and is <= 4. We probably won't get a model with a static height or width that low. I don't know if there's any order that would allow batch to be at index 1, but we currently don't account for that regardless. |
Oh, that's interesting. I assumed it was like the topaz models that took in other parameters (which we can't support currently) So I guess we just need a way to check if it has static axis instead of always assuming dynamic? Do you think that's possible? |
Like I edited in probably after you read that, we may just be able to check whether index 1 is an int <= 4, because I think it's unlikely that we will get a model with a static height of 4. I don't know what all other orders there are though. |
I see. Batch would always come first I assume. bhwc and bchw are the two standard ones I know of. Anything else would be nonstandard |
So I think your idea would work. I don't think any model would be for that small of an image size |
@theflyingzamboni thanks! |
@joeyballentine If you can't think of an issue, I can go ahead and PR that change. (Side node, really weird that this model hard-codes a size of 2176x2176. What uses those dims, and why so limited?) |
@theflyingzamboni I just know that in Armorlab you can choose 2K or 4K or 8K or 16K |
It might be trained for that resolution specifically? Idk. maybe the arch is such that it needs to have that dimension |
in_nc -> 3? If so, then the model starts but does not work correctly (photo_to_base, photo_to_normal). |
2k is 2048 though. 2176 just seems like such an unusual number to target. Not that it really matters, except you can't pick a tile size, since the axes aren't dynamic.
Hmm, are you sure something isn't going wrong there? Those are definitely not the base colors of that texture, the chaiNNer output is much closer to the original in color. I also ran all the other models, and they look basically like I would expect them to, I think (though I'm not positive, the normal is a much darker blue than I'm used to). |
How did you get access to this? Their download page just says "In construction". |
Yes. The base color should not have shadows and highlights, it is a "flat" colors. Normals... wait sec...
It needs to be built from source, or I can upload the finished build. |
Yeah, that looks like more what I'd expect from a normal. If it requires more than just us getting correct dimensions, this may or may not be something we can reasonably implement. Will need to look into it more. |
Maybe it works in a linear color space? |
ArmorLabRelease.zip (7zip LZMA2)
|
I'm reasonably sure that this is not going to be possible to use in chaiNNer. From what I can tell here, it seems the program does additional data transformations outside of just using the model, which places it outside the scope of chaiNNer. |
Ah yes, I was careless, sorry. |
Closing this for now as these models seem outside chaiNNer's scope |
Yeah they definitely are doing stuff other than just purely running the model |
Every pixel (pixel_channel / 255 - 0.5) / 0.5 = result "linear space". @joeyballentine, Is there a way to iterate over each pixel in this way? |
@RunDevelopment How do you store a color image in f32? |
You can't iterate over the pixels with just chainner nodes alone. This is fairly simple with numpy array operations though. |
Models: https://github.com/armory3d/armorlab_models/releases/tag/2022.8
Simple project: armorlabtest.zip
The text was updated successfully, but these errors were encountered: