Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Got invalid dimensions for input #808

Closed
0x4E69676874466F78 opened this issue Aug 22, 2022 · 36 comments
Labels
bug Something isn't working

Comments

@0x4E69676874466F78
Copy link

An error occurred in a ONNX Upscale Image node:

[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Got invalid dimensions for input: input for the following indices
 index: 1 Got: 1 Expected: 3
 Please fix either the inputs or the model.

Input values (partial):
• Image: RGB image 2176x2176
• Tile Size Target: 0

Models: https://github.com/armory3d/armorlab_models/releases/tag/2022.8
Simple project: armorlabtest.zip

@0x4E69676874466F78 0x4E69676874466F78 added the bug Something isn't working label Aug 22, 2022
@0x4E69676874466F78
Copy link
Author

I've tried other Tile Size Targets, no effect. It lacks some more parameters.

@joeyballentine
Copy link
Member

We don't support things that take in parameters. I don't even know what this model is or what it's for. The only kinds of models officially supported are super resolution models.

@0x4E69676874466F78
Copy link
Author

@joeyballentine these models are for generating PBR textures (basecolor, height, normal, occlusion, roughness).
Where in code could I set these parameters myself?

@joeyballentine
Copy link
Member

It would be where we actually run the ONNX model. Though I don't know how to actually set the other parameters as I have not tried to do so before.

@0x4E69676874466F78
Copy link
Author

As far as I understand this is just the output size.

@joeyballentine
Copy link
Member

It's missing two parameters

@0x4E69676874466F78
Copy link
Author

@joeyballentine
Copy link
Member

kha.Image.fromBytes is not where it actually runs the model

@theflyingzamboni
Copy link
Collaborator

theflyingzamboni commented Aug 22, 2022

The issue is actually just with how we get the number of input channels. chaiNNer is expecting the models to use dynamic axes, where the shape is ["batch_size", 3 (or whatever int value for the number of channels), "height", "width"], the values in quotes being string literals. So the check to get the shape order (bchw vs. bhwc) currently relies on determining which index (1 or 3) contains the int (which because of the dynamic axes is assumed to be in_nc). Since this model uses static axes, all values are ints, and the code returns 1 (at index 0) as the number of input channels instead of 3. Hardcoding 3 allowed this model to run in my testing.

Basically, to get something like this to work, we need another way to get in_nc that doesn't assume dynamic axes. This may be resolvable by just checking that index 1 is an int and is <= 4. We probably won't get a model with a static height or width that low. I don't know if there's any order that would allow batch to be at index 1, but we currently don't account for that regardless.

image

@joeyballentine
Copy link
Member

Oh, that's interesting. I assumed it was like the topaz models that took in other parameters (which we can't support currently)

So I guess we just need a way to check if it has static axis instead of always assuming dynamic? Do you think that's possible?

@theflyingzamboni
Copy link
Collaborator

theflyingzamboni commented Aug 22, 2022

Like I edited in probably after you read that, we may just be able to check whether index 1 is an int <= 4, because I think it's unlikely that we will get a model with a static height of 4. I don't know what all other orders there are though.

@joeyballentine
Copy link
Member

joeyballentine commented Aug 22, 2022

I see. Batch would always come first I assume. bhwc and bchw are the two standard ones I know of. Anything else would be nonstandard

@joeyballentine
Copy link
Member

So I think your idea would work. I don't think any model would be for that small of an image size

@0x4E69676874466F78
Copy link
Author

@theflyingzamboni thanks!

@theflyingzamboni
Copy link
Collaborator

theflyingzamboni commented Aug 22, 2022

@joeyballentine If you can't think of an issue, I can go ahead and PR that change.

(Side node, really weird that this model hard-codes a size of 2176x2176. What uses those dims, and why so limited?)

@0x4E69676874466F78
Copy link
Author

@theflyingzamboni I just know that in Armorlab you can choose 2K or 4K or 8K or 16K

изображение

@joeyballentine
Copy link
Member

joeyballentine commented Aug 22, 2022

What uses those dims, and why so limited?

It might be trained for that resolution specifically? Idk. maybe the arch is such that it needs to have that dimension

@0x4E69676874466F78
Copy link
Author

0x4E69676874466F78 commented Aug 22, 2022

@theflyingzamboni

Hardcoding 3 allowed this model to run in my testing.

        return convenient_upscale(
            img,
            in_nc,
            lambda i: self.upscale(i, session, split_factor, change_shape),
        )

in_nc -> 3?

If so, then the model starts but does not work correctly (photo_to_base, photo_to_normal).

@0x4E69676874466F78
Copy link
Author

basecolor from armorlab:
изображение

@theflyingzamboni
Copy link
Collaborator

theflyingzamboni commented Aug 22, 2022

@theflyingzamboni I just know that in Armorlab you can choose 2K or 4K or 8K or 16K

2k is 2048 though. 2176 just seems like such an unusual number to target. Not that it really matters, except you can't pick a tile size, since the axes aren't dynamic.

basecolor from armorlab

Hmm, are you sure something isn't going wrong there? Those are definitely not the base colors of that texture, the chaiNNer output is much closer to the original in color. I also ran all the other models, and they look basically like I would expect them to, I think (though I'm not positive, the normal is a much darker blue than I'm used to).

@theflyingzamboni
Copy link
Collaborator

How did you get access to this? Their download page just says "In construction".

@0x4E69676874466F78
Copy link
Author

@theflyingzamboni

Hmm, are you sure something isn't going wrong there?

Yes. The base color should not have shadows and highlights, it is a "flat" colors. Normals... wait sec...

How did you get access to this?

It needs to be built from source, or I can upload the finished build.

@0x4E69676874466F78
Copy link
Author

@theflyingzamboni

изображение

изображение

@theflyingzamboni
Copy link
Collaborator

Yeah, that looks like more what I'd expect from a normal. If it requires more than just us getting correct dimensions, this may or may not be something we can reasonably implement. Will need to look into it more.

@0x4E69676874466F78
Copy link
Author

Maybe it works in a linear color space?

@0x4E69676874466F78
Copy link
Author

0x4E69676874466F78 commented Aug 22, 2022

ArmorLabRelease.zip (7zip LZMA2)
drop models in data/models folder

  1. Drag and drop texture on canvas.
  2. Connect color to color
  3. Run
  4. Switch to 2D tab
  5. In menu Mode -> Base color / Normal
  6. Click on Proto to PBR node

@theflyingzamboni
Copy link
Collaborator

theflyingzamboni commented Aug 22, 2022

https://github.com/armory3d/armorlab/blob/a3fbea595478637de59efac7f7f38c746e57a08f/Sources/arm/node/brush/PhotoToPBRNode.hx#L63-L76

I'm reasonably sure that this is not going to be possible to use in chaiNNer. From what I can tell here, it seems the program does additional data transformations outside of just using the model, which places it outside the scope of chaiNNer.

@0x4E69676874466F78
Copy link
Author

0x4E69676874466F78 commented Aug 22, 2022

Ah yes, I was careless, sorry.
But I'll ask the author what kind of magic he's doing there.

@joeyballentine
Copy link
Member

Closing this for now as these models seem outside chaiNNer's scope

@0x4E69676874466F78
Copy link
Author

изображение
изображение

@joeyballentine
Copy link
Member

Yeah they definitely are doing stuff other than just purely running the model

@0x4E69676874466F78
Copy link
Author

Every pixel (pixel_channel / 255 - 0.5) / 0.5 = result "linear space".
if pixel_channel = 0 then result -1
if pixel_channel = 128 then result 0,003921...
if pixel_channel = 255 then result 1
and then it does the reverse transformation (pixel_channel * 0.5 + 0.5) * 255).

@joeyballentine, Is there a way to iterate over each pixel in this way?

@0x4E69676874466F78
Copy link
Author

0x4E69676874466F78 commented Aug 24, 2022

@RunDevelopment How do you store a color image in f32?

@joeyballentine
Copy link
Member

You can't iterate over the pixels with just chainner nodes alone. This is fairly simple with numpy array operations though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants