Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Missing 'InstanceNormalization' operator #18

Closed
gnsmrky opened this issue Dec 3, 2018 · 7 comments
Closed

Missing 'InstanceNormalization' operator #18

gnsmrky opened this issue Dec 3, 2018 · 7 comments
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed operator Related to one or more ONNX operators

Comments

@gnsmrky
Copy link

gnsmrky commented Dec 3, 2018

Lately Instance Normalization (IN) has become popular. Just curious if IN is being planned to be added? Would love to see that for wasm backend.

ONNX.js operator list:
https://github.com/Microsoft/onnxjs/blob/master/docs/operators.m

BTW, ONNX runtime does have IN supported:
https://github.com/onnx/onnx/blob/master/docs/Operators.md#InstanceNormalization

@gnsmrky gnsmrky changed the title Add support for 'InstanceNormalization' operator Missing 'InstanceNormalization' operator Dec 3, 2018
@fs-eire fs-eire added operator Related to one or more ONNX operators enhancement New feature or request help wanted Extra attention is needed good first issue Good for newcomers labels Dec 3, 2018
@JamesOscar
Copy link

Any idea if/when this will be added? It would be really useful for a project I'm currently working on.

@dahuang37
Copy link

Hi, I would love to take on this issue. Is anyone else working on this already?

@hariharans29
Copy link
Member

@gnsmrky and @JamesOscar - Thanks for your interest in ONNX.JS. #82 should have the IN operator for the wasm backend and the default cpu backend

@yhung119 - I have a simplistic implementation for this already. Please feel free to take a look to see if it can be improved. Additionally, the IN operator for the webgl backend can be tackled as well :). Thanks for your interest in contributing to ONNX.JS.

@gnsmrky
Copy link
Author

gnsmrky commented Jan 30, 2019

@hariharans29 #82 has few errors. Is it still good to use it? Or should wait until official stable release?

I actually worked around this issue by implementing InstanceNorm in PyTorch and managed to run it on ONNX.js now.

@hariharans29
Copy link
Member

hariharans29 commented Jan 30, 2019

@gnsmrky

Thanks for your comment. It's great to know that you have a work-around to unblock yourself.

Some comments/questions -

  1. You are more than welcome to wait for the next official release, but as far as I can see, operator: InstanceNormalization operator for wasm and cpu backends #82 passes build and has passed all tests (including the IN op specific tests). The red crosses that you probably are referring to on the PR page pertains to the fact that it's still pending a review and hence it is not yet ready to merge. Barring a few minor changes relating to PR feedback, I probably expect it to be merged to master (and hence available in the next release). I have requested a review and I will await feedback.

  2. In your comment, you mentioned that you were interested in the wasm backend. Is that still the case now ? operator: InstanceNormalization operator for wasm and cpu backends #82 only supports the wasm backend and the default cpu backend for the IN op (just FYI).

@gnsmrky
Copy link
Author

gnsmrky commented Jan 31, 2019

@hariharans29, thanks a lot! The background info is that I re-wrote the InstanceNorm using basic ops. It turned out the performance is not that bad as most basic ops are webgl accelerated.

  1. Will see if wasm IN works.

  2. Yes, wasm is still interested. I will do some benchmarks on Intel small core CPU.

@hariharans29
Copy link
Member

Closing this as #82 is merged to master and primarily addresses the issue of missing IN op in ONNX.js. Currently, cpu and wasm backends are supporting IN op.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed operator Related to one or more ONNX operators
Projects
None yet
Development

No branches or pull requests

5 participants