Skip to content

Conversation

@gprateek93
Copy link
Collaborator

This commit adds support for aten.native_layer_norm operation. Here
the previous code for aten.layer_norm is tweaked a little bit to
accomodate both mean and variance values alongwith the layer norm
value. This commit also adds decomposition of aten.layer_norm into
aten.native_layer_norm, which was previously getting lowered directly
to linalg.

Signed-Off-By: Prateek Guptaprateek@nod-labs.com

@gprateek93 gprateek93 force-pushed the prateek/aten-native-layer-norm branch from 238f10e to ef74c03 Compare December 10, 2021 13:14
This commit adds support for aten.native_layer_norm operation. Here
the previous code for aten.layer_norm is tweaked a little bit to
accomodate both mean and variance values alongwith the layer norm
value. This commit also adds decomposition of aten.layer_norm into
aten.native_layer_norm, which was previously getting lowered directly
to linalg.

Signed-Off-By: Prateek Gupta<prateek@nod-labs.com>
@gprateek93 gprateek93 force-pushed the prateek/aten-native-layer-norm branch from ef74c03 to a8a8389 Compare December 10, 2021 13:15
@gprateek93 gprateek93 merged commit cfc8de3 into main Dec 10, 2021
@gprateek93 gprateek93 deleted the prateek/aten-native-layer-norm branch December 10, 2021 16:52
qedawkins pushed a commit to nod-ai/torch-mlir that referenced this pull request Oct 3, 2022
)

* Handle optional parameters in custom function translation

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

* Modify function name generation

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

* clang format

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants