We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RmsNorm
RmsNorm switches to faster implementation if tensor is contiguous:
candle/candle-nn/src/layer_norm.rs
Lines 174 to 175 in 82b641f
But it does not support backward pass:
candle/candle-nn/src/ops.rs
Line 640 in 82b641f
Maybe it's better to implement ModuleT rather than Module for RmsNorm and use faster implementation only if train == false?
ModuleT
Module
train == false
The text was updated successfully, but these errors were encountered:
No branches or pull requests
RmsNorm
switches to faster implementation if tensor is contiguous:candle/candle-nn/src/layer_norm.rs
Lines 174 to 175 in 82b641f
But it does not support backward pass:
candle/candle-nn/src/ops.rs
Line 640 in 82b641f
Maybe it's better to implement
ModuleT
rather thanModule
forRmsNorm
and use faster implementation only iftrain == false
?The text was updated successfully, but these errors were encountered: