We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
reshape
Optimizer::update()
concatenate
flatten
tape.add_operation
tape.add_backward_op
tape_holder
tape
max_last_dim()
gather_last_dim()
Repeated<T, N>
nn::DropoutOneIn<N>
nn::LayerNorm
*_broadcast_rhs_first
nn::Residual<T>
The text was updated successfully, but these errors were encountered:
Closing this issue in favor of per release issues
Sorry, something went wrong.
No branches or pull requests
0.9.0 - nightly conv nets & transformers
reshape
function #55Optimizer::update()
causes runtime panic #67Comparison against pytorch (patch version bump)
Misc other generic const exprs functions (patch version bump)
concatenate
function #43flatten
layer/functions #14Released
v0.5.1 - Mnist example with linear MLPtape.add_operation
totape.add_backward_op
#23tape_holder
totape
. #24Released
v0.5.2 - RL examples & save/loadmax_last_dim()
#29gather_last_dim()
#30Released
v0.6.0 - transformers prep & other additionsRepeated<T, N>
for repeating a module N times. #36nn::DropoutOneIn<N>
#38nn::LayerNorm
#37*_broadcast_rhs_first
functions #51nn::Residual<T>
#53The text was updated successfully, but these errors were encountered: