You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when 3d input in Linear, the Linear Module will call the function flow.F.matmul
assume that x is (N, C, T), the weight is (C, D)
in oneflow/core/functional/impl/nn_functor.cpp
// it will go this branch
if (a_shape->NumAxes() != b_shape->NumAxes()) {
CHECK_EQ_OR_RETURN(b_shape->NumAxes(), 2)
<< "Not support number of dimensions of a being less than number of dimensions of b!";
return OpInterpUtil::Dispatch<Tensor>(*bcast_matmul_op_, {a, b}, attrs);
}
so here call the broadcast matmul.
In oneflow/core/autograd/gradient_funcs/matmul.cpp
in_grads->resize(2);
if (ctx->requires_grad_b) {
printf("requires grad b here!!!!!!");
const auto& input_b = ctx->SavedTensors().at(ctx->b_index);
in_grads->at(0) =
JUST(OpInterpUtil::Dispatch<Tensor>(*grad_a_op_, {out_grads.at(0), input_b}, attrs_a));
}
here it take wrong saved_tensor and failed
The text was updated successfully, but these errors were encountered:
example
code
when x.requires_grad = True, the code is correct.
problem
when 3d input in Linear, the Linear Module will call the function
flow.F.matmul
assume that x is (N, C, T), the weight is (C, D)
in
oneflow/core/functional/impl/nn_functor.cpp
so here call the broadcast matmul.
In
oneflow/core/autograd/gradient_funcs/matmul.cpp
here it take wrong saved_tensor and failed
The text was updated successfully, but these errors were encountered: