the error of backpropagation #2906
xuhongxin
started this conversation in
Deep Learning
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
i was study the Section 2.5(Automatic Differentiation), but when i run the follow demo code , it report the error;what is the reason of the error?
the demo code
try (GradientCollector gc = Engine.getInstance().newGradientCollector()) { NDArray y = x.sum(); gc.backward(y); } x.getGradient() // Overwritten by the newly calculated gradient
the error
`
ai.djl.engine.EngineException: MXNet engine call failed: MXNetError: Check failed: !AGInfo: :IsNone(*i): Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward.
Stack trace:
File "/Users/runner/work/djl/djl/src/imperative/imperative.cc", line 295
`
Beta Was this translation helpful? Give feedback.
All reactions