Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient Descent, Signs of Healthy Training: Need to detach() tensor. #28

Open
ethman opened this issue May 24, 2021 · 0 comments
Open
Labels
bug Something isn't working

Comments

@ethman
Copy link
Contributor

ethman commented May 24, 2021

We need to detach a tensor in the "Signs of Healthy Training" section on the Gradient Descent page.

Here's the cell:

fig, ax = plt.subplots(1, 2, figsize=(15, 5))

for j, lr in enumerate(LEARNING_RATES):
    ax[0].plot(grad_norms[j], label=f'Learning rate: {lr}')
    ax[0].legend()
    ax[0].set_xlabel('Iteration')
    ax[0].set_ylabel('grad norm')
    ax[0].set_title('Gradient norm for each learning rate')
    
for j, lr in enumerate(LEARNING_RATES):
    ax[1].plot(np.log10(losses[j]), label=f'Learning rate: {lr}')
    ax[1].legend()
    ax[1].set_xlabel('Iteration')
    ax[1].set_ylabel('log(loss)')
    ax[1].set_title('Loss for each learning rate')
plt.show()

And the cell's current output:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-18-91a4bf5517c3> in <module>
      9 
     10 for j, lr in enumerate(LEARNING_RATES):
---> 11     ax[1].plot(np.log10(losses[j]), label=f'Learning rate: {lr}')
     12     ax[1].legend()
     13     ax[1].set_xlabel('Iteration')

/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/torch/tensor.py in __array__(self, dtype)
    619             return handle_torch_function(Tensor.__array__, (self,), self, dtype=dtype)
    620         if dtype is None:
--> 621             return self.numpy()
    622         else:
    623             return self.numpy().astype(dtype, copy=False)

RuntimeError: Can't call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead.
@ethman ethman added the bug Something isn't working label May 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant