Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward. #31

Open
Bie401 opened this issue Mar 22, 2023 · 1 comment

Comments

@Bie401
Copy link

Bie401 commented Mar 22, 2023

Getting below error while running the backward pass:

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

Code

#Forward Pass
logits = (xenc @ W)
counts = logits.exp()
prob = counts/counts.sum(1,keepdim=True)
loss = - prob[torch.arange(5),ys].log().mean()
print(loss.item())

#Backward Pass
W.grad=None
loss.backward()

#update the weights
W.data += -0.1 * W.grad

Query:

Why are performing the one-hot encoding for the input every-time while iterating the forward pass?

@JaredLevi18
Copy link

Answering your question, no, in fact to begging with, we are not using one-hot encoding, based on what I've read, using embeddings is better, witch is what is done in the video, and it's done when building the vocabulary, where the char variable is, the stoi and itos variables (if I remember well their names) are basically doing the embedding part, then after that we do the the iteration first to the forward pass and then to the backward pass.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants