New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix errors in beginner/blitz #79

Merged
merged 2 commits into from May 17, 2017
Jump to file or symbol
Failed to load files and symbols.
+9 −9
Diff settings

Always

Just for now

@@ -36,9 +36,9 @@
``Variable`` and ``Function`` are interconnected and build up an acyclic
graph, that encodes a complete history of computation. Each variable has
a ``.creator`` attribute that references a ``Function`` that has created
a ``.grad_fn`` attribute that references a ``Function`` that has created
the ``Variable`` (except for Variables created by the user - their
``creator is None``).
``grad_fn is None``).
If you want to compute the derivatives, you can call ``.backward()`` on
a ``Variable``. If ``Variable`` is a scalar (i.e. it holds a one element
@@ -61,8 +61,8 @@
print(y)
###############################################################
# ``y`` was created as a result of an operation, so it has a creator.
print(y.creator)
# ``y`` was created as a result of an operation, so it has a ``grad_fn``.
print(y.grad_fn)
###############################################################
# Do more operations on y
@@ -157,15 +157,15 @@ def num_flat_features(self, x):
# For example:
output = net(input)
target = Variable(torch.range(1, 10)) # a dummy target, for example
target = Variable(torch.arange(1, 11)) # a dummy target, for example
criterion = nn.MSELoss()
loss = criterion(output, target)
print(loss)
########################################################################
# Now, if you follow ``loss`` in the backward direction, using it’s
# ``.creator`` attribute, you will see a graph of computations that looks
# ``.grad_fn`` attribute, you will see a graph of computations that looks
# like this:
#
# ::
@@ -181,9 +181,9 @@ def num_flat_features(self, x):
#
# For illustration, let us follow a few steps backward:
print(loss.creator) # MSELoss
print(loss.creator.previous_functions[0][0]) # Linear
print(loss.creator.previous_functions[0][0].previous_functions[0][0]) # ReLU
print(loss.grad_fn) # MSELoss
print(loss.grad_fn.next_functions[0][0]) # Linear
print(loss.grad_fn.next_functions[0][0].next_functions[0][0]) # ReLU
########################################################################
# Backprop
ProTip! Use n and p to navigate between commits in a pull request.