Skip to content

Commit

Permalink
[Tutorial] Add BatchDotGrad to all-in-one example
Browse files Browse the repository at this point in the history
  • Loading branch information
f-dangel committed Aug 19, 2020
1 parent 15bfc65 commit da5fde9
Showing 1 changed file with 14 additions and 0 deletions.
14 changes: 14 additions & 0 deletions docs_src/examples/basic_usage/example_all_in_one.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
KFLR,
KFRA,
PCHMP,
BatchDotGrad,
BatchGrad,
BatchL2Grad,
DiagGGNExact,
Expand Down Expand Up @@ -91,6 +92,19 @@
print(".grad.shape: ", param.grad.shape)
print(".batch_l2.shape: ", param.batch_l2.shape)


# %%
# Dot products of individual gradients

loss = lossfunc(model(X), y)
with backpack(BatchDotGrad()):
loss.backward()

for name, param in model.named_parameters():
print(name)
print(".grad.shape: ", param.grad.shape)
print(".batch_dot.shape: ", param.batch_dot.shape)

# %%
# It's also possible to ask for multiple quantities at once

Expand Down

0 comments on commit da5fde9

Please sign in to comment.