Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better document how to return multiple values from forward #457

Merged
merged 4 commits into from Jun 11, 2019

Conversation

Projects
None yet
3 participants
@BenjaminBossan
Copy link
Collaborator

commented Apr 16, 2019

Add an elaborate example to the Avanced_Usage.ipynb
notebook. Furthermore, add a short entry in the FAQ that also links to
the notebook.

BenjaminBossan
Better document how to return multiple values from forward
Add an elaborate example to the Avanced_Usage.ipynb
notebook. Furthermore, add a short entry in the FAQ that also links to
the notebook.

@BenjaminBossan BenjaminBossan requested a review from ottonemo Apr 16, 2019

@BenjaminBossan BenjaminBossan self-assigned this Apr 16, 2019

@thomasjpfan

This comment has been minimized.

@BenjaminBossan

This comment has been minimized.

Copy link
Collaborator Author

commented May 17, 2019

Fixes #428

@thomasjpfan
Copy link
Member

left a comment

Small nits

"\n",
"One strategy would be to only use the decoded state for the loss and discard the encoded state. For this demonstration, we have a different plan: We would like the encoded state to be sparse. Therefore, we add an L1 loss of the encoded state to the reconstruction loss. This way, the net will try to reconstruct the input as accurately as possible while keeping the encoded state as sparse as possible.\n",
"\n",
"To implement this, the right method to override is called `get_loss`, which is where `skorch` computes and returns the loss. It gets the prediction (our tuple) and the target as input, as well as some other things that we don't need to bother with. We create a subclass of `NeuralNetRegressor` that overrides said method and implements our idea for the loss."

This comment has been minimized.

Copy link
@thomasjpfan

thomasjpfan May 18, 2019

Member

"Don't need to be bother with" is a bit aggressive? Maybe:

It gets the prediction (our tuple) and the target as input, as well as other arguments and keywords that we passthrough.

:func:`~skorch.net.NeuralNet.forward_iter` method (lazy).

For an example of how this works, have a look at this `notebook
<https://nbviewer.jupyter.org/github/dnouri/skorch/blob/master/notebooks/Advanced_Usage.ipynb#Multiple-return-values-from-forward>`_.

This comment has been minimized.

Copy link
@thomasjpfan

thomasjpfan May 18, 2019

Member
Suggested change
<https://nbviewer.jupyter.org/github/dnouri/skorch/blob/master/notebooks/Advanced_Usage.ipynb#Multiple-return-values-from-forward>`_.
<https://nbviewer.jupyter.org/github/skorch-dev/skorch/blob/master/notebooks/Advanced_Usage.ipynb#Multiple-return-values-from-forward>`_.
@@ -37,7 +37,11 @@
" * [Accessing callback parameters](#Accessing-callback-parameters)\n",

This comment has been minimized.

Copy link
@ottonemo

ottonemo May 27, 2019

Collaborator

This cell does not reach the promised accuracy of 70%, maybe it would make sense to write an assertion so that such errors are caught directly?


Reply via ReviewNB

This comment has been minimized.

Copy link
@BenjaminBossan

BenjaminBossan Jun 7, 2019

Author Collaborator

👍

@@ -37,7 +37,11 @@
" * [Accessing callback parameters](#Accessing-callback-parameters)\n",

This comment has been minimized.

Copy link
@ottonemo

ottonemo May 27, 2019

Collaborator

Maybe instead of writing 'would raise an error', quote the actual error (makes it searchable).


Reply via ReviewNB

This comment has been minimized.

Copy link
@BenjaminBossan

BenjaminBossan Jun 7, 2019

Author Collaborator

The error would be

AttributeError: 'tuple' object has no attribute 'size'

It is raised within pytorch code. We could catch that but this seems overly specific. We could think of another strategy to deal with that situation, but that would be out of scope for this PR.

This comment has been minimized.

Copy link
@ottonemo

ottonemo Jun 11, 2019

Collaborator

Yeah I just thought that quoting the error in the notebook would make it searchable through google if someone searches for skorch AttributeError has not attribute size or something similar.

@@ -37,7 +37,11 @@
" * [Accessing callback parameters](#Accessing-callback-parameters)\n",

This comment has been minimized.

Copy link
@ottonemo

ottonemo May 27, 2019

Collaborator

I would have liked a heatmap of the activations to have a bit of a visual element but this is OK.

Reply via ReviewNB

This comment has been minimized.

Copy link
@BenjaminBossan

BenjaminBossan Jun 7, 2019

Author Collaborator

The activations have shape (batch_size, 5), that doesn't really lend itself too well for a visualizations. Also, running the notebook currently doesn't require matplotlib, it's probably not worth for this to add this requirement.

BenjaminBossan
@BenjaminBossan

This comment has been minimized.

Copy link
Collaborator Author

commented Jun 7, 2019

@ottonemo pls review again

@ottonemo ottonemo merged commit 815ade3 into master Jun 11, 2019

2 checks passed

Travis CI - Branch Build Passed
Details
Travis CI - Pull Request Build Passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.