Skip to content
This repository has been archived by the owner on Feb 12, 2022. It is now read-only.

Modifications to work with pytorch 0.4 #43

Closed
wants to merge 1 commit into from

Conversation

shawntan
Copy link

@shawntan shawntan commented May 6, 2018

  • Removed some Variables
  • Modified embedding_dropout so using Functional
  • Modified finetune.py loss
  • Modified repackage_hidden

@salesforce-cla
Copy link

salesforce-cla bot commented May 6, 2018

Thanks for the contribution! Before we can merge this, we need @shawntan to sign the Salesforce.com Contributor License Agreement.

@esvhd
Copy link

esvhd commented Jun 1, 2018

When I tried to run the original code with pytorch 0.4, in addition to the above, I also saw a warning that torch.nn.functional.log_softmax() in splitcross.py should now explicitly specify dim=-1. Any chance we can add this to this PR too?
Thanks.

@esvhd
Copy link

esvhd commented Jun 2, 2018

Also should add dim=-1 for all torch.nn.functional.softmax(), e.g. in pointer.py.

@keskarnitish
Copy link
Contributor

keskarnitish commented Jun 13, 2018

@shawntan Thank you very much for this. 457a422 makes the repository PyTorch 0.4 compatible.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants