Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added accUpdate to nn.LookupTable #33

Merged
merged 1 commit into from
Jul 17, 2014
Merged

added accUpdate to nn.LookupTable #33

merged 1 commit into from
Jul 17, 2014

Conversation

nicholas-leonard
Copy link
Member

This PR add method accUpdateOnly() to nn.LookupTable. This allows the LookupTable to function without the gradWeights tensor. This is useful when you want to save memory (especially for GPU) as this Module can use up a great proportion of your available memory. Added unit tests.

@nicholas-leonard
Copy link
Member Author

Any news on this?

@soumith
Copy link
Member

soumith commented Jul 11, 2014

I don't use this module much. Maybe the NLP guys can review, @andresy @koraykv

@nicholas-leonard
Copy link
Member Author

This PR changes nothing to the existing interface. It just allows for switching gradWeight/gradBias off so as to save memory.

@soumith
Copy link
Member

soumith commented Jul 15, 2014

okay cool, why not use the .training = false that we use for the other modules

@nicholas-leonard
Copy link
Member Author

Its not the same. train = false is for Modules having a different behaviour for training vs evaluation. This is for deleting the gradWeight/gradBias to half the memory footprint of the Module.

koraykv added a commit that referenced this pull request Jul 17, 2014
added accUpdate to nn.LookupTable
@koraykv koraykv merged commit adbd1ef into torch:master Jul 17, 2014
@koraykv
Copy link
Member

koraykv commented Jul 17, 2014

Thanks! This is useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants