Skip to content
This repository has been archived by the owner on Nov 1, 2021. It is now read-only.

Add gradient for log1p #137

Merged
merged 1 commit into from Jul 12, 2016
Merged

Add gradient for log1p #137

merged 1 commit into from Jul 12, 2016

Conversation

bartvm
Copy link
Contributor

@bartvm bartvm commented Jul 10, 2016

No description provided.

@alexbw
Copy link
Collaborator

alexbw commented Jul 10, 2016

Lgtm, will merge shortly
On Sun, Jul 10, 2016 at 11:38 AM Bart van Merriënboer <
notifications@github.com> wrote:


You can view, comment on, or merge this pull request online at:

#137
Commit Summary

  • Add gradient for log1p

File Changes

Patch Links:


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#137, or mute the thread
https://github.com/notifications/unsubscribe/AAJ4j9IkuWt4ZNstmO6CCqVsxI4G3dEdks5qUQPLgaJpZM4JI2OD
.

@alexbw alexbw merged commit 5a06a94 into twitter-archive:master Jul 12, 2016
alexbw pushed a commit that referenced this pull request Jul 13, 2016
* master:
  Add gradient for log1p (#137)
  add bmm and baddbmm (#136)
  Add scalar support to gradcheck (#135)
  Corrected some global assignments (#133)
  Update README.md
  Remove totem dependency (torch now has its own tester) (#124)
  lua-like error message
  prevent infinite recursion in number operation
  Fix for sigmoid and tanh
  Expose the nn object in wrapper
  missing abs sign
  restoring original var
  cherry picked new grad check
  Print error occurred when parsing generated code.
  Add missing pow operator in generated code.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants