Skip to content
This repository has been archived by the owner on Nov 1, 2021. It is now read-only.

Remove totem dependency (torch now has its own tester) #124

Merged
merged 2 commits into from May 24, 2016
Merged

Conversation

alexbw
Copy link
Collaborator

@alexbw alexbw commented May 23, 2016

Also fix contiguous grad

@alexbw alexbw merged commit ff0057f into master May 24, 2016
@alexbw
Copy link
Collaborator Author

alexbw commented May 24, 2016

Closes #123

alexbw pushed a commit that referenced this pull request Jul 13, 2016
* master:
  Add gradient for log1p (#137)
  add bmm and baddbmm (#136)
  Add scalar support to gradcheck (#135)
  Corrected some global assignments (#133)
  Update README.md
  Remove totem dependency (torch now has its own tester) (#124)
  lua-like error message
  prevent infinite recursion in number operation
  Fix for sigmoid and tanh
  Expose the nn object in wrapper
  missing abs sign
  restoring original var
  cherry picked new grad check
  Print error occurred when parsing generated code.
  Add missing pow operator in generated code.
@alexbw alexbw deleted the no-totem branch July 13, 2016 16:47
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant