-
Notifications
You must be signed in to change notification settings - Fork 969
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
THNN functional module conversions batch 1 #547
Conversation
andreaskoepf
commented
Dec 30, 2015
- ClassNLLCriterion
- DistKLDivCriterion
- HardShrink
- HardTanh
- L1Cost
- final cross-check test with old implementation
Fixed indentation in THNN.lua to 3-spaces.
For some reason ad1efee has broken clnn. It would be less of a mystery if it seemed not to break either nn or cunn at the same time :-) It seems like nn/Abs.lua expects input to contain input.THNN now, though how THNN gets into input is a mystery to me, for now. Edit, diagnosis bit by bit:
Edit2:
Edit: |
oh right. Totally forgot that this would break clnn Hugh! Sorry about that. |
:-D |
|
||
function ClassNLLCriterion:__init(weights, sizeAverage) | ||
parent.__init(self) | ||
if sizeAverage ~= nil then | ||
self.sizeAverage = sizeAverage | ||
self.sizeAverage = sizeAverage |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some of this is 3-indented, and some is 4-indented :-P
I really hate 3-indentation by the way. 2-indentation is quite nice. 4-indentation is quite standard. 3-indentation is one of the things that makes me run away and write python wrappers instead :-P
@hughperkins Sorry for breaking clnn, actually I wanted to ask @soumith about other backends two days ago but after I remembered that cudnn uses completely different modules I felt this might be a stupid question to ask: All backends that work by keeping the existing modules but a different tensor type are affected by the THNN change. We are using the same 'metatable' trick to pass in the tensor-type specific C-backend functions that was done previously in C, e.g. we register under the name 'THNN' in the specific tensor-metatables (e.g. the metatables of DoubleTensor, FloatTensor, CudaTensor) a table that contains the new ffi C-functions. Great to see that you already began working on mirroring the THNN functions in CL. btw you can reach the THNN team via gitter here. BTW sorry for the indention-errors, I will fix these. I am using 2-space indention for my normal lua and c code and I am using an editor which afiak does not allow to set project specific indentation rules .. I was just too lazy to change the setting and did the indentation by hand which is obviously not 100% ;-). Regarding 3-space indention I like the rumour that Mike Pall once said he he would refuse to take anyone seriously who used three spaces. |
Compared against non-THNN versions the converted modules via xtest.lua. nn2 is based on the last commit before THNN was added. |
THNN functional module conversions batch 1
Thanks Andreas. This is coming along really well. |
Hmmm, seems by some miracle merging this specific pull didnt break clnn build https://travis-ci.org/hughperkins/clnn/builds/99973441 . I'm not quite sure why that is .... Edit: tests pass too :-)
Edit2: I guess that there are three possible situations:
Edit3: hmmm, no thats not quite correct summary. since clnn provides implementation of Abs. Hmmm... Edit4: maybe it's something like, if nn calls Edit5: seems like edit4 is plausibly correct, and it's possible to fix any breaks in clnn, without needing to implement everything in c/c++, for now, eg using an approach similar to https://github.com/hughperkins/clnn/blob/master/ClassNLLCriterion.lua#L5-L18 to monkey-patch things in. We shall see :-)
|