You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice to be able to load well-known pre-trained models like VGG and GoogLeNet into CGT data-structures.
There's a half-written script for loading Caffe prototxt files, called caffe2cgt.py.
For VGG, all of the operations are already implemented, whereas some of the other models involve "local response normalization" layers that are not yet implemented.
The text was updated successfully, but these errors were encountered:
I think the general consensus is that local response normalization is sort of pointless. To the best of my knowledge people aren't really using this anymore. That said, if you want to reproduce existing models exactly I guess you have no choice but to implement it. It's just a bit unfortunate to be spending valuable development time on stuff like this :)
That's interesting to hear that LRN is not considered to be important anymore, and thanks for the pointer to the Lasagne code.
I was also thinking about implementing the cross-channel LRN with a matrix multiplication (after permuting the axes so that channels are last) and the 2D neighborhood version as im2col + matrix multiplication.
It would be nice to be able to load well-known pre-trained models like VGG and GoogLeNet into CGT data-structures.
There's a half-written script for loading Caffe prototxt files, called
caffe2cgt.py
.For VGG, all of the operations are already implemented, whereas some of the other models involve "local response normalization" layers that are not yet implemented.
The text was updated successfully, but these errors were encountered: