New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tanh #116
Tanh #116
Conversation
#include <algorithm> | ||
|
||
namespace caffe { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should create a SetUp method to check that there is only one bottom blob and only one top blob or if you also want this layer to work in-place you could add this
void TanHLayer<Dtype>::SetUp(const vector<Blob<Dtype>*>& bottom,
vector<Blob<Dtype>*>* top) {
NeuronLayer<Dtype>::SetUp(bottom, top);
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the back-and-forth, but one does not need to write a custom set up function if no special care needs to be carried out. The NeuronLayer has a virtual SetUp() function that will deal with the default construction and in-place check:
https://github.com/BVLC/caffe/blob/master/src/caffe/layers/neuron_layer.cpp
So maybe simply ignore the setup function...?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need a setup for the situation in which tanh is not used in place. It should then call Reshape for (*top)[0].
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/BVLC/caffe/blob/master/src/caffe/layers/neuron_layer.cpp#L18 serves this purpose.
I've added a SetUp function. I noticed that its not present in ReLU or Sigmoid either. Included all recent changes from Master into this. One of my friends verified that the MKL version compiles and the CPU tests pass - he doesn't have a GPU on that machine. |
@aravindhm sorry for the confusion, @Yangqing is right you don't need to add a setup in this case. Neuron_layer will take care of that. @aravindhm now you are adding new commits to this layer and merging it with master again. That will mess with other commits. You will need to revert, or better to rebase again against the new master by doing the following (@shelhamer please correct me if I'm wrong): If your master is different from BVLC/master then first create a new branch diverged to keep your changes.
Then fetch the latest changes from BVLC/master and reset your master to be the same as BLVC/master
Now you can pick the commits your tanh branch that you want to keep for the pull-request by deleting or editing the lines in the interactive window
And now you have to force push to your repository to have clean history
|
I've followed those instructions blindly after resetting back to the commit before the setup function was added. The tests all pass when I integrate it with boost-eigen. The extra SetUp function is not there. |
@aravindhm sorry for the confusion in sharing your contribution. I'm writing up a guide to help with such matters. Thanks for your work :) |
Add TanH = hyperbolic tangent activation layer (popular for sparse autoencoders).
Merged to |
Add TanH = hyperbolic tangent activation layer (popular for sparse autoencoders).
Add TanH = hyperbolic tangent activation layer (popular for sparse autoencoders).
Add TanH = hyperbolic tangent activation layer (popular for sparse autoencoders).
Add TanH = hyperbolic tangent activation layer (popular for sparse autoencoders).
Added a hyperbolic tangent activation function layer. This was developed in the boost-eigen branch so hasn't been tested/compiled with master (I don't have MKL). Please test / examine this and let me know if anything needs to be fixed.