-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Concat layer #125
Concat layer #125
Conversation
What about not doing memory copy. In the setup function, replace the bottom data with a continuous chunk of memory, and pack the pointers in the Blob and forward and backward functions just do nothing. |
Might be not very kosher because the bottom blobs are supposed to be Yangqing On Mon, Feb 17, 2014 at 6:30 PM, Lin Min notifications@github.com wrote:
|
@mavenlin I'm a bit confused by your comment. I'm not doing any memory copy in the setup, just setting up the top_blob. |
@sguada Sorry, you are right, contiguous memory is not enough for concatenation along different dimensions. |
@Yangqing Sure it is not an elegant way. I guess some of the memory optimisations can be done on the network level. In a similar way that theano does. |
Hey @sguada, is this still WIP? Let me know when I should test and possibly merge. |
Rebased according the new |
ConcatLayer<TypeParam> layer(layer_param); | ||
GradientChecker<TypeParam> checker(1e-2, 1e-3); | ||
// it is too expensive to call curand multiple times, so we don't do an | ||
// exhaustive gradient check. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think these comments are copied from some other layer, please remove (unless this layer really does use random in some way that I'm missing)
I verified that the tests for this layer pass - could you remove the comments as noted above and clean up the lint errors? |
This layer needs to be split into cpp and cu per #152 . |
@jeffdonahue Removed comments and cleaned up lint errors I wonder if we should also splits the tests codes into .cpp and .cu? |
@sguada Good observation: everything will be split, but it's fine for now. Let's revisit all this post deadline. |
Thanks for polishing @sguada ! Merged. |
@sguada can you give us a more detailed example, so we can know exactly how to use that concat layer, thanks! |
Added layer to concatenate blobs along one dimension. For now it only allows to concatenate along
num
concat_dim=0
orchannels
concat_dim=1
dimension. This layer can take multiple blobs (at least two) and produce one that is the concatenation along one dimension of them. The other dimensions must agree.It adds
optional unit32 concat_dim=65 [default=1]
de to the list of params of caffe.proto. By default it concatenate blobs along channels.