Join GitHub today
GitHub is home to over 20 million developers working together to host and review code, manage projects, and build software together.
Blobs are N-D arrays (for N not necessarily equals 4) #1970
Conversation
shelhamer
added JL ES ready for review
labels
Feb 25, 2015
This was referenced Feb 28, 2015
|
I had added a commit intended to add matcaffe support for N-D array blobs, but decided to defer to #1913 to add the support, as I don't have much experience with matcaffe and it might require some special cases (e.g., I remembered that MATLAB doesn't support arrays with <2 axes so would need to figure out what to do with those cases -- maybe the MATLAB default of adding extra axes with dimension 1 would work fine, but not sure). But if that commit might be helpful in the development of #1913, feel free to cherry-pick or just refer to it here (but note that I didn't test or even try to compile it...). So for now, matcaffe will continue to work fine for blobs with <= 4 axes, but will die on attempts to access blobs with >4 dimensions (per their use of |
Yangqing
commented on the diff
Mar 2, 2015
| @@ -2,13 +2,21 @@ syntax = "proto2"; | ||
| package caffe; | ||
| +// Specifies the shape (dimensions) of a Blob. | ||
| +message BlobShape { | ||
| + repeated int64 dim = 1 [packed = true]; |
Yangqing
Contributor
|
|
Oops, just realized that I've been commenting on commits again. Agree with |
|
Added my Python update... this should be ready pending your approval of that, Travis, and @shelhamer. |
|
Thanks Jon! Your updated |
|
The |
shelhamer
commented on an outdated diff
Mar 3, 2015
| @@ -15,7 +15,11 @@ void InnerProductLayer<Dtype>::LayerSetUp(const vector<Blob<Dtype>*>& bottom, | ||
| const int num_output = this->layer_param_.inner_product_param().num_output(); | ||
| bias_term_ = this->layer_param_.inner_product_param().bias_term(); | ||
| N_ = num_output; | ||
| - K_ = bottom[0]->count() / bottom[0]->num(); | ||
| + const int dim = this->layer_param_.inner_product_param().axis(); | ||
| + // Dimensions starting from "axis" are "flattened" into a single | ||
| + // length K_ vector. For example, if bottom[0]'s shape is (N, C, H, W), |
|
|
|
@jeffdonahue this is sweet. My only comments were minor, so do what you want with them and merge! |
shelhamer
commented on the diff
Mar 3, 2015
| @@ -35,6 +35,9 @@ void SoftmaxWithLossLayer<Dtype>::Reshape( | ||
| const vector<Blob<Dtype>*>& bottom, const vector<Blob<Dtype>*>& top) { | ||
| LossLayer<Dtype>::Reshape(bottom, top); | ||
| softmax_layer_->Reshape(softmax_bottom_vec_, softmax_top_vec_); | ||
| + softmax_axis_ = this->layer_param_.softmax_param().axis(); | ||
| + outer_num_ = bottom[0]->count(0, softmax_axis_); | ||
| + inner_num_ = bottom[0]->count(softmax_axis_ + 1); |
shelhamer
Owner
|
|
Hey Evan, thanks for the review! I made the additional commits above in response to your comments -- Edit: I went ahead and squashed the fixups, but I saved the state before the squash in another branch "tensor-blob-presquash" if you still want to look at only the additional changes I made after your review. |
jeffdonahue
and others
added some commits
Nov 26, 2014
|
This should be ready to go -- I'll merge as soon as @shelhamer or @longjon quickly OKs the small fixups I made last night in response to @shelhamer's suggestions (can see just those changes at jeffdonahue/caffe@eca584d...jeffdonahue:tensor-blob-presquash). |
|
This will probably cross impact every new layer in the PRs queue. How many new layers we have in the "waiting list"? The merging order is not neutral because will impact the adaptation charge to developer of one PR instead of another. |
|
Thanks for the fixups @jeffdonahue -- this all looks good to me. This PR can be a guide to converting in-progress and future layers to N != 4. I'm happy to see brewing transcend to higher dimensions. Next up: N-D convolution and pooling! |
shelhamer
added a commit
that referenced
this pull request
Mar 4, 2015
|
|
shelhamer |
85bb397
|
shelhamer
merged commit 85bb397
into
BVLC:master
Mar 4, 2015
1 check passed
ddetone
commented
Mar 4, 2015
|
@shelhamer Is there an existing branch in which someone has begun N-D convolution and pooling? I am currently working on my own 3-D convolution and pooling implementation. |
jeffdonahue
added a commit
that referenced
this pull request
Mar 4, 2015
|
|
jeffdonahue |
dec148e
|
jeffdonahue
deleted the
jeffdonahue:tensor-blob branch
Mar 4, 2015
This was referenced Mar 5, 2015
shelhamer
added a commit
to shelhamer/caffe
that referenced
this pull request
Mar 7, 2015
|
|
shelhamer |
38ec847
|
This was referenced Mar 7, 2015
BestSonny
commented on d8c6aeb
Mar 7, 2015
|
MY data is 64 ..36..1..1 and label is 64..1..1..1 Can you give some tips? |
BestSonny
replied
Mar 7, 2015
|
Is there some thing wrong?
|
|
@jeffdonahue can you align https://github.com/BVLC/caffe/blob/master/docs/tutorial/net_layer_blob.md with the transcendence to N-D blob sometime? |
shelhamer
referenced
this pull request
Mar 8, 2015
Closed
Preserve shape of extracted features #1457
shelhamer
added a commit
that referenced
this pull request
Mar 8, 2015
|
|
shelhamer |
fc35930
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 9, 2015
|
|
jeffdonahue |
32e6e96
|
This was referenced Mar 9, 2015
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 9, 2015
|
|
jeffdonahue |
a3e4604
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 9, 2015
|
|
jeffdonahue |
3385d1c
|
jeffdonahue
referenced
this pull request
Mar 9, 2015
Merged
FlattenLayer gets a FlattenParameter with an axis, end_axis #2082
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 9, 2015
|
|
jeffdonahue |
6d5c8b2
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 9, 2015
|
|
jeffdonahue |
7a40f74
|
jeffdonahue
added a commit
that referenced
this pull request
Mar 9, 2015
|
|
jeffdonahue |
77ab8f6
|
jeffdonahue
referenced
this pull request
Mar 10, 2015
Closed
Very simple version of ReshapeLayer #2088
qinhongwei
pushed a commit
to qinhongwei/caffe
that referenced
this pull request
Mar 12, 2015
|
|
844bdb6
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 26, 2015
|
|
jeffdonahue |
f10c43b
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 26, 2015
|
|
jeffdonahue |
0afd1bf
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
Mar 26, 2015
|
|
jeffdonahue |
a9d4adc
|
jeffdonahue
added a commit
to jeffdonahue/caffe
that referenced
this pull request
May 15, 2015
|
|
jeffdonahue + shelhamer |
cf0e6b7
|
shelhamer
added a commit
that referenced
this pull request
May 15, 2015
|
|
shelhamer |
af224c1
|
myfavouritekk
added a commit
to myfavouritekk/caffe
that referenced
this pull request
May 15, 2015
|
|
myfavouritekk |
2ba4fe4
|
This was referenced May 17, 2015
ddetone
added a commit
to ddetone/caffe
that referenced
this pull request
Jun 26, 2015
|
|
jeffdonahue + ddetone |
7841337
|
matthiasplappert
added a commit
to matthiasplappert/caffe
that referenced
this pull request
Aug 10, 2015
|
|
jeffdonahue + matthiasplappert |
a7b2aab
|
cbfinn
added a commit
to cbfinn/caffe
that referenced
this pull request
Aug 12, 2015
|
|
shelhamer + cbfinn |
0bde346
|
cbfinn
added a commit
to cbfinn/caffe
that referenced
this pull request
Aug 12, 2015
|
|
jeffdonahue + cbfinn |
d37d928
|
cbfinn
added a commit
to cbfinn/caffe
that referenced
this pull request
Aug 12, 2015
|
|
jeffdonahue + cbfinn |
3bd95a4
|
This was referenced Aug 20, 2015
wangyida
added a commit
to wangyida/caffe
that referenced
this pull request
Sep 22, 2015
|
|
jeffdonahue + wangyida |
233c981
|
This was referenced Oct 7, 2015
dbcam
commented on include/caffe/blob.hpp in 1434e87
Jan 14, 2016
|
This code implies that 'index' can be negative, however, line 150 indexes a std::vector<> that does not support negative indices. Blob::ShapeEquals() calls LegacyShape(index) with a negative index. Should probably drop pseudo-support for negative indices [Line 143 change -4 to 0. Line 144 delete '|| index < -num_axes()'.] |
|
|
dbcam
replied
Jan 17, 2016
|
You are correct Sean. My mistake. I confused array indexing with function call syntax. Thank you. |
|
I think this is ok since |
jeffdonahue commentedFeb 25, 2015
Replaces #1486 -- this PR is to master instead of dev. This is rebased and is ready for review @longjon.