Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Layer type is a string #1694

Merged
merged 13 commits into from
Feb 6, 2015
Merged

Layer type is a string #1694

merged 13 commits into from
Feb 6, 2015

Conversation

jeffdonahue
Copy link
Contributor

addresses #1685 (thanks for the reminder @shelhamer)

Mostly works, but I still need to implement automagic upgrading of net prototxts before this is merged. Which is actually kind of hard, because type is currently an enum so the parse of an old prototxt will fail due to the lack of quotes around the enum value (which was why I'd suggested using a different name like class, but I didn't particularly like the name class either). One possible path that I think would probably work is to change the name of the layers field in NetParameter, but keep that field to mark the use of a "V1" prototxt (with this new one being "V2"). I've wanted to do this because it kind of annoys me that it's plural -- layers -- so if others agree this could be a good excuse to change the name to layer.

To do:

  • Change layers field name to layer as discussed above
  • Restore type() overrides
  • Update upgrade_net_proto_blah tool to also handle V1->V2 updates
  • Unit test(s)
  • Upgrade prototxts using upgrade_net_proto_text tool

Make the type case insensitive

@shelhamer
Copy link
Member

change the name of the layers field in NetParameter [...]
I've wanted to do this because it kind of annoys me that it's plural -- layers -- so if others agree this could be a good excuse to change the name to layer

Fine by me. type and layer are both good names and that gives us an upgrade marker.

@sguada
Copy link
Contributor

sguada commented Jan 8, 2015

I like layer and type too

On Wednesday, January 7, 2015, Evan Shelhamer notifications@github.com
wrote:

change the name of the layers field in NetParameter [...]
I've wanted to do this because it kind of annoys me that it's plural --
layers -- so if others agree this could be a good excuse to change the name
to layer

Fine by me. type and layer are both good names.


Reply to this email directly or view it on GitHub
#1694 (comment).

Sergio

@longjon
Copy link
Contributor

longjon commented Jan 9, 2015

Change layers field name to layer as discussed above

Hooray!

Make the type case insensitive

Really?

@jeffdonahue
Copy link
Contributor Author

After some discussion with @longjon I've concluded that I too think case insensitive type strings would be overly user-friendly. Sorry for that brief lapse in judgment!

@jeffdonahue jeffdonahue force-pushed the layer-type-str branch 7 times, most recently from 39bfcf1 to d5839a9 Compare January 13, 2015 00:51
@jeffdonahue
Copy link
Contributor Author

Any LayerParameter field names that we should seize the (kind of) one-time opportunity to change? I was thinking it might be good to standardize the 'parameter' blob names -- we currently have param (for the name), blobs_lr, and weight_decay. I could change them to param, param_lr, param_decay? (Other suggestions?)

I also plan to clean up all the numbers so that everything is in a nice order (not that it will stay that way, but can at least make it start with name, type, bottom, top, ...).

@ducha-aiki
Copy link
Contributor

param, param_lr, param_decay

To me these ones are less meaningful, than current.

@longjon
Copy link
Contributor

longjon commented Jan 14, 2015

Re: naming:

  • blobs_lr is not very meaningful... blobs is overly generic, mispluralized, and lr is a terse abbreviation. param_lr is more consistent, but consider also: just lr, or learning_rate, or rate, or (since it's not the overall learning rate being specified) {rate/lr/learning_rate}_{coef/mult/multiplier}.
  • weight_decay has the advantage of being a recognizable bigram, but weight is inconsistent with param, plus it applies to parameters that are not normally considered weights. But consider also: decay, or [param_]decay_{coef/mult/multiplier}

@jeffdonahue
Copy link
Contributor Author

Thanks for the well thought out suggestions @longjon. I had a thought on another option that would make the param/blob prefix redundant:

message LayerParameter {
  ...
  repeated ParamSpec param = 5;
  ...
}

message ParamSpec {
  optional string name = 1;
  optional float lr_mult = 2 [default = 1.0];
  optional float decay_mult = 3 [default = 1.0];
  ...
}

I think having a separate ParamSpec message (suggestions for better names than ParamSpec are certainly welcome) would reduce bookkeeping in code (e.g., checks that the length is 0 or N for each field) and the need to specify default values for prototxt users. Agree/disagree?

@jeffdonahue jeffdonahue force-pushed the layer-type-str branch 8 times, most recently from 07aa37e to 958e372 Compare January 15, 2015 08:15
@jeffdonahue
Copy link
Contributor Author

Other than needing to run the updated automagic upgrade tool (upgrade_net_proto_text) on all the existing prototxts (which I don't particularly want to do yet as it will only hugely add to the already review-discouraging +/- line count...), this is roughly ready for review. I went ahead and did the ParamSpec thing in my comment above (c647d9c), but I can switch it back if people don't like it (but I'll probably still try to unify the blobs_lr, weight_decay, etc. names a bit). Feedback is welcome (@longjon @shelhamer and anyone else).

@longjon
Copy link
Contributor

longjon commented Jan 27, 2015

Took a pass, just the one issue noted. The upgrade code is nasty, but maybe it has to be that way, and it's not forever...

ParamSpec looks good to me. I'm happy to merge this soon if there is general agreement.

@jeffdonahue
Copy link
Contributor Author

Thanks for looking over this @longjon!

Aren't these names now stored in param_? Why not use a single implementation of this function for all layers? (I suppose I can imagine a future in which layers are initialized with something other than a LayerParameter, and this kind of introspective information is useful, however...)

Yeah, I realized this was kind of weird after doing it... But, I rely on it for the CreateLayer (cb50d6d) and layer type upgrade (573e8c7) unit tests, which test that LayerFactory actually creates a layer of the type requested. (Without the explicit type implementation, would have to just check that it sets the layer's LayerParameter with the correct type, which is weaker than the Layer being of the correct subclass.) Maybe there's another way to do this that I didn't think of, or it's not a useful enough check to justify the overhead of having to write the per-class type()?

@jeffdonahue jeffdonahue merged commit 9e28b34 into BVLC:dev Feb 6, 2015
@jeffdonahue
Copy link
Contributor Author

Merged after discussion with @longjon and @shelhamer.

@shelhamer
Copy link
Member

@jeffdonahue jeffdonahue deleted the layer-type-str branch February 6, 2015 00:12
jsupancic added a commit to jsupancic/caffe-1 that referenced this pull request Feb 12, 2015
Introduced by  Layer type is a string BVLC#1694
shelhamer pushed a commit that referenced this pull request Feb 16, 2015
Introduced by  Layer type is a string #1694
jeffdonahue added a commit to jeffdonahue/caffe that referenced this pull request Mar 26, 2015
jeffdonahue added a commit to jeffdonahue/caffe that referenced this pull request Mar 26, 2015
jeffdonahue added a commit to jeffdonahue/caffe that referenced this pull request Mar 26, 2015
jeffdonahue added a commit to jeffdonahue/caffe that referenced this pull request May 15, 2015
shelhamer added a commit that referenced this pull request May 15, 2015
Update docs for ND blobs (#1970) and layer type is a string (#1694)
myfavouritekk added a commit to myfavouritekk/caffe that referenced this pull request May 15, 2015
* master: (21 commits)
  Update docs for ND blobs (BVLC#1970) and layer type is a string (BVLC#1694)
  Add ReshapeParameter axis and num_axes to reshape only a particular span of the input shape
  basic tests (Forward, Gradient) for ReshapeLayer
  ReshapeLayer fixups for ND blobs
  Added a Reshape layer for copying-free modification of blob dimensions.
  Spatial Pyramid Pooling Layer
  remove bogus implementation of SigmoidCrossEntropyLossLayer::Forward_gpu
  remove superfluous empty destructors
  [pycaffe] use bp::object instead of PyObject* for self in Python layer
  python: PEP8; changed docstring documentation style to NumPyDoc style
  This imports the wrong io module in Python 3.
  check that count_ does not overflow in Blob::Reshape
  Modify for better readability regarding temporary bufffer for backward computation
  Fix redundancy of parameter backward computation
  Added support for original implementation, using (margin - d^2), through the legacy_version parameter.
  added epsilon to prevent possible division by zero in gradient calculation
  Fixed contrastive loss layer to be the same as proposed in Hadsell et al 2006
  remove spurious net.hpp includes
  always call Layer::Reshape in Layer::Forward
  Increment iter_ before snapshotting, remove +1 logic -- fixes final snapshot being off by one
  ...
matthiasplappert pushed a commit to matthiasplappert/caffe that referenced this pull request Aug 10, 2015
cbfinn pushed a commit to cbfinn/caffe that referenced this pull request Aug 12, 2015
wangyida pushed a commit to wangyida/caffe that referenced this pull request Sep 22, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants