-
Notifications
You must be signed in to change notification settings - Fork 235
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor tensorflow import backend #499
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@haabhi Nice effort, can you address the comments
if activation_op.type == 'Relu': | ||
layer_type = 'ReLU' | ||
|
||
if activation_op.type == 'LeakyRelu': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this not in elif
?
layer_params['alpha'] = 1 | ||
layer_type = 'ELU' | ||
|
||
elif activation_op.type == 'Softplus': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For all layers below this can't we directly assign activation_op.type
?
conv2d_op = next((x for x in layer_ops if x.type == 'Conv2D'), None) | ||
|
||
layer_params = {} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove newline
def import_conv2d(layer_ops): | ||
|
||
conv2d_op = next((x for x in layer_ops if x.type == 'Conv2D'), None) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove newline
|
||
|
||
def import_conv3d(layer_ops): | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be good to have comments in every method
|
||
def import_deconvolution(layer_ops): | ||
deconv_op = next((x for x in layer_ops if x.type == 'Conv2DBackpropInput'), None) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unnecessary newlines
layer_params = {} | ||
|
||
if '3D' in depthwise_conv_op.type: | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should throw some sort of error in such case so that user knows what was the issue
|
||
def import_batchnorm(layer_ops): | ||
layer_params = {} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please follow same styles for newlines, I mean add newlines by following some style guidelines
d9a39bd
to
c3c7386
Compare
@haabhi while merging we will squash the commits anyway so if all your changes are present in the PR then it shouldn't be an issue. |
99abc64
to
8a23946
Compare
@Ram81 Made the changes. I have added docstrings and normal comments where things were not very clear. As for new lines, I have deleted a lot of them and grouped related lines tightly and separated them with single blank lines. |
@Ram81 updated the layer addition tutorial to match these changes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@haabhi nice work, can you please make the requested changes
|
||
|
||
def get_input_layers(layer_ops): | ||
''' return the name of the layers directly preceeding the layer of layer_ops. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you move the comment to new line
''' return the name of the layers directly preceeding the layer of layer_ops. | ||
layer_ops is a list of all ops of the layer we want the inputs of. | ||
''' | ||
ret = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please rename the variable
placeholder_op = layer_ops[0] | ||
layer_params = {} | ||
layer_dim = [int(dim.size) for dim in placeholder_op.get_attr('shape').dim] | ||
# make batch size 1 if it is -1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add newlines where necessary
- Create a function name import_<layer_name> that takes one parameter, layer_ops, that is a list of all ops in the layer being imported. | ||
- Get layer parameters from the operations in layer_ops and build a dictionary mapping parameter names to values. | ||
- Get a list of input layers to the layer being processed using `get_input_layers(layer_ops)`. | ||
- Create and return a json layer for your layer, calling `jsonLayer` with your layer_type, parameters, input layers and output_layers(optional). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
change this to Create and return a json layer for new layer
replace your with new in the statement
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
* Refactor tensorflow import code * Remove unnecessary newlines and make code concise * Update documentation for adding new layers to tensorflow import * Minor changes
This PR includes a clean, modular code base for tensorflow import. I have split the import logic into two files, much like with the keras and caffe import ones.
All the examples inside example/tensorflow work exactly like they used to.