Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Layer factory #21

Merged
merged 62 commits into from
Nov 5, 2018
Merged
Show file tree
Hide file tree
Changes from 61 commits
Commits
Show all changes
62 commits
Select commit Hold shift + click to select a range
cc567e4
Remove layer factory and use symbol.load_json
jnkm Sep 14, 2018
1837ff1
Fix merge conflicts
jnkm Sep 14, 2018
b277960
Merge branch 'develop' of https://github.com/amzn/xfer into layer-fac…
jnkm Sep 20, 2018
1ba9f96
Print symbol dict
jnkm Sep 24, 2018
18b516a
Fix arg_nodes bug on drop_layer_bottom
jnkm Sep 24, 2018
cf96729
Replace layer_factory with mx.sym
jnkm Sep 24, 2018
acfb394
Replace layer_factory with mx.sym
jnkm Sep 24, 2018
01ba5d1
Replace references to layer factory and remove print statements
jnkm Sep 24, 2018
4aec078
Merge branch 'develop' into layer-factory
jnkm Sep 25, 2018
a7be424
Merge branch 'develop' into layer-factory
jnkm Sep 25, 2018
d53b636
Add download_resnet method
jnkm Oct 1, 2018
593a38b
Add support for drop_bottom_layer when two layers have data as input
jnkm Oct 2, 2018
1477742
Add support for n drops with one command in drop_layer_bottom
jnkm Oct 2, 2018
0c424f1
Fix multiplayer drop bugs in drop_bottom_layer
jnkm Oct 2, 2018
58e2c22
Add tests for split models
jnkm Oct 8, 2018
c96c4a0
Implement drop_layer_top for split model output
jnkm Oct 8, 2018
8758049
Merge branch 'develop' into layer-factory
jnkm Oct 8, 2018
fabb5c9
Remove whitespace and print statements
jnkm Oct 8, 2018
a390763
Merge branch 'layer-factory' of https://github.com/amzn/xfer into lay…
jnkm Oct 8, 2018
670b933
Add resnet test
jnkm Oct 8, 2018
8a08ed0
Remove prints and update comments
jnkm Oct 8, 2018
8d552b3
Add tests for updated MH
jnkm Oct 8, 2018
eb97277
Update test_update_inputs
jnkm Oct 9, 2018
f323a11
Clean add/drop layers
jnkm Oct 9, 2018
ff4672c
Add docstring and add support for case with two join layers
jnkm Oct 9, 2018
290a5a3
Remove prints
jnkm Oct 9, 2018
98e0d4b
Fix merge conflicts from develop
jnkm Oct 10, 2018
9580f94
Update ambiguous layer drop error message
jnkm Oct 10, 2018
82614b8
Abstract away disambiguating layer and removing join layer
jnkm Oct 10, 2018
d9a6b3f
Remove redundant methods
jnkm Oct 11, 2018
4a54ee5
Refactor get_join_idx
jnkm Oct 11, 2018
f02e2e1
Refactor update_inputs
jnkm Oct 11, 2018
92f0602
Refactor remove_join_layer_if_redundant
jnkm Oct 11, 2018
aec79fc
Add some missing tests and branch_to_keep->keep_branch_names
jnkm Oct 11, 2018
eebd3b8
Add get_layer_node_idx method
jnkm Oct 11, 2018
0520a20
Remove use of LayerType in get_layers_matching_type()
jnkm Oct 11, 2018
3f2275e
Handle input in form [i,0]
jnkm Oct 11, 2018
b05415d
Add comment about input form
jnkm Oct 11, 2018
98662a6
Add comment about head form
jnkm Oct 11, 2018
62f7182
Update LayerType in demos
jnkm Oct 11, 2018
e9ad6fc
Merge branch 'develop' into layer-factory
jnkm Oct 16, 2018
c84dd78
Remove references to Layer Factory
jnkm Oct 17, 2018
cee78d8
Use get_symbol_dict in add_layer_bottom
palindromik Oct 18, 2018
3b81805
Added argument names in ambiguous cases and added missing comments
jnkm Oct 18, 2018
cdbc545
Merge branch 'layer-factory' of https://github.com/amzn/xfer into lay…
jnkm Oct 18, 2018
b1c154c
Remove for loop from add_layer_bottom
jnkm Oct 29, 2018
8069fae
Concatenate nodes and then add to network in add_layer_top
jnkm Oct 29, 2018
4ee9e20
Add comments to add_layer
jnkm Oct 29, 2018
1028007
Remove debug code
jnkm Oct 29, 2018
3dbbd0b
Update xfer/model_handler/model_handler.py
palindromik Oct 29, 2018
748fb83
Update xfer/model_handler/model_handler.py
palindromik Oct 29, 2018
1f8e7f8
Update xfer/model_handler/model_handler.py
palindromik Oct 29, 2018
f3237ff
Update xfer/model_handler/model_handler.py
palindromik Oct 29, 2018
4230a05
Update xfer/model_handler/model_handler.py
palindromik Oct 29, 2018
27db2a5
ip->node_input
jnkm Oct 29, 2018
19633bb
Merge branch 'layer-factory' of https://github.com/amzn/xfer into lay…
jnkm Oct 29, 2018
15356e8
Do not pass data node to be input shifted
jnkm Oct 29, 2018
2b6d8ad
Update xfer/model_handler/model_handler.py
palindromik Oct 29, 2018
1a24409
Address reviewer comments
jnkm Oct 29, 2018
b1f717b
Merge branch 'layer-factory' of https://github.com/amzn/xfer into lay…
jnkm Oct 29, 2018
005dba7
Remove branching support
jnkm Oct 30, 2018
e9a55b6
Move symbol attribute updates out of for loop
jnkm Nov 1, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
99 changes: 0 additions & 99 deletions docs/add_to_layer_factory.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,5 @@ Model Handler

.. Add/replace module names you want documented here
xfer.model_handler.ModelHandler
xfer.model_handler.layer_factory
xfer.model_handler.exceptions
xfer.model_handler.consts
23 changes: 11 additions & 12 deletions docs/demos/xfer-custom-repurposers.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -347,8 +347,8 @@
" precision recall f1-score support\n",
"\n",
" 0 1.00 0.50 0.67 2\n",
" 1 1.00 1.00 1.00 2\n",
" 2 0.67 1.00 0.80 2\n",
" 1 0.67 1.00 0.80 2\n",
" 2 1.00 1.00 1.00 2\n",
" 3 1.00 1.00 1.00 2\n",
"\n",
"avg / total 0.92 0.88 0.87 8\n",
Expand Down Expand Up @@ -417,14 +417,13 @@
" model_handler.update_sym(target_symbol)\n",
"\n",
" # Add a fully connected layer (with nodes equal to number of target classes) and a softmax output layer on top\n",
" fully_connected_layer1 = xfer.model_handler.layer_factory.FullyConnected(num_hidden=self.num_nodes, name='fc_rep')\n",
" fully_connected_layer2 = xfer.model_handler.layer_factory.FullyConnected(num_hidden=self.target_class_count,\n",
" name='fc_from_fine_tune_repurposer')\n",
" softmax_output_layer = xfer.model_handler.layer_factory.SoftmaxOutput(name=train_iterator.provide_label[0][0].replace('_label', ''))\n",
" fully_connected_layer1 = mx.sym.FullyConnected(num_hidden=self.num_nodes, name='fc_rep')\n",
" fully_connected_layer2 = mx.sym.FullyConnected(num_hidden=self.target_class_count, name='fc_from_fine_tune_repurposer')\n",
" softmax_output_layer = mx.sym.SoftmaxOutput(name=train_iterator.provide_label[0][0].replace('_label', ''))\n",
" model_handler.add_layer_top([fully_connected_layer1, fully_connected_layer2, softmax_output_layer])\n",
"\n",
" # Get fixed layers\n",
" conv_layer_names = model_handler.get_layer_names_matching_type(xfer.model_handler.consts.LayerType.CONVOLUTION)\n",
" conv_layer_names = model_handler.get_layer_names_matching_type('Convolution')\n",
" conv_layer_params = model_handler.get_layer_parameters(conv_layer_names)\n",
" \n",
" # Create and return target mxnet module using the new symbol and params\n",
Expand Down Expand Up @@ -488,10 +487,10 @@
"\n",
" 0 1.00 0.50 0.67 2\n",
" 1 1.00 1.00 1.00 2\n",
" 2 0.50 1.00 0.67 2\n",
" 3 1.00 0.50 0.67 2\n",
" 2 1.00 1.00 1.00 2\n",
" 3 0.67 1.00 0.80 2\n",
"\n",
"avg / total 0.88 0.75 0.75 8\n",
"avg / total 0.92 0.88 0.87 8\n",
"\n"
]
}
Expand All @@ -503,7 +502,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "Python [default]",
"language": "python",
"name": "python3"
},
Expand All @@ -517,7 +516,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
"version": "3.6.1"
}
},
"nbformat": 4,
Expand Down