-
Notifications
You must be signed in to change notification settings - Fork 106
New model encoding version #749
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
… of the result array, change tests accordingly.
Codecov Report
@@ Coverage Diff @@
## master #749 +/- ##
==========================================
- Coverage 79.38% 78.82% -0.57%
==========================================
Files 48 49 +1
Lines 7476 7650 +174
==========================================
+ Hits 5935 6030 +95
- Misses 1541 1620 +79
Continue to review full report at Codecov.
|
tests/flow/test_serializations.py
Outdated
| tf_model = load_file_content('graph.pb') | ||
| con.execute_command('AI.MODELSTORE', key_name, 'TF', 'CPU', 'TAG', "TF_GRAPH_V2", 'BATCHSIZE', 4, | ||
| 'MINBATCHSIZE', 2, 'MINBATCHTIMEOUT', 1000, 'INPUTS', 2, 'a', 'b', 'OUTPUTS', 1, 'mul', | ||
| 'BLOB', tf_model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you don't need to do this
you just need to load the rdb
tests/flow/test_serializations.py
Outdated
| torch_model = load_file_content('pt-minimal.pt') | ||
| con.execute_command('AI.MODELSTORE', key_name, 'TORCH', 'CPU', 'TAG', "PT_MINIMAL_V2", 'BATCHSIZE', 4, | ||
| 'MINBATCHSIZE', 2, 'MINBATCHTIMEOUT', 1000, 'BLOB', torch_model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you don't need to do this
you just need to load the rdb
tests/flow/test_serializations.py
Outdated
| onnx_model = load_file_content('linear_iris.onnx') | ||
| con.execute_command('AI.MODELSTORE', key_name, 'ONNX', 'CPU', 'TAG', "ONNX_LINEAR_IRIS_V2", 'BATCHSIZE', 4, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you don't need to do this
you just need to load the rdb
Introduce a new version of RDB decoding, v2. The change is in RAI_ModelType, where we now load to RDB the
minbatchtimeoutof a model (in addition to the rest of the model fields). Model that were encoded using v1 will be decoded with the v1 "old" decode callback.AI.MODELGET (META)command now returns the value of the modelminbatchtimeoutfield at the end of the meta-data array.