-
Notifications
You must be signed in to change notification settings - Fork 496
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added resnet50 with mlmodelscope #15
Conversation
@cli99 Cheng, I am going to close this and not merge the code into the code base as we are all set on the models. Sound good? |
I think this should to the optionzal harness directory per previous discussions? what do you think/ |
@abduld I think it could land to |
no we support the TF/PT/ONNX/Caffe/Caffe2/MXNet/... ones |
The official TF/PT/ONNX models are not part of this PR, hence my question. I would suggest to open a new PR for these, but I defer to @jveejay here. |
@jveejay Hi Vijay, can you please review this again? We have other MLPerf models integrated with MLModelScope and would like to make them as an option for others to run MLPerf models. This is only the first model we'd like to try this process. Thanks. |
I don't see the use of the official models either. This PR is from December, don't think we had the official models at that time. Also I see no loadgen being used. |
Thanks. We will punt this to the 0.6 release. |
closing this one - you must use the reference models. |
Add NVIDIA MLPerf-Inference 2.0 calibration docs for Triton CPU
No description provided.