You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2024-01-08 17:38:03,398 log_parser.py:50 INFO] Sucessfully loaded MLPerf log from open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/accuracy/mlperf_log_detail.txt.
[2024-01-08 17:38:03,401 log_parser.py:50 INFO] Sucessfully loaded MLPerf log from open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt.
[2024-01-08 17:38:03,403 log_parser.py:50 INFO] Sucessfully loaded MLPerf log from open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt.
[2024-01-08 17:38:03,403 submission_checker.py:1803 INFO] Target latency: None, Latency: 15192000764, Scenario: Offline
[2024-01-08 17:38:03,404 submission_checker.py:1834 ERROR] open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt Required minimum samples per query not met by user config, Expected=24576, Found=10
[2024-01-08 17:38:03,404 submission_checker.py:1849 ERROR] open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt Test duration less than 600s in user config. expected=600000, found=0
[2024-01-08 17:38:03,404 submission_checker.py:2781 ERROR] open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1 has issues
[2024-01-08 17:38:03,404 submission_checker.py:3304 INFO] ---
[2024-01-08 17:38:03,404 submission_checker.py:3310 INFO] ---
[2024-01-08 17:38:03,404 submission_checker.py:3313 ERROR] NoResults open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline
[2024-01-08 17:38:03,405 submission_checker.py:3313 ERROR] NoResults open/MLCommons/results/default-reference-cpu-pytorch-v2.2.0a0-default_config/bert-99/offline
[2024-01-08 17:38:03,405 submission_checker.py:3395 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3396 INFO] Results=0, NoResults=2, Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3403 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3404 INFO] Closed Results=0, Closed Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3409 INFO] Open Results=0, Open Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3414 INFO] Network Results=0, Network Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3419 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3421 INFO] Systems=0, Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3422 INFO] Closed Systems=0, Closed Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3427 INFO] Open Systems=0, Open Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3432 INFO] Network Systems=0, Network Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3437 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3439 ERROR] SUMMARY: submission has errors
CM error: Portable CM script failed (name = run-mlperf-inference-submission-checker, return code = 256)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Note that it may be a portability issue of a third-party tool or a native script
wrapped and unified by this portable CM script. In such case, please report this issue
with a full log at "https://github.com/mlcommons/ck". The CM concept is to collaboratively
fix such issues inside portable CM scripts to make existing tools and native scripts
more portable, interoperable and deterministic. Thank you!
The text was updated successfully, but these errors were encountered:
Hi @whk6688 can you please share the source from which this command is taken from? It is missing --execution-mode=valid flag which is needed to do a valid submission run (default is test run). Also, --target_qps option should be avoided or the actual target_qps must be input during a valid run. (When we do a test run, target_qps should be automatically determined by the workflow and later used in the valid run).
i found the command in mlcommons project. i will send to you if i find again. besides, i saw the performance report:
i run the script:
cm run script --tags=generate-run-cmds,inference,_find-performance,_all-scenarios --model=bert-99 --implementation=reference --device=cpu --backend=onnxruntime --category=edge --division=open --quiet --rerun
SUT name : PySUT
Scenario : Offline
Mode : PerformanceOnly
Samples per second: 0.670674
Result is : VALID
Min duration satisfied : Yes
Min queries satisfied : Yes
Early stopping satisfied: Yes
RUN:
cmr "run mlperf inference generate-run-cmds _submission" --submitter="MLCommons" --hw_name=default --model=bert-99 --implementation=reference --backend=pytorch --device=cpu --scenario=Offline --adr.compiler.tags=gcc --target_qps=1 --category=edge --division=open
Output:
[2024-01-08 17:38:03,398 log_parser.py:50 INFO] Sucessfully loaded MLPerf log from open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/accuracy/mlperf_log_detail.txt.
[2024-01-08 17:38:03,401 log_parser.py:50 INFO] Sucessfully loaded MLPerf log from open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt.
[2024-01-08 17:38:03,403 log_parser.py:50 INFO] Sucessfully loaded MLPerf log from open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt.
[2024-01-08 17:38:03,403 submission_checker.py:1803 INFO] Target latency: None, Latency: 15192000764, Scenario: Offline
[2024-01-08 17:38:03,404 submission_checker.py:1834 ERROR] open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt Required minimum samples per query not met by user config, Expected=24576, Found=10
[2024-01-08 17:38:03,404 submission_checker.py:1849 ERROR] open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1/mlperf_log_detail.txt Test duration less than 600s in user config. expected=600000, found=0
[2024-01-08 17:38:03,404 submission_checker.py:2781 ERROR] open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline/performance/run_1 has issues
[2024-01-08 17:38:03,404 submission_checker.py:3304 INFO] ---
[2024-01-08 17:38:03,404 submission_checker.py:3310 INFO] ---
[2024-01-08 17:38:03,404 submission_checker.py:3313 ERROR] NoResults open/MLCommons/results/default-reference-cpu-onnxruntime-v1.16.3-default_config/bert-99/offline
[2024-01-08 17:38:03,405 submission_checker.py:3313 ERROR] NoResults open/MLCommons/results/default-reference-cpu-pytorch-v2.2.0a0-default_config/bert-99/offline
[2024-01-08 17:38:03,405 submission_checker.py:3395 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3396 INFO] Results=0, NoResults=2, Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3403 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3404 INFO] Closed Results=0, Closed Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3409 INFO] Open Results=0, Open Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3414 INFO] Network Results=0, Network Power Results=0
[2024-01-08 17:38:03,405 submission_checker.py:3419 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3421 INFO] Systems=0, Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3422 INFO] Closed Systems=0, Closed Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3427 INFO] Open Systems=0, Open Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3432 INFO] Network Systems=0, Network Power Systems=0
[2024-01-08 17:38:03,405 submission_checker.py:3437 INFO] ---
[2024-01-08 17:38:03,405 submission_checker.py:3439 ERROR] SUMMARY: submission has errors
CM error: Portable CM script failed (name = run-mlperf-inference-submission-checker, return code = 256)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Note that it may be a portability issue of a third-party tool or a native script
wrapped and unified by this portable CM script. In such case, please report this issue
with a full log at "https://github.com/mlcommons/ck". The CM concept is to collaboratively
fix such issues inside portable CM scripts to make existing tools and native scripts
more portable, interoperable and deterministic. Thank you!
The text was updated successfully, but these errors were encountered: