You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi team, following issues were observed while hands on AML on onnxrt-aio:1.8.0 image. please advise whether these need to be ignored or workaround.
1. Error while execute git clone --recursive https://github.com/AmpereComputingAI/ampere_model_library.git
...
Receiving objects: 100% (559/559), 1.30 MiB | 45.85 MiB/s, done.
Resolving deltas: 100% (337/337), done.
Cloning into '/home/azureuser/ampere_model_library/text_to_image/stable_diffusion/stablediffusion'... git@github.com: Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
fatal: clone of 'git@github.com:AmpereComputingAI/stablediffusion.git' into submodule path '/home/azureuser/ampere_model_library/text_to_image/stable_diffusion/stablediffusion' failed
Failed to clone 'text_to_image/stable_diffusion/stablediffusion' a second time, aborting
2. Error while execute bash setup_deb.sh
...
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
test-tube 0.7.5 requires torch>=1.1.0, which is not installed.
streamlit 1.28.0 requires altair<6,>=4.0, which is not installed.
streamlit 1.28.0 requires blinker<2,>=1.0.0, which is not installed.
streamlit 1.28.0 requires gitpython!=3.1.19,<4,>=3.0.7, which is not installed.
streamlit 1.28.0 requires pydeck<1,>=0.8.0b4, which is not installed.
streamlit 1.28.0 requires tenacity<9,>=8.1.0, which is not installed.
streamlit 1.28.0 requires toml<2,>=0.10.1, which is not installed.
streamlit 1.28.0 requires tzlocal<6,>=1.1, which is not installed.
streamlit 1.28.0 requires validators<1,>=0.2, which is not installed.
streamlit 1.28.0 requires watchdog>=2.1.5; platform_system != "Darwin", which is not installed.
...
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
ultralytics 8.0.75 requires seaborn>=0.11.0, which is not installed.
ultralytics 8.0.75 requires sentry-sdk, which is not installed.
ultralytics 8.0.75 requires torchvision>=0.8.1, which is not installed.
open-clip-torch 2.7.0 requires torchvision, which is not installed.
nnunet 1.7.0 requires dicom2nifti, which is not installed.
nnunet 1.7.0 requires sklearn, which is not installed.
batchgenerators 0.21 requires unittest2, which is not installed.
pytorch-lightning 1.9.1 requires torchmetrics>=0.7.0, but you have torchmetrics 0.6.0 which is incompatible.
nnunet 1.7.0 requires batchgenerators>=0.23, but you have batchgenerators 0.21 which is incompatible.
...
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
ultralytics 8.0.75 requires seaborn>=0.11.0, which is not installed.
ultralytics 8.0.75 requires sentry-sdk, which is not installed.
3. Attempt to get default onnx fp32 result:
# AIO_PROCESS_MODE=0 OMP_NUM_THREADS=16 python3 run.py -m resnet50_v1.onnx -p fp32 -f ort
FAIL: this model seems to be unsupported in a specified precision: fp32
Hi team, following issues were observed while hands on AML on onnxrt-aio:1.8.0 image. please advise whether these need to be ignored or workaround.
1. Error while execute
git clone --recursive https://github.com/AmpereComputingAI/ampere_model_library.git
2. Error while execute
bash setup_deb.sh
3. Attempt to get default onnx fp32 result:
Looks like ort backend supports fp16 only in AML? https://github.com/AmpereComputingAI/ampere_model_library/blob/main/computer_vision/classification/resnet_50_v15/run.py#L187
4. Attempt to get default onnx fp16 result:
5. Attempt to get onnx-aio fp16 result:
The onnx model is https://zenodo.org/record/2592612/files/resnet50_v1.onnx
6. Try download ONNX Runtime model in fp16 precision
https://www.dropbox.com/s/r80ndhbht7tixn5/resnet_50_v1.5_fp16.onnx described in README.md:
The text was updated successfully, but these errors were encountered: