Description
Describe the issue
We're trying to get onnxruntime 1.22.0 running on an Nvidia Jetson Orin NX using Balena OS, which uses Docker. We are using the python interface to onnxruntime.
We cannot even import the module when running on the Jetson. However, the same setup on an x86_64 PC works fine.
To reproduce
I've been able to strip my Dockerfile down to this minimal example:
FROM ubuntu:noble
WORKDIR /usr/src
RUN apt-get update && apt-get install -y \
python3-venv
RUN python3 -m venv venv
RUN venv/bin/pip install \
numpy==2.3.0 \
onnxruntime==1.22.0 \
opencv-python-headless==4.11.0.86
CMD ["./venv/bin/python"]
This can be built with e.g. docker build --tag prove-it .
This installs a few packages in a python in a virtual environment and then invokes the python REPL.
If I run this on my x86_64 Linux PC, it works as expected, and I can import onnxruntime:
$ docker run -it --rm prove-it
Python 3.12.3 (main, May 26 2025, 18:50:19) [GCC 13.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime
>>>
When I run this on the Jetson, I cannot import onnxruntime:
Python 3.12.3 (main, May 26 2025, 18:50:19) [GCC 13.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime
/opt/rh/gcc-toolset-14/root/usr/include/c++/14/bits/stl_vector.h:1130: std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::operator[](size_type) [with _Tp = unsigned int; _Alloc = std::allocator<unsigned int>; reference = unsigned int&; size_type = long unsigned int]: Assertion '__n < this->size()' failed.
Aborted
Running pip freeze
on both systems shows that the same versions of all pip modules are installed:
# ./venv/bin/pip freeze
coloredlogs==15.0.1
flatbuffers==25.2.10
humanfriendly==10.0
mpmath==1.3.0
numpy==2.3.0
onnxruntime==1.22.0
opencv-python-headless==4.11.0.86
packaging==25.0
protobuf==6.31.1
sympy==1.14.0
I'm not sure where to go from here, or why it behaves differently on ARM compared to x86.
Similar errors are reported e.g. #24473 but only when trying to set up inference. I can't even import the module.
Urgency
No response
Platform
Linux
OS Version
Ubuntu 22.04 on desktop, Balena OS 6.4.0 on Jetson
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.22.0
ONNX Runtime API
Python
Architecture
ARM64
Execution Provider
Other / Unknown
Execution Provider Library Version
No response