-
Notifications
You must be signed in to change notification settings - Fork 178
Unable to use inference server: Illegal instruction #1120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
hi there - thanks for raising the issue. The problem is likely not with your setup, but with the inference server itself. I must admit I've never attempted to run on raspbery, so no clue what may be wrong. That may be not that easy, but could u try to build the image from repository source using the following command:
|
I am under impression that raspbery arch. may require dedicated build, but I am not sure. |
It's strange, because in the docs it says Inference works on Raspberry Pi 4 Model B and Raspberry Pi 5 so long as you are using the 64-bit version of the operating system (if your SD Card is big enough, we recommend the 64-bit "Raspberry Pi OS with desktop and recommended software" version).
|
well - not sure why this happened, we need to debug internally. |
In the mean time, what are your suggestions to deploy a roboflow object detection on a raspberry? is there an easy guide i can follow without difficulties? |
@herobrine99dan could you try to you may need to apt install a couple of things. The docker installs these:
|
I've installed the stuff you told me, running import inference in a python3 shell doesn't give any issue. |
Search before asking
Bug
Hello, I was following the instructions to install Inference on a raspberry pi 4, I've got no issues installing all the packages with pip install inference and setting up the docker container, nevertheless after the server starts the docker container exits a few seconds later and the log i'm getting with docker logs is
Environment
Inference: v0.44.1
OS: Debian GNU/Linux 12 (bookworm) aarch64
Device: raspberry pi 4
Python: 3.11.2
Minimal Reproducible Example
i'm following the official guide:
pip install inference-cli
inference server start
Additional
I've tried to install Inference on a raspberry pi 4, and then i've tried a clean installation on a raspberry pi 400, getting the same issue in both cases, i've also tried using an older version of docker. Nothing helped. Everything is up to date.
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: