Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

M.2 in docker #125

Closed
Sayyam-Jain opened this issue May 26, 2020 · 8 comments
Closed

M.2 in docker #125

Sayyam-Jain opened this issue May 26, 2020 · 8 comments
Labels
help wanted Extra attention is needed PCIe Issue relating to our pcie modules research

Comments

@Sayyam-Jain
Copy link

Hi, my device is jetson nano connected with M.2 coral TPU.
I want to access the EdgeTPU in docker.
I've installed all the required files but still getting this error:
Failed to load delegate from libedgetpu.so.1

At first, I thought it was due to PCIe device not being available inside of the docker, hence I added apex group and mounted folder inside the docker with privileged flag , but I'm still getting the same error.

Kindly help me out. I'll be really thankful to you.
Thanks.

@Namburger
Copy link

@Sayyam-Jain
FYI, docker is not officially supported by us, but it definitely works with our usb accelerator. I don't have enough info to give you much suggestion here but how are you running the image?

Failed to load delegate from libedgetpu.so.1

Could means something as simple as you don't have libedgetpu.so installed but there are also concerns with adding the apex rules as described in step 4 here which usually requires a reboot but since you have root (in docker) that really shouldn't be an issue.

There are also concern with the docker image that you pulled from already comes with apex/gasket kernel modules, so you'll need to blacklist that before installing our drivers.

Those are just some suggestions, since this is not an issue with our driver, I'll mark as close. Feel free to continue the conversion here, I'll try my best to helps!

Cheers

@Namburger Namburger added PCIe Issue relating to our pcie modules help wanted Extra attention is needed research labels May 26, 2020
@skyler1253
Copy link

@Sayyam-Jain Were you able to get this to work inside of docker? Trying the same approach and would love to know if you found a work around. Thanks!

@Sayyam-Jain
Copy link
Author

Unfortunately no.
There was some problem with pcie port of my Jetson Nano.

Eventually I installed it on my laptop replacing the wifi card. I didn't work for the l4t docker image so I left it there.

I'll try once again this week to be sure if it's even possible on any Docker not just arm64.

I'll let you know about the results soon.

Outside Docker it's working fine.
If you want Docker I'll suggest go with Usb coral tpu.
It'll work with mounting dev/usb folder with privileged flag inside Docker

@skyler1253
Copy link

Sounds good. I'll be trying this week too so I'll keep you posted as well.

@Namburger
Copy link

Humm, I have a feeling that it won't be as straight forward. I just remember that normally VMs needs pcie passthrough, haven't tried it in docker but here is a link of reference:
https://www.reddit.com/r/docker/comments/bfxbe7/passing_a_pci_device/?utm_medium=android_app&utm_source=share

@Namburger
Copy link

Namburger commented Jun 10, 2020

Hi guys, I guess this has became a quite popular request, we haven't try anything like this and don't have much HW to test due to wfh. However if everything is installed correctly, our apex module will expose the modules via /dev/apex_* so you can try passing that through docker:

docker run -it  --device /dev/apex_0:/dev/apex_0 <image_name> /bin/bash

@Namburger
Copy link

Have you guys got a chance to try this yet?
I did it on the Coral Dev Board (which also interface the tpu over pcie) and this is working :)


# docker can be installed on the dev board following these instructions: 
# https://github.com/f0cal/google-coral/issues/32#issuecomment-571629174
# 1) create this dockerfile
# 2) build: docker build -t "coral" .
# 3) run: docker run -it --device /dev/apex_0:/dev/apex_0 coral /bin/bash
# 4) Try the classify_image demo:
# apt-get install edgetpu-examples
# python3 /usr/share/edgetpu/examples/classify_image.py --model /usr/share/edgetpu/examples/models/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite --label /usr/share/edgetpu/examples/models/inat_bird_labels.txt --image /usr/share/edgetpu/examples/images/bird.bmp

FROM arm64v8/debian:latest

WORKDIR /home
ENV HOME /home
RUN cd ~
RUN apt-get update
RUN apt-get install -y git nano python3-pip python-dev pkg-config wget usbutils curl

RUN echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" \
| tee /etc/apt/sources.list.d/coral-edgetpu.list
RUN curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add -
RUN apt-get update

@arun-kumark
Copy link

Hi Namburger,

I am using the above Docker file mentioned but I like to run the Parrot inference example alone, could you help me refining it?
I am using M.2 device connected with the remote x86 machine. I can only check through logs over SSH to test whether the accelerator is working or not, also with some performance indicators like speed over 10 inferences.

Could you let me know if any Dockerfile exist already which I can use for testing the card is working for inference or not?

I am building over your example, my unsuccessful attempt is below:

FROM ubuntu:18.04

WORKDIR /home
ENV HOME /home
RUN cd ~
RUN apt-get update
RUN apt-get install -y git vim python3-pip python-dev pkg-config wget usbutils curl
RUN apt-get install -y libedgetpu1-legacy-std python3-edgetpu python3-opencv
RUN echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" \
| tee /etc/apt/sources.list.d/coral-edgetpu.list
RUN curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add -
RUN apt-get update

RUN git clone https://github.com/google-coral/pycoral.git
RUN git clone https://github.com/google-coral/test_data.git

RUN apt install -y python3-pycoral
RUN apt-get install -y libedgetpu1-legacy-std python3-edgetpu python3-opencv

RUN wget https://dl.google.com/coral/python/tflite_runtime-2.1.0-cp36-cp36m-linux_x86_64.whl
RUN pip3 install tflite_runtime-2.1.0-cp36-cp36m-linux_x86_64.whl

RUN git clone https://github.com/google-coral/tflite.git

WORKDIR tflite/python/examples/classification

RUN bash ./install_requirements.sh

ENTRYPOINT ["python3",  "classify_image.py",   "--model",  "models/mobilenet_v2_1.0_224_inat_bird_quant.tflite",   "--labels", "models/inat_bird_labels.txt",   "--input", "images/parrot.jpg"]

Thank you very much !!

Kind Regards
Aru n

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed PCIe Issue relating to our pcie modules research
Projects
None yet
Development

No branches or pull requests

4 participants