Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Pineboards Hat AI! Dual Edge Coral TPU Bundle #648

Closed
geerlingguy opened this issue May 24, 2024 · 7 comments
Closed

Add Pineboards Hat AI! Dual Edge Coral TPU Bundle #648

geerlingguy opened this issue May 24, 2024 · 7 comments

Comments

@geerlingguy
Copy link
Owner

Pineboards offers a Hat AI! Dual Edge Coral TPU Bundle for Raspberry Pi 5, which unites a Dual Edge Coral TPU for AI/ML/inference to the Raspberry Pi 5, through a PCIe Switch to support both PCIe lanes for both TPUs.

hat-pineboards-hatai-dual-edge-tpu

Most other E-key PCIe HATs only support one PCIe lane, so if you installed a Dual Edge TPU, you would only have access to one of the two TPUs.

Pineboards also includes a Dual Edge TPU with this bundle, so you don't need to source your own from another vendor.

@geerlingguy
Copy link
Owner Author

See existing issue about Coral M.2 Dual Edge TPU support on Pi: #318

@MidnightLink
Copy link

MidnightLink commented May 31, 2024

I'm getting one of these in the next few days. I'm excited to test if it's possible to host a local AI server using something like CodeProject.AI on a Pi5. I have a Blue Iris server running on my network and having a central location to send AI requests to (instead of having to use a power hungry GPU) would be fantastic. That paired with using one of the Wave Share POE hats would make it a super easy and compact solution....if it works ;)

@MidnightLink
Copy link

MidnightLink commented Jun 3, 2024

I've been running this hat alongside a Wave Share POE hat for a few days now and it's been working without issue. I have a Blue Iris server running on a Win11 machine sending AI requests over network to it.

Previously I was running GPU AI detection, which takes a ton of power on an RTX 3060. I tried to move to the Coral dual edge TPU using a PCIe adapter ( https://www.makerfabs.com/dual-edge-tpu-adapter.html ) but Windows was never fully happy with it and it often crashed, reverting back to CPU/GPU AI detection and again using a ton of power.

Here's a quick setup guide if anyone wants to try to set up CodeProject.AI with this hat and a Coral Dual Edge TPU.

Notes:
This guide is for the rpi64 version which is mostly limited to just the Coral TPU. If you would like to use more modules or have more customization, simply replace "rpi64" with "arm64" in the docker download/start lines.

A lot of the guide comes condensed from PineBoards' and CodeProject.AI's instructions. (And thanks to Jeff for the PCIe Gen 3.0 code ;) )


Start with a fresh install of PiOS Lite 64-bit and connect using your SSH program of choice.

First boot update:

sudo apt update && sudo apt upgrade -y

Update the kernel to the latest version:

sudo rpi-update

Install docker:

sudo curl -sSL https://get.docker.com | sh

Add your user to the docker group:

sudo usermod -aG docker $USER

Open the Pi's config file in Nano:

sudo nano /boot/firmware/config.txt

Add the following lines to the bottom of the file:

#Enable the PCIe External connector.
dtparam=pciex1
kernel=kernel8.img
#Enable Pineboards Hat Ai
dtoverlay=pineboards-hat-ai
#Upgrade to PCI Gen 3.0
dtparam=pciex1
dtparam=pciex1_gen=3

Save and close the file by pressing CTRL+X, Y to confirm, and Enter to exit.

Reboot the Pi:

sudo reboot

Install rpi-source to Fetch Kernel Headers, then fetch the Kernel Headers:

sudo apt install git bc bison flex libssl-dev make libncurses5-dev && sudo wget https://raw.githubusercontent.com/jgartrel/rpi-source/master/rpi-source -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source && rpi-source --tag-update && rpi-source --default-config

Add the Google Coral Edge TPU package repository and import the GPG key:

echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | sudo tee /etc/apt/sources.list.d/coral-edgetpu.list && curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -

Update your package list:

sudo apt-get update

Install the necessary packages:

sudo apt-get install cmake libedgetpu1-std devscripts debhelper dkms dh-dkms

Clone the Gasket Driver repo:

git clone https://github.com/google/gasket-driver.git

Change into the directory and build the driver:

cd gasket-driver && sudo debuild -us -uc -tc -b

Go back to the parent directory and install the built package:

cd .. && sudo dpkg -i gasket-dkms_1.0-18_all.deb

Add a udev rule to manage device permissions:

sudo sh -c "echo 'SUBSYSTEM==\"apex\", MODE=\"0660\", GROUP=\"apex\"' >> /etc/udev/rules.d/65-apex.rules"

Create a new group and add your user to it:

sudo groupadd apex && sudo adduser $USER apex

Reboot your Pi:

sudo reboot

Download the latest version of Codeproject.AI server: RPi64 version

docker pull codeproject/ai-server:rpi64

Start the docker container and set it to run on boot

docker run --restart=always --name CodeProject.AI -d -p 32168:32168 \
 --privileged -v /dev/bus/usb:/dev/bus/usb codeproject/ai-server:rpi64

Open the web interface of the newly set up CodeProject.AI server:

http://piaddress:32168

You should now show "Started Multi-TPU (TF-Lite)" and can send AI requests

@geerlingguy
Copy link
Owner Author

@MidnightLink awesome work! That's great to see it's been validated on the Pi 5. I have also broken mine out earlier today, and it was able to recognize both TPUs, so I think we can mark this as fixed/working, and if anyone has issues or further questions, feel free to add them here!

@geerlingguy
Copy link
Owner Author

@MidnightLink - I just ran through your instructions on a fresh Pi OS 12 install, and it worked like a charm, thanks!

Follow-up question: Any idea if the CodeProject.AI Server will be adding support for the Hailo-8 / Hailo-8L?

@MidnightLink
Copy link

@geerlingguy Funny enough I actually just posted in their forum asking the same exact question after seeing your latest video :) They do support adding third party modules already, but I'm hoping that they'll be able to get something integrated natively soon since I also just ordered one of the new Hailo Pi kits. There's a pretty detailed write up listed in their docs on how to do so ( https://www.codeproject.com/ai/docs/devguide/module_examples/adding_new_modules.html ) but I haven't really had the need to add anything as of yet

@xaquib666
Copy link

xaquib666 commented Jul 4, 2024

image

Running dual edge TPU and one USB TPU with the PI 5 with 9 camera streams. However no hardware accelaration is working due to depriciated support. Anyone had any success with hardware accelaration?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants