Additional Accelerator Support? #3529
Replies: 3 comments 1 reply
-
For one, coral stock has seen signs of increasing. Thousands of corals went up in stock at RS components in the UK and many of our users have already confirmed delivery. The Intel NCS2 was likely the top contender for a USB device but it has already been set as end of life so the effort to support for a dead product does not make sense. #2548 adds support for TensorRT and Jetson Nano devices, so Nvidia GPU support will be officially supported in a future release (already works using a user-provided docker image). The ideal Edge TPU is one that processes tflite natively and doesn't have an SDK / model conversion required. |
Beta Was this translation helpful? Give feedback.
-
Pleased to hear that Coral stock is increasing somewhere... In the US the best price I can find is around $300, and most places still don't have stock... Hopefully that changes soon... I understand that supporting another accelerator is real work... I read through the Nvidia stuff, and I don't have the technical skills to do the work myself... I will chat with the Kinara guys to see if I can get one of their Engineers to engage, and I will keep looking for more affordable Coral solutions... Thanks. |
Beta Was this translation helpful? Give feedback.
-
I also have a Luxonis OAK-D and a OAK-D Lite which is a dual camera for depth perception. Both run detection and inference onboard and claim to support running any custom TF Lite models. If either of these were supported, I would not need a Coral. Awesome project by the way. Many thanks |
Beta Was this translation helpful? Give feedback.
-
I am keen to use frigate with my HA install. that said, it is really hard/expensive to get Coral TPUs right now, so I am delaying my deployment...
Has the team explored supporting different edge accelerators? I would like to better understand what would need to be done to the Frigate application to support an additional accelerator, and what would need to be done on the accelerator side (models, runtime, etc.).
I know the team at Kinara (https://kinara.ai/) and have talked to them some about getting some support from their side for this usage.
Thanks,
-Bill
Beta Was this translation helpful? Give feedback.
All reactions