Area of Concern
Describe the bug
Changing the model and model size for the Coral no longer works on CPAI 2.8 and Coral module 2.4.0. You're stuck on MobileNet Small, no matter what you pick.
It looks like it's working because the Info tab shows the selected model and model size, but the inferences remain identical to MobileNet Small no matter what you pick. For example, even after choosing YoloV5 Large, the inferences still take ~13ms, which is the typical return time of MobileNet Small.
Someone else has also noticed and posted the bug on CPAI Discussions.
Expected behavior
It should change the model and model size properly.
Your System (please complete the following information):
- CodeProject.AI Server version: 2.8.0
- OS: Docker, Linux
- System RAM: 32 GB
- GPU (if available): Coral TPU (mini PCI-e and USB)