Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ARM support #2

Open
hoonkai opened this issue Mar 28, 2019 · 7 comments
Open

ARM support #2

hoonkai opened this issue Mar 28, 2019 · 7 comments

Comments

@hoonkai
Copy link

hoonkai commented Mar 28, 2019

Hi

Firstly, great work! Can I ask if ARM is still not supported even though OpenVINO has support for ARM now? I see the line "Caution: It does not work on ARM architecture devices such as RaspberryPi / TX2." in the README, but then it's followed by a notice update. Not sure if the notice implies that Deeplab supports ARM now?

Thanks!

@PINTO0309
Copy link
Owner

Explain the meaning of my words.

  • OpenVINO officially supports armv7l(ARM)
  • My test program does not support the combination of NCS2 + ARM(RaspberryPi / TX2)
  • Argmax and some other layers in the DeeplabV3+ model do not support NCS2
  • You need a tricky implementation to get DeeplabV3+ working properly

@hoonkai
Copy link
Author

hoonkai commented Mar 29, 2019

I see. Any idea if your program works on a Celeron J1900 then? I'm still thinking about buying the NCS2 or not...

@PINTO0309
Copy link
Owner

PINTO0309 commented Mar 30, 2019

When using OpenVINO, benchmark results are faster with inference speed using a Intel's CPU than using NCS2. You will have no major benefit in purchasing NCS2.

LattePanda Alpha + OpenVINO + "CPU (Core m3) vs NCS1 vs NCS2", Performance comparison

Btw, If you want to get the best performance, I recommend buying "Google Edge TPU Accelerator".
The following URL is my verification article.

I tested the operating speed of MobileNet-SSD v2 using Google Edge TPU Accelerator with RaspberryPi3 (USB2.0) and LaptopPC (USB3.1) (MS-COCO)

https://github.com/PINTO0309/TPU-MobilenetSSD.git

@hoonkai
Copy link
Author

hoonkai commented Mar 30, 2019

Thanks for the advice. But isn't Intel Celeron unsupported? According to https://software.intel.com/en-us/articles/OpenVINO-InferEngine the CPU target supports "Intel® Xeon® with Intel® AVX2 and AVX512, Intel® Core™ Processors with Intel® AVX2, Intel® Atom® Processors with Intel® SSE". Perhaps that's why Gemini91's Celeron result is about 5x slower than NCS2?

@PINTO0309
Copy link
Owner

It has been confirmed to work with Celeron.
However, because it does not support the latest Intel architecture, it seems that sufficient performance can not be achieved.
OpenVINO is designed to be optimized for Intel's Atom or higher CPU architecture.
https://ncsforum.movidius.com/discussion/comment/4139/#Comment_4139

@Aaronreb
Copy link

Aaronreb commented Dec 9, 2019

Does openvino works on raspberry pi without NCS?

@PINTO0309
Copy link
Owner

@Aaronreb
Yes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants