New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there a specific implementation limit ? (multitasking models, cascaded models, or large models) #27
Comments
Hi @BICHENG,
|
Thx, I will buy it from the official Hailo website. |
Hello @nadaved1
May I ask you what is the online htop-like tool that one can use to monitor the load on the Hailo Device? I have the Hailo M.2 Acceleration module, and I'm wondering, how to monitor the AI Load on it. |
Hey Mustafa,
It's part of hailortcli tool, you can launch it by doing 'hailortcli
monitor' in a separate terminal than the one that you execute your
inference.
בתאריך יום ג׳, 15 באוג׳ 2023 ב-16:14 מאת Mustafa Younes <
***@***.***>:
… Hello @nadaved1 <https://github.com/nadaved1>
1. We have offline profiler as well as online htop-like tool to
monitor the load on the Hailo device to give you insights as for how to
improve the performance of the pipeline that you execute.
May I ask you what is the online htop-like tool that one can use on the
Hailo Device?
I have the Hailo M.2 Acceleration module, and I'm wondering, how to
monitor the AI Load on it.
—
Reply to this email directly, view it on GitHub
<#27 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADBIQYDLVNOM6HB2PJNBXQDXVNY23ANCNFSM6AAAAAARMCE4FQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
Regards,
*Nadav Eden*
|
Hi @nadaved1 Thanks for the quick reply! I tried to use the
The mentioned environment variable was set successfully to the mentioned value:
OS: Ubuntu 22.04.2 LTS |
HI Mustafa,
I simply do `export HAILO_MONITOR=1` and then execute the hailortcli run
command
בתאריך יום ג׳, 15 באוג׳ 2023 ב-23:30 מאת Mustafa Younes <
***@***.***>:
… Hi @nadaved1 <https://github.com/nadaved1>
Thanks for the quick reply!
I tried to use the hailortcli monitor command on one screen window while
running inference using PyHailoRT on another window, but unfortunately, it
keeps throwing this warning:
Monitor did not retrieve any files. This occurs when there is no application currently running.
If this is not the case, verify that environment variable 'HAILO_MONITOR' is set to 1.
The mentioned environment variable was set successfully to the mentioned
value:
screen -S <session_id> -X setenv HAILO_MONITOR 1
# at both monitor + inference windows
$ echo $HAILO_MONITOR
1
*OS*: Ubuntu 22.04.2 LTS
*HailoRT-CLI version*: 4.14.0
—
Reply to this email directly, view it on GitHub
<#27 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADBIQYEYCVPPIGBBSDRCWOLXVPL6PANCNFSM6AAAAAARMCE4FQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
Regards,
*Nadav Eden*
|
Hi, I have not applied through the developer zone account yet (will it be difficult to apply for all-pass?).
I wonder if the Haio-8 chip can run some large models at the same time? Or can you tell me what the implementation limits are? E.g:
The text was updated successfully, but these errors were encountered: