-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nvidia Jetson ffmpeg + TensorRT support #6458
Nvidia Jetson ffmpeg + TensorRT support #6458
Conversation
✅ Deploy Preview for frigate-docs ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
I'm still planning the right approach for adding "Community supported" boards in this repo. I think our approach will be to have official support for any boards that represent >1% of the docker pulls. Anything below that will be assigned a code owner similar to how home assistant has approached integration maintenance in core. With that, it will be important to create cleaner lines between official and community supported builds. We won't merge this until that is established. |
Well...that's frustrating. Do you want me to close this PR or leave it open? |
Leave it open. Almost all of this is fine. I just don't want to sign up to make sure it continues to work unless a significant portion of the user base runs a Jetson device. I just need to move a few things around. |
This comment was marked as resolved.
This comment was marked as resolved.
Nice. I think the way you have implemented this will fit nicely within how I have been thinking about community supported images. This may actually be clean enough to merge and then convert later. I do have a Jetson Xavier device to test with. |
This comment was marked as resolved.
This comment was marked as resolved.
Looking forward to having an image which I can pull from the repo :) |
Hi, I have a Jetson Orin nano running HA + Frigate and I'm interested to test, especially with ffmpeg acceleration. Google coral M.2 works well but I can test other detectors if needed. |
This comment was marked as outdated.
This comment was marked as outdated.
@madsciencetist how to create a docker container from the source? trying on Jetson Nano 2GB, already converted the models |
This comment was marked as outdated.
This comment was marked as outdated.
The build for my Orin Nano ends with this message. Not sure why.
|
This branch needs to be rebased on the latest dev |
2554ab0
to
35daaa4
Compare
Can confirm @bdherouville the latest dev brings no error, also tested. |
it has already been rebased |
Ok, i just realized, that the changes are in the jetson_support branch @bdherouville so had to clone it this way |
This comment was marked as resolved.
This comment was marked as resolved.
@madsciencetist that is amazing, i have also been trying around, stuck on add-apt-repository, needed for python 3.9, but i see you did it manually, will try that. Edit: ok, i am on JetPack SDK 4.6.3 |
Hi, built and imported on my nano using following commnd. Did I missed something ? docker compose up
docker-compose.yaml
|
@bdherouville make sure you've installed and are using the nvidia container runtime. You'll need to |
Thanks, @madsciencetist we are progressing. Now I have this :
|
You need to run |
@madsciencetist |
@gjtjx what platform do you have? Was it a stock nvidia 4.6.4 image or something custom? If there is a bug with stock 4.6.4 jetpack on platforms that require it, that would be good to know |
@madsciencetist Jetson Xavier NX, the nvidia 4.6.4 image is official release |
I'm attempting to install and run on my 2gb Nano (jetpack 4.6.1) by following the preview instructions, supplemented by studying this thread. I'm nearly there, but something's not quite right. I can access the website at
Here's the docker command I'm using:
and my configuration file:
Frigate log contains one warning, which I don't understand, and think is irrelevant. Full frigate log:
go2rtc looks happy:
And nginx seems mostly happy, apart from some errors at the start that might be because frigate isn't ready yet?
Thanks for any help you can offer. PS: The updated documentation really should explicitly mention the automatically generated model, and how to reference it |
@kerryland the model width and height parameters need to match the model you're using, so for yolov7-320, you need to set both of those to 320. I'm wondering if the issues you're seeing are just performance though? Run top and jtop and make sure that the CPU and GPU aren't saturated. You could try switching the model to yolov7-tiny-416 (which requires changing the model width and height parameters to 416). |
Thank you. It seems to be working. |
@klutzzykitty I would suggest using the up to date frigate dev builds that are being done, I don't think that build is up to date |
@NickM-27 Thank you ! It's the same behavior with it as well. (Removed the previous container files completely along with docker volumes) This is how i'm starting the container:
PS: after few restarts , sometimes everything works like in the references images above. |
@klutzzykitty I haven't encountered that. If it's failing to detect motion, but only when using tensorrt (verify that first), then the thread might be getting stuck. You could use something like |
Thanks for the hint @madsciencetist !
These are the 3 repetitive happenings (in random) up on restarting the container - latest jp4 frigate image. So the first case is verified - failing to detect motion but not limited to only when using tensorrt.
Can you throw more light on how I can take the @NickM-27 has this occurred anytime during testing ? Any clue on why threads could be getting stuck ? Logs as i used TPU: nothing quirky noticeable in the logs.
|
At some point if object detection is occurring then motion detection has to have occurred. I must be honest I look at the debug view pretty often and I've never seen an issue with the boxes not showing. With case Either way, I've not seen this issue at all running the main dev image and corals |
@NickM-27 Yeah I agree as first motion detection happens and those regions are passed for object detection. I tried with another Jetson Nano 4GB with an even latest image : dev-5658e5a-tensorrt-jp4 Here's some py-spy outputs for which i think there had to be more activity . Thinking below two are important ones
Can any direction be provided so I can try to figure what can be done? Currently im in complete open waters with this issue XD
If that's a case, how can that be checked ? can the video that comes to debug somehow be grabbed to verify? |
I'll look later, but the debug view is not video, it's just an image frame hitting the debug image endpoint 5 times a second. |
Sure! Thanks Nick. |
@klutzzykitty I think you'll get to the bottom of this fastest by adding debug prints to the |
@madsciencetist Thanks for the hints. |
i cant seem to get my stack to run i get no web interface im running a jetson-nano 4G with portainer-ce to deploy the stack im planning to add 1or2 cameras maybe with a coral (coral not ordered yet) version: "3.9"
services:
frigate:
container_name: frigate
privileged: true # this may not be necessary for all setups
restart: unless-stopped
image: ghcr.io/blakeblackshear/frigate:dev-0858859-tensorrt-jp4
shm_size: "64mb" # update for your cameras based on calculation above
devices:
- /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
- /dev/apex_0:/dev/apex_0 # passes a PCIe Coral, follow driver instructions here https://coral.ai/docs/m2/get-started/#2a-on-linux
# - /dev/dri/renderD128 # for intel hwaccel, needs to be updated for your hardware
volumes:
- /etc/localtime:/etc/localtime:ro
- /home/jetson-nano/frigate/config:/config
- /home/jetson-nano/frigate/storage:/media/frigate
- type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
target: /tmp/cache
tmpfs:
size: 1000000000
ports:
- "5000:5000"
- "8554:8554" # RTSP feeds
- "8555:8555/tcp" # WebRTC over tcp
- "8555:8555/udp" # WebRTC over udp
environment:
FRIGATE_RTSP_PASSWORD: "1"`
|
you have to create the config file, I'd suggest reading the docs https://deploy-preview-6262--frigate-docs.netlify.app/guides/getting_started |
so i have to open a cli text editor and write myself a full config file ? i get no web page at all |
at least a minimal config file, then the built in webui config editor can be used |
since i was not able to find it out on my own for a full day you need to run
then paste someting like
|
Is this still the correct image? : And if not, how would I find out what is the latest version? (I randomly guessed at Thanks! |
I just trew that web address into a browser and
|
That's hysterical. Thanks! |
Is there any documentation about how to deploy frigate on jetson? PS: I have a Jetson Orin Nano 8G with Jetpack 5.1.1 |
Jetson image is not stable yet, you'll need to use the 0.13 beta |
Thank you for your reply, I have learned this information |
I recently bought a new camera that's
As far as I can see h265 is supported on the jetson nano
I'm talking about [camera-paarden] that's not working |
@the-master-r you have |
This PR creates a new
make jetson
target and a newfrigate-jetson
docker image, which is based on Ubuntu 20.04 with L4T (the supported OS for Jetsons) rather than on Debian 11. This enables a rather seamless replacement of the rpi-acceleratedffmpeg
with a jetson-accelerated version, and enables the use of thetensorrt
detector, utilizing the Jetson's GPU or DLA. Note that Ubuntu 20.04 defaults to python3.8, so I forced it to upgrade to python3.9.This is a smaller change than #2548 in that it still uses ffmpeg, rather than adding and using gstreamer.
The accelerated decoding and scaling cuts ffmpeg's CPU use 80-90% and saves >1W of power per stream. Switching from the default mobilenet CPU detector to yolov7-tiny-416 on the GPU saves another watt, and running the same on the DLA saves yet another half watt more.
Tested on a Jetson Xavier NX running Jetpack 5.0.2 (L4T 35.1) and Jetson Xavier AGX running Jetpack 4.6.1 (L4T 32.6.1). Theoretically works on most other Jetson platforms and versions.