-
Notifications
You must be signed in to change notification settings - Fork 385
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ZED is very slow #242
Comments
Have you verified that the Jetson TX2 is in "performance" mode and not in "on demand"? |
Hi @Myzhar , thank you for the suggestion. Could you tell me how to check and set camera resolution? |
Check the file "zed_camera.launch" you will find two parameters "resolution" and "framerate" and the possible setting combinations |
In the launch file, gpu_id=-1, should I change that to gpu_id=0? Since I do have GPU on my TX2 |
gpu=-1 means that the first available GPU will be used. In your case it is like setting it to 0. |
Thanks. After changing the resolution and frame rate, the image is much smoother. However, I found there was still some delay. Since I am going to use the image to train a deep learning network and drive the car, the image need to be paired with control parameters. So the delay is a big problem for me. Is there a way to fix it? Also, since for this application I do not need the depth image, can I turn 3D reconstruct /deep sense off? I want use ZED as just a now camera. |
The wrapper is "smart": About latency, it is a "ROS issue" since the ZED SDK introduces a very small latency. |
I am using rosbag to save the images. Can I use UDP in rosbag? |
Hi not sure what #1279 means. Is that already available in rosbag? |
@ArtlyStyles this is a ROS related issue. You can find more information on ROS channels |
Hi @Myzhar I would like to study more on the latency issue. Can I view ZED image topic in rviz using UDP? |
Sure you can
Please remember that the latency is due to ROS communication and Rviz processing and what you see on Rviz is not the real ZED latency that can be measured at max as 3 or 4 frames on Ubuntu. |
Hi @Myzhar Thanks. I tried that, the improvement was not very obvious. Since this is a ROS issue, it probably means that ROS is not a good choice for time-critical projects, such as autonomous driving? And changing to another camera won't help? |
This is one of the critics made to ROS. |
I am thinking writing my own ROS node which talks to ZED API directly and output driving commands. If I copy zed_wrapper_nodelet.cpp, modify it by putting my image processing code in and then output 2 floating point numbers through ROS publisher, then there should not be any delay. Correct? |
Not really correct, you will have a delay generated by the sum of two factors:
|
yes, I will have these delays. But it should be much shorter since it will take about half second just to get the image from ROS |
I close the issue, but you can continue to post comments if you need further help |
Hello @Myzhar, I have been following this topic closeley, and have found my self in a similar situation. Also I am using ROS 2 Foxy, on a Nvidia AGX Orion. I thought it would be something I could solve leveraging cuda or tapping into the more powerfull board, but nothing so far. Any suggestions on how could I proceed? |
@CrossWax this is a closed issue. |
Appologies @Myzhar , |
Hi,
I am running ZED ROS wrapper on MIT-Racecar, a car-like robot. The node is started by zed_wrapper/zed.launch. When the cars was driving, I monitored left camera rectified image through rviz. However, the image was not continous, I got about 1 frame update every 5 seconds. The car has jetson Tx2 so computation should not be a problem.
Actually for my application, I only need the raw images from both cameras to feed a deep learning neural network. I don't need deep image, rectified image, etc.
How can I improve the frame rate?
The text was updated successfully, but these errors were encountered: