-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
D435 - IR Cameras light wave length range data #1070
Comments
Hi @p0wdrdotcom |
The stereo cameras for the D415 and D435 are both sensitive to ALL wavelengths in the visible to near IR, following the sensitivity curve of the CMOS sensors. There are no IR blocking filters. So this means specifically that they will see visible, 950nm and 940nm. The D415 does have a Bayer pattern filter to extract color while the D435 is monochrome. However, the Bayer Pattern filters do not block IR. You should be able to set exposure to low enough to make it work outdoors, but as you say you can add an ND filter. Just be careful with any back-reflections from the projector. So ideally use two separate ND filters, but be sure they are of good optical quality and not angled, or the depth will be affected, as you say. |
[Realsense Customer Engineering Team Comment] Is this question clarified? Still need any help? |
I'd like to open this question back up as it does not appeared to have been directly answered, and I have the same query. I'm interested in this from a different perspective, as I plan on using the camera to observe/track some fish. However, there is some data that suggests fish can see 850nm light, but not 940nm. I do not see any mention in the documentation mentioned above what the sensitivity range is for the RGB camera. I've looked up the cmos chip that is used (OV2740) and cannot find a sensitivity curve for that either. Does anyone have any additional information? |
You can use 940nm illumination. |
I'm curious about this as well. I'm doing some augmented reality projection mapping stuff and low and behold the projections are visible in the infrared stream. Meaning if I project anything on the surfaces I'm reading the depth from it throws off the depth reading. I've never experienced this in the depth camera projector combo before. How can I tune the camera to disregard visible light in the infrared streams? I read above that the infrared streams combined visible, near, and far IR? Why not just limit it to neae IR or only the 850nm the IR projector casts? Can this be done in software? Can I buy a filter and glue it to the front? What are my options? Thank you! |
@Sensebellum Depth should not be affected negatively by you projecting visible patterns onto the scene. I would like to understand what you see. Visible light actually helps our depth sensors, but it sounds like you are seeing the opposite? You cannot tune the D435 in Software to disregard visible light. However, you can place a physical optical filter (RGB band-pass or IR-cut) filter in front. There is a white paper on that. https://dev.intelrealsense.com/docs/optical-filters-for-intel-realsense-depth-cameras-d400 |
@agrunnet Thank you for chiming in. The depth is absolutely negatively affected by the projections (from the real projector not the laser dot patterns). The camera uses the IR feed (not actually IR only as I found out) to base it's stereo calculations. Since it is combining the color into the IR feed for some reason the projected images definitely mess with it. Please see attached. In the images you can clearly see how the File Explorer window is visible in the IR feed. It throws off the depth image and introduces a bunch of artifacts. I cannot boost the laser projector power to overcome. So I have to dim the projector so that it is only 25% power. I apologize for the problem with 2 important aspects of this problem being labeled "projector" Why is this? Why would this be handy? Why would Intel call it IR and have it actually be a bunch of different frequencies? So confusing??? All I want is the depth. My application is 1m away. The dots are clearly visible. Except when the projections are visible. Then it throws all of it off. Is physically putting a lens in front of the camera (on both lenses) really the only option? Thank you for any tips |
Thanks for sharing the images, and the left stream in particular. The scene appears to be saturated. I would recommend trying to use manual exposure, and even considering turning down the IR projector even more, or even turning it off and seeing if it helps. Background: The key to good depth is that the imagers see good images. So we just need to make sure that there is not overexposure or underexposed. Why is it called the "IR imagers"? You are absolutely correct in saying that this is confusing. It is a remnant of legacy designs. It stems from the fact that some of our early generation depth sensors were indeed only IR. But as our algorithms improved, we could benefit more from RGB and from ambient illumination, and we removed the IR-cut filters. But in many cases the designation of the imagers did not change in software to the more correct "stereo RGB+IR" imagers. Also, we have some products like the D415 and D455 that actually see the color, and others that are monochrome like the D435. Currently all are sensitive to IR+RGB. (The LR200 was only IR). I hope this helps. |
@agrunnet Thank you for the quick response. Please interpret my frustration at the problem not you. Anyways... Turning off the IR projector makes the whole scene a mess. Fundamentally it is over blown because there is RGB light getting in there. I this case from the File Explorer.... I feel like we are just missing each other here on what the problem is. Changing the exposure or gain does not get the RGB light out of the picture. It just amplifies it the same. Separate note. Can I just upload settings to the camera directly? Having to read from a JSON file is usually hit or miss. Would be great to actually save settings from camera. Super helpful when moving between machines. I digress. I also appreciate the notes on the IR distinction. I feel that this is a bad move to bake in the convergence of all these frequencies , to then name it something confusing, and then provide no easy solution to undo it. Anyways, I really hope these cut IR filters work tomorrow or it's back to the Kinect v1! |
Hi
I would like to know what the light wavelength band the IR cameras are using/sensitive to. I know the projector is ~850nm.
My Interest is due to wanting to lower the overall amplitude/intensity level of light entering the IR cameras. I have a situation where the depth map data of an object when in direct sunlight is absent due to over exposure. The exposure and gain on the sensor appears to be the minimum (gain:16,exposure:20) and yet the objects textured surface is over exposed. I figure that a neutral-density filter is a simple optical solution to this. I'd like to know what specifications to make the filter(s).
I'm also aware that there could also be side affects to adding an optical filter. The camera calibrations could be affected and impact the quality and accuracy of the depth information. I welcome comment on this also.
Regards
Geoff
The text was updated successfully, but these errors were encountered: