Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

D435 - IR Cameras light wave length range data #1070

Closed
p0wdrdotcom opened this issue Jan 24, 2018 · 10 comments
Closed

D435 - IR Cameras light wave length range data #1070

p0wdrdotcom opened this issue Jan 24, 2018 · 10 comments

Comments

@p0wdrdotcom
Copy link

Required Info
Camera Model D435
Firmware Version ?
Operating System & Version Ubuntu 16.04
Kernel Version (Linux Only) 4.4.0-109
SDK Version 2 (master)

Hi

I would like to know what the light wavelength band the IR cameras are using/sensitive to. I know the projector is ~850nm.

My Interest is due to wanting to lower the overall amplitude/intensity level of light entering the IR cameras. I have a situation where the depth map data of an object when in direct sunlight is absent due to over exposure. The exposure and gain on the sensor appears to be the minimum (gain:16,exposure:20) and yet the objects textured surface is over exposed. I figure that a neutral-density filter is a simple optical solution to this. I'd like to know what specifications to make the filter(s).

I'm also aware that there could also be side affects to adding an optical filter. The camera calibrations could be affected and impact the quality and accuracy of the depth information. I welcome comment on this also.

Regards

Geoff

@freemanlo
Copy link
Contributor

Hi @p0wdrdotcom
You can find the datasheet with more detail in below link. Visible and IR wavelength used for Left/Right camera.
https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D40-Series-Datasheet.pdf

@agrunnet
Copy link
Contributor

The stereo cameras for the D415 and D435 are both sensitive to ALL wavelengths in the visible to near IR, following the sensitivity curve of the CMOS sensors. There are no IR blocking filters. So this means specifically that they will see visible, 950nm and 940nm. The D415 does have a Bayer pattern filter to extract color while the D435 is monochrome. However, the Bayer Pattern filters do not block IR.

You should be able to set exposure to low enough to make it work outdoors, but as you say you can add an ND filter. Just be careful with any back-reflections from the projector. So ideally use two separate ND filters, but be sure they are of good optical quality and not angled, or the depth will be affected, as you say.

@RealSense-Customer-Engineering
Copy link
Collaborator

[Realsense Customer Engineering Team Comment]
Hi @p0wdrdotcom

Is this question clarified? Still need any help?

@jkenney9a
Copy link

I'd like to open this question back up as it does not appeared to have been directly answered, and I have the same query. I'm interested in this from a different perspective, as I plan on using the camera to observe/track some fish. However, there is some data that suggests fish can see 850nm light, but not 940nm.

I do not see any mention in the documentation mentioned above what the sensitivity range is for the RGB camera. I've looked up the cmos chip that is used (OV2740) and cannot find a sensitivity curve for that either. Does anyone have any additional information?

@agrunnet
Copy link
Contributor

You can use 940nm illumination.
The sensitivity is reduced but it works.

@CaseyJScalf
Copy link

I'm curious about this as well.

I'm doing some augmented reality projection mapping stuff and low and behold the projections are visible in the infrared stream. Meaning if I project anything on the surfaces I'm reading the depth from it throws off the depth reading.

I've never experienced this in the depth camera projector combo before.

How can I tune the camera to disregard visible light in the infrared streams? I read above that the infrared streams combined visible, near, and far IR? Why not just limit it to neae IR or only the 850nm the IR projector casts?

Can this be done in software? Can I buy a filter and glue it to the front? What are my options?

Thank you!

@agrunnet
Copy link
Contributor

@Sensebellum Depth should not be affected negatively by you projecting visible patterns onto the scene. I would like to understand what you see. Visible light actually helps our depth sensors, but it sounds like you are seeing the opposite?
Let me give a specific example of where it helps a lot: If you are indoors, there is very little ambient IR light, so if we block all the visible light we will see a dark room at far distances. Yes, there is an IR-projector, but it will not illuminate very well beyond 3-5m. Finally, there is often more structure and detail in visibly lit scene, than there is with a projector-lit scene. If we only derive depth from the dots, then our XY resolution is solely dependent on the number of dots, limiting us to about 5000 to 20000 depth points. When you see the real world, you actually can get depth from natural scenes and start approaching the resolution of the sensor instead.

You cannot tune the D435 in Software to disregard visible light. However, you can place a physical optical filter (RGB band-pass or IR-cut) filter in front. There is a white paper on that. https://dev.intelrealsense.com/docs/optical-filters-for-intel-realsense-depth-cameras-d400

@CaseyJScalf
Copy link

@agrunnet Thank you for chiming in. The depth is absolutely negatively affected by the projections (from the real projector not the laser dot patterns). The camera uses the IR feed (not actually IR only as I found out) to base it's stereo calculations.

Since it is combining the color into the IR feed for some reason the projected images definitely mess with it. Please see attached.

In the images you can clearly see how the File Explorer window is visible in the IR feed. It throws off the depth image and introduces a bunch of artifacts. I cannot boost the laser projector power to overcome. So I have to dim the projector so that it is only 25% power.

I apologize for the problem with 2 important aspects of this problem being labeled "projector"

Why is this? Why would this be handy? Why would Intel call it IR and have it actually be a bunch of different frequencies? So confusing???

All I want is the depth. My application is 1m away. The dots are clearly visible. Except when the projections are visible. Then it throws all of it off.

Is physically putting a lens in front of the camera (on both lenses) really the only option?

Thank you for any tips

IMG_20200921_175251
IMG_20200921_175305

@agrunnet
Copy link
Contributor

Thanks for sharing the images, and the left stream in particular. The scene appears to be saturated. I would recommend trying to use manual exposure, and even considering turning down the IR projector even more, or even turning it off and seeing if it helps.
Right now the autoexposure is not coping well with the centrally lit region, which is sand which is reflecting the light back (which is not bad, but it just needs to be considered). You can try to do 1. Manual exposure and reduce it and/or reduce gain, or 2. You can decrease the Autoexposure set point under advanced settings. You should see the depth improve dramatically.

Background: The key to good depth is that the imagers see good images. So we just need to make sure that there is not overexposure or underexposed.

Why is it called the "IR imagers"? You are absolutely correct in saying that this is confusing. It is a remnant of legacy designs. It stems from the fact that some of our early generation depth sensors were indeed only IR. But as our algorithms improved, we could benefit more from RGB and from ambient illumination, and we removed the IR-cut filters. But in many cases the designation of the imagers did not change in software to the more correct "stereo RGB+IR" imagers. Also, we have some products like the D415 and D455 that actually see the color, and others that are monochrome like the D435. Currently all are sensitive to IR+RGB. (The LR200 was only IR).

I hope this helps.

@CaseyJScalf
Copy link

@agrunnet Thank you for the quick response. Please interpret my frustration at the problem not you.

Anyways...

Turning off the IR projector makes the whole scene a mess.

Fundamentally it is over blown because there is RGB light getting in there. I this case from the File Explorer.... I feel like we are just missing each other here on what the problem is. Changing the exposure or gain does not get the RGB light out of the picture. It just amplifies it the same.

Separate note. Can I just upload settings to the camera directly? Having to read from a JSON file is usually hit or miss. Would be great to actually save settings from camera. Super helpful when moving between machines. I digress.

I also appreciate the notes on the IR distinction.

I feel that this is a bad move to bake in the convergence of all these frequencies , to then name it something confusing, and then provide no easy solution to undo it.

Anyways, I really hope these cut IR filters work tomorrow or it's back to the Kinect v1!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants