Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple (fisheye) camera calibration more fragile with bigger calibration dataset size? #231

Open
skohlbr opened this issue Oct 10, 2018 · 12 comments
Labels
debugging Might be a bug, needs to be looked into

Comments

@skohlbr
Copy link
Contributor

skohlbr commented Oct 10, 2018

As described in #226, we calibrate multiple fisheye camera against each other using kalibr with varying degree of success. Sometimes calibration works, sometimes it does not, and it is very hard to know the reason why it sometimes fails and sometimes works. Very often, the initial projection estimates are full of nan values. I seem to have noticed that this actually is more likely the higher the number of image samples in the dataset is. I would expect that imperfections of real camera data could be a reason for this, but I can observe exactly the same effect on datasets collected in simulation, which are pretty much perfect (i.e. no time sync offsets, blur, non-rigid mounting, lighting variance...). If there are any hints or ideas on why that is (and what could be done to improve success rate) that would be useful.

@skohlbr
Copy link
Contributor Author

skohlbr commented Oct 15, 2018

So I calibrated another set of cameras and discovered the --bag-from-to parameter. When using that one to (significantly) restrict the used dataset size, calibration basically always finishes. Of course, sampling the image only sparsely with features might reduce calibration quality.

I'd still be very interested why there is this very strong correlation between dataset size and likelihood of calibration failure.

@safijari
Copy link

Have you visualized the april tag detection results? The underlying library is intended to be used with rectified images, meaning using it in this fashion can cause you to lose out on perfectly detectable tags and that might potentially be causing your issues. As far as I know kalibr does tag extraction only once on unrectified images.

It may be beneficial to calibrate the intrinsics separately, and then feeding rectified images to kalibr to see if the behavior improves.

@skohlbr
Copy link
Contributor Author

skohlbr commented Oct 26, 2018

Yes, the April tag detections look fine from visual inspection. It's also possible to run calibration on different time slices of the dataset (the smaller the slice, the higher the likelihood of not getting NaNs). So I can run the same calibration command line with --bag-from-to 0 30, --bag-from-to 30 60 or --bag-from-to 60 90 with calibration success, but if I run a "too large" time slice (for instance --bag-from-to 0 90), calibration fails with NaNs in the initial projection estimates. This suggests to me that in principle, the detected features are ok, but for some reason optimization gets thrown off by "too much data".

As for your suggestions to split up optimization, I'd be interested in that, but as far as I'm aware this kind of functionality is not exposed in kalibr at the moment (i.e. would require some implementation effort).

@ishipachev
Copy link

@skohlbr
Hi there. To me it seems like sometimes with big amount of images procedure fails to converge. Various possible reasons:

  1. Optical deviations from modeled camera model, when it becomes asymmetric or diverge quite far from ideal one. Fisheye cameras, especially with small lenses tend to have this issues.
  2. Printed pattern has different scale for each of its sides. Because of the printing it is not square, but rectangular. In this case shooting the same target from different poses can lead do divergence with big amount of images.
    When I updated target for a proper one the problem have disappeared. Check carefully the width and length and also skew (by comparing diagonals).

Just for some future visitors here.

@goldbattle goldbattle added the debugging Might be a bug, needs to be looked into label May 4, 2022
@goldbattle
Copy link
Collaborator

Hi, can you provide a dataset where this occurs? Thanks

@skohlbr
Copy link
Contributor Author

skohlbr commented May 5, 2022

Hi @goldbattle I'll check, it's been a while :)

@skohlbr
Copy link
Contributor Author

skohlbr commented May 5, 2022

@goldbattle actually found one of the old datasets. Available via this link: https://energyrobotics.sharepoint.com/:f:/s/ExternalSharing/EoTso8wWoghPkuJ8ONb3KJoBCBC7Z8oPL4DDx4sKcvg-Ew?e=0AfRrQ

Calibration worked for me (for one of the cameras) by running below command line. The other cameras gave NaN results, so might be good test case
rosrun kalibr kalibr_calibrate_cameras --target april_6x6_50x50cm.yaml --bag sim_insta360_2018-10-02-01-21-20.bag --models omni-radtan omni-radtan --topics /camera360/left/image_raw_throttled /camera360/right/image_raw_throttled

@Nick-0814
Copy link

@goldbattle actually found one of the old datasets. Available via this link: https://energyrobotics.sharepoint.com/:f:/s/ExternalSharing/EoTso8wWoghPkuJ8ONb3KJoBCBC7Z8oPL4DDx4sKcvg-Ew?e=0AfRrQ

Calibration worked for me (for one of the cameras) by running below command line. The other cameras gave NaN results, so might be good test case rosrun kalibr kalibr_calibrate_cameras --target april_6x6_50x50cm.yaml --bag sim_insta360_2018-10-02-01-21-20.bag --models omni-radtan omni-radtan --topics /camera360/left/image_raw_throttled /camera360/right/image_raw_throttled

Hi , @skohlbr.
I see that the camera equipment you are using is also the insta360 series. Recently, I am trying to use kalibr to calibrate the internal and external parameters between the two fisheye lenses of the insta360 air. Did you use kalibr to label successfully at that time?
I keep getting bugs during running (Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approval. sync. tolerance.)

Best regards,
Nick

problem

@skohlbr
Copy link
Contributor Author

skohlbr commented May 6, 2024

Hi @Nick-0814 , you're right, calibrating a single Insta fisheye cam with two lenses doesn't work with kalibr (to the best of my knowledge). The trick is to use 2 cameras and arrange them 90 deg offset from each other, so they have mutual observations. Example (of older Insta cams) from a few years ago:
insta_360_calib

@Nick-0814
Copy link

Hi @Nick-0814 , you're right, calibrating a single Insta fisheye cam with two lenses doesn't work with kalibr (to the best of my knowledge). The trick is to use 2 cameras and arrange them 90 deg offset from each other, so they have mutual observations. Example (of older Insta cams) from a few years ago: insta_360_calib

Thanks so much!!! @skohlbr
Bro, I would like to reconfirm the calibration method you used. I also have a ricoh theta s camera equipment. If it is used with the insta360 air, is there anything else to pay attention to except for the fixed offset angle?
This method is equal to that when I use kalib _ bagcreate to make a dataset, I need four image topics, similar to/cam0/image _ raw,/cam1/image _ raw,/cam2/image _ raw,/cam3/image _ raw, right?

Best regards,
Nick

@skohlbr
Copy link
Contributor Author

skohlbr commented May 15, 2024

So you have to take care you get all the images and IIRC you might need some tooling so their timestamps are all synced up (i.e. the different cameras might have non-matching timestamps as they're not synced, and you gotta fake that, otherwise kalibr will complain). For that reason, you also should make sure to only capture static scenes if possible (perhaps by selectively triggering the 4x image capture).

@ishipachev
Copy link

@skohlbr
Yes, this is correct. You need to trigger cameras simultaneously to make the whole setup works. And name them based on their trigger time. Or of you don't need to calibrate IMU or rolling shutter -> any time but to have the same file name for images which are triggered simultaneously.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
debugging Might be a bug, needs to be looked into
Projects
None yet
Development

No branches or pull requests

5 participants