Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Matlab's Camera Calibrator App #20

Open
SRHarrison opened this issue Jul 16, 2020 · 5 comments
Open

Matlab's Camera Calibrator App #20

SRHarrison opened this issue Jul 16, 2020 · 5 comments

Comments

@SRHarrison
Copy link

Hi Brittany,

After some testing, I think that the Camera Calibrator App (included with Matlab's Computer Vision toolbox) is not only convenient (no clicking), but also seems to work better in some cases at resolving the lens model than caltech toolbox.

For people wanting to use that, it might be worthwhile including a translator (similar to your caltech2CIRN.m) from that output to the CIRN intrinsic variable (and maybe call it camcalibrator2CIRN.m).

Assuming that the user exports the "camera parameters variable" as params to workspace from the Camera Calibrator, then the translation to intrinsics is:

%% Conversion
intrinsics(1) = params.ImageSize(2);            % Number of pixel columns
intrinsics(2) = params.ImageSize(1);            % Number of pixel rows
intrinsics(3) = params.PrincipalPoint(1);         % U component of principal point  
intrinsics(4) = params.PrincipalPoint(2);          % V component of principal point
intrinsics(5) = params.FocalLength(1);         % U components of focal lengths (in pixels)
intrinsics(6) = params.FocalLength(2);         % V components of focal lengths (in pixels)
intrinsics(7) = params.RadialDistortion(1);         % Radial distortion coefficient
intrinsics(8) = params.RadialDistortion(2);         % Radial distortion coefficient
intrinsics(9) = params.RadialDistortion(3);         % Radial distortion coefficient
intrinsics(10) = params.TangentialDistortion(1);        % Tangential distortion coefficients
intrinsics(11) = params.TangentialDistortion(2);        % Tangential distortion coefficients
@burritobrittany
Copy link
Collaborator

That is very helpful shawn! I think that is a good idea since we have been having issues with the Caltech toolbox running on certain versions of MATLAB. I will add it to the list. I am actually hoping to address some issues tomorrow and will start!

@sivaiahborra
Copy link

Dear SRHarrison,

As I am in starting stage of using this Toolbox, I have been going through the documentation and finished first step that movies2frame, I collected the user inputs such as GCPs in FOV of video, Extrinsics (X,Y,Z, Azimuth, Tilt, Roll of fixed camera), but, in case of Intrinsics (camera parameters such as above said 11 parameters). How would I get those 11 intrinsic parameters of camera? Please let me know. Do I need any per-requistes to get those 11 parameter. Sorry for inconvenient with my simple and silly questions.

@SRHarrison
Copy link
Author

Dear SRHarrison,

As I am in starting stage of using this Toolbox, I have been going through the documentation and finished first step that movies2frame, I collected the user inputs such as GCPs in FOV of video, Extrinsics (X,Y,Z, Azimuth, Tilt, Roll of fixed camera), but, in case of Intrinsics (camera parameters such as above said 11 parameters). How would I get those 11 intrinsic parameters of camera? Please let me know. Do I need any per-requistes to get those 11 parameter. Sorry for inconvenient with my simple and silly questions.

Hi @sivaiahborra ,
Let me see if I understand correctly.... You have collected a video of the surfzone/beach using a UAS hovering 'still'. You were able to extract the video frames, and determine the camera position (extrinsic parameters) for each frame in time, using Brittany's Toolbox. Now you wonder how to get the intrinsic parameters that A_formatIntrinsics.m assumes that you've already gathered?

There are many ways to skin a dog, but typically we introduce people to intrinsic calibration / lens calibration with this presentation on Intrinsic Calibration and Distortion and with this hand's on Lens Calibration Practicum.

Basically, you have to assume that your UAS camera has a fixed aperture and fixed focus (probably not true), and use it to take photos of a graduated checkerboard pattern (must be on a flat surface, not curvy). You effectively 'paint' the entire FOV of the sensor with images of the checkerboard. Then you can try to fit a camera model with it... The CalTech toolbox is free and fairly accessible. I definitely prefer feeding the images to Matlab's Camera Calibrator App, but it is part of a toolbox and probably not worth the extra cost if that's all you need the toolbox for.

Structure from motion software, e.g. Agisoft Metashape, or Pix4D Mapper determine the intrinsic parameters in a similar way, but do not require you to do the checkerboard thing. They use images of the same object from differing views to determine the lens distortion. However, translating those parameters to the format that the CIRN toolbox expects is not always straight forward. Brittany or others might have some translation suggestions if you plan to go that route.

I suggest taking your UAS, placing it on a table and just walking around in front of it with the checkboard displayed. Make sure that you use the exact same settings on the camera that you did during your flight/video capture. Typically these UAS cameras will use a subset of the sensor to do video, and so you want to make sure and calibrate the lens for that subset. If you later decide to change the resolution to record video, you'll need to calibrate again for those settings.

@sivaiahborra
Copy link

sivaiahborra commented Jul 21, 2020 via email

@sivaiahborra
Copy link

sivaiahborra commented Jul 24, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants