New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calibration between Unity space and DSLR or HoloLens Camera and DSLR #322
Comments
Hello, The calibration process does a few things:
When using this calibration information the HoloLens drives the unity camera location. This is based on the HoloLens's perceived location in the world, which is not equal to the actual physical location of the PV camera used during calibration. Some error is introduced by this assumption. In an ideal world, the physical transform for the DSLR camera would be both the transform from the HoloLens's perceived self to the physical PV camera combined with the transform from the PV camera to the DSLR camera. This error is likely small enough to not detract from the end filming experience, but may be worth fixing long term if information is available on the physical location of the PV camera relative to the HoloLens's position tracking. Is there more information on what sort of misalignment you are seeing? Calibration has been a major pain point for many of the contributors/users of Spectator View Pro. Is it possible for you to share the CalibrationData.txt you generated so that we can take a look on our end at the transforms you generated? |
It is possible to obtain camera extrinsic information/the physical location of the PV camera. However, it doesn't appear to be possible to obtain this information when grabbing a PV camera frame via the mixed reality capture REST api. If you built your own logic for handing frames from the hololens to the calibration app, you could access the information in a similar manner to here:
The manufacturer provided focal length and principal points will be more accurate than the values you obtain with opencv. The calibration app roughly supports providing your own intrinsic information, which I would suggest doing to try and get a more accurate calibration. But I would also suggest some changes based on the richness of information that you have.
|
Thanks a lot for the detailed answer! I will give it a try. |
I have some misalignment between my two coordinate systems. Is the calibration done regarding unity space and DSLR or the HoloLens Webcam and the DSLR. In the second case, I would need to apply another transformation, right?
The text was updated successfully, but these errors were encountered: