Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create a more convenient method of providing base station geometry. #2

Closed
3 tasks done
ashtuchkin opened this issue Nov 13, 2016 · 6 comments
Closed
3 tasks done

Comments

@ashtuchkin
Copy link
Owner

ashtuchkin commented Nov 13, 2016

Base station positions and direction matrices are hardcoded in geometry.cpp (lightsources).

I'm currently using an OpenVR hello world sample on my main machine to get it (by inspecting m_rmat4DevicePose in debug mode). This is obviously not sustainable.

Subtasks:

  • Keep the matrices in EEPROM.
  • Create interface to set them via USB.
  • Create a way to determine the matrices for a setup - special application?
@paralin
Copy link

paralin commented Dec 1, 2016

The real Vive controllers and headset do it automatically, since they have a lot of sensors and know the exact geometry of all of those sensors, it must be quite easy to derive the positions of the base stations. The room calibration is primarily used to figure out the ground + play area, I suppose. The official sensors can determine the base stations relative positions without calibration or pre-set values. I think it would be possible for us to do this as well if we have multiple sensors on the quad with pre-calibrated positions

@kyranf
Copy link

kyranf commented Dec 7, 2016

"it must be quite easy to derive the positions of the base stations." Sure, if you use the solve PnP algorithm. otherwise it's a pain to do it from mathematical first-principals.

Once the first stage "lock on" of the base station (light houses) has been acquired, the algorithm changes to just locating the device relative to the lighthouse and the previously defined "ground" plane reference during room setup.

@mpinner
Copy link

mpinner commented Dec 8, 2016

I'd be interested to ease the process by which we're to determine the positions and direction matrices. I got the OpenVR project working am able to inspect the m_rmat4DevicePose as well. The data from there (int[16]) didn't match format of our lightsources[2] (See Below)

I was thinking i could build a little windows app that outputs the correct lighthouse calibration information (direction and position). i agreed the eeprom is a nice way to go. i'll look into something there as well.

thanks for all your work and documentation.

watched-lighthouse-calibration

@ashtuchkin
Copy link
Owner Author

ashtuchkin commented Dec 8, 2016 via email

@mpinner
Copy link

mpinner commented Dec 12, 2016

Super. this worked great. Thanks!

all the HTTCC devices are in there. it is fascinating getting the positioning matrices for all the device.

i wonder what it takes to build your own into the system?

@ashtuchkin
Copy link
Owner Author

I updated the project to allow runtime configuration (with data stored in eeprom) and created a small app to easily get the base station geometry in needed format (https://github.com/ashtuchkin/vive-diy-position-sensor-geometry-getter). Will update the docs soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants