-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MonoGame integration #6
Comments
MonoGame is licensed Ms-PL per https://github.com/mono/MonoGame/blob/develop/LICENSE.txt - not sure of compatibility with Apache 2.0. Ideally the integration would be Apache 2.0 so parts could be re-used in other OSVR-related integrations most easily. |
References:
|
Good point about the license. I don't have any idea. If we don't distribute the binaries/source for MonoGame, can we make our integration Apache 2.0 even though it references MonoGame? Only the final developer would download the MonoGame libraries through NuGet or their own build. (I am not a lawyer) |
I’ll ask the lawyers to check what the best way is From: Sleepy Daddy Software™ [mailto:notifications@github.com] Good point about the license. I don't have any idea. If we don't distribute the binaries/source for MonoGame, can we make our integration Apache 2.0 even though it references MonoGame? Only the final developer would download the MonoGame libraries through NuGet or their own build. (I am not a lawyer) — |
(I am not a lawyer) Looks like Apache Software Foundation is fine with depending on MsPL stuff, for what it's worth: http://www.apache.org/legal/resolved.html |
Status update: Most of the OSVR-Unity code has been ported to OSVR-MonoGame. The MonoGame version of the ClientKit is a MonoGame GameComponent and is meant to be registered as a Game service that any other component can inject. It is mostly a straight port, with some exceptions:
One stumbling block: I wasn't sure what the MonoGame equivalent to Unity's Matrix4x4.TRS function. I'm not familiar with Unity, and the docs are a little vague, so I'm looking into this. So currently Pose reports are probably not working correctly. Some unproven ideas:
|
The TRS is just creating a matrix from a Translation vector, a Rotation quaternion (Unity wants left handed, so the binding does that, no idea about monogame), and a Scale (just 1). RE ring buffer: No callbacks get delivered when the context update call isn't being run (if you want them outside your main thread, then call update outside your main thread) - so that is, callbacks all are called in your main thread. RE signals: The underlying C API actually has this (the "state" interface, vs. the "callback" interface) - it's just not in managed-osvr because it wasn't in the C++ wrapper yet and callbacks were enough to get us going. I'd suggest looking into using the underlying C API's state methods isntead of maintaining additional state outside in your binding. |
Sounds great. I am sure that in the next few months we will be adding interface types to OSVR and to the Unity/Unreal plugins: whether imager (camera), skeleton, locomotion, eye tracking, etc. How easy would it be to make sure these also go into the MonoGame plugin? Are there some conventions that we should follow to make it easy for these devices to propagate to all the game engines? Yuval From: Sleepy Daddy Software™ [mailto:notifications@github.com] Status update: Most of the OSVR-Unity code has been ported to OSVR-MonoGame. The MonoGame version of the ClientKit is a MonoGame GameComponent and is meant to be registered as a Game service that any other component can inject. It is mostly a straight port, with some exceptions:
One stumbling block: I wasn't sure what the MonoGame equivalent to Unity's Matrix4x4.TRS function. I'm not familiar with Unity, and the docs are a little vague, so I'm looking into this. So currently Pose reports are probably not working correctly. Some unproven ideas:
— |
@rpavlik The timestamps that come with the raw event callbacks in Managed-OSVR - are they stamped when the context update occurs, or do they come from the hardware drivers at the time the hardware actually makes the report? The state API - does it only store the last state, or does it have more history? I could see needing up to 4 data-points, preferably with timestamps, for things like smoothing and input-prediction. Also, MonoGame/XNA uses a right-handed coordinate system. So, Vec3 conversion is just pass-through. I'm currently using this for the Pose conversion, which I think is right (rotation * translation): public static Matrix ConvertPose(OSVR.ClientKit.Pose3 pose)
{
var rotation = Matrix.CreateFromQuaternion(Math.ConvertOrientation(pose.rotation));
var translation = Matrix.CreateTranslation(Math.ConvertPosition(pose.translation));
return rotation * translation;
//return Matrix.TRS(Math.ConvertPosition(pose.translation), Math.ConvertOrientation(pose.rotation), Vector3.Zero);
} I didn't quite understand this comment though:
Did these need modification due to another issue unrelated to handedness? For now, I'm just doing a straight conversion without modification. @yboger I think these two issues cover everything I can think of now: I may have more ideas once I can get a real sample going. |
The timestamps are from the hardware drivers, as best as they provide. At worst they are when the plugin sends the message, at best they are when the device driver begins to receive data (indicating that a sample has been taking), but they do come from the device side. The quaternions shouldn't need any change if you're using a right-handed coordinate system, I believe. |
Oh, and state right now only holds the latest. For prediction, etc. we envision that being done as an "analysis plugin" (transparent to the game) which would be able to store longer state, etc. |
Note that VRPN quaternions are [X,Y,Z,W]. Not Russ At 01:25 PM 3/10/2015, you wrote:
Russell M. Taylor II, Ph.D. russ@reliasolve.com |
Yes, we do change the quaternion order, see https://github.com/OSVR/OSVR-Core/blob/master/inc/osvr/Util/QuaternionC.h Primary reason is for direct memory compatibility with the Eigen math library. That said, we don't recommend accessing the members directly unless needed (in something like an FFI like p/invoke, so this case is an exception) - there are inline accessors for that, and/or a compatibility layer to your preferred math library (see https://github.com/OSVR/OSVR-Core/blob/master/inc/osvr/Util/QuatlibInteropC.h and https://github.com/OSVR/OSVR-Core/blob/master/inc/osvr/Util/EigenInterop.h ) |
Status update:
Sorry for the delay in status updates. Dealing with a wave of colds in the family! :) |
Status Update:
Challenges:
Next steps:
This is still a work in progress. Sorry about the slow pace of updates. |
Thanks so much for your work on this! Was just at the IEEE VR conference and there was a lot of interest in monogame so I imagine there will be more user-contributors soon. Would you mind updating the issue about splitting managed OSVR with info on the typical way to distribute binaries? (Do people usually use a single nuget provider or should we be using myget for example? Is there a preferred way to handle providing bins for the two framework versions? How about distributing the native libraries? I'll do some googling once I'm back in the office, but if there are particulars or "community customs" you know that might not come up on a search I would value your insight.). I know I have to do basically a git subtree split and build system adjustment, those I have plenty of experience in. Hopefully I can knock out at least this little admin/housekeeping stuff quickly to help everyone's workflow. |
I'm working on porting the VREye transform calculations to MonoGame but the code for it is spread out into multiple places. In the MonoGame implementation, I have the code set properties like EyeRoll, Translation, EyeRotationY, and RotatePi (XNA/MonoGame uses radians by default) and calculate the matrix in one go. Can anyone verify if this is the correct transformation calculation? // TODO: Should we cache this transform and recalculate it
// on an Update method?
public Matrix Transform
{
get
{
var ret = Matrix.CreateRotationY(EyeRotationY)
* Matrix.CreateRotationZ(EyeRoll);
if(RotatePi)
{
ret = ret * Matrix.CreateRotationZ(MathHelper.Pi);
}
ret = ret * Matrix.CreateTranslation(Translation);
return ret;
}
} |
Status update: JeroMiya/OSVR-MonoGame@03e080f I've added a skeleton game project and fixed a few bugs. Added a DrawScene utility in VRHead that takes care of some boilerplate. The project is now referencing the standalone Managed-OSVR project instead of the one in OSVR-Unity. Known issues:
|
Sounds great! Is there any way you could take a look at setting up the nuget package for managed osvr? I kind of overloaded on Microsoft XML formats getting the msbuild files set up for multiple framework versions and copying the native files the other day. |
Oh, and the latest managed OSVR does support auto detecting and using the right DLL versions, using intptr.size and putting files in bit specific directories. (It does use a setdlldirectory pinvoke call to do this, not sure how to port that best, maybe setting LD_LIBRARY_PATH?), so you can put the native files in a subdirectory of where the assembly is, named by the number of bits (or other options, look in clientkit.cs for the search process), and it will load the right ones automatically. |
Status update: See JeroMiya/OSVR-MonoGame@7e61a6d I have a question about the combination of orientation interface and the VREye's local transformation. I'm still not very familiar with Unity, so in the OSVR-Unity implementation of VREye/VRHead, I am having trouble figuring out what order the final transformations are being done since some of it was implied in how the scene is laid out. In the OSVR-MonoGame implementation, I'm gathering up all the relevant information from the display interface and the orientation interface and calculating a transformation in one go: https://github.com/JeroMiya/OSVR-MonoGame/blob/master/OSVR-MonoGame/OSVR-MonoGame/VREye.cs#L48-L65 // orientation matrix
var orientationRotation = Matrix.CreateFromQuaternion(this.orientationSignal.Value);
// eye device rotation
var pitch = EyeRotationY;
var roll = EyeRoll;
var yaw = RotatePi ? MathHelper.Pi : 0f;
var eyeRotation = Matrix.CreateFromYawPitchRoll(yaw, pitch, roll);
// translate (Should this be eyeRotation * orientationRotation)
var ret = orientationRotation * eyeRotation * Matrix.CreateTranslation(Translation);
return ret; Is this the right composition of orientation and eye rotation? Or should it be eyeRotation * orientationRotation * translation? |
In Unity, this is indeed implied in how the scene is laid out, and is implemented in Unity's Transform component. VREyes are children of VRHead, and as such are automatically transformed with VRHead as its orientation gets updated in the callback in OrientationInterface. I assume you have tested both of those compositions and neither work? Try translating first: There is an implementation of a Transform class on this page that may be helpful: http://www.gamedev.net/page/resources/_/technical/math-and-physics/making-a-game-engine-transformations-r3566 |
Thanks for the suggestion. I'll try that. I haven't been able to test at all yet without a headset. However, my next steps are: write a mock orientation interface (a.k.a. mouse-look), plus a mock display interface (the default one is monoscopic), and I will also need to add something to my sample Game to actually draw (probably along the axes). After that, I'll be working on integrating the distortion shader. |
Are you using the IPD to set slightly different view frustums for each eye? On 4/16/2015 12:55 AM, Sleepy Daddy Software™ wrote:
|
Yes, that was one bug. My port was based on the OSVR-Unity VRHead code, which updated the stereo amount in the Update method, and I was never calling update. It still didn't seem to work after I fixed that, but then I noticed this line: https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L212 This looks incorrect. I believe this clamped value needs to be divided by 100f since the StereoAmount is meant to be a value between 0.0 to 1.0, not 0 to 100. When I divide by 100 here, I can perceive at least a small amount of stereo: Ah, now I think I have it. My VREye Transformation calculation was in the wrong order. It needs to be orientation * eyeRotation (Identity in my case) * Translation (this is the stereo translation). OK, these changes are checked in: |
To be honest, I'd rather they didn't have the "stereo amount" setting in there - I am of the opinion that we should be generating correct stereo images based on IPD/IOD, and if those images don't work well we should document why we deviate from that standard. |
Is there a relationship between the horizontal overlap percent and the IPD? I assumed IPD could be factored into an adjustment to the maxStereo amount. Also what is IOD? Best match on google is integrated optical density but not sure how that relates to stereo. |
So horizontal overlap percent is a property of the display device. IPD, interpupillary distance, is usually what we call the general "distance between the eye cameras". However, that's not entirely/exactly the right term for the concept, at least if you work on things related to eyetrackers (since actual distance between pupils can change because you have vergence to keep binocular vision) - in that world, IOD (inter-ocular distance) is the distance between the centers of the eyeballs and is a separate quantity from IPD. Sorry for the confusion. Max stereo I think is just made up - I hadn't heard of it until I started working with game devs who did VR games. In academic/research VR, you just use the IPD/PD/IOD (and there's one other term I'm forgetting for the same concept) to set up your cameras. So I think the unity controls related to stereo are just tweaks for the "intensity of the 3d effect" that are not "pure" 😄 |
Horizontal overlap shows how much the horizontal viewing direction of Examples:
There is no relationship between the IPD and the horizontal overlap. IPD On 4/17/2015 10:52 AM, Sleepy Daddy Software™ wrote:
|
Hmm, if that's the case, it looks like OSVR-Unity's (and OSVR-MonoGame which is based on it) VREye translation calculation is incorrect. It is currently scaling the maxStereo amount (we'll call that some kind of IPD/IOD oversimplification?) by the stereo amount: https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L137-L138 Where stereo amount is calculated (perhaps incorrectly scaled) to the overlap percent: https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L212 From your description of overlap percentage, however, it sounds like an overlap percentage less than 100% should cause a rotation of each eye (to the right or left of center respectively), not more or less stereoscopic translation, right? And instead of calculating stereoAmount from the device description, we should be using a user configurable IPD value that has a reasonable default value? |
That is correct - less than 100% overlap causes rotation of each eye An example of these calculations in the Unity plugin is here: With regards to IPD - you can take a default value but this should soon On 4/17/2015 5:18 PM, Sleepy Daddy Software™ wrote:
|
== Status update: == === IPD === === Research === === Blocking issues/questions === === Next steps === |
Lots of progress tonight: Now almost ready for the distortion shader implementation, as I am now rendering each eye to an offscreen RenderTarget2D. Also, major cleanup of the solution and project build configurations, referencing a locally built Managed-OSVR NuGet package, and updated MonoGame nuget references to 3.3 and removed old references. |
Status Update:
|
Status Update:
|
Looks great, and you should be able to test it with the HDK any day now. Does it use the JSON files for display parameters or are they hard-coded? Is there a standard Monogame demo scene (akin to "Tuscany") that would be good to demonstrate? |
It gets display parameters from /display so as long as you have it configured on the server, it should be able to pick it up. It doesn't yet support distortion however. I have the HDK now, but I need to pick up an hdmi mini male-to-female adapter and a DVI cable tomorrow. I'm still looking for a better scene/model for the sample. I was going to use the palace map from the other OSVR demo but the model format was incompatible (version was too old). Will look into converting the model to a compatible format with blender/etc... and see how that goes. |
Status update:
Next steps:
|
@JeroMiya can we close this? Does MonoGame have its own repo yet? |
Yes, it has its own repo. We can close this. |
Would be based on Managed-OSVR just as the OSVR-Unity integration is.
http://www.monogame.net/
cc @JeroMiya
The text was updated successfully, but these errors were encountered: