Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MonoGame integration #6

Closed
rpavlik opened this issue Feb 18, 2015 · 42 comments
Closed

MonoGame integration #6

rpavlik opened this issue Feb 18, 2015 · 42 comments

Comments

@rpavlik
Copy link
Member

rpavlik commented Feb 18, 2015

Would be based on Managed-OSVR just as the OSVR-Unity integration is.

http://www.monogame.net/

cc @JeroMiya

@rpavlik
Copy link
Member Author

rpavlik commented Feb 18, 2015

MonoGame is licensed Ms-PL per https://github.com/mono/MonoGame/blob/develop/LICENSE.txt - not sure of compatibility with Apache 2.0. Ideally the integration would be Apache 2.0 so parts could be re-used in other OSVR-related integrations most easily.

@JeroMiya
Copy link
Contributor

Good point about the license. I don't have any idea. If we don't distribute the binaries/source for MonoGame, can we make our integration Apache 2.0 even though it references MonoGame? Only the final developer would download the MonoGame libraries through NuGet or their own build. (I am not a lawyer)

@VRguy
Copy link
Contributor

VRguy commented Feb 20, 2015

I’ll ask the lawyers to check what the best way is

From: Sleepy Daddy Software™ [mailto:notifications@github.com]
Sent: Thursday, February 19, 2015 8:55 PM
To: OSVR/OSVR-Unity
Subject: Re: [OSVR-Unity] MonoGame integration (#6)

Good point about the license. I don't have any idea. If we don't distribute the binaries/source for MonoGame, can we make our integration Apache 2.0 even though it references MonoGame? Only the final developer would download the MonoGame libraries through NuGet or their own build. (I am not a lawyer)


Reply to this email directly or view it on GitHub #6 (comment) . https://github.com/notifications/beacon/AIqiIzLs-mx44VJ4rTh1LsnMyg1HqzX1ks5ntot3gaJpZM4DiVOB.gif

@rpavlik
Copy link
Member Author

rpavlik commented Feb 20, 2015

(I am not a lawyer)

Looks like Apache Software Foundation is fine with depending on MsPL stuff, for what it's worth: http://www.apache.org/legal/resolved.html

@JeroMiya
Copy link
Contributor

Status update:

Most of the OSVR-Unity code has been ported to OSVR-MonoGame. The MonoGame version of the ClientKit is a MonoGame GameComponent and is meant to be registered as a Game service that any other component can inject.

It is mostly a straight port, with some exceptions:

  • I'm wrapping the the interface reports in a struct Report<T> where T : struct. Report<T> has Timestamp and Value properties. The Unity callbacks were missing the timestamp component, which I imagine will be useful for doing more accurate input processing.
  • MonoGame doesn't have a built-in concept of a 'behavior' that you attach to a 'game object'. So for all of the behavior classes I just wrote a plain class. I'm playing around with creating a common interface that could be used as a building block for implementing your own behavior system, or adapt it to an existing one.

One stumbling block: I wasn't sure what the MonoGame equivalent to Unity's Matrix4x4.TRS function. I'm not familiar with Unity, and the docs are a little vague, so I'm looking into this. So currently Pose reports are probably not working correctly.

Some unproven ideas:

  • I'm going to look into doing a simple ring-buffer implementation for OSVR reports. The idea is that OSVR reports would be buffered between Update steps, where you would call Flush to have them all flushed then. This can simplify development if you're not sure how to handle event callbacks outside of the Update step.
  • I'm going to look into implementing report "signals". Much of the time, you don't want to respond to every interface report - you just want to know the latest value or the last two (or more) values. A 'Signal' for an event is just the last N report values for a particular interface (this would of course work well with the ring buffer above).

@rpavlik
Copy link
Member Author

rpavlik commented Feb 24, 2015

The TRS is just creating a matrix from a Translation vector, a Rotation quaternion (Unity wants left handed, so the binding does that, no idea about monogame), and a Scale (just 1).

RE ring buffer: No callbacks get delivered when the context update call isn't being run (if you want them outside your main thread, then call update outside your main thread) - so that is, callbacks all are called in your main thread.

RE signals: The underlying C API actually has this (the "state" interface, vs. the "callback" interface) - it's just not in managed-osvr because it wasn't in the C++ wrapper yet and callbacks were enough to get us going. I'd suggest looking into using the underlying C API's state methods isntead of maintaining additional state outside in your binding.

@VRguy
Copy link
Contributor

VRguy commented Feb 25, 2015

Sounds great.

I am sure that in the next few months we will be adding interface types to OSVR and to the Unity/Unreal plugins: whether imager (camera), skeleton, locomotion, eye tracking, etc. How easy would it be to make sure these also go into the MonoGame plugin? Are there some conventions that we should follow to make it easy for these devices to propagate to all the game engines?

Yuval

From: Sleepy Daddy Software™ [mailto:notifications@github.com]
Sent: Tuesday, February 24, 2015 5:50 PM
To: OSVR/OSVR-Unity
Cc: Yuval Boger
Subject: Re: [OSVR-Unity] MonoGame integration (#6)

Status update:

Most of the OSVR-Unity code has been ported to OSVR-MonoGame. The MonoGame version of the ClientKit is a MonoGame GameComponent and is meant to be registered as a Game service that any other component can inject.

It is mostly a straight port, with some exceptions:

  • I'm wrapping the the interface reports in a struct Report where T : struct. Report has Timestamp and Value properties. The Unity callbacks were missing the timestamp component, which I imagine will be useful for doing more accurate input processing.
  • MonoGame doesn't have a built-in concept of a 'behavior' that you attach to a 'game object'. So for all of the behavior classes I just wrote a plain class. I'm playing around with creating a common interface that could be used as a building block for implementing your own behavior system, or adapt it to an existing one.

One stumbling block: I wasn't sure what the MonoGame equivalent to Unity's Matrix4x4.TRS function. I'm not familiar with Unity, and the docs are a little vague, so I'm looking into this. So currently Pose reports are probably not working correctly.

Some unproven ideas:

  • I'm going to look into doing a simple ring-buffer implementation for OSVR reports. The idea is that OSVR reports would be buffered between Update steps, where you would call Flush to have them all flushed then. This can simplify development if you're not sure how to handle event callbacks outside of the Update step.
  • I'm going to look into implementing report "signals". Much of the time, you don't want to respond to every interface report - you just want to know the latest value or the last two (or more) values. A 'Signal' for an event is just the last N report values for a particular interface (this would of course work well with the ring buffer above).


Reply to this email directly or view it on GitHub #6 (comment) . https://github.com/notifications/beacon/AIqiI7nZjUKrnlDlA5AeCxvCrG9Wo-1Oks5nvPeygaJpZM4DiVOB.gif

@JeroMiya
Copy link
Contributor

@rpavlik The timestamps that come with the raw event callbacks in Managed-OSVR - are they stamped when the context update occurs, or do they come from the hardware drivers at the time the hardware actually makes the report?

The state API - does it only store the last state, or does it have more history? I could see needing up to 4 data-points, preferably with timestamps, for things like smoothing and input-prediction.

Also, MonoGame/XNA uses a right-handed coordinate system. So, Vec3 conversion is just pass-through. I'm currently using this for the Pose conversion, which I think is right (rotation * translation):

            public static Matrix ConvertPose(OSVR.ClientKit.Pose3 pose)
            {
                var rotation = Matrix.CreateFromQuaternion(Math.ConvertOrientation(pose.rotation));
                var translation = Matrix.CreateTranslation(Math.ConvertPosition(pose.translation));
                return rotation * translation;
                //return Matrix.TRS(Math.ConvertPosition(pose.translation), Math.ConvertOrientation(pose.rotation), Vector3.Zero);
            }

I didn't quite understand this comment though:

                // Wikipedia may say quaternions are not handed, but these needed modification.
                return new Quaternion(-(float)quat.x, -(float)quat.y, (float)quat.z, (float)quat.w);

Did these need modification due to another issue unrelated to handedness? For now, I'm just doing a straight conversion without modification.

@yboger I think these two issues cover everything I can think of now:
#3
#5

I may have more ideas once I can get a real sample going.

@rpavlik
Copy link
Member Author

rpavlik commented Mar 10, 2015

The timestamps are from the hardware drivers, as best as they provide. At worst they are when the plugin sends the message, at best they are when the device driver begins to receive data (indicating that a sample has been taking), but they do come from the device side.

The quaternions shouldn't need any change if you're using a right-handed coordinate system, I believe.

@rpavlik
Copy link
Member Author

rpavlik commented Mar 10, 2015

Oh, and state right now only holds the latest. For prediction, etc. we envision that being done as an "analysis plugin" (transparent to the game) which would be able to store longer state, etc.

@russell-taylor
Copy link
Contributor

Note that VRPN quaternions are [X,Y,Z,W]. Not
sure if OSVR swizzles them to [W,X,Y,Z] before handing them on.

Russ

At 01:25 PM 3/10/2015, you wrote:

The timestamps are from the hardware drivers, as
best as they provide. At worst they are when the
plugin sends the message, at best they are when
the device driver begins to receive data
(indicating that a sample has been taking), but
they do come from the device side.

The quaternions shouldn't need any change if
you're using a right-handed coordinate system, I believe.

­
Reply to this email directly or
#6 (comment)
it on GitHub.


Russell M. Taylor II, Ph.D. russ@reliasolve.com
www.cs.unc.edu/~taylorr

@rpavlik
Copy link
Member Author

rpavlik commented Mar 10, 2015

Yes, we do change the quaternion order, see https://github.com/OSVR/OSVR-Core/blob/master/inc/osvr/Util/QuaternionC.h

Primary reason is for direct memory compatibility with the Eigen math library.

That said, we don't recommend accessing the members directly unless needed (in something like an FFI like p/invoke, so this case is an exception) - there are inline accessors for that, and/or a compatibility layer to your preferred math library (see https://github.com/OSVR/OSVR-Core/blob/master/inc/osvr/Util/QuatlibInteropC.h and https://github.com/OSVR/OSVR-Core/blob/master/inc/osvr/Util/EigenInterop.h )

@JeroMiya
Copy link
Contributor

Status update:

  • I went in a slightly different direction for the interface classes. I'm calling them InterfaceSignals. "Signal" is a reactive term for something that is both an event source for a value that changes over time and its current value.
  • VRHead and VREye not yet ported. I'm thinking of making these GameComponents perhaps.
  • Initial checkin is available here (very early): https://github.com/JeroMiya/OSVR-MonoGame

Sorry for the delay in status updates. Dealing with a wave of colds in the family! :)

@JeroMiya
Copy link
Contributor

Status Update:

  • Added DeviceDescriptor to the project.
  • Ported code from OSVR-Unity into VRHead.cs for reading in settings from DeviceDescriptor.
  • VREye now calculates Viewports.

Challenges:

  • Had to make a copy of DeviceDescriptor.cs from OSVR-Unity to OSVR-MonoGame. It would be better if DeviceDescriptor was in Managed-OSVR so there wasn't duplication of code, but that would make Json.Net a new dependency of the Managed-OSVR DLL. I'm not sure how Unity handles third party dependencies like that. I just use NuGet for OSVR-MonoGame to bring in Json.Net.
  • MonoGame seems to have its own version of FX runtime and shader syntax that supposedly can be cross-platform for DirectX and OpenGL platforms, but I'm struggling to find good documentation on it. Regardless, shaders are not something I have a significant amount of experience with. I may need to seek help in porting the distortion shader from OSVR-Unity. For now, I'm just leaving it undistorted.
  • MonoGame/XNA doesn't have a high-level concept of a "Camera" or object hierarchy with dependency injection/inspector surface, so some aspects of the code (like the 180 degree rotation) take longer to port.

Next steps:

  • Add code to VRHead to draw a scene into the two viewports, one for each eye (or alternatively draw it once for Mono view mode).
  • I should provide an API in VRHead/VREye to calculate view matrices for each eye (or the single view matrix for monoscopic rendering).
  • I've got a small proof of concept sample I'm working on - it's got the split-screen viewports working, but it's not calculating the view matrices yet.

This is still a work in progress. Sorry about the slow pace of updates.

@rpavlik
Copy link
Member Author

rpavlik commented Mar 29, 2015

Thanks so much for your work on this! Was just at the IEEE VR conference and there was a lot of interest in monogame so I imagine there will be more user-contributors soon.

Would you mind updating the issue about splitting managed OSVR with info on the typical way to distribute binaries? (Do people usually use a single nuget provider or should we be using myget for example? Is there a preferred way to handle providing bins for the two framework versions? How about distributing the native libraries? I'll do some googling once I'm back in the office, but if there are particulars or "community customs" you know that might not come up on a search I would value your insight.). I know I have to do basically a git subtree split and build system adjustment, those I have plenty of experience in. Hopefully I can knock out at least this little admin/housekeeping stuff quickly to help everyone's workflow.

@JeroMiya
Copy link
Contributor

I'm working on porting the VREye transform calculations to MonoGame but the code for it is spread out into multiple places. In the MonoGame implementation, I have the code set properties like EyeRoll, Translation, EyeRotationY, and RotatePi (XNA/MonoGame uses radians by default) and calculate the matrix in one go. Can anyone verify if this is the correct transformation calculation?

            // TODO: Should we cache this transform and recalculate it
            // on an Update method?
            public Matrix Transform
            {
                get
                {
                    var ret = Matrix.CreateRotationY(EyeRotationY)
                        * Matrix.CreateRotationZ(EyeRoll);

                    if(RotatePi)
                    {
                        ret = ret * Matrix.CreateRotationZ(MathHelper.Pi);
                    }
                    ret = ret * Matrix.CreateTranslation(Translation);
                    return ret;
                }
            }

@JeroMiya
Copy link
Contributor

JeroMiya commented Apr 3, 2015

Status update:

JeroMiya/OSVR-MonoGame@03e080f

I've added a skeleton game project and fixed a few bugs. Added a DrawScene utility in VRHead that takes care of some boilerplate. The project is now referencing the standalone Managed-OSVR project instead of the one in OSVR-Unity.

Known issues:

  • You currently have to manually copy the appropriate native DLLs from Managed-OSVR into the output folder.
  • MonoScopic rendering isn't supported (which is the default when you don't have a headset).
  • VRHead is applying some settings from the /display interface, but it's still not syncing up the view matrices of each eye with an actual orientation or pose interface.
  • The view matrix calculations in VREye are probably way off (see: https://github.com/JeroMiya/OSVR-MonoGame/blob/master/OSVR-MonoGame/OSVR-MonoGame/VREye.cs#L47-L61)
  • Several build configurations from Managed-OSVR's ClientKit project were automatically added to the solution when I added the ClientKit project to the solution for solution references. Need to revert this and find another way - perhaps just including a snapshot build in the OSVR-MonoGame repo, at least until there is a NuGet package.

@rpavlik
Copy link
Member Author

rpavlik commented Apr 3, 2015

Sounds great! Is there any way you could take a look at setting up the nuget package for managed osvr? I kind of overloaded on Microsoft XML formats getting the msbuild files set up for multiple framework versions and copying the native files the other day.

@rpavlik
Copy link
Member Author

rpavlik commented Apr 3, 2015

Oh, and the latest managed OSVR does support auto detecting and using the right DLL versions, using intptr.size and putting files in bit specific directories. (It does use a setdlldirectory pinvoke call to do this, not sure how to port that best, maybe setting LD_LIBRARY_PATH?), so you can put the native files in a subdirectory of where the assembly is, named by the number of bits (or other options, look in clientkit.cs for the search process), and it will load the right ones automatically.

@JeroMiya
Copy link
Contributor

Status update: See JeroMiya/OSVR-MonoGame@7e61a6d

I have a question about the combination of orientation interface and the VREye's local transformation. I'm still not very familiar with Unity, so in the OSVR-Unity implementation of VREye/VRHead, I am having trouble figuring out what order the final transformations are being done since some of it was implied in how the scene is laid out. In the OSVR-MonoGame implementation, I'm gathering up all the relevant information from the display interface and the orientation interface and calculating a transformation in one go:

https://github.com/JeroMiya/OSVR-MonoGame/blob/master/OSVR-MonoGame/OSVR-MonoGame/VREye.cs#L48-L65

                    // orientation matrix
                    var orientationRotation = Matrix.CreateFromQuaternion(this.orientationSignal.Value);

                    // eye device rotation
                    var pitch = EyeRotationY;
                    var roll = EyeRoll;
                    var yaw = RotatePi ? MathHelper.Pi : 0f;
                    var eyeRotation = Matrix.CreateFromYawPitchRoll(yaw, pitch, roll);

                    // translate (Should this be eyeRotation * orientationRotation)
                    var ret = orientationRotation * eyeRotation * Matrix.CreateTranslation(Translation);
                    return ret;

Is this the right composition of orientation and eye rotation? Or should it be eyeRotation * orientationRotation * translation?

@DuFF14
Copy link
Member

DuFF14 commented Apr 14, 2015

In Unity, this is indeed implied in how the scene is laid out, and is implemented in Unity's Transform component. VREyes are children of VRHead, and as such are automatically transformed with VRHead as its orientation gets updated in the callback in OrientationInterface.

I assume you have tested both of those compositions and neither work? Try translating first:
var ret = Matrix.CreateTranslation(Translation) * orientationRotation * eyeRotation;

There is an implementation of a Transform class on this page that may be helpful: http://www.gamedev.net/page/resources/_/technical/math-and-physics/making-a-game-engine-transformations-r3566

@JeroMiya
Copy link
Contributor

Thanks for the suggestion. I'll try that. I haven't been able to test at all yet without a headset. However, my next steps are: write a mock orientation interface (a.k.a. mouse-look), plus a mock display interface (the default one is monoscopic), and I will also need to add something to my sample Game to actually draw (probably along the axes).

After that, I'll be working on integrating the distortion shader.

@JeroMiya
Copy link
Contributor

Making progress, but stereoscopic rendering seems off (using OSVR HDK display json). I'm not perceiving any depth:
image

Note: I had to take it out of fullscreen to take the screenshot. Also, tip for those without an HDK yet: I'm testing with a DodoCase cardboard vr, running remote desktop through SplashTop.

@VRguy
Copy link
Contributor

VRguy commented Apr 16, 2015

Are you using the IPD to set slightly different view frustums for each eye?

On 4/16/2015 12:55 AM, Sleepy Daddy Software™ wrote:

Making progress, but stereoscopic rendering seems off (using OSVR HDK
display json). I'm not perceiving any depth:
image
https://cloud.githubusercontent.com/assets/320332/7174271/fcbf955c-e3d2-11e4-8766-589210f77344.png


Reply to this email directly or view it on GitHub
#6 (comment).

@JeroMiya
Copy link
Contributor

Yes, that was one bug. My port was based on the OSVR-Unity VRHead code, which updated the stereo amount in the Update method, and I was never calling update. It still didn't seem to work after I fixed that, but then I noticed this line:

https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L212

This looks incorrect. I believe this clamped value needs to be divided by 100f since the StereoAmount is meant to be a value between 0.0 to 1.0, not 0 to 100. When I divide by 100 here, I can perceive at least a small amount of stereo:

image

Ah, now I think I have it. My VREye Transformation calculation was in the wrong order. It needs to be orientation * eyeRotation (Identity in my case) * Translation (this is the stereo translation).

OK, these changes are checked in:
JeroMiya/OSVR-MonoGame@1101ed3

@rpavlik
Copy link
Member Author

rpavlik commented Apr 17, 2015

To be honest, I'd rather they didn't have the "stereo amount" setting in there - I am of the opinion that we should be generating correct stereo images based on IPD/IOD, and if those images don't work well we should document why we deviate from that standard.

@JeroMiya
Copy link
Contributor

Is there a relationship between the horizontal overlap percent and the IPD? I assumed IPD could be factored into an adjustment to the maxStereo amount. Also what is IOD? Best match on google is integrated optical density but not sure how that relates to stereo.

@rpavlik
Copy link
Member Author

rpavlik commented Apr 17, 2015

So horizontal overlap percent is a property of the display device. IPD, interpupillary distance, is usually what we call the general "distance between the eye cameras". However, that's not entirely/exactly the right term for the concept, at least if you work on things related to eyetrackers (since actual distance between pupils can change because you have vergence to keep binocular vision) - in that world, IOD (inter-ocular distance) is the distance between the centers of the eyeballs and is a separate quantity from IPD. Sorry for the confusion.

Max stereo I think is just made up - I hadn't heard of it until I started working with game devs who did VR games. In academic/research VR, you just use the IPD/PD/IOD (and there's one other term I'm forgetting for the same concept) to set up your cameras. So I think the unity controls related to stereo are just tweaks for the "intensity of the 3d effect" that are not "pure" 😄

@VRguy
Copy link
Contributor

VRguy commented Apr 17, 2015

Horizontal overlap shows how much the horizontal viewing direction of
the two cameras overlaps.

Examples:

  1. HMD where the left eye can see from -45 to +45 degrees and right eye
    can see from -45 to +45 degrees. There is full overlap (100%) because
    the overlapping direction of both cameras is the entire field of view.
  2. HMD where left eye seems from -60 to +35 and right eye sees from -35
    to +60. This gives 120 degree overall horizontal field but overlap
    region is just -35 to +35 so less than 100%

There is no relationship between the IPD and the horizontal overlap. IPD
sets camera position. Overlap sets viewing direction of camera.

On 4/17/2015 10:52 AM, Sleepy Daddy Software™ wrote:

Is there a relationship between the horizontal overlap percent and the
IPD? I assumed IPD could be factored into an adjustment to the
maxStereo amount. Also what is IOD? Best match on google is integrated
optical density but not sure how that relates to stereo.


Reply to this email directly or view it on GitHub
#6 (comment).

@JeroMiya
Copy link
Contributor

Hmm, if that's the case, it looks like OSVR-Unity's (and OSVR-MonoGame which is based on it) VREye translation calculation is incorrect. It is currently scaling the maxStereo amount (we'll call that some kind of IPD/IOD oversimplification?) by the stereo amount:

https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L137-L138

Where stereo amount is calculated (perhaps incorrectly scaled) to the overlap percent:

https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L212

From your description of overlap percentage, however, it sounds like an overlap percentage less than 100% should cause a rotation of each eye (to the right or left of center respectively), not more or less stereoscopic translation, right? And instead of calculating stereoAmount from the device description, we should be using a user configurable IPD value that has a reasonable default value?

@VRguy
Copy link
Contributor

VRguy commented Apr 18, 2015

That is correct - less than 100% overlap causes rotation of each eye
relative to the center. IPD causes translation.

An example of these calculations in the Unity plugin is here:
https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs

With regards to IPD - you can take a default value but this should soon
become a system-wide setting associated with the user profile.

On 4/17/2015 5:18 PM, Sleepy Daddy Software™ wrote:

Hmm, if that's the case, it looks like OSVR-Unity's (and OSVR-MonoGame
which is based on it) VREye translation calculation is incorrect. It
is currently scaling the maxStereo amount (we'll call that some kind
of IPD/IOD oversimplification?) by the stereo amount:

https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L137-L138

Where stereo amount is calculated (perhaps incorrectly scaled) to the
overlap percent:

https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/Assets/OSVRUnity/src/VRHead.cs#L212

From your description of overlap percentage, however, it sounds like
an overlap percentage less than 100% should cause a /rotation/ of each
eye (to the right or left of center respectively), not more or less
stereoscopic translation, right? And instead of calculating
stereoAmount from the device description, we should be using a user
configurable IPD value that has a reasonable default value?


Reply to this email directly or view it on GitHub
#6 (comment).

@JeroMiya
Copy link
Contributor

== Status update: ==

=== IPD ===
Updated VRHead to calculate VREye translation based off of a configurable IPD value (in meters, with a median default value) scaled by a configurable WorldUnitsPerMeter value. This replaces the old StereoAmount * maxStereo calculation.

=== Research ===
Brushing up on stereoscopic rendering from several sources, from oculus SDK documentation, the OSVR-Unreal implementation, nvidia presentation slides, and etc...

=== Blocking issues/questions ===
The oculus SDK and OSVR-Unreal appear to calculate a separate projection matrix for each eye (each slightly translated in either direction). I took a look at the unreal implementation, but it appears to use metadata from the display device descriptor that is not present in the json. The oculus sdk examples similarly use metadata from the sdk that i dont have from the device descriptor. Am I missing something?

=== Next steps ===
Skip stereo rendering issues for now and move on to the distortion shader using a placeholder shader for now.

@JeroMiya
Copy link
Contributor

Lots of progress tonight:
JeroMiya/OSVR-MonoGame@7b06a85
JeroMiya/OSVR-MonoGame@d400bbf
JeroMiya/OSVR-MonoGame@d0aa201

Now almost ready for the distortion shader implementation, as I am now rendering each eye to an offscreen RenderTarget2D. Also, major cleanup of the solution and project build configurations, referencing a locally built Managed-OSVR NuGet package, and updated MonoGame nuget references to 3.3 and removed old references.

@JeroMiya
Copy link
Contributor

JeroMiya commented May 2, 2015

Status Update:
Added model loading to the sample to load a model from Sketchup. Got carried away trying to snaz up the sample - I will need to create a scene without the 3D warehouse components before checking in, so this is just a proof of concept:

image

Also improved the camera controls in the demo. You can now fly around the scene.

Added controls (Q and E) to adjust the IPD value:
image

Next steps:

  • Create sample scene model from scratch without 3D warehouse components (for licensing reasons).
  • Toggle between FPS and flying camera modes.
  • Some kind of animation? Maybe rain/snow?
  • Figure out why MonoGame PipeLine is having trouble processing textures in .fbx model files. It currently says it can't find the texture files, but they do in fact exist where they're being referenced from so... not sure.

@JeroMiya
Copy link
Contributor

JeroMiya commented May 4, 2015

Status Update:

  • I'm pretty sure stereo rendering is working now! Finally! Part of it was disabling the projection matrix translation, and the other was that the stereo translation for each eye was in world space and not view space, resulting in each eye being swapped.
  • Created a placeholder model/scene without sketchup's 3d warehouse components. I'm not a 3D artist by any means, but it's better than triangles in a line.

@JeroMiya
Copy link
Contributor

Status Update:

  • Uploaded a video demo! https://www.youtube.com/watch?v=doOOLaIuj48
  • Added left and right hand tracking. Hands are rendered with triangles for now.
  • Basic hand tracking offset calibration (C button).
  • Three head tracking modes - Display (if you have a real HMD), Mouselook, and using /me/hands/right as the "head". Press O to cycle through orientation modes.
  • Positional tracking (using PoseInterface) in addition to orientation tracking.

@VRguy
Copy link
Contributor

VRguy commented May 15, 2015

Looks great, and you should be able to test it with the HDK any day now.

Does it use the JSON files for display parameters or are they hard-coded?

Is there a standard Monogame demo scene (akin to "Tuscany") that would be good to demonstrate?

@JeroMiya
Copy link
Contributor

It gets display parameters from /display so as long as you have it configured on the server, it should be able to pick it up. It doesn't yet support distortion however. I have the HDK now, but I need to pick up an hdmi mini male-to-female adapter and a DVI cable tomorrow.

I'm still looking for a better scene/model for the sample. I was going to use the palace map from the other OSVR demo but the model format was incompatible (version was too old). Will look into converting the model to a compatible format with blender/etc... and see how that goes.

@JeroMiya
Copy link
Contributor

Status update:

  • Refactored OSVR-MonoGame for the new interface wrappers in OSVR.ClientKit.
  • Now using the state APIs instead of the callback APIs in the sample.

Next steps:

  • Add XnaButtonInterface, XnaAnalogInterface.
  • Add extension methods for convenience.
  • Use XnaButtonInterface and XnaAnalogInterface for movement and interaction in the sample.
  • Fix issue with headtracking when combining position + orientation. The head is swiveling around a pivot point. It should be rotating in place (at the position point).

@DuFF14
Copy link
Member

DuFF14 commented Oct 14, 2015

@JeroMiya can we close this? Does MonoGame have its own repo yet?

@JeroMiya
Copy link
Contributor

Yes, it has its own repo. We can close this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants