-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Solvers need to track controller transforms #2785
Comments
Proposed solution: MRControllers add a Transform reference to their own transform, so a script can access it either in SourceDetected or by asking the controller manager for the specifically handed controller. |
Additionally, asking the controller manger for its controllers was made more difficult as more "device managers" came online. Previously, the OpenVRDeviceManager or the WindowsMixedRealityDeviceManager would be the only active "device managers" in the scene. Now, there's a whole list of them, but only one of them will have tracked motion controllers. |
The solvers just need to handle events for the controller they're assigned to. Look at the pointer implementation for details. You should be able to fulfill the requirements with |
Solvers have their own update cycle and track towards a specific transform that gets set once. This is important, because a benefit of solvers is that they can interact with each other in a known moment, state, and order. Additionally, once attachment points are added (again, as passive transforms that get set once), the pose synchronizer is incompatible. |
Totally understand and agree
Totally disagree |
Can you elaborate? |
I disagree that it is incompatible. |
When a passive transform updates, there are no events to update the SourcePoseSynchonizer. That means we won't be able to update the Transform that all the solver handlers are coordinating around. I believe this is Kurtis's original point. In the same vein, we're going to hit problems with tracking APIs like Leap Motion which provide transforms instead of events. There won't be any update events there either. Unless you know some secret juju, I think Kurtis is right about it being incompatible. If you see some way to connect these disparate models of operation, please share! :) As I described in proposal #2608, using transforms is a much more unityesque way of doing business. If we want to avoid people reaching into the device layer to get transforms, we should have the device layer provide transforms in a way that we're ok with. It's also true that if we want to continue using the pose events to control position, at some point in the future we'll find ourselves building a class that checks if a passive transform has moved and fires an event to simulate a pose update. I don't think it's a crazy idea to have the SourcePoseSynchronizer (or something like it) update a transform for WMR motion controllers, but that should be happening in the MRTK Constructs layer, not the User Abstraction Layer where Solvers live. |
They're not "events" in the traditional sense, as they're not that much different than the "passive" updates you're mentioning. (under the hood it's calling the method directly)
We are doing things the "unity" way, as the entire input system is based off of unity's event system... soooo... I'm not really sure what the difference here is.
I don't think so either, in fact, that's exactly what's already happening. 😅 I encourage you guys to go back and look everything over. I'll work on a PR with some proposed changes to adhere to the current architecture. @SimonDarksideJ do you have anything else to add that might help them understand what we're trying to convey? |
The "passive" transform you want to listen to is what ever game object's transform has the pose synchinizer. That is the scenes only knowledge of the controller, and should stay that way or we break the architecture we've laid out and satisfies the requirements. |
Thank you. I can't speak for Kurtis, but I know for me an answer like this is much more helpful than "Totally disagree" because it follows directly to an action item. "Totally disagree" leaves the ball up in the air with no one as the responsible party and no clear next steps. There's no way to figure out what to do in response. On the other hand, it's easy and stress free to wait for a PR. :) |
Do you guys want to schedule a call today where we can discuss this much more effectively? |
Imo this task is moot because you can already do everything you need to do by attaching the solver handler to what ever prefab/ game object that has a |
In the We can even do the same for the pointers as well considering they also implement |
Added a review comment in the Task description to nail down the actual requirement, as I still don't see it. Review (SJ)Solvers are components added to GameObjects in a scene. These Solvers then need a "target" to bind to which may be the users head or could be a physical controller QuestionAre ONLY controller entities (head / controller / hand / finger) bound to? Or could it in theory be bound to ANY other GameObject? |
Any game object, both in editor and at runtime. I've used them for a menu that tracks a controller by default, but can be attached to a docking object when the user wants to put it down. It has the same solving behavior whether it's following the controller or the docking object. |
That's what I worked with for #2795 Also raise Task #2798 to look at updating the SolverHandler to use the updated MRTK configurable metaphor. The Solvers are solid, however I feel the scene implementation and config could benefit from the pattern used for other MRTK features. (and also allow the core solvers to move in to the core as pure c# scripts) |
Overview
The solvers in vNext are currently head-only. The also need to support tracking motion controllers.
Requirements
Acceptance Criteria
A list of the specific things that must be done for approval / acceptance of the task in checklist form.
This may correspond with the above requirements.
The text was updated successfully, but these errors were encountered: