Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Interaction" scene and AR simulation #1052

Open
Creepymulder opened this issue Jan 24, 2023 · 28 comments
Open

"Interaction" scene and AR simulation #1052

Creepymulder opened this issue Jan 24, 2023 · 28 comments
Labels
fixed Fixed in newer release

Comments

@Creepymulder
Copy link

Hi,

I'm trying to find a workflow using AR Foundation and the XR Interaction Toolkit.

If if try the scene "Interaction" on the XR Simulation scene, it's not working : clicking with a mouse on a detected surface doesn't do anything. It's working fine when it's build on Android.

Is this a known limitation or am I doing something wrong ? The scene "SimpleAR" is working fine on the XR Simulation Scene.

Thanks!

@ROBYER1
Copy link

ROBYER1 commented Jan 26, 2023

Likewise, the XR simulation feels pretty useless for previewing placing a product in situ and interacting with the scene through XR Interaction Toolkit, any fixes or ideas?

@StefanoCecere
Copy link

I asked the same in the Unity forum last week. no reply yet. Dilmer Valecillos confirms that he couldn't make XRI and AR Simulation work together, too.
let's cross fingers that some Unity dev will improve these

@andyb-unity
Copy link
Collaborator

andyb-unity commented May 22, 2023

Hi folks, apologies for radio silence on this issue.

The XR Interaction Toolkit team has begun working on updates to their AR components to be compatible with XR Simulation. We'll update this thread once they have more to share about that.

The issue is that the XR Interaction Toolkit components currently use the Input System's touchscreen bindings, so you would have to provide touchscreen input to Unity for these components to work in Simulation. In the AR Foundation samples we use Pointer bindings for everything. Pointer resolves to a touch input on mobile and a mouse input on desktop, allowing us to write a common set of code that executes in Simulation as well as iOS and Android platforms.

For anyone writing their own input scripts, you could look at our PressInputBase class for inspiration on how to implement simulation-agnostic input.

@apurvvarshney
Copy link

Did anyone get it working for the time being?

@JasonSpatial
Copy link

I hope you'll all find this helpful. I've created a GitHub template repo (which means you can quickly create your own repo from it) that uses AR Foundation and XRI together, with a demo scene using XR Simulation to place and manipulate AR objects.

This short video shows how it works: https://www.loom.com/share/35e79df3781a4b9992e516e6c8cb5a7c

The repo is here and I've tried to make the README pretty clear about how to get started: https://github.com/JasonSpatial/ar-template

I hope this helps.

@andyb-unity
Copy link
Collaborator

Thanks @JasonSpatial!

I saw Jason give a talk at the GameDevGuild conference a couple weeks ago where I learned of his workaround. He uses the Input System's TouchSimulation while working with XRI in the Editor (does not support multi-touch input): https://github.com/JasonSpatial/ar-template/blob/main/Assets/_Project/Scripts/Editor/ToolsEditor.cs

In this case he enables and disables touch simulation via a button on a component Inspector. This might be a suitable workaround for some folks in the meantime until XRI supports AR Foundation's XR Simulation.

@StefanoCecere
Copy link

I hope you'll all find this helpful. I've created a GitHub template repo (which means you can quickly create your own repo from it) that uses AR Foundation and XRI together, with a demo scene using XR Simulation to place and manipulate AR objects.

This short video shows how it works: https://www.loom.com/share/35e79df3781a4b9992e516e6c8cb5a7c

The repo is here and I've tried to make the README pretty clear about how to get started: https://github.com/JasonSpatial/ar-template

I hope this helps.

great! no way it works with 2022.3?

@JasonSpatial
Copy link

JasonSpatial commented Jun 6, 2023 via email

@andyb-unity
Copy link
Collaborator

Hi all, we've learned that XR Interaction Toolkit (XRI) 2.5 will include some mobile AR refactors, one of which is adding support for AR Foundation's XR Simulation. XRI 2.5 is not public yet but I thought followers of this thread would like to know this is coming.

@Creepymulder
Copy link
Author

Hi all, we've learned that XR Interaction Toolkit (XRI) 2.5 will include some mobile AR refactors, one of which is adding support for AR Foundation's XR Simulation. XRI 2.5 is not public yet but I thought followers of this thread would like to know this is coming.

Thanks for the information ! Any ETA for XRI 2.5 ?

@profkeegan
Copy link

Thank you big time for this solution Jason, going to give it a try. I was also curious if there was a clear timeline for XRI 2.5? Trying to help my students tinker with AR Foundation, and it would be great to have something to go to as a backup for testing!

@andyb-unity
Copy link
Collaborator

andyb-unity commented Aug 30, 2023

Hi all, happy to share that XRI 2.5 is out now! Looks like the docs haven't uploaded yet (this usually takes up to 48 hours after release), but I see 2.5.0 in the public registry, so you should be able to manually request this version in Package Manager.

@StefanoCecere
Copy link

i see it in the docs https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.5/manual/index.html but not in the public registry (still 2.4.3) :)

@andyb-unity
Copy link
Collaborator

The Package Manager in the Unity Editor does not show you the public registry-- it shows you a snapshot of the public registry at the time that the Editor was released. If you add the package by name and specify version 2.5.0, you'll see that the package is in fact in the public registry. https://docs.unity3d.com/Manual/upm-ui-quick.html

@Creepymulder
Copy link
Author

Creepymulder commented Aug 31, 2023

I updated to 2.5.0, but the mouse clicking is still not detected in the XR simulation against a detected floor, am I doing something wrong ? I'm using the "AR Placement Interactable" component.

It's working fine on the starter scene "ARDemoScene" but this scene doesn't use this component

@StefanoCecere
Copy link

The Package Manager in the Unity Editor does not show you the public registry-- it shows you a snapshot of the public registry at the time that the Editor was released. If you add the package by name and specify version 2.5.0, you'll see that the package is in fact in the public registry. https://docs.unity3d.com/Manual/upm-ui-quick.html

done modifying the packages manifest file, thank you!

@JasonSpatial
Copy link

I didn't get the impression that 2.5 would take away the need to simulate touch. In fact, looking at the XR Screen Space Controller, it has input mappings for Touchscreen Gestures, which suggests it's still expecting a simulated touchscreen gesture.

This still works the way it worked before, by enabling Touch Simulation, either via the Input Debugger or via script, as I've done here with a simple editor script for an empty component I've added to AR Session.

image

@andyb-unity
Copy link
Collaborator

I haven't tried this yet myself but it looks like you can set up Editor input by modifying the Touchscreen Gestures Input Action Map as shown on this new page in their docs: https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.5/manual/ar-interaction-overview.html

@profkeegan
Copy link

profkeegan commented Aug 31, 2023

Apologies if this is a Unity forum question as I am more on the art/design end of things. I've tested your amazing project @JasonSpatial but am curious how I would be able to integrate this script into the AR Sample project provided by [Unity] (https://github.com/Unity-Technologies/arfoundation-samples). I would very much love to leverage both the interactable functionality with image tracking but feel like I am running into the same testing roadblocks. I've tried to setup the inputs as mentioned in 2.5 with no luck.

I also neglected to mention I am working on a 2022.3.0f1 version if that would present another complication.

@tropicdragon
Copy link
Collaborator

@Creepymulder
ARPlacementInteractable is part of the old way of setting up a scene for AR. Here is a doc page for how to set up AR with the new changes in XRI 2.5.

@profkeegan
Just to be sure, include the Input Action Map provided by the XRI 2.5 Starter Assets referenced by your InputActionManager in the scene. Without it gestures will not be detected. If you want a working example please take a look at the AR Stater Asset's ARDemoScene. You can pull these starter assets by checking out the XRI package from the package manager window.

Also, only one-finger touch gestures are currently supported in simulation mode.

@vrdave-unity
Copy link

Hey all, let me see if I can answer some of your questions about the new AR integration with XRI 2.5.0.

As @tropicdragon pointed out, the goal is to replace the existing AR Interactables (placement, rotation, scale, selection, translation) with the new system so that you simply add the AR Transformer to any standard XR Grab Interactable and it should work in much the same way the individual AR Interactable components worked but with the rest of the interaction system as well. Since this is a replacement for the old way of doing things, the older AR Interactables do not work with the new XR Screen Space Controller, so the scene would need to be setup the original way to work.

I highly recommend using the prefabs and Input Action Map from the AR Starter Assets sample if you plan to use XRI in your AR scene.

We have planned to deprecate the older AR Interactable classes in our upcoming XRI 3.0 release, but wanted to get these new changes out to get feedback about the AR Transformer and Screen Space Controller, so please let us know what you like/don't like and how we can improve these additions for the next iteration.

@Creepymulder
Copy link
Author

@Creepymulder ARPlacementInteractable is part of the old way of setting up a scene for AR. Here is a doc page for how to set up AR with the new changes in XRI 2.5.

Also, only one-finger touch gestures are currently supported in simulation mode.

Thanks ! I'll look into it

@JasonSpatial
Copy link

I didn't get the impression that 2.5 would take away the need to simulate touch. In fact, looking at the XR Screen Space Controller, it has input mappings for Touchscreen Gestures, which suggests it's still expecting a simulated touchscreen gesture.

This still works the way it worked before, by enabling Touch Simulation, either via the Input Debugger or via script, as I've done here with a simple editor script for an empty component I've added to AR Session.

image

I stand corrected. I just ran the demo scene in 2.5 again and realized I had simulated touch enabled the first time. Excited to say that the new system works without the need to toggle simulated touch! 🎉

Fantastic work, Unity team.

@profkeegan I'll be updating my sample repo to use the new approach. The important thing to note is that the new XRI being discussed here doesn't change AR Foundation's functionality (which controls your image tracking). It also looks like any Unity version from 2021.3 and newer should work.

@profkeegan
Copy link

Everything seems to be working quite nicely now! Thanks everyone. I'll keep an eye out for the update @JasonSpatial

@ROBYER1
Copy link

ROBYER1 commented Sep 25, 2023

On import of the AR Starter Samples, all the input action maps are missing on the XR Screen Space Controller, is this expected? Nothing works in the sample scene due to this.
image

@tropicdragon
Copy link
Collaborator

@ROBYER1
Just to be sure, did you pull the latest version of the (Non-AR) Starter Assets as well? That should fix the issue.

@ROBYER1
Copy link

ROBYER1 commented Sep 25, 2023

@ROBYER1 Just to be sure, did you pull the latest version of the (Non-AR) Starter Assets as well? That should fix the issue.

I can confirm I was using the latest Starter Assets for both, I deleted the whole Samples folder then updated to 2.5.1 XRI and pulled Starter Assets down for that first then the AR Starter Assets and the issue was still there..

I will test in an empty project tomorrow to check as I was using it on this repo.

Edit: Worked fine in a fresh project when I imported, in the current project which is a clone of this repo, the issue remained - even when I exported the Samples folder for XRI as a Unitypackage file from the working fresh project and imported it into the repo cloned project - It doesn't matter though really as I have at least got it working in another project :) thanks!

@ROBYER1
Copy link

ROBYER1 commented Sep 26, 2023

Now I have the samples working - I had a quick question about single object spawning/selection I posted here:
#1104

I was trying to use a workaround suggested here which was for AR Foundation, but these new XRI 2.5.1 components are mostly very different and I can't find a way to automatically select and force the selection of a single spawned AR Object?
https://forum.unity.com/threads/how-to-place-object-with-ar-select-interactor-in-a-selected-state.1316028/#post-8338452

@andyb-unity andyb-unity added the fixed Fixed in newer release label Sep 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fixed Fixed in newer release
Projects
None yet
Development

No branches or pull requests

9 participants