-
Notifications
You must be signed in to change notification settings - Fork 1
Debugging
It is rather tedious to build the app to the mobile device every time you change some code and wish to test. The old ARKit library contains a plugin called ARKitRemote that allows you to debug the app with the iOS device without having to deploy the app to the device. With ARKitRemote, you are able to test all AR features, including plane tracking.
ARFoundation, on the other hand, does not have a proper remote debugging tool. However, our app is not AR-heavy as we are only using the plane-tracking functionality of ARFoundation. This means that if we can separate plane finding and the rest of the logic, we can debug the app logic on the editor, without executing the ARFoundation code. We also need a new set of camera controls to navigate in the 3D space.
Therefore, I added two additional scripts, Debug Switch and Debug Camera. There are also slight modifications to StartPanelManager and InputManager.
By using a state machine, we put the AR plane-finding logic in the FindPlane state, and put non-AR logic in other states. This achieves a natural separation of logic. If I skip the FindPlane state, I will not invoke the ARFoundation code.
The entry point of the app is in StartPanelManager, when the user will be prompted to select a language and toggle the tutorials. In the Confirm() method, the script will set the next state after the user confirmed her choice. If it detects that the app is running in the editor, it would skip FindPlane stage completely.
The input is managed by the Unity Event System. For more information, see Interaction System. Long story short, it supports multiple kinds of input, including screen touch and mouse click. Thus, we do not need significant changes to our input system.
The only thing we need to adapt is the "double tap" on the screen. Although this is not used in the main logic, it would still be nice to have it in case we need to change our interaction. We can treat a mouse right click as a double tap, which is the common approach. A couple of if statements in the Input Manager should do the trick.
To navigate in the 3D scene, we need to move our camera. I adapted the script from http://wiki.unity3d.com/index.php/SmoothMouseLook to make a simple debug camera that runs only in the editor. This script is supposed to work in full screen, and will not work well in play mode, but it should be enough for simple camera rotation.
Finally we need to switch between the debug camera and the AR camera, along with all ARFoundation components, depending on the device we are running. The script simply does that.