Facial AR Remote (Preview)
Facial AR Remote is a tool that allows you to capture blendshape animations directly from your iPhone X into Unity 3d by use of an app on your phone.
This repository is tested against the latest stable version of Unity and requires the user to build their own iOS app to use as a remote. It is presented on an experimental basis - there is no formal support.
Get the latest release from the Releases tab
How To Use/Quick Start Guide
Project built using Unity 2018+, TextMesh Pro Package Manager, and ARKit plugin. Note ARKit plugin is only required for iOS build of remote. For your convenience, you may want to build the remote from a separate project. For best results use Bitbucket tip of ARKit plugin
This repository uses Git LFS so make sure you have LFS installed to get all the files. Unfortunately this means that the large files are also not included in the "Download ZIP" option on Github, and the example head model, among other assets, will be missing.
iOS Build Setup
Setup a new project either from the ARKit plugin project from BitBucket or a new project with the ARKit plugin from the asset store.
(Unity 2018.1) Add
TextMesh-Proto the project from
Window > Package Manager. The package is added automatically in Unity 2018.2 and above.
Add this repo to the project and set the build target to iOS.
Setup the iOS build settings for the remote. In
Other Settings > Camera Usage Descriptionbe sure you add "AR Face Tracking" or something to that effect to the field. Note You may need to set the
Target Minimum iOS Versionto
11.3or higher. You may also need to enable
Requires ARKit SupportNote The project defaults to ARkit 2.0, To use ARkit 1.5 you will need to set
Other Settings > Scripting Define Symbols*this will be required only if you have not updated your remote app to support ARkit 2.0. Note You may need to update your version of the ARkit plugin and update to XCode 10 or greater for ARKit 2.0.
Client.sceneand on the
Clientgameobject, set the correct
Stream Settingson the
Clientcomponent for your version of ARKit.
When prompted, import TMP Essential Resources for TextMesh Pro
Enable "ARKit Uses Facetracking" on UnityARKitPlugin > Resources > UnityARKitPlugIn > ARKitSettings
Client.sceneas your build scene and build the Xcode project.
Editor Animation Setup
Install and Connection Testing
TextMesh-Proto your main project or new project from
Window > Package Manager.
Add this repo to the project. Note You should not need the ARKit plugin to capture animation.
To test your connection to the remote, start by opening
Be sure your device and editor are on the same network. Launch the app on your device and press play in the editor.
Portnumber on the device to the same
Portlisted on the
Stream Readercomponent of the
Stream Readergame object.
IPof the device to one listed in the console debug log.
Connecton the device. If your face is in view you should now see your expressions driving the character on screen. Note You need to be on the same network and you may have to disable any active VPNs and/or disable firewall(s) on the ports you are using. This may be necessary on your computer and/or on the network. Note Our internal setup was using a dedicated wireless router attached to the editor computer or lighting port to ethernet adaptor.
Character Rig Controller does not support Humanoid Avatar for bone animation.
Animation Baking does not support Humanoid Avatar for avatar bone animation.
Stream source can only connect to a single stream reader.
Some network setups cause an issue with DNS lookup for getting IP addresses of the server computer.
Note: History edits were made on 10/29/2018. If you cloned this repository before that date, please rebase before submitting a Pull Request.