Facial motion capture on newer iOS devices
- Install LIVE Face phone app on your iPhone. This app is not written by, nor supported by me.
- Connect LIVE Face app to your network either with USB or WiFi
- Add the VaM Facial Motion Capture plugin to a Person Atom
- Type in the IP Address listed in the "LIVE Face" app and click connect
- Have fun
Requires VaM 1.19 or newer.
Download LFE.FacialMotionCapture.(version).var
from Releases
Save the .var
file in the (VAM_ROOT)\AddonPackages
.
If you have VaM running already, click on the Main UI > File (Open/Save) > Rescan Add-on Packages button so that this plugin shows up.
After you connect for the first time to the phone, a configuration file will be created that you can edit. In the future this will be easier but sorry, it is by hand right now.
Saves\PluginData\lfe_facialmotioncaptuire.json
{
"clientIp" : "192.168.1.2",
"mappings" : {
"Brow Down Left" : {
"morph" : "Put whatever morph name you want here",
"strength" : "1"
},
"Brow Down Right" : {
"morph" : "or here",
"strength" : "1"
},
"Brow Inner Up" : {
"morph" : "", // empty means disabled
"strength" : "1"
},
...
}
Reload the plugin after editing this file.
The recordings can be imported into the Timeline plugin by Acid Bubbles. You can get this plugin here. https://hub.virtamate.com/resources/timeline.94/
The jaw physics on Person atoms is often too tight which can cause the motion capture to perform poorly. Try playing around with the Jaw Hold Spring
, Jaw Hold Damper
and even Tongue Collision
.
Many auto expressions will get in the way of motion capture. Try playing around with the settings.
- easier way for you to say what morph is used for each phone change that comes in
- find morphs better suited for motion capture. anyone can convert these to G2F/G2M? https://sharecg.com/v/92621/browse/21/DAZ-Studio/FaceShifter-For-Genesis-8-Female
Icon: Face Recognition by mungang kim from the Noun Project