Skip to content

realtime motion tracking in blender using mediapipe and rigify

License

Notifications You must be signed in to change notification settings

ThadeuLuz/BlendArMocap

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BlendArMocap

BlendArMocap is a Blender add-on to preform Hand, Face and Pose Detection using Googles Mediapipe. To detection requires a movie file input or a webcam connected to the computer. The detected data can be easily transferred to rifigy rigs.

Setup Instructions

Blender has to be started with elevated permissions in order to install the required packages opencv and mediapipe via the add-ons preferences. Internet connection is required to install the required packages. It's recommended to disable VPN's during the installation processes. Also Blender may has to be restarted during the installation process. To access the webcam feed blender usually has to be started with elevated permissions.

Starting Blender with elevated permissions

Windows
Right-click the blender application and choose: "Run as administrator"

Mac
Start Blender as admin by using the terminal:
Navigate to Blender: cd /Applications/Blender/Contents/MacOS
Run Blender as admin: sudo ./Blender

Linux
Start Blender as admin using the terminal:
Navigate to Blender: cd /usr/bin
Run Blender as admin: sudo ./blender

When running Blender as admin using sudo in the terminal, it's required to enter the admin password. Once the add-on packages are installed and your terminal has the permission to access your camera, you can start Blender with just ./Blender.

Capabilities

BlendArMocap uses opencv to access the users webcam or movie file and mediapipe by google to preform hand, face and pose detection in blender. The detected data can be used to drive a rigify rig.

Transferable data to rigify rigs

Hands

  • Hand rotation
  • Finger x-angles
  • Finger y-angles

Face

  • Head rotation
  • Open and close Mouth
  • Relative Mouth Corners
  • Open and close eyes
  • Eyebrow Movement

Pose

  • Hand position
  • Hand orientation
  • Elbow position
  • Shoulder position
  • Shoulder rotation
  • Hip rotation
  • Knee position
  • Ankle position
  • Foot orientation

Detection

Type
Select the data type you want to use as input:

  • Stream
  • Movie

Webcam Device Slot
If you have multiple webcam devices you may have to change the integer value until you find the device you want to use.
Defaults the Webcam Device Slot should be 0.

File Path
Select the path to your movie file. Preferable reduce it in size before starting the detection.

Key Step
The Key Step determines the frequency of Keyframes made in Blender. Adjust the Keyframe Step so the detection results in Blender match the recording speed.

Target
Select the detection target:

  • Hands
  • Face
  • Pose
  • Holistic

Start Detection
When pressing the Start Detection button a window will open which contains the webcam or movie feed and detection results. The detection results are recorded in Blender at runtime. You can modify the recording starting point by changing the keyframe start in Blender.
May deactivate the rig while detecting if you have transferred animation results previously. To finish the recording press 'Q' or the "Stop Detection" button.

Animation Transfer

The detection results can be transferred to a generated rigify rig.
The new rigify face is not supported yet.

Drivers
Select the driver collection you want to transfer.
You can select the parent collection, or just the collection containing the drivers of your choice.
May not change the collection names nor the empty objects names.

Rig
Select the generated rigify rig you want to transfer to.
May not change the bone names of the rigify rig.

Overwrite Drivers
When selected, drivers and constraints will be overwritten with default values.

Leg Transfer (Experimental)
By default, only the upper body motion of the detection results are getting transferred.
The feature is only visible, as long either 'Holistic' or 'Pose' is selected as detection type.

Start Transfer
Transfers detection results from the selected collection to the rigify rig.
Once the transfer has taken place, new recordings will be applied instantly to the rig.
There is no need to transfer twice.

How to manipulate transfer results

Manual
Translate or rotation the bone you want to offset. Make sure to create keyframes while doing so as the correction may change the entire animation.
Pose Mode > Select control bone > Object Properties

Constraints
The data is copied from the drivers by constraints. In some cases, it might be useful to change or remove constraints.
Pose Mode > Select control bone > Bone Constraints

Custom Properties
On some bones, custom properties will be added upon the transfer. The custom properties help to manipulate the minimum and maximum mapping values of the driver.
Pose Mode > Select control bone > Object Properties > Custom Properties

Offset time
If you want to change the speed of an animation.

  1. Select the drivers in the collection you want to smooth, or select all.
    Right click collection > Select Objects
  2. Navigate to or open the graph editor, select make sure the graphs of the objects are selected.
    Timeline > 'A'
  3. Make sure your the currently selected frame is at the start of your animation (usually 0).
  4. Scale the timeline to increase or decrease the offset between keyframes.
    Timeline > 'S'

Smooth results

  1. Select the drivers in the collection you want to smooth, or select all.
    Right click collection > Select Objects
  2. Navigate to or open the graph editor, select make sure the graphs of the objects are selected.
    Graph editor > 'A'
  3. If you used a key-step while recording, resample the curves.
    Graph editor > Key > Sample Keyframes
  4. Finally, smooth the animation. You may repeat this step till you reach your desired result.
    Graph editor > Key > Smooth Key
Manipulation Options meaning location
m manual pose mode
c constraint bone constraint
p custom property bone custom property

Data Assignment

Rigify Pose Bone Constraint type Driver Source Opts
torso copy rotation hip_center m, c
chest copy rotation shoulder_center m, c
hand_ik.R child of left_hand_ik m, c
hand_ik.L child of right_hand_ik m, c
upper_arm_ik_target.L limit distance left_forearm_ik m, c
upper_arm_ik_target.R limit distance right_forearm_ik m, c
foot_ik.R child of left_foot_ik m, c
foot_ik.L child of right_foot_ik m, c
thigh_ik_target.L limit distance right_shin_ik m, c
thigh_ik_target.R limit distance left_shin_ik m, c
Hand Driver Source Constraint type Rigify Hand Bone Opts
wrist copy rotation hand_ik m, c
thumb_cmc copy rotation thumb.01 m, c, p
thumb_mcp copy rotation thumb.02 m, c, p
thumb_ip copy rotation thumb.03 m, c, p
thumb_tip copy rotation thumb.01 m, c, p
index_finger_mcp copy rotation f_index.01 m, c, p
index_finger_pip copy rotation f_index.02 m, c, p
index_finger_dip copy rotation f_index.03 m, c, p
index_finger_tip copy rotation f_index.01 m, c, p
middle_finger_mcp copy rotation f_middle.01 m, c, p
middle_finger_pip copy rotation f_middle.02 m, c, p
middle_finger_dip copy rotation f_middle.03 m, c, p
middle_finger_tip copy rotation f_middle.01 m, c, p
ring_finger_mcp copy rotation f_ring.01 m, c, p
ring_finger_pip copy rotation f_ring.02 m, c, p
ring_finger_dip copy rotation f_ring.03 m, c, p
ring_finger_tip copy rotation f_ring.01 m, c, p
pinky_mcp copy rotation f_pinky.01 m, c, p
pinky_pip copy rotation f_pinky.02 m, c, p
pinky_dip copy rotation f_pinky.03 m, c, p
pinky_tip copy rotation f_pinky.01 m, c, p
Face Driver Source Constraint type Rigify Face Bone Opts
head copy rotation head m, c
chin copy rotation jaw_master m, c
right_eye_t copy location lid.T.R.002 m, c, p
right_eye_b copy location lid.B.R.002 m, c, p
left_eye_t copy location lid.T.L.002 m, c, p
left_eye_b copy location lid.B.L.002 m, c, p
mouth_t copy location lip.T m, c, p
mouth_b copy location lip.B m, c, p
mouth_l copy location lips.R m, c, p
mouth_r copy location lips.L m, c, p
eyebrow_in_l copy location brow.T.L.001 m, c, p
eyebrow_out_l copy location brow.T.L.003 m, c, p
eyebrow_in_r copy location brow.T.R.001 m, c, p
eyebrow_out_r copy location brow.T.R.003 m, c, p

License

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program.  If not, see <http://www.gnu.org/licenses/>.

Copyright (C) cgtinker, cgtinker.com, hello@cgtinker.com



For tutorials regarding my tools may check out my YouTube-Channel. If you want to support the development you can donate at Gumroad or become a Patreon.


Would be lovely, thanks!

About

realtime motion tracking in blender using mediapipe and rigify

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%