Fixed face poser not posing the last flex on some models #1146

Merged
merged 1 commit into from Mar 30, 2016

Projects

None yet

2 participants

@NO-LOAFING
Contributor

The Face Poser's think function subtracts 1 from the flex count for no reason, making it skip the last flex when applying the values from the convars. Usually, this doesn't matter, since on most models, the last flex is a useless internal thing like a gesture or eye movement flex that the tool would just ignore anyway. On a few models, though, like the TF2 Engineer (both standard and HWM), the last flex isn't useless, but the slider for it in the cpanel doesn't do anything since the think function never applies its value to the model.

@NO-LOAFING NO-LOAFING changed the title from Fixed face poser skipping the last flex on some models to Fixed face poser not posing the last flex on some models Mar 25, 2016
@robotboy655 robotboy655 merged commit 2527119 into garrynewman:master Mar 30, 2016
@NO-LOAFING NO-LOAFING deleted the NO-LOAFING:patch-2 branch Mar 31, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment