Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to produce the animation on adding new metahuman #6

Closed
OnlinePage opened this issue Apr 7, 2022 · 17 comments
Closed

Not able to produce the animation on adding new metahuman #6

OnlinePage opened this issue Apr 7, 2022 · 17 comments

Comments

@OnlinePage
Copy link

OnlinePage commented Apr 7, 2022

Hello, first of all, thanks for all making it easy for Polly's integration. I am successfully able to run the sample given, now I am trying to add another metahuman as well and followed the steps given in the developer's guide, I am successfully able to import the metahuman and add all the speech blueprints with sequence, now on the next steps I am stuck.

I have some questions
1> after adding the Speech_AnimBP to the face component, the facial animation does not work but the audio plays back successfully.
2.> similar to above but this time on the body component I cant add the Bodyidle animation, even dragging n dropping the Bodyidle animation into anim class doesn't work,

Now I observed that if the face mesh of the new metahuman is changed back to ada_Facemesh with the Speech_AnimBP, it works fine.
So basically animation is only for ada_FaceMesh (Ada)?

I tried to change the face mesh directly by editing the Speech_AnimBP, but was unable, as it doesn't show other face mesh other than Ada.

Please guide. @cwalkere

@Legumtechnica
Copy link

Could you find any solution?

Also, my editor crashes as soon as I open BP_Ada
Tried adding "speech" component on another metahuman, but crashes again.

Can you help?

@cuijiaxu
Copy link

meet the same question.
It looks like visemes and Bodyidle of new metahuman needs to be done.

@cwalkere
Copy link
Contributor

cwalkere commented Jun 22, 2022

Edit: Are you using UE4 or UE5? This project was created with UE4 and it looks like UE5 metahumans are not compatible with UE4 metahumans.

@yogeshchandrasekharuni
Copy link

Running into the same issue. Did you happen to find a fix?

@Krxtopher
Copy link
Contributor

Krxtopher commented Jul 6, 2022

We provide step-by-step instructions for adding speech capability to any MetaHuman character. You'll find those instructions in the "Adding New Metahumans" section of our Developer Guide. If those instructions don't work for you, please let us know where they fail so we can improve them.

Please report back to us on whether this addresses your original issue. Thanks.

@Krxtopher
Copy link
Contributor

@OnlinePage @cuijiaxu and @ishu07, today I was able to get a custom MetaHuman working but I did have to make a few changes to the structure of the Content folder in order to do so. Also, there's a small (but important) flaw in one of our documentation images showing how to set up your Blueprint logic. I'll see if I can submit a PR that addresses both of these issues over the next few of days.

@yogeshchandrasekharuni
Copy link

@Krxtopher, in case you're unable to submit a PR, could you please provide the steps you took to get it to work - so that one of us will be able to push a fix. Thanks!

@Legumtechnica
Copy link

I was able to compile and run everything with a custom metahuman and everything this code is intended for. I've not had issues, just takes some fiddling.

Anyway, just one question. How do I add visemes for another language like Hindi.

@cwalkere
Copy link
Contributor

cwalkere commented Jul 7, 2022

IIRC, the viseme animations were hand-made by an in-house animator. The reference/sample animations can be found in the Content/AmazonPollyMetaHuman/Animation/Visemes folder. You'd have to try out Amazon Polly for Hindi, see what it returns, and create any missing viseme animations yourself. Or maybe you can find some on the internet.

@Krxtopher
Copy link
Contributor

Krxtopher commented Jul 7, 2022

@ishu07 visemes are actually language agnostic. So supporting new languages doesn't require new visemes. A viseme is the shape the mouth, lips, tongue, and jaw make when a human makes a particular vocal sound. It doesn't matter whether that sound is being used to produce a word in English, Hindi, or any other language.

If you need more help, LMK. I'd love to help you get this working with other languages if you run into issues.

@Krxtopher
Copy link
Contributor

Regarding how I got custom hosts working. Here are (roughly) the steps I went through (from memory):

In the Content Browser, move the "Animation" and "Common" folders from "AmazonPollyMetaHuman" into a folder called "MetaHumans".

Import your custom MetaHuman into the project using Quixel Bridge.

It will pop up an error saying that you must update some files manually (using Windows Explorer) by copying them from a temporarily location it tells you to your Content folder. Do this.

Then I think I had to re-import the custom MetaHuman one more time. This time you won't get the file conflict message.

After you've done the above, you should be able to follow the regular step-by-step instructions in our Developer Guide. However, the image showing the Blueprint logic is missing an important piece. You must be sure to feed the "Return value" from the "Start speech" node into the "Sound" value of the "Play Sound 2D" node, as shown in this updated image...
BP fix

@Krxtopher
Copy link
Contributor

I've submitted a pull request that should help users who've had problems incorporating new MetaHumans into this sample project. You'll find the PR here: PR #15

@Krxtopher
Copy link
Contributor

@ishu07 I have a follow-up on my language/viseme comment above. I was looking through our source code and realized that, while visemes shouldn't block you from having a host speak a different language, there are two blockers that will get in your way. 1) We currently hard code which voices are available, and we've only included the English-speaking voices. 2) We are not passing in an explicit language ID when asking Polly to generate speech. Therefore, Polly will always use the default language ID which is "us-EN". So some code changes will be needed in order to get other languages working.

If you end up making those code changes yourself, please consider sharing back via a pull request. Otherwise, we'll consider adding this capability in the future. I'll create a new feature request issue so we can track it.

@Legumtechnica
Copy link

Alright, Thanks

@yogeshchandrasekharuni
Copy link

Awesome! I can confirm that now I am able to add a custom MetaHuman and that it works as expected and documented. Thank you!

@cuijiaxu
Copy link

@Krxtopher Get it,Thanks.

@Krxtopher
Copy link
Contributor

I believe everyone has confirmed that the comments and changes above addressed this issue. Closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants