Thanks for compliments, Raven & Stuck...funny, on another thread I was asking for advice on how to translate CrazyTalk phonemes smoothly into iClone. I still find myself trying to improve on it too. For some reason, no matter how clean I get a CrazyTalk Script File to look it doesn't look as good when I apply it to my character in iClone (usually the phonemes don't seem to"hit" as hard or the deformation in the jaw is too strong). You definitely have to play with the
Expression Strength slider in iClone and
Expressiveness slider in CrazyTalk. Although, I haven't spent much time on all my lip-synchs in previous videos, I tried a little harder on the
"Injured Bad" video since it was shorter. Even then, the
"Ee's" &
"Oh's" phonemes seemed to come out sloppy in iClone so I had to change them to
"Ah's" just so they could be read better in the final render. With that being said my basic lip-sync workflow has been:
1. Clean up all unnecessary sounds in CrazyTalk by deleting the phonemes unheard and reducing the amount of "NONE" phonemes that show up. Overall, I think the software does a good job of recognizing the audio imported but I try to keep the timeline as clean & basic as possible. When there's a very long sentence with words that run-on next to each other, I delete a lot of the "NONE's" since they could be replaced with just the consonant sounds.
2. Second I go through all the vowel sounds and make sure CrazyTalk analyzed it correctly. If an "Ih" was used where an "Ee" should be it can make a big difference on how the lip-synch looks. Also, playback your audio sentence by sentence and listen to how your voiceover actor emphasizes certain words...bump up the Expressiveness slider on those words and reduce it on the ones less emphasized.
3. Next, after I think the vowels are correct, I do the same thing with the consonants. Especially when you say words with "B,M,P"...it makes the lips curl in hard and when you say "W" words the lips come together like a kiss. Emphasizing these phonemes (plus your "Z" sounds) are really noticeable in characters and can make your facial animations read better. I try to exaggerate these vowels & consonants hard to block in the whole facial animation then scale back when needed.
4. Then, I make sure all the phonemes and the "NONE's" end where they are suppose to. I notice CrazyTalk likes to stretch out the words when the character should have stopped talking a few frames or even a few seconds ago. Some may not agree with this, but I think getting that jaw to shut when their not talking or making the mouth start moving a few frames before the audio looks better to me.
5. Last I drop in the Facial Expressions on the Face timeline and choose them carefully. Increase or decrease the Expressiveness on the emotions of your character and it will make your character look like he/she is thinking more. Gives it a more human element to it...I believe a lot of expression is said through the eyes...if you can see what the character is thinking in their eyes then you can probably get away with some off beat lip-synchs.
Then after all that is said-and-done, you import it into iClone and start all over again because it still won't look right! LOL... These are definitely not the only guidelines to lip-synching because I'm sure there are better ways to approach it. I hope other artists can share their insight too because I feel I have a lot to learn with the software. I realize facial animation can be time consuming and tedious but like anything you have to pick and choose what you want to spend more time on in your final render. I hope some of this helps and hope you don't mind it was very long-winded!
BTW, I'm using CrazyTalk 5 so not sure how well version 6 has improved it's scripting when importing to iClone.
-Oliver