Profile Picture

Facial Performance Studio - Face Puppet 2.0

Posted By will2power71 9 Years Ago
You don't have permission to rate!
Author
Message
CaseClosed
CaseClosed
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)Distinguished Member (3.4K reputation)

Group: Forum Members
Last Active: 4 Years Ago
Posts: 344, Visits: 781
We need Reallusion markerless facial mocap software that we can us to capture using video from a smart phone. It's very possible, and very powerful, and will take production using iClone to that level where... if you post a video on youtube, people will be clamoring to see how you did it.

Currently, people who see iClone videos are impressed until they see the facial animation and speaking. Hopefully the Reallusion crew is working on this missing ingredient... facial mocap.
will2power71
will2power71
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 389, Visits: 2.8K
I know the point you're getting at. Most of my reservations about Iclone have more to do with being cut off from controlling the models that you work with in the way that suits you. It's like one person likes the puppet controller and thinks that EVERYONE likes the puppet controller. I looked at ways to accomplish what I was looking for in Iclone, and when I happened upon the AML scripting (Which seems to have very little available documentation) I realized that you could use that component to accomplish what I was after if they allowed it to be utilized with a tool like this. I absolutely abhor the puppet controller because I have to do take after take to get an expression I want when I know the expression is there. Real Pose to pose facial animation is the start. Adding curves would make it ROCK! 

What I'm hoping for Iclone 7 is that they will acknowledge the shortcomings of the animation tools and give the users some other choices to accomplish what they want. The key thing Iclone is missing to me, is fine control. The puppet tool is great for adding variation, but terrible at fine control. We need tools for fine control. About the only other thing I would want, and at that point I'd probably stop using all my other tools is having the HumanIk controller in the viewport and not the puppet controller window. I'm hoping that they will move away from those tools or at least give us the option to choose another style of motion control in our animations.
will2power71
will2power71
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)Distinguished Member (9.6K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 389, Visits: 2.8K
I was working with Iclone last night and became visibly frustrated with facial performances. Most of this is because of the puppet tool. Moving my mouse around for big movements like turning the head or looking up and down is fine, but for finer, more subtle movements --it seems unnecessarily difficult. What occurred to me is that you can't key frame  the facial performance in any precise way. Then it occurred to me that you have all of these expressions, but no way to precisely mix them during a performance. But then as I was looking over Persona and the Perform menu option, it dawned on me that we should have a specialized Persona Studio for the face. This is something where you can see the face in Perspective or orthographic view. When you select the iAvatar, a list of their facial expressions are listed to the left or right, with the timeline for the face below.Before you start the performance, you should be able to scroll through the list of expressions and check the ones you're going to use and add them to the performance tab, so you only have to scroll through the stuff you need and not the whole list.

Then you start the vocal track and instead of clicking and dragging with the mouse, you click on each item through the take. So you can set when they blink just by clicking on the blink button, which adds a keyframe for the action. Here's where it gets interesting. Once you've gone through the blink track, you can click on the keyframe and then add any other facial change at the time. which adds an entry for it that you can offset, plus you get the strength slider for the action so that you can make it subtle or exaggerated. If you like the combination of dials, you can save the expression to a quick expression that's can be saved to each actor. You can also stretch/squash the time it takes to perform the action so that you can have a slow blink or fast blink -as you prefer, or if you want them to smile from this point all the way through the dialogue you can set the expression to remain through the performance and just adjust it subtly by adding facial ticks. If the character is bone based, then you would have visual locators for the face bones so that you could do the same thing in a bone based character. The idea is not to re-invent anything but to give you access to what's already there in a more precise fashion.

The idea is to use what's already available to the character. What makes it even more powerful is that if the character has custom facial expressions already, you don't have to do anything but dial it in during the performance. So if you brought in a custom morph for say Vampire fangs and added it to the character's custom facial expressions --you could now keyframe it and have them dial in slowly or quickly. I think that it's a better way to get a character performance because it is more precise than doing take after take with the puppeteer. I've done a mockup of what I  have in mind.

https://forum.reallusion.com/uploads/images/0313bfe4-e77f-407a-9c59-f568.jpg

Now keep in mind, I envision this as a tool for imported characters --not a replacement for crazytalk,  but the idea is that anything in the facial presets of whatever character you are working with should be available to you when you launch the performance studio. This means if the character has bones, or morphs --it doesn't matter. Whatever is available to that character should be listed when you launch the Facial Performance studio. What this is is a way to precisely tweak your character performances using what you already have, and allowing you to combine them in new ways while keeping more control. In addition, I think this would be a great place to put facial motion capture for companies like Face Shift, Faceware,Brekel Kinect or Mycap Studio.

I wouldn't even mind seeing something like this as an add-on like Mocap Kinect or the Physics Toolbox.



Reading This Topic