Profile Picture

New video about my new iClone plug-in: Import facial animation (coming soon)

Posted By mrtobycook 2 Years Ago
Rated 5 stars based on 1 vote.
1
...
2
3
4
5
6
...
14

New video about my new iClone plug-in: Import facial animation (coming...

Author
Message
mtakerkart
mtakerkart
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)

Group: Forum Members
Last Active: 2 days ago
Posts: 3.2K, Visits: 29.2K
mrtobycook (7/6/2023)
[quote][b]Garry_Seven_of_one (7/6/2023)
Yes exactly! I have a large spreadsheet of all the values, which has taken a long time to compare and double check etc etc etc. There are some that are 1:1 matches, and some where the MetaHuman version is split into four quadrants on the mouth instead of two, etc etc. A lot of complexity to it. But I have a plan written out (in that spreadsheet), I'm just taking it step by step with the coding.



So if I understand correctly, the ArtKit system with LiveFace is less good for lack of control points than the metahuman animator?
Yo Dojo
Yo Dojo
Posted 2 Years Ago
View Quick Profile
Senior Member

Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)Senior Member (267 reputation)

Group: Forum Members
Last Active: Last Year
Posts: 21, Visits: 92
I'm assuming Meta Human Animator facial mocap is superior to iClone? That's why you guys are working on this workflow? I just bought iclone, motionlive/ LiveFace, and wrinkles for a client project. The acculips is a pretty cool base to do facial mocap on top of. But it still needs more smoothing and complexity, because it can look really robotic.. The project I'm working on has a lot of fast talking.

I guess a drawback of MetaHuman would be that you aren't supposed to render them outside of Unreal. But I've had trouble getting iclone animations and wrinkles to transfer to a third party program, Cinema 4d. So I would say iClone isn't much better that MetaHuman as far as compatibility with other software.

Thanks for any insights, I'm super new to iClone and facial mocap
mtakerkart
mtakerkart
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)Distinguished Member (16.8K reputation)

Group: Forum Members
Last Active: 2 days ago
Posts: 3.2K, Visits: 29.2K
Yo Dojo (7/6/2023)
I'm assuming Meta Human Animator facial mocap is superior to iClone? That's why you guys are working on this workflow? I just bought iclone, motionlive/ LiveFace, and wrinkles for a client project. The acculips is a pretty cool base to do facial mocap on top of. But it still needs more smoothing and complexity, because it can look really robotic.. The project I'm working on has a lot of fast talking.

I guess a drawback of MetaHuman would be that you aren't supposed to render them outside of Unreal. But I've had trouble getting iclone animations and wrinkles to transfer to a third party program, Cinema 4d. So I would say iClone isn't much better that MetaHuman as far as compatibility with other software.

Thanks for any insights, I'm super new to iClone and facial mocap


The Acculips is only for accurate english lipsync. It's not for emotional expressions. You must create it  (puppet or facial mocap)

Garry_Seven_of_one
Garry_Seven_of_one
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)Distinguished Member (1.7K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 457, Visits: 1.9K
mrtobycook (7/6/2023)
Garry_Seven_of_one (7/6/2023) Actually, I signed up to the beta shortly after posting my comment and watched the video, and it’s an intriguing idea. So am I right in thinking that you are looking at the data from the exported MHA animation clip and then transposing them into extended CC4 blendshapes? I mean that’s the only way I can think you would be able to do it, and even then it must be a very difficult task with all of the MH control points and number of CC4 blendshapes. I mean respect and kudos for you if you can pull it off.


Yes exactly! I have a large spreadsheet of all the values, which has taken a long time to compare and double check etc etc etc. There are some that are 1:1 matches, and some where the MetaHuman version is split into four quadrants on the mouth instead of two, etc etc. A lot of complexity to it. But I have a plan written out (in that spreadsheet), I'm just taking it step by step with the coding.

In terms of the MH pipeline, I have got it working since the kit was originally produced but I find it overly complex

Yes I completely agree. I've been using my 'frankenmetahuman' process [its a video I made with a man and a woman boxing] and that has made it easier - asingle body and head COMBINED in iclone, for metahumans. And it means that characters can touch etc perfectly and it all matches absolutely in Unreal at the end. Before I developed that technique, it all was just impossible for me. 



I have a full-length tutorial made for that which is almost done, but I didn't get much up-take on that video haha so I think I'll wait and see if people start being interested to know more about it. 

One thing that was bothering me about that pipeline is that the facial animation on the MH now with MHA is far superior in my opinion to anything else, so I’m at a crossroads whether to actually carry on this pipeline


That's so interesting. I definitely am not at all familiar with the particular challenges of stylized characters when it comes to MetaHumans! Yes the post process blueprints - in fact ALL the blueprints when it comes to MetaHumans - are incredibly complicated. Very difficult if you're trying to do non-humans, I would think, because they simply haven't officially been playing with that (Epic I mean) and problem solving the issues. Difficult!

Secondly, I find the process of body and head animations for the MH difficult and time consuming compared with animating a single CC4 skeleton.


Well exactly. That's why I use the "frankenmetahuman" thing, but of course there's no such thing as MetaHuman dummies for non-human characters yet, so maybe that wouldn't work as well for you as it has been for me :-)

If you can get this plug-in to produce something near that MHA quality, then for me that is the magic bullet, the missing link that streamlines the pipeline significantly.


I'm not sure there is any perfect solution - it's hard, because even once MetaHuman facial animation is inside iClone, it's still going to be difficult to wrangle. But yes, could be good! As long as - and I know I keep saying this on this thread - you are committed to rendering in Unreal (otherwise Epic will get v annoyed). :-)

I'm hoping that there, at some point soon, becomes a true official Reallusion workflow for (a) a single metahuman body and face inside of iClone (even if its CC4 EXTENDED blendshapes on the face, but the body is 1:1 the same as the 18 MetaHuman body formats). And (2) facial animation import. I know this plug-in will do it, but I really hope that Reallusion themselves come to the party and throw some developers at it. :-)





Thanks for that. Just to clarify, when II say stylized, I mean humanoids like below which I’m working on with wrap and HS2: these are models from talented sculptor Matker Malitsky.



Garry, Clearstream, Seven_of_one
https://forum.reallusion.com/uploads/images/f8c0246a-f887-417d-9f3c-b22c.png
My Pinterests
mrtobycook
mrtobycook
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 425, Visits: 2.7K
@Garry Oh ok, interesting. I’d love to know more about your process, do you have a blog or YouTube I can check out? Sorry if I’ve already asked that or have already seen them. I’m a very visual person and sometimes I forget who is who on this darn forum, because there aren’t faces :)



@Yo Dojo Well, the MetaHuman Animator isn’t really “superior” to iClone - it’s just another tool to add. The whole MHA process is very cumbersome. IClone’s motion live and acculips system is amazing because it’s so so fast, and acculips is fantastic results for American-accented English speakers. If it’s looking robotic, I’d definitely say you should try adjusting the lip settings on the clip - I find ‘fast talking’ works great, or other ones, depending on speaker. Also, iClone 8 characters - with CC4 EXTENDED facial profile - have amazing expressions, just as good as MetaHuman in my opinion. So there’s a lot of advantages in staying with iClone/CC stuff. And yes, Epic is very strict about MetaHuman Animator being Unreal-rendered only, which is a massive drawback for many people because it’s much easier to get fast awesome results directly out of iClone or with the USD export to Omniverse. MHA is just another tool, and has drawbacks yep! Also keep in mind, lots of people have actually made great character animation short film/film/tv stuff with iClone/cc without unreal. It’s a proven, fantastic system. Anything with iClone + unreal is really just a test at this stage , in terms of professional level character animation. Just my opinion! :) so far, all finished professional-standard work with MetaHumans has used Maya. I’m a Maya guy and it is incredibly time consuming and you need a big team. IClone can be done by individuals; and it’s awesome :)



@mtakerkart Yes, ARKIT is inferior to MetaHuman Animator, because of the number of control points etc. Here’s a more detailed answer if it helps: It’s also about how they work, fundamentally - arkit is just looking at the iPhone’s camera and depth data in order to drive 52 blendshapes. MHA uses a “4d solve”, which means it uses the depth data from iPhone/stereoHMC to to recreate your entire facial mesh (ie: it’s like unlimited Blendshapes). At the moment, it then maps that down to the current MetaHuman Face ControlRig data (200+ controls). But in the future even the MetaHuman’s face rig will probably get better/bigger (400+), and the MetaHuman Animator process will still be effective.


- - - - - - - - - - - - - - - - - - - - - - - - - - - - 
https://forum.reallusion.com/uploads/images/d11fc97b-7387-4f19-bb1e-0785.png
virtualfilmer.com | youtube

charly Rama
charly Rama
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)Distinguished Member (8.1K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 1.9K, Visits: 17.3K
Very promising. Go ahead Toby, we trust in you :) and I like your female voice :D




     


mrtobycook
mrtobycook
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 425, Visits: 2.7K
😂 AMAZING, charly!

I didn’t think that little test recording would ever be public hahah. I’ll have to record some better ones :)



- - - - - - - - - - - - - - - - - - - - - - - - - - - - 
https://forum.reallusion.com/uploads/images/d11fc97b-7387-4f19-bb1e-0785.png
virtualfilmer.com | youtube

mrtobycook
mrtobycook
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 425, Visits: 2.7K
By the way, for anyone browsing this thread, just to clarify:

The video charly made was using a very early beta of my new plugin. The lips still aren’t correct - I have a list of many things to do.

And this is just arkit, fyi. Charly made it using the little test recording from “LiveLink Face” app - set to arkit mode, NOT metahuman animator - that I include as part of test animations in the beta.

He has imported it into iClone using the plugin. So this is one of the first facial animation import examples EVER TO BE DONE with iClone. Congratulations charly!! :)

FYI everyone, you can download the free beta plugin now; at https://virtualfilmer.com

The plugin, even when released, will be completely free and open source. :)

- - - - - - - - - - - - - - - - - - - - - - - - - - - - 
https://forum.reallusion.com/uploads/images/d11fc97b-7387-4f19-bb1e-0785.png
virtualfilmer.com | youtube

rosuckmedia
rosuckmedia
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)Distinguished Member (9.4K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 3.2K, Visits: 4.8K
Hi Charly nice test 👍 , I also did a test with Toby's templates. Hi Tobi, which iPhone are you using?
I did my own test with my iPhone X, but unfortunately it didn't turn out very well. But it can only get better.
You can do it Toby👍 . Greetings Robert


mrtobycook
mrtobycook
Posted 2 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)Distinguished Member (2.8K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 425, Visits: 2.7K
@Rosuckmedia - don’t forget to make the “sound” subtrack visible for your character, then click drag it 15 frames to the right!

Every livelink face arkit audio recording with the app is 15 frames out of sync for some reason. But it’s easily fixed :)

And I’m using an iPhone 14 Pro Max mostly. I have a few different phones but that’s my primary one.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - 
https://forum.reallusion.com/uploads/images/d11fc97b-7387-4f19-bb1e-0785.png
virtualfilmer.com | youtube


1
...
2
3
4
5
6
...
14



Reading This Topic