A Comparison Between Apple ARKit, NVIDIA Maxine, and Google MediaPipe Facial Capture Solutions


https://forum.reallusion.com/Topic550409.aspx
Print Topic | Close Window

By sunnyviewtech - Last Year
Using default settings of the software, without pre-calibration
The software used,
Apple ARKit: LIVE Face
NVIDIA Maxine: Reallusion AccuFACE
Google MediaPipe: Dollars EGAO
By Nirwana - Last Year
Interesting comparison. My take:
- None of the three are really good with default settings; they all need quite a lot of tweaking and/or better calibration.
- To me, a blend of LiveFace and AccuFace would probably look best. The mouth on LiveFace sometimes looks exaggerated while AccuFace wrinkles the forehead a bit much.
- The Dollars EGAO version (never heard of that) looks the worst to me; the eye lids look far too sleepy and the facial motion too sedated considering the emotional delivery of the voice actress and her wide open eyes.
- There are two things I found problematic with the test/comparison itself: (a) a male avatar for a female voice (b) that no head rotation was applied to the avatar. I assume this was done to more clearly compare the facial details of the capture, but to me it looks distracting (also if she tilts her head but still looks at the camera, can the facial capture reproduce that accurately?). So I'd like to also see a version with a female avatar and head rotation included.
By R Ham - Last Year
The iPhone works well enough, but the teensy screen is unfortunate.
I've never been able to get AccuFace to run properly. Much facial distortion, lips tied in knot.
Dollars wasn't functioning at all last time I looked, I guess they fixed it.

All in all, the iPhone is my choice of the three. It's worth mentioning that the iPhone is not able to raise one eyebrow individually. I've brought it up more than once.
I suspect AccuFace has some host setup requirements that have not been documented. The marketing vids all look swell.

Facial mocap is a great idea but, in the past, it has lacked a professional implementation.
By AutoDidact - Last Year
 no head rotation was applied to the avatar. I assume this was done to more clearly compare the facial details of the capture, but to me it looks distracting (also if she tilts her head but still looks at the camera, can the facial capture reproduce that accurately?).



From my experience using Facial capture and watching many UE5 metahuman/Iclone videos
(where various facial capture apps were used)
It seems (to me at least) when you try to capture head movements/rotations you always get an uncanny, very linear 
“Theme Park Animatronic Robot “ look to the head movement.


I think this may  be a limitation of the video based capture solutions thus far
and is the reason I would never pay for one.


We have a free video based solution for Blender called 
“face landmarker”
The best scenario is when you have both a good facial animation control rig and or shape key animation panel to manually tweak the expressions based on the emotional tone of the actors performance and the ability to bypass head rotation recording and manually key frame subtle head rotations on top of the facial capture.


Here are some of my recent tests using an AI generated facial lipsinc as the video source with the free Live link CVS importer for Blender on a Daz genesis figure which comes with the  52 Arkit facial  facial Blend shapes already built in.





And another using a video of a real person as the source
 
 


Neither are “ Thanos Endgame Level”  shots obviously, and need more refinement .:blush:
Still, for a  FREE SOLUTION, 
I don’t think they too far off from the recent video based solutions that cost hundreds of dollars not including the mandatory modern NIVIDIA RTX card (Like Accuface)
By Kelleytoons - Last Year
R Ham (4/3/2024)

I've never been able to get AccuFace to run properly. Much facial distortion, lips tied in knot.
I suspect AccuFace has some host setup requirements that have not been documented. The marketing vids all look swell.


As I've asked to more than one person, can you just post a sample video of what you are using that doesn't work for you?  

I've had no trouble with Acculips at all (and prefer it to LiveFace  - not by a wide margin but perhaps 10-15% better and, of course, MUCH easier to use recorded video from other sources with, although as I've shown you can do it with LiveFace as well) but I'm pretty sure anyone's issue with Acculips can be tracked down if they are willing to either share what video they are using OR use a standard video someone else is.  I haven't tweaked the settings at all (now - part of it might be due to what video card/drivers are being used, but until someone shares their problems it's hard to help).

Since I've now bought four facial animation solutions in the past decade or so I have to say I'm really impressed we can do it so easily now (those of you who used some of the primitive ones in the past - including ones you had to put markers on your face (ugh) - have no idea how sophisticated the ones we have now are).
By R Ham - Last Year
Below is an example of what I got with AccuFace.

I finally assumed there were some default iClone settings that were required for AccuFace to work properly, and that I had altered them during a "press every button" session. If I had seen a "Reset all relevant settings to default" button, I would have used it.

I did fiddle with the settings as best I could. I could get it to a point where the mouth looked normal when it was closed, but not when it was moving.

https://forum.reallusion.com/uploads/images/d9a1e041-3490-47cc-a558-c15f.bmp
By R Ham - Last Year
AutoDidact (4/3/2024)
[quote]Here are some of my recent tests using an AI generated facial lipsinc as the video source with the free Live link CVS importer for Blender on a Daz genesis figure which comes with the  52 Arkit facial  facial Blend shapes already built in.

The first test in the first vid is the best. The second test in the first vid can't quite get its lips closed between words. Why the difference? The second vid suggests that only one brow is being monitored. That seems odd. If animation is imported, there's no need for face tracking at all. Does this plugin allow the movement upward of a single eyebrow?
By Kelleytoons - Last Year
R Ham (4/3/2024)
Below is an example of what I got with AccuFace.

I finally assumed there were some default iClone settings that were required for AccuFace to work properly, and that I had altered them during a "press every button" session. If I had seen a "Reset all relevant settings to default" button, I would have used it.

I did fiddle with the settings as best I could. I could get it to a point where the mouth looked normal when it was closed, but not when it was moving.

https://forum.reallusion.com/uploads/images/d9a1e041-3490-47cc-a558-c15f.bmp


So what video were you using?  Without that we can't accurately help you (if we can standardize on what we are using we can track this down).
By R Ham - Last Year
Kelleytoons (4/3/2024)
[quote]
So what video were you using?  Without that we can't accurately help you (if we can standardize on what we are using we can track this down).


Ah, what video source. The source was a live webcam. A Logitech Brio.
By AutoDidact - Last Year
The first test in the first vid is the best. The second test in the first vid can't quite get its lips closed between words. Why the difference?


Well if you look carefully at the source video being tracked the guy is a bit of a “mouth breather” and leaves his mouth open  alot between words.

The blond lady  video source is an AI generated talking head I made last year.

If animation is imported, there's no need for face tracking at all. 


I think you misunderstand my process
this is not “live” face tracking recorded in realtime
you (or some remote actor) records a video and the face landmarker app basicly  detects and tracks the face motion and saves a “CSV” file that is imported into blender and applied to a genesis figure via the free Diffeomorphic addon.

I Believe Accuface does similar with  pre-recorded video sources ..yes??
however it does not restrict me to  Daz or any figure ecosystem as it works in Blender with any rig that has the 52 ARKIT blend shapes such as the free avatars from the “Ready player me” website



That said it comes with a secondary panel that does have a Live face tracking option but I cant be bothered to set up the connection with my Android phone as I prefer a curated video sources and don’t do any live streaming online etc.
  
Does this plugin allow the movement upward of a single eyebrow?



If the actor in the source video does a single “Spock like” eye brow raise then yes this app will record it into the CSV data
However capturing such subtleties are moot for me as I am one who considers ALL mocap to be a base layer starting point  where you should always have the option of layering one top with discrete non destructive layers via a viewport control rig for body and face.

https://forum.reallusion.com/uploads/images/d5b5b7f3-0e8d-4510-be7d-d51a.jpg

By Kelleytoons - Last Year
R Ham (4/3/2024)
Kelleytoons (4/3/2024)
[quote]
So what video were you using?  Without that we can't accurately help you (if we can standardize on what we are using we can track this down).


Ah, what video source. The source was a live webcam. A Logitech Brio.


Yeah, the problem is everyone needs to be using the same source.  I'd advise you to either create or find a video source you can share so that others can try and reproduce your issues.
By R Ham - Last Year
Kelleytoons (4/3/2024)
[quote]
Yeah, the problem is everyone needs to be using the same source.  I'd advise you to either create or find a video source you can share so that others can try and reproduce your issues.


I see. Thanks for your offer of help. I've already deleted AccuFace eval version. I'll watch for AccuFace v2.0 and try it again.

I find it interesting that Nvidia is now involved in both iClone and Daz software. I'm referring to the Daz iRay decal node.

EDIT:
I see that Accuface has now gone from an overpriced 250$ to a totally ludicrous 400$! I pass.

By R Ham - Last Year
AutoDidact (4/3/2024)
We have a free video based solution for Blender called 
“face landmarker”


Yeah, there's activity in the Blender animation plugin world these days.

Mocap is one area of Blender plugin activity. UV Mapping is another. Looking at these is on my list of things to do.
By AutoDidact - Last Year
I find it interesting that Nvidia is now involved in both iClone and Daz software. I'm referring to the Daz iRay decal node.



Actually the Iray engine has shipped for FREE with Daz studio
since 2015.
however Daz studio will render IRay on your CPU if you have no NVIDIA GPU (very slowly obviously )
When reallusion offered Iray as paid plugin I warned people in this community that it was too slow for animation and suggested they install Daz studio. and test Iray themselves yet many insisted that Reallusion was somehow going to 
“optimize” Iray for “realtime” ( in the age before RTX pathtracing)
and we know how that all turned out.

 
Daz appears to be about to make the move to RTX and ominverse though with some AI assisted Character creator thingy.





Mocap is one area of Blender plugin activity. UV Mapping is another. Looking at these is on my list of things to do.



The Blender ecosystem has a massive add-on market because of the
low cost of entry to become a developer
if you are as skilled  python coder you can grab a copy of blender for free and try to make something for yourself and even sell it.

By R Ham - Last Year
AutoDidact (4/3/2024)
if you are as skilled  python coder you can grab a copy of blender for free and try to make something for yourself and even sell it.


I can't drive the car and build the car at the same time.

Nobody can.

By AutoDidact - Last Year
I can't drive the car and build the car at the same time.

Nobody can.Nobody can.
Well I am thankful that there are people who know how both to use(Drive) blender as well as “get under the hood” and add needed features not available in the base program, and give them away free or sell the addons , at a  very reasonable price ,on Blender market.
when it comes to Character animation (not VFX simulation) Blender ,with about $200 worth of addons, is on par with Autodesk Maya.

I know because I have  had a Maya indie license since Dec.
I remember when Reallusion announced the addition of a Python API in Iclone 7 many thought we see would alot of new plugins  to  add missing/requested features to iclone but that turned out to not be so much the case.

would be cool if it was possible to port this FREE open source rag doll physics  python plugin to iclone though:D
https://bitbucket.org/PhysicalcSoftware/poserphysics/src/master/