Profile Picture

A Comparison Between Apple ARKit, NVIDIA Maxine, and Google MediaPipe Facial Capture Solutions

Posted By sunnyviewtech Last Year
You don't have permission to rate!
1
2

A Comparison Between Apple ARKit, NVIDIA Maxine, and Google MediaPipe...

Author
Message
AutoDidact
AutoDidact
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 2.1K, Visits: 13.6K
I can't drive the car and build the car at the same time.

Nobody can.Nobody can.
Well I am thankful that there are people who know how both to use(Drive) blender as well as “get under the hood” and add needed features not available in the base program, and give them away free or sell the addons , at a  very reasonable price ,on Blender market.
when it comes to Character animation (not VFX simulation) Blender ,with about $200 worth of addons, is on par with Autodesk Maya.

I know because I have  had a Maya indie license since Dec.
I remember when Reallusion announced the addition of a Python API in Iclone 7 many thought we see would alot of new plugins  to  add missing/requested features to iclone but that turned out to not be so much the case.

would be cool if it was possible to port this FREE open source rag doll physics  python plugin to iclone though:D
https://bitbucket.org/PhysicalcSoftware/poserphysics/src/master/











RAG DOLL COLLISION ANIMATIONS FOR ICLONE 8 & 7
---------------------------------------------------------------------------------------------------------------------
Ghost Origins
My latest Feature length film created with Iclone.
https://forum.reallusion.com/uploads/images/adf9b210-df59-4cb6-aa1b-9de5.jpg
My Sci- Fi Graphic Novel on Amazon: https://a.co/d/9k3cwoY


R Ham
R Ham
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 628, Visits: 4.1K
AutoDidact (4/3/2024)
if you are as skilled  python coder you can grab a copy of blender for free and try to make something for yourself and even sell it.


I can't drive the car and build the car at the same time.

Nobody can.



"Less clicks good, more clicks bad."
AutoDidact
AutoDidact
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 2.1K, Visits: 13.6K
I find it interesting that Nvidia is now involved in both iClone and Daz software. I'm referring to the Daz iRay decal node.



Actually the Iray engine has shipped for FREE with Daz studio
since 2015.
however Daz studio will render IRay on your CPU if you have no NVIDIA GPU (very slowly obviously )
When reallusion offered Iray as paid plugin I warned people in this community that it was too slow for animation and suggested they install Daz studio. and test Iray themselves yet many insisted that Reallusion was somehow going to 
“optimize” Iray for “realtime” ( in the age before RTX pathtracing)
and we know how that all turned out.

 
Daz appears to be about to make the move to RTX and ominverse though with some AI assisted Character creator thingy.





Mocap is one area of Blender plugin activity. UV Mapping is another. Looking at these is on my list of things to do.



The Blender ecosystem has a massive add-on market because of the
low cost of entry to become a developer
if you are as skilled  python coder you can grab a copy of blender for free and try to make something for yourself and even sell it.




RAG DOLL COLLISION ANIMATIONS FOR ICLONE 8 & 7
---------------------------------------------------------------------------------------------------------------------
Ghost Origins
My latest Feature length film created with Iclone.
https://forum.reallusion.com/uploads/images/adf9b210-df59-4cb6-aa1b-9de5.jpg
My Sci- Fi Graphic Novel on Amazon: https://a.co/d/9k3cwoY


R Ham
R Ham
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 628, Visits: 4.1K
AutoDidact (4/3/2024)
We have a free video based solution for Blender called 
“face landmarker”


Yeah, there's activity in the Blender animation plugin world these days.

Mocap is one area of Blender plugin activity. UV Mapping is another. Looking at these is on my list of things to do.


"Less clicks good, more clicks bad."
R Ham
R Ham
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 628, Visits: 4.1K
Kelleytoons (4/3/2024)
[quote]
Yeah, the problem is everyone needs to be using the same source.  I'd advise you to either create or find a video source you can share so that others can try and reproduce your issues.


I see. Thanks for your offer of help. I've already deleted AccuFace eval version. I'll watch for AccuFace v2.0 and try it again.

I find it interesting that Nvidia is now involved in both iClone and Daz software. I'm referring to the Daz iRay decal node.

EDIT:
I see that Accuface has now gone from an overpriced 250$ to a totally ludicrous 400$! I pass.



"Less clicks good, more clicks bad."
Kelleytoons
Kelleytoons
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 9.2K, Visits: 22.1K
R Ham (4/3/2024)
Kelleytoons (4/3/2024)
[quote]
So what video were you using?  Without that we can't accurately help you (if we can standardize on what we are using we can track this down).


Ah, what video source. The source was a live webcam. A Logitech Brio.


Yeah, the problem is everyone needs to be using the same source.  I'd advise you to either create or find a video source you can share so that others can try and reproduce your issues.



Alienware Aurora R16, Win 11, i9-149000KF, 3.20GHz CPU, 64GB RAM, RTX 4090 (24GB), Samsung 870 Pro 8TB, Gen3 MVNe M-2 SSD, 4TBx2, 39" Alienware Widescreen Monitor
Mike "ex-genius" Kelley
AutoDidact
AutoDidact
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)Distinguished Member (6.1K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 2.1K, Visits: 13.6K
The first test in the first vid is the best. The second test in the first vid can't quite get its lips closed between words. Why the difference?


Well if you look carefully at the source video being tracked the guy is a bit of a “mouth breather” and leaves his mouth open  alot between words.

The blond lady  video source is an AI generated talking head I made last year.

If animation is imported, there's no need for face tracking at all. 


I think you misunderstand my process
this is not “live” face tracking recorded in realtime
you (or some remote actor) records a video and the face landmarker app basicly  detects and tracks the face motion and saves a “CSV” file that is imported into blender and applied to a genesis figure via the free Diffeomorphic addon.

I Believe Accuface does similar with  pre-recorded video sources ..yes??
however it does not restrict me to  Daz or any figure ecosystem as it works in Blender with any rig that has the 52 ARKIT blend shapes such as the free avatars from the “Ready player me” website



That said it comes with a secondary panel that does have a Live face tracking option but I cant be bothered to set up the connection with my Android phone as I prefer a curated video sources and don’t do any live streaming online etc.
  
Does this plugin allow the movement upward of a single eyebrow?



If the actor in the source video does a single “Spock like” eye brow raise then yes this app will record it into the CSV data
However capturing such subtleties are moot for me as I am one who considers ALL mocap to be a base layer starting point  where you should always have the option of layering one top with discrete non destructive layers via a viewport control rig for body and face.

https://forum.reallusion.com/uploads/images/d5b5b7f3-0e8d-4510-be7d-d51a.jpg




RAG DOLL COLLISION ANIMATIONS FOR ICLONE 8 & 7
---------------------------------------------------------------------------------------------------------------------
Ghost Origins
My latest Feature length film created with Iclone.
https://forum.reallusion.com/uploads/images/adf9b210-df59-4cb6-aa1b-9de5.jpg
My Sci- Fi Graphic Novel on Amazon: https://a.co/d/9k3cwoY


R Ham
R Ham
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 628, Visits: 4.1K
Kelleytoons (4/3/2024)
[quote]
So what video were you using?  Without that we can't accurately help you (if we can standardize on what we are using we can track this down).


Ah, what video source. The source was a live webcam. A Logitech Brio.


"Less clicks good, more clicks bad."
Kelleytoons
Kelleytoons
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)Distinguished Member (37.8K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 9.2K, Visits: 22.1K
R Ham (4/3/2024)
Below is an example of what I got with AccuFace.

I finally assumed there were some default iClone settings that were required for AccuFace to work properly, and that I had altered them during a "press every button" session. If I had seen a "Reset all relevant settings to default" button, I would have used it.

I did fiddle with the settings as best I could. I could get it to a point where the mouth looked normal when it was closed, but not when it was moving.

https://forum.reallusion.com/uploads/images/d9a1e041-3490-47cc-a558-c15f.bmp


So what video were you using?  Without that we can't accurately help you (if we can standardize on what we are using we can track this down).



Alienware Aurora R16, Win 11, i9-149000KF, 3.20GHz CPU, 64GB RAM, RTX 4090 (24GB), Samsung 870 Pro 8TB, Gen3 MVNe M-2 SSD, 4TBx2, 39" Alienware Widescreen Monitor
Mike "ex-genius" Kelley
R Ham
R Ham
Posted Last Year
View Quick Profile
Distinguished Member

Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)Distinguished Member (4.3K reputation)

Group: Forum Members
Last Active: 2 Months Ago
Posts: 628, Visits: 4.1K
AutoDidact (4/3/2024)
[quote]Here are some of my recent tests using an AI generated facial lipsinc as the video source with the free Live link CVS importer for Blender on a Daz genesis figure which comes with the  52 Arkit facial  facial Blend shapes already built in.

The first test in the first vid is the best. The second test in the first vid can't quite get its lips closed between words. Why the difference? The second vid suggests that only one brow is being monitored. That seems odd. If animation is imported, there's no need for face tracking at all. Does this plugin allow the movement upward of a single eyebrow?


"Less clicks good, more clicks bad."

1
2



Reading This Topic