Profile Picture

Facial motion capture from a single web camera - is it possible to implement with Python API?

Posted By midix 4 Years Ago
You don't have permission to rate!

Facial motion capture from a single web camera - is it possible to...

Author
Message
midix
midix
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)Distinguished Member (1.0K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 59, Visits: 362
I recently saw a video about quick&dirty facial mocap for Blender using OpenCV and Python:
https://www.youtube.com/watch?v=O7nNO3FLkLU

And there are also other neural network projects that can extract facial feature positions in real time:
https://github.com/ageitgey/face_recognition

I'm wondering if it would be possible to implement similar plugin for iClone characters. Do we have Python access to the character morph data? When the webcam->OpenCV sends a signal that a specific point was moved on actor's face, how do we map it to iClone character's face? Can we leverage the existing iClone facial morphs or would it require creating virtual mesh regions for facial features and recalculate all the mesh transformations in Python code?

I'm a programmer, but I have little experience with Python and facial animations. I've been reading articles and watching videos to gain some knowledge for my Unreal Engine hobby project. As it's a hobby, I don't feel ready to spend that much on iPhone and official Live plugins, I already have spent enough on iClone and CC3 itself. So, a cheap solution with one webcam would be awesome (but I have doubts if Reallusion would support it, considering that they have to promote their official mocap solutions).

ADDED LATER:
I looked at current API limitations and I see that mesh level components are in Inoperable assets list. So, no way to go that route (yet). But maybe we have access to facial puppeteering controls, will take a look...
Edited
4 Years Ago by midix
animagic
animagic
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)

Group: Forum Members
Last Active: 3 hours ago
Posts: 15.7K, Visits: 30.5K
midix (5/7/2020)
 I'm a programmer, but I have little experience with Python and facial animations. I've been reading articles and watching videos to gain some knowledge for my Unreal Engine hobby project. As it's a hobby, I don't feel ready to spend that much on iPhone and official Live plugins, I already have spent enough on iClone and CC3 itself. So, a cheap solution with one webcam would be awesome (but I have doubts if Reallusion would support it, considering that they have to promote their official mocap solutions).

RL is not in the habit of deliberately blocking options, especially since all current facial Mocap solutions are third-party. Just draw up a plan and make a request if additional API access is needed. RL has always maintained that the API is a work in progress, and the idea is of course to extend iClone's functionality. 


https://forum.reallusion.com/uploads/images/436b0ffd-1242-44d6-a876-d631.jpg




Reading This Topic