I got a rough attempt with at least head movement working on a PC webcam but there is a big problem - CPU usage. I am doing dlib-based face feature extraction from 640x360 frames at 30fps, but it stutters very much when used inside iClone.
While I do the heavy stuff in a background thread, still it does not help much. My i7 usage is about 40% and avatar facial movements are jerky and with latency.
Maybe there is some more lightweight way to animate avatar's face (actually only head for now) realtime than with my current solution of
face_component.AddClip(RLPy.RTime(0), "Clip", RLPy.RTime(1))
result = face_component.AddHeadKey( RLPy.RTime( 0 ), [ v, h, t ] )
but I could not find anything better in the API documentation.
I might be able to get better results if I separate the image processing in another standalone QT app (actually I tested feature extraction this way first before integrating into iClone), but there's still an issue with noticeable tremors because dlib does not guarantee feature point stability, even after applying lots of Gaussian blur on the source frames. It would require some additional algorithm to detect and filter out false facial feature point movements, but that would slow it down even more.
I'm not sure if I want to invest more time to experiment with it or just buy an iPhone :D