iPhone face mocap: noise & jitter


https://forum.reallusion.com/Topic441949.aspx
Print Topic | Close Window

By Mikay² - 6 Years Ago
I think about motion capture with an iphone. One the one hand, the demo look very promising. However, what completely destroys the mocap is the noise and jitter that I see in all the videos I found so far. Is it possible to remove that noise and jitter in iclone?
Are there any videos that demonstrate this?
By kungphu - 6 Years Ago
Good question I’d definitely like the answer to. In some of the videos using it, perhaps the person wasn’t capturing while in minimal mode? When I’ve captured still in High mode I’ve got a substantial amount of jitter. Plus I think it also relies on your internet connection. Kelly toons has managed to get it working via USB. I’m not sure exactly the method, but seems like that would be a faster way to transfer data? I use it frequently and sometimes it’s pretty good and sometimes I do wish I could smooth the capture.
By Kelleytoons - 6 Years Ago
Nowadays I capture using Ethernet but, honestly, I don't see the jitter you're referring to -- then again, everyone has a different eye and expects different things.  Can you point us towards a video where you see this jitter?  If I don't see it there, I can't help you further.
By Mikay² - 6 Years Ago
I think it is quite obvious here:
https://youtu.be/89dW2LFR07I?t=18

As I am just a beginner, I can only guess what causes the problem. To me, it seems that the problem is mostly caused by noise, but sometimes it appears that the sampling rate is to low which creates jumps. If the sampling rate of the iphone is limited, the only solution is to move more slowly. 
However, the noise could be improved by iclone. Does it have some form of low-pass filter to smoothen the movements and remove the jitter?
By Kelleytoons - 6 Years Ago
Um, no -- about 98% of that video is just showing the live capture.  Has very little to do with the animation results (there is about a one second animation at the end that looks fine to me).

Show me an ANIMATION made with this that has the jitter (you can't compare the actual live capture -- that's like assuming iClone does real time rendering, which it clearly does not.  It takes at least three or four times as long -- sometimes MUCH longer -- to render something, at which point a lot of smoothing occurs).
By Mikay² - 6 Years Ago
Kelleytoons (4/17/2020)
Um, no -- about 98% of that video is just showing the live capture.  Has very little to do with the animation results (there is about a one second animation at the end that looks fine to me).

Show me an ANIMATION made with this that has the jitter (you can't compare the actual live capture -- that's like assuming iClone does real time rendering, which it clearly does not.  It takes at least three or four times as long -- sometimes MUCH longer -- to render something, at which point a lot of smoothing occurs).


Thanks for pointing that out. My assumption was that this is the result, as I have not yet found any marketing video that shows the final output. 

So my point is that I cannot show you what I am asking for. All the demos I found were either jittery or movements were so overexaggerated that it was hard to make a judgment about the quality of the recording.

So far, I mostly used Character Creator. Now I get to the stage where I want do do animation for unreal. 
By kungphu - 6 Years Ago
I’ve gotten jitter before but my internet does suck not sure if that’s what’s caused it. The last Time I recorded facial in hi mode (which was stupid) and I did get jitter. I’ve stopped doing that with good results, but the last trailer I did used a weird Shark character. I set the facial capture at about 110 strength overall and there was definite jitter. Was on a time deadline so didn’t really have to redo it and see if I could reduce the effect. I do toons so I crank up the strength. During the quarantine my WiFi has been awful. I’m guessing that has had an effect on my captures.

Mike do you have a tute for the setup? I’ll do a search on the forum.

@Mikay you’ll quickly find KT is an invaluable source of help on the forum. Lots of helpful members that don’t mind sharing their learnings here which is a welcome change from most forums.
By Kelleytoons - 6 Years Ago
I was asking to see examples because, honestly, everyone sees things differently.  You definitely will see live capture not as smooth as animation.

I'll post some examples of my own video -- if you see jitter in these I cannot help you, as I do not (and like what I've done).  I'm not saying these are the best you can do, but they are pretty representative of what can be achieved with Live Face (however -- be aware of one important thing.  Two of these weren't done "live" -- which is to say I lipsynced to the actual voice(s).  So the sync may be off on those a bit at times.  But the sync isn't the issue here, the smoothness of what can get captured is, and these demonstrate how something actually animates):





By Mikay² - 6 Years Ago
Haha, "Never Too Late" is damn funny ^^

@Mikay you’ll quickly find KT is an invaluable source of help on the forum.

I know, I subscribed to his Youtube channel before I even bought iclone.

For me, this is related to the uncanny valley. People like the increase in realism until it gets fairly realistic but not realistic enough to nail it. Using cartoon characters is a way to avoid this. However, if you strive for realism, then it's getting more difficult. I give you an example of very smooth mocap:
https://www.youtube.com/watch?v=pySDflhRMs4

Apart from the animated wrinkle maps, there is obviously a difference in the capture quality. Yes, it is true, with 20% of the effort you get 80% of the outcome, but I am willing to go for more, and I just want to know what my options are.
I found this feature for lip sync and tested it: https://www.youtube.com/watch?v=cb3jvt4H2Rs
This vastly improves the lip sync. Is there a similar feature for smoothing motion capture data from the iphone?
By Kelleytoons - 6 Years Ago
Yeah, uncanny valley indeed.

For your example that you posted I would suggest that has nothing to do with the mocap per se, but the model itself.  It is FAR more detailed than an iClone/CC3 model.  I think THAT'S where the smoothness you are seeing comes in.  We are told that a new CC (and, I assume, iClone) model will be coming with improved facial features and that this will also give us better visemes and thus I suspect that will get us closer to that if that's what you want (not me but, as you say, some folks strive for "realism").  But, again, nothing to do with Live Face.

I suppose the way to prove this would be to take some Live Face animation and export it to, what, Unreal (was that the software used?) and somehow apply it to that higher detailed figure, but I doubt that's going to work (right now, for example, we can't use Daz HD avatars -- well, we CAN but they are scaled down in realism to the CC3 one.  My gut tells me the same thing would happen in reverse there, that any added detail in the avatar wouldn't translate the motion well, if at all.  But I know nothing about Unreal so perhaps someone else will chime in here.

So, again -- I don't think it's a limitation of Live Face -- that is easily just as smooth as the animation you show.   But if you want to achieve that level of detail in an avatar, iClone/CC3 ain't the thing for you (at least right now.  If you are patient then by this fall perhaps we'll know a lot more).
By kungphu - 6 Years Ago
@KT I hadn't seen my Lament before, I recall the others but that one was new to me. That was awesome!  Sounded like you singing as well. Well done!  How did you do the head coming off?  Did you use two characters and use opacity to remove the head? I've got that model but haven't used it for anything yet. Joequick's stuff is awesome. As a matter of fact, I used his shark model for the iclone vid wwith some Zbrush tweaks here where I got the jitter. To be clear I used to get a lot of jitter before, but that's because I was stupid and recording in high mode. After going to minimal or even quickmode it kinda went away. I got some super strange jitter right at 3:02 on this vid. I was up against a time schedule so I didn't bother fixing it.  I think it's a combination of the really weird geometry of the model and my wifi has been awful during the quarantine. It comes and goes. I even have trouble streaming Youtube vids at times. I think that was part of the problem. The Decakrd character's facial mocap came out fine and plenty smooth. I did a recent cat and mouse vid with similar jitter problems. But it was a 48Hr film and I think I had the scenes in high mode. Plus it was during quarantine as well, so my wifi was probably way less than optimal. Hence why I was interested in connecting via a hardline somehow. 


By Mikay² - 6 Years Ago
::From what I read it is typically necessary to clean up motion capture data and I doubt that facial animation is different. I found this example where the iphone x was used for face mocap via FaceAR:
https://youtu.be/AIHoDo7Y4_g?t=938


You see the same noise and jitter. Now compare this to this demo which was also done with the iphone x.
https://www.youtube.com/watch?v=lXZhgkNFGfM&t=303s

No jitter and no noise. This is because here IKinemas live action was used, which applies real-time correction of noise and jitter. Unfortunately, Ikinema is just another company that was bought by Apple. But who knows, maybe all this will come in handy for all of us in the future.

There are Iclone animation products for the whole body and they look very good and the motions are fluid. So I am pretty sure that it is - or will be - possible to create this level of fidelity in the face as well. I agree, there are many things that need to be improved, such as the "bending jaw bone", but I decided to go for Reallusion products because I see the potential. The new CC base will be a game-changer and the groundwork for Reallusion's future success. If the CC base works well, it can become industry standard in the gaming industry - and nut just there. Why would anyone who is starting out in this field go for something else when you have it all setup - inkluding wrinkle maps etc.?  
If they succeed, they will attract a lot of new customers and a lot of funding. 
By kungphu - 6 Years Ago
I've got the Ikenema software. Used it for mocap cleanup. I switched to perception Neuron and I don't have to use much of anythign for clean-up. Their own software does a good job. All of the faces in my vid were done via the iphone. For the most part they are smooth. It was just the Shark at the end that looks pretty terrible. The price for the iphone mocap is great. I'm convinced it's my terrible wifi that is giving me my sporadic fits. Oh and also the character I was using had really weird geo to where if I raised my eyebrows his head would sort of swell up. PLus I do toons and really crank up the sensitivity on the mocap catpure. I think in the future I should keep it at normal and then just raise the overall expression slider in the animation tab. That'll probably give better results.
By Kelleytoons - 6 Years Ago
Kungphu,

Yeah, me singing (sort of) and, yes, two models for the head, transparent parts on the appropriate ones and then switched back and forth.  I've done this before (like in my zombie video where the zombie playing the guitar loses his arm -- there are actually THREE different zombies there, one intact, one without the arm, and the one that is JUST the arm, all synced and uncovered properly.  Watch it and see if you can tell the switches (I'll bet you can't).

I do think wi-fi capture can be iffy, which is why I use ethernet almost exclusively nowadays.  But I also think we'll need a much higher detailed model to get the kind of smoothness the OP wants and I'm not sure that's what will happen (it might -- hard to say).  He says "but body mocap is smooth" but he's missing my point -- the facial mocap is as smooth as the detail in the face allows.  Body detail doesn't have to be high-res to look smooth, but once you start moving around eyes/mouths/cheeks you need LOTS of detail or it WILL look less smooth.  So that's the answer -- you don't need to cleanup the facial mocap (nor will it help) unless you have that detail.

So we'll see what the future brings.  For me it's close enough -- I actually capture and render at 24fps because I want a film like look (which isn't as smooth as, say 60fps).  For that reason alone I couldn't care less about the smoothness getting better.  But higher details in the face (and even the body) would make a difference even in just what we could model (right now Daz HD models are remapped rather poorly).  Let's talk by the end of the year about this.


By Mikay² - 6 Years Ago
Kelleytoons (4/19/2020)
Kungphu,
I actually capture and render at 24fps because I want a film like look (which isn't as smooth as, say 60fps). 


Actually the opposite might be the case. You will probably make it smoother, especially if you export at 24fps and not 30fps. You are right, you will make it less fluid, but you will remove jitter.
When you use 24fps and not 30fps, iclone must interpolate between two frames (at least I hope it does) which acts as a low pass filter, averaging out higher frequencies. Similarly, you get a better, less noisy image, if you reduce its resolution by averaging the pixels. If you export at 30fps, this is not necessarily the case, as iclone can simply take every second frame.

I found a paper on smoothing markerless motion capture from Rosenhan et al 2007 that shows what I mean. You can see the jitter in the red signal.
After Rosenhan et al. applied their algorithm, the signal is way smoother (see black signal) and this is certainly what IKinema live action is doing in one way or the other.
This has nothing to do with the quality of the mesh. Even a moving box can have jittery movement.

https://forum.reallusion.com/uploads/images/0bb76fdd-c8ae-44e6-9021-bd59.png



By Kelleytoons - 6 Years Ago
Sorry, but you're wrong.  As I will be proven when/if the face mesh is increased.

But -- thanks for playing.
By Mikay² - 6 Years Ago
Kelleytoons (4/19/2020)
Sorry, but you're wrong. 

After doing some more research I found pretty jittery stuff that was recorded with the Iphone X on the web: https://youtu.be/Wxw6MwkfXUI?t=28 

But then I found something with pretty good results: https://www.youtube.com/watch?v=XTHvekIJzcs
Turns out this was done with "Face cap" from bannaflak.

And guess what they suggest as best practice in their documentation"Post process your recording by filtering out noise/jitter/glitches."
And guess what they have in their app - a smoothing button that does exactly what I said is needed:

https://forum.reallusion.com/uploads/images/cbb14664-6c87-496d-89d5-50a0.png


So to get back to my question: Is it possible to get the data in Iclone (e.g.,through python) to smooth it?
It would be cool to have the curves available like in the face cap app.


By StyleMarshal - 6 Years Ago
Here is a Test Video I did some time ago , nothing cleaned.
Morphs are looking nice , only the "Facial Bone"  jitters a little bit when the mouth is closed , but you can clean up by deleting the "Muscle Keys" at the parts where the mouth is jittering.



You also have a Smooth Head function in the settings :

https://forum.reallusion.com/uploads/images/19d597f8-a1d8-4420-8d65-c95a.jpg

Whats missing is a Curve Editor for facials , that's what we are waiting for.

You also can clean up or fine tune the facials in Unreal Engine , or any other 3D App with Curve Editor :




For me the Live-Face Recording works great  , happy with it !
By Mikay² - 6 Years Ago
This does indeed look very promising! I did not yet notice the smoothing button, thanks for pointing that out. The curve editor in unreal is a great solution!

Thanks so much for putting the effort into giving me this answer! 
By Dmyricks - 6 Years Ago
I'm having a similar problem. I am getting gitter when I render. Is there any way to smooth this? I selected "smooth head" in live face. I changed the FPS, and minimized the face controls, but I still am getting gitter. Looks more robotic then natural.
Its a daz Gen 8 character 



Any advice?
Please help.
By StyleMarshal - 6 Years Ago
Are you capturing over WiFi ?
By kungphu - 6 Years Ago
Never even dawned on me to use the unreal curve editor. )insert facepalm) ugh... nice tip! Now I guess I better read up on the curve editor.
By Dmyricks - 6 Years Ago
Yes. I use Live Face on my iphone using wifi connected to iclone link.  I use my laptop which is also connected to wifi.
Was not sure how to connect via USB. Will this help with the jitter?
By Eric C (RL) - 6 Years Ago
Hello @Dmyricks,

Here's the manual demonstrating how to connect to Live Face. https://bit.ly/3cHHLsN
Using a USB is highly recommended and it should solve your problem. :-) 

By Dmyricks - 6 Years Ago
Connected the USB direct and problem fixed! Thank you so much!
By StyleMarshal - 6 Years Ago
https://forum.reallusion.com/uploads/images/faca5409-5093-4c5d-9ba3-f61f.gif