Motion capture with a video only


https://forum.reallusion.com/Topic555619.aspx
Print Topic | Close Window

By mtakerkart - Last Year



I tested only the portage into Iclone. It works but you need to import in udk first then in Iclone.

 CC4 recognize the structure but it's a mess.

https://forum.reallusion.com/uploads/images/3727e9c9-1288-4429-aa94-0d51.png


https://forum.reallusion.com/uploads/images/2190a1a1-f1ae-482c-b6fb-74c9.png

But it's ok in UDK :  

https://forum.reallusion.com/uploads/images/ab6f9717-13e0-4a9a-ae56-7441.png

After characterizing In CC4 , you can add the motion that could be apply to your custom character.

https://forum.reallusion.com/uploads/images/04a25a4a-1a3a-4390-9d03-9b8a.gif


https://forum.reallusion.com/uploads/images/787711a0-36a2-4090-ab75-5a33.gif

By 4u2ges - Last Year
Interesting. Did you make a .3dxProfile for importing into IC/CC FBXs exported directly from MoCapade?
By mtakerkart - Last Year
4u2ges (9/1/2024)
Interesting. Did you make a .3dxProfile for importing into IC/CC FBXs exported directly from MoCapade?



I don't have the option to export to 3ds file, only unreal.

https://forum.reallusion.com/uploads/images/476f2319-275e-4666-84af-f58d.png

As I feared, there is a problem of foot slidding, and I cannot correct it with the motion correction.
I willingly kept my right foot as a pivot because even with my neuron mocap it is a recurring problem.




By 4u2ges - Last Year
I was actually referring to custom 3dx profile for direct import of MoCapade Unreal fbx into iClone and CC.

But I made one anyway. Just drop it to C:\Program Files\Reallusion\iClone 8\Program\Assets\Share\CharaterizeProfiles
and  C:\Program Files\Reallusion\Character Creator 4\Program\Assets\Share\CharaterizeProfiles
Then it will be fetched and applied (after IC/CC restart):

https://forum.reallusion.com/uploads/images/a4141aca-ed36-4038-b18d-f9b6.jpg

Foot sliding notorious for any mocap (not to mention from a video).
Unless they come up with reliable AI foot planting, it's just a toy or can only be used while hiding feet with camera.
Or else a lot of manual work with reach targets would require (not with motion correction - which is another toy).
By mtakerkart - Last Year
Wow! Cool it works. Where did you find it ?

For the slidding foot I didn't see descent tutorial. Only this one wich assume that motion correction detect all the foot contact.
But in my case it detect only 2 positions of my right foot.



By 4u2ges - Last Year
I made that profile. Most any FBX files have a rest pose as T-pose.
So I imported fbx from MoCapade to Blender (export Unreal without blend shapes option form MoCapade), switched to the Rest pose and exported without baking animation.
Then imported to CC, in Characterization tweaked bones to have the Unreal T-pose look more like an RL T-pose (hardest part - specifically for fingers).
And then mapped bones and saved 3dx profile as custom.

Regarding feet slide, I have more or less stable routine working with reach targets.
It is pretty much similar to Motion Correction, only manual where I have more control over it.
I have 2 same characters with same movement where I move feet of one character with mostly L/R Reach objects (actively using transition curves)
while having second character as a reference and trying to match planted foot without sliding.

I do not have a video of a process itself, but I have a video of a result.
And yes, although it's time consuming, the achieved result can be perfect with zero sliding.


By AutoDidact - Last Year
I Installed the Desktop Meshcapade app on my Imac
several months ago
the FBX files retarget great with my Blender ARP rigs 

By mtakerkart - Last Year
4u2ges (9/2/2024)
I made that profile. Most any FBX files have a rest pose as T-pose.
So I imported fbx from MoCapade to Blender (export Unreal without blend shapes option form MoCapade), switched to the Rest pose and exported without baking animation.
Then imported to CC, in Characterization tweaked bones to have the Unreal T-pose look more like an RL T-pose (hardest part - specifically for fingers).
And then mapped bones and saved 3dx profile as custom.

Regarding feet slide, I have more or less stable routine working with reach targets.
It is pretty much similar to Motion Correction, only manual where I have more control over it.
I have 2 same characters with same movement where I move feet of one character with mostly L/R Reach objects (actively using transition curves)
while having second character as a reference and trying to match planted foot without sliding.

I do not have a video of a process itself, but I have a video of a result.
And yes, although it's time consuming, the achieved result can be perfect with zero sliding.




Thank you again , now it's very easy to import in Iclone. 
Meshcapade is in beta so I imagine they will refine the sliding issue.
There is still some 
root jittering   but the result is quite encouraging considering the simplicity of the mocap.


By sunnyviewtech - Last Year


With our monocular camera-based motion capture software, Dollars MONO,
you can also capture motion in real-time from a camera or video file and transfer the motion capture data to iClone.
We also support facial and finger capture.
By Kelleytoons - Last Year
I'd advise folks to try the freeware Rokoko Vision.

It works pretty solid and unlike a lot of other solutions is COMPLETELY free (no limitations on anything).  It will work with a video or a webcam.

(They DO have a paid version but it works with two cameras for better capture.  But the single camera version is pretty damn good.  You should at least check it out - it comes into iClone just fine with no fiddling at all if you use the Mixamo export).
By mtakerkart - Last Year
Kelleytoons (9/2/2024)
I'd advise folks to try the freeware Rokoko Vision.

It works pretty solid and unlike a lot of other solutions is COMPLETELY free (no limitations on anything).  It will work with a video or a webcam.

(They DO have a paid version but it works with two cameras for better capture.  But the single camera version is pretty damn good.  You should at least check it out - it comes into iClone just fine with no fiddling at all if you use the Mixamo export).


Thank you Kelleytoons. I made a trial and the one cam is a completely mess. The problem for me with 2 cam features enforce to have a PC online on set . That's not my case.

Rokoko one cam



I send them an email to make 2 cam record offline.

I redo a test with meshcapade but this time with a more marked gait. Little bit better but I REALLY need more explanations
with the motion correction. The user manual doesn't help... Yes the feet are not slidding any more  but  they are stuck like magnets.


By 4u2ges - Last Year
Have you looked at this?
https://courses.reallusion.com/home/iclone/character-motions?product=iClone&version=8&keyword=motion+correction&v=iclone-8-tutorial-motion-correction-for-footstep-and-handprint
By mtakerkart - Last Year


Thank 4u2ges. It's more clear. In their exemple one click makes the result nearly perfect. In my case I have to rework all footsteps and the result is so so.
As I said this AI tech is ready for very marked steps. May in few months ;)
By 4u2ges - Last Year
I am surprised no one came out with that yet. The issue of sliding/sinking feet is part of any mocap out there.
Meantime, I would mostly rely on manual correction with reach targets. Yes, it takes time though.

This case was as bad as it can be and could not be fixed with motion correction.
I will explain a bit in details later on how it was done with some tips.


By 4u2ges - Last Year
Just a few details on fixing feet sliding/sinking for motions which cannot be effectively fixed with Motion Correction.
I hope someone find it interesting and/or introduce their own alternative methods or tips to improve the workflow.
Mid to advanced experience level with animating on the timeline required.

It's not actually "fixing", but rather recreating motions for 4 bones - L/R Foot and L/R ToeBase.

The idea is to have 2 characters with the same motion at the same location in the project.
You then animate 4 mentioned bones of one character without sliding while having second character as a reference for planting feet.
During the process you would want to hide/un-hide reference character to have a clear view of the main character you are trying to fix the motion for.

Animating L/R Foot

You do not actually animate bones. You do it by animating Reach Targets attached to  L/R Foot nodes.
So you have to create 2 reach targets, place them on the timeline and start animating to have feet of the target character follow the feet of the reference character.
You select an L/R foot node and click Create Dummy in Edit Reach Target:
https://forum.reallusion.com/uploads/images/b67d26ae-e8be-4b7b-8397-255b.jpg

The biggest issue during animation is target rotation. Vast majority of human feet motions pivoted to toes.
But foot bones and consequently reach targets pivoted to heels.
So a slight turn with rotation would require repositioning reach target with move to arbitrary match initial toe position.
And mismatch with positioning even for a tiny fraction of an inch would result in sliding.

I came up with orthodox foot rotation by setting a Pivot point for Reach targets to the location of toes.
It would feel a bit unusual at first, that while rotating reach target by X the Calf and Thigh bones are pushed up. But it's a matter of getting used to.
The benefits are obvious. Once foot is rotated with reach target, the toe would firmly stay planted at the intended location with zero sliding
and further tweaking and guessing of the foot position would NOT be required. Making turns and twists would be a snap.
Here is how to make an alternative pivoting of the feet Reach Targets:



Alternative method for setting a Pivot to toes



So animating Reach targets for the feet (specially with alternative pivots) is nothing really special.
You make pose-to-pose animation moving foot from one pose on the ground to another
and fill the blanks in between (such as raising foot, rotating up/down for steps) while trying to mimic the foot movement of the reference character.

Animating L/R ToeBase

This has to be done with character bones. As the matter of fact you only need to animate ToeBase\Rotation X.
But first, you need to remove ALL ToeBase bone animation from a given clip.
They are jittery most of the time (specially having Y and Z rotation while there has to be none by those axises).

Sample Selected parts
https://forum.reallusion.com/uploads/images/c3bd27b0-663e-4ea0-8ac6-d30a.jpg

Then open Curve Editor, select ALL keys for the ToeBase bones and delete them.
https://forum.reallusion.com/uploads/images/9a8e58d4-7357-4255-8d0f-282e.jpg

After that, you may Flatten the clip back.

You animate ToeBase bones by X rotation where appropriate while making Reach targets pose-to-pose animation for feet.
You may animate toes afterwards when you complete Reach Targets animation for feet. In which case slight tweaking of Reach targets would most likely require in a process.
You have to use Edit Motion Layer with Edit Body Parts mode.
Try to mimic the reference character, or if toe bending is completely messed up (like in a case with boxer animation in my previous post) try to make them perform like in real life.

Keep Motion Layer clean. Do not animate anything else other than ToeBase bones.
It's very easy to lose track of that's happening at the motion layer if you start fixing other parts. You can do that later.
The only possible exception is Hip bone (Pelvis). For out of touch mocaps you may get inadequate Hip movements where it would prevent planting feet properly.
So you might have to tweak Hip bone in a way.

I found it's a good practice to cleanup unnecessary keys from timeline.
Every once in a while I would go to Curve Editor, select and delete all keys leaving only:
CC_Base_R_ToeBase \Rotation X
CC_Base_L_ToeBase \Rotation X

This is how my timeline looked at the end from Curve Editor perspective (only 2 curves):

https://forum.reallusion.com/uploads/images/5cb18ec3-8241-4d31-9e08-be0b.jpg

At the end when all is done and you are happy with the result, go ahead and bake with Flatten All Motion with Constraint option. Delete reach dummies once motion is baked.

It might seem overly complicated, but it's actually not. Once you get done a few poses, you'd come up with steady routine and your own shortcuts for animating Reach dummies and Toe bones at the timeline.
It's also a good practice to keep your skills for manual animations sharp.
By mtakerkart - Last Year
Thank you 4u2ges , very instructive. But to much time consuming for me ;)
JSFILMZ made a new video with the mocopi kit wich can record in bvh format. It seems quite good . May it could be 
better the the motion correction...


By AutoDidact - Last Year
JSFILMZ made a new video with the mocopi kit wich can record in bvh format. It seems quite good . May it could be 
better the the motion correction...

better the the motion correction...


JSFILMZ recommends export from UE5 to FBX for Iclone with his Mocapi setu
BVH is typically a horrible format unless it is bespoke from your specific character that was retargeted in Iclone or 
Motionbuilder.
I suspect that a raw generic BVH from Mocapi will have the same foot sliding (or worse), and still require the type of tedious manual fixing that  4u2ges had to perform.  
By 4u2ges - Last Year
Yes, I am afraid there would be a need to do some sort of manual surgical procedures with mocaps motions until AI figures it all out.

Just one more note about the routine I posted.
In some cases you might not even need to animate toe bones at all (only dummy objects for feet).
Alternative pivoting method to toes I posted later on there, allows to use Foot Contact feature to drive toe bones automatically.
Foot Contact has it's own quirks and bugs, but still workable.

The best part is, that I learned recently, a deformation with Foot Contact feature will be baked along with all other constraints.
Meaning there would not be any dependency on this iClone feature when you export motion.
By mtakerkart - Last Year
AutoDidact (9/5/2024)
JSFILMZ made a new video with the mocopi kit wich can record in bvh format. It seems quite good . May it could be 
better the the motion correction...

better the the motion correction...


JSFILMZ recommends export from UE5 to FBX for Iclone with his Mocapi setu
BVH is typically a horrible format unless it is bespoke from your specific character that was retargeted in Iclone or 
Motionbuilder.
I suspect that a raw generic BVH from Mocapi will have the same foot sliding (or worse), and still require the type of tedious manual fixing that  4u2ges had to perform.  


Never had a horrible problem with bhv file into Iclone.
The import process into UDK showed by JSFILMZ required NASA ingeneer skills wich required PC and wifi to stream data  which adds a source of recording error...
Mocopi made an update one year ago. What interests me in mocopi is that it uses bluetooth  to record bvh file on a cell phone.
So you can record anywhere with no distance limit.

By mtakerkart - Last Year
4u2ges (9/5/2024)
Yes, I am afraid there would be a need to do some sort of manual surgical procedures with mocaps motions until AI figures it all out.


I think it's just a root positioning issue that their AI is unable to handle.
If we look at the rest of the body it's very faithful, and just 1 camera!!!
Maybe it should use the iPhone's zbuffer sensor data....


By AutoDidact - Last Year
The import process into UDK showed by JSFILMZ required NASA engineer skills which required PC and wifi to stream data.



LOL!!:D
I really do not get why people insist that unreal engine is a good choice for animated filmmaking
with Characters 
What the hell good is “realtime” when you have to do 15 complicated steps before your Character can even start moving. 
By mtakerkart - Last Year
I will give a try next week with the dual camera Rokoko vision. There's a 14 days trial. Let's see what it gives...

https://www.rokoko.com/products/vision
By WesleySales3d - Last Year
For single-camera mocap solutions, I recommend:


Another program I also RECOMMEND is Cascadeur, it also cleans up mocap and generates medium-quality animations from videos.
https://cascadeur.com/
https://www.youtube.com/watch?v=euMb627cF9I

I don't like Rokoko's camera-based solutions, I think they're too weak in terms of precision and it'll take a lot more work to adjust everything manually. Not to mention that the 2-camera version needs calibration.
By jeffkirkland - Last Year
sunnyviewtech (9/2/2024)

With our monocular camera-based motion capture software, Dollars MONO,
you can also capture motion in real-time from a camera or video file and transfer the motion capture data to iClone.
We also support facial and finger capture.


I like Dollars MONO enough to spend money on it  Its live link to iClone makes it very convenient for some quick motion capture but I tested them side by side and Meshcapade's mocap gave a much more accurate result, in terms of figuring out occlusion although everything I've tried in Meshcapcade tends to be off the ground and leaning forward.
By mtakerkart - Last Year
I did a single test with the rokoko vision and it is what I found most precise in the placement of the feet.
There are still strange movements. I bought the webcams that they recommend but it requires a distance of 11 feet!!
So if you have to act in a square of 8X8 feet you need a room of 20X20 feet!!!
I will do a test again Friday in a larger room.
It is imperative to have a PC (not strong), an internet connection and print the documents to calibrate the cameras.


By sunnyviewtech - Last Year

I like Dollars MONO enough to spend money on it  Its live link to iClone makes it very convenient for some quick motion capture but I tested them side by side and Meshcapade's mocap gave a much more accurate result, in terms of figuring out occlusion although everything I've tried in Meshcapcade tends to be off the ground and leaning forward.


Thank you, sir. We are continuously improving the capture quality.

By mtakerkart - Last Year
Ok guys , back to the topic. Something cool might happen ...

I tried the Rokoko Vision  the 2 cameras advised by rokoko. Clearly not usable. And as I said it need very large set to be entirely seen by the cameras.



Tested Move.AI this morning (Yes It's a Christmas pant :P ) and very very happy with this tech. Only one Iphone and no root drift or strange behavior.
The feet need a very little tweak but I'm sure that  2 Iphones 
 will solve this defect. This is RAW data.


By mtakerkart - Last Year
Hehehe ! me again :P

I get the mocopi mocap system and 
pleasantly surprised by the quality of the mocap.
The video below is a raw bhv file and very easy to import. I get better result as the neuron suit V1 and rokoko.



By Kalex - Last Year
Hey ! Never heard about mocopi. Is it some mocap solution that allows to capture a motion from video ?

You might be interested by this also. I have no idea when they plan to release it though.


By mtakerkart - Last Year
Kalex (10/21/2024)
Hey ! Never heard about mocopi. Is it some mocap solution that allows to capture a motion from video ?

You might be interested by this also. I have no idea when they plan to release it though.





Mocopi is this : 

https://electronics.sony.com/more/mocopi/all-mocopi/p/qmss1-uscx?srsltid=AfmBOoolG-u89KJgKKvJhQAuiAvcvwTqiZEmgj7cz0a2-5mvbc4L20p0

Yes, I had seen the Microsoft project but it is a prototype and is not accessible at the moment ;)
By toystorylab - Last Year

Interesting! When I got it right, no camera setup needed at all?
Just strap those thingees and record wherever you are and no matter how much space you have?
I assume sitting on chair works fine too?
By mtakerkart - Last Year
toystorylab (10/21/2024)


Interesting! When I got it right, no camera setup needed at all?
Just strap those thingees and record wherever you are and no matter how much space you have?
I assume sitting on chair works fine too?


Indeed there is no need to film. The iPhone is used to record the mocap via Bluetooth,
so if you wear it on you there is no limit to the movement.
In addition I find the hardware very solid and really easy to implement.
There is also a function to block the hip for the sitting position but I did not use it in my test and the result is very acceptable.