|
sencorpx
|
sencorpx
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 28,
Visits: 74
|
I have a couple of questions if anyone has an answer? I recently picked up the xbox one Kinect plugin. After a few days of research and testing and asking questions, I have it up and running. Applying motion data right to a character that has already been loaded with both voice script, and facial animation is fantastic. In the time of doing the research I had found that ipi also had the ability to use xbox 360 as well as xbox one sensors. And you could set up several motion capture sensors to get a higher accuracy tracking. It would capture fingers and even face expressions. This being said, The steps to get captures in that software then loaded into Iclone takes a bit of work and crossing platforms such as blender.
What is the path moving forward with the Iclone plugin? Will there be options to add more sensors? Capture more movement like fingers for example? The downfall of the plugin is that it only captures one view. The outcome is very glitch filled with small jerks and twitches in the arms , wrists, ect. But is there a way to smooth those glitches?
Thanks in advance for the read.
|
|
Kelleytoons
|
Kelleytoons
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 9.2K,
Visits: 22.1K
|
While it's *possible* RL will answer you I'm guessing not. In any case, there won't be any more development of the Kinect plugin. I know I've read this on the forum at least a few times (but Peter can chime in to confirm). If you are totally sold on that route you will either have to accept the limitations it has or try and use other outside solutions (like iPisoft). However (and I don't want you to shoot the messenger - this is just an observation born of experience and like most such the truth it contains will almost surely be dismissed until you find out the hard way) eventually you will move to a physical mocap solution. It just ALWAYS happens. Optical solutions have limitations that will never be overcome, and the more you expect out of your mocap (and you WILL expect more) the more you will discover you need to move on. My best advice is to NOT invest too much money in any other solution until then (so stay with what you have now and when it's time to move up you will know it).
Alienware Aurora R16, Win 11, i9-149000KF, 3.20GHz CPU, 64GB RAM, RTX 4090 (24GB), Samsung 870 Pro 8TB, Gen3 MVNe M-2 SSD, 4TBx2, 39" Alienware Widescreen Monitor Mike "ex-genius" Kelley
|
|
sencorpx
|
sencorpx
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 28,
Visits: 74
|
Oh for sure. A pro motion capture system is always the best way to go. Much of my animation is done in small settings that for now is not to bad using the one sensor. (As you pointed out, limitations.) I actually made the mistake with the sensors and purchased the 360's over the one. It was not a total loss turns out. I have been able to turn the three 360's into a great 3D scanner for 3D printing props for both the real world, and importing into 3D animations.
That being said, my second question. Any advice on smoothing some of the movements? Scrubbing them? I have downloaded the curve editor but have yet to find a good tutorial that could help explain just how to use it, with a slightly shaky mocap from the plugin.
Cheers.
|
|
Kelleytoons
|
Kelleytoons
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 9.2K,
Visits: 22.1K
|
Here's one of the good ones: https://manual.reallusion.com/Curve_Editor_Plug_in/ENU/Default.htm#Curve_Editor_1/Curve_Editor/Application_Mocap_Data_Cleaning.htm%3FTocPath%3DApplication%2520Examples%7C_____4
There's also a way of just compressing and expanding the timeline but at the moment I've had a *bit* too much alcohol to explain further. If you remind me tomorrow I'll lay it out for you.
Alienware Aurora R16, Win 11, i9-149000KF, 3.20GHz CPU, 64GB RAM, RTX 4090 (24GB), Samsung 870 Pro 8TB, Gen3 MVNe M-2 SSD, 4TBx2, 39" Alienware Widescreen Monitor Mike "ex-genius" Kelley
|