|
Author
|
Message
|
|
infoxiasma
|
infoxiasma
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Years Ago
Posts: 5,
Visits: 41
|
Hi all, I wonder if anyone here has used a similar setup to that which I'm intending to buy this week? We want to create avatars which will be exported to be used in Unity. For that side, we're thinking Character Creator 3 Pipeline or CC3 for iClone + 3DXcjange Pipeline. And we'll add in Headshot + Skingen. We also want to motion capture. We will use Leap Motion (actually, IR 170) + ARKit on an ipad Pro + Kinect 2. A bit "domestic" rather than "pro" but this will do the job for 3-6 months and then we're likely to go Qualisys (we will have access to a studio and hardware). On the trial products, I've got Leap Motion and ARKit working together at the same time, and I've got Kinect working but the trial won't let me try all three together. Note that ARKIt and LeapMotion are through Motion Live Plugin whereas the Kinect is through the Kinect Mocap Plug-in. Does anyone foresee a problem in this setup? We will maybe hand-correct animations in Blender, unless Reallusion's suite will do a better job? I say Blender because our animator is very familiar/skilled there but she's happy to learn new things. And what about exporting the animations and characters? Is this everything I'll need to get my animated avatars into Unity and Blender? Thanks all,
|
|
|
|
|
planetstardragon
|
planetstardragon
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Months Ago
Posts: 11.5K,
Visits: 46.0K
|
I'd start small and slowly add things to what I can integrate to my current system, there's a huge eco system of existing mocap that's either free of very inexpensive to acquire and modify. From my own experience in setting up systems in both music and now with my new cg life - there's always new tech coming out and slowly ingesting new stuff to my existing workflow is a more efficient upgrade than stopping work just to learn a new system that may change in a month. I don't even want to think about the money i've spent in gear and software i've hardly used because by the time i learned it something newer and better just came out. it never ends! If you plan to go hard with iclone products - then you want cc3 - for character creation, iclone for all things animation - and 3DX for non standard importing and exporting - by non standard I mean features not found in CC3 - 3DX allows you to do things manually that can't be achieved "Automatically" in cc3 and allows more custom detail production in the items - ie importing sketchup models, and breaking them down into small parts .....or importing some obscure model made in some obscure software with a triple rig that cc3 won't recognize and putting spring effects on it's nose or whatever. from what you are talking about the mocap set up - I'd focus on building a library of stock motions while you wait to have access to the larger better mocap system. My library is currently at around 10k mocaps, there's not much left for me to mocap lol, nor do I plan on learning monkey kung fu and buying a 2k mocap set to do something I can easily keyframe or piece together from my library. I'd rather organize my library and then buy what i need to fill in the blanks after. Building a business these days is hard enough than to waste money on something that will also waste time in learning that will be replaced faster than it took in time for you to learn it lol Good luck in your adventure - sounds like a fun one! my 2 cents.
☯🐉 "To define Tao is to defile it" - Lao Tzu
|
|
|
|
|
Peter (RL)
|
Peter (RL)
Posted 4 Years Ago
|
|
Group: Administrators
Last Active: Last Year
Posts: 23.1K,
Visits: 36.6K
|
Hi... Character Creator 3 Pipeline will allow you to create characters and export them with animations to almost any other software. We have export pre-sets for software like Unity, Unreal, Maya, Max etc or you can choose custom export options if using other software not covered by a pre-set. If you want to create animations for your character and do motion capture then you will also need iClone 7, the Motion LIVE 3D plug-in, and the Gear Profiles for the hardware you plan to use. Apart from the Kinect plug-in all mocap gear profiles work together in Motion LIVE. With Kinect you will have to record your body animation separately as this older device is not supported by the Motion LIVE system. 3DXchange 7 Pipeline is an optional extra and if you need it will depend if you also want to export props and other non-standard characters. It can be a helpful tool to have but if you only want to export characters and animations then you can do all exporting directly from Character Creator 3 Pipeline.
Peter Forum Administrator www.reallusion.com
|
|
|
|
|
infoxiasma
|
infoxiasma
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Years Ago
Posts: 5,
Visits: 41
|
Thank you @planetstardragon.
I think you speak a lot of sense and that route really would be my first choice, but much of the motions I need to capture are simply not in libraries where I can purchase them. Cued Speech and various sign languages (not just American Sign Language), for instance. I would love to be wrong about the availability of off-the-peg mocaps for this! If I am wrong, please show me, I will be extremely grateful.
|
|
|
|
|
infoxiasma
|
infoxiasma
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Years Ago
Posts: 5,
Visits: 41
|
Hi Peter,
Thank you for your guidance. When you say "with Kinect you will have to record your body animation separately as this older device is not supported by the Motion LIVE system." how will this work in practical terms? My expectation was that I would load up iClone 7 and run the Motion LIVE plugin for ARKit and Leap Motion, start it recording and then start and the Kinect Mocap plugin and start Device MoCap recording in iClone. Is this not possible? The trial of Motion Live won't record so I can't see how it operates. I do see that the trial of Device Mocap (Kinect), when I start recording, the rest of iClone is "locked" (the other controls like Motion Live are disabled). Is there no reasonable way to run all three mocaps together (and see it on the avatar)? I need to watch the mocap as the Leap Motion does lose sight quite often and then we'll have to retake.
|
|
|
|
|
planetstardragon
|
planetstardragon
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Months Ago
Posts: 11.5K,
Visits: 46.0K
|
For a specialized collection like that, you may have some luck with contacting universities like Max Plank institute of psycholinguistics https://archive.mpi.nl/mpi/islandora/search/motion%20capture?type=dismax
or the CMU -
Carnegie Mellon University Motion Capture Department http://mocap.cs.cmu.edu/
both of which would have an educational interest in such a collection, some may have an existing collection that they would license to you for a fee and some may give you a chance to pitch your project and maybe even funding for mutual benefit on your project. - at least thats how I would think considering the uniqueness of your project as a selling point.
if you want to rough it, iclone itself has some useful hand puppet tools / plugin that can help make the process faster and easier..... the best feature, for me at least is that iclone gives you tools to draft up quick animation scenes that normally are tedious and time consuming in other software ....so you draft something in iclone fast ...and iron it out in blender. the RL tools have a good grip on taking the "tedious" and "Time consuming out of animation. They give you a chance to really make things really complex too, but being able to draft ideas fast is priceless.
the pose manager is really cool too..
https://marketplace.reallusion.com/pose-manager
☯🐉 "To define Tao is to defile it" - Lao Tzu
|
|
|
|
|
infoxiasma
|
infoxiasma
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Years Ago
Posts: 5,
Visits: 41
|
That does look rather cool, but it means we'll have to animate everything, even if the animation process is significantly improved. And I wonder how it would perform on movements like touching thumb tip to little finger tip or crossing fingers?
Thank you for those links, much appreciated.
|
|
|
|
|
planetstardragon
|
planetstardragon
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Months Ago
Posts: 11.5K,
Visits: 46.0K
|
my pleasure, I enjoy answering questions and helping because I learn more and get more ideas for myself in the process hehe! I suspect the pose manager may be a helpful tool for you if you can make a quick library of preset moves So for example, you grab an idle motion, you tell iclone to keep the rest of the body motion but erase just arms, then you just focus on the arm movements and paste your customized hand gestures from the pose manager plug in. you could even get creative like removing arm motions from a dancing motion and have the character talk while dancing. I had a friend that loved music even though he was deaf, he felt the pulses of music and would love reading the lyrics. the idea would be to make a library of presets you can cut / paste or merge with other existing motions. between small saved clips and the pose manager plug in, you should be able to develop a system that lets you create new unique and complex animations quickly by piecing existing segments together.
☯🐉 "To define Tao is to defile it" - Lao Tzu
|
|
|
|
|
infoxiasma
|
infoxiasma
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Years Ago
Posts: 5,
Visits: 41
|
Thank you again :)
I am now thinking that going with the Perception Neuron V2 (body and gloves) but only using the body (instead of Kinect) should do what we want? It's an extra $2500 I didn't want to spend but it will mean we still get good face and fingers capture and we can experiment with the gloves to see if we can ultimately get something as good/better in there.
This really is to last us 6 months or so until we get (hopefully) unfettered access to a Qualisys rig.
|
|
|
|
|
planetstardragon
|
planetstardragon
Posted 4 Years Ago
|
|
Group: Forum Members
Last Active: 4 Months Ago
Posts: 11.5K,
Visits: 46.0K
|
I don't have that set up so I wouldn't be able to comment, but I do know that capturing small details like fingers can be a challenge for most systems - I'd seek out other users that use the system and ask them, and ask them how much clean up after the capture. - maybe even contact the mocap developer themselves and see if they have any specialized solutions for your needs.
☯🐉 "To define Tao is to defile it" - Lao Tzu
|
|
|
|