Third (last?) test of Orion Software/VIVE Hardware


https://forum.reallusion.com/Topic372387.aspx
Print Topic | Close Window

By Kelleytoons - 6 Years Ago
Yesterday I spent about five minutes capturing (and then a total of less than 20 minutes in iClone) creating this possibly last test (because I can't see not keeping the hardware and buying the software).  I was just fooling around trying to see how long it might take me to generate some crowd type scenes (answer -- ridiculously a small amount of time.  I was actually pretty foolish in that I kept getting up off my sofa and moving down, being a real idiot in not thinking I could just cut the clip into pieces.  Sigh -- I'm old).

I also experimented with "picking up" an object -- I will need a better focus point to do this, as I was a bit off on my second effort around the non-existent table (to be fair, when I made my second pass one of my cats decided it was a good time to visit me.  I can actually tell when she rubs against my foot, as it slips a tiny bit then.  I think I'll need to shut them out when I do critical captures).  And I'm still not doing the hands right -- when I was pantomiming picking the object up I forgot about holding the controller in my offhand correctly, so it looks a bit disjointed.  I have to get used to holding them straighter.  Also, having issues still with the Leap controller.  Sigh -- there really needs to be a countdown before it starts the capture.

Finally, I just substituted Zane in there for the second test.  In retrospect this doesn't work well, as his proportions don't match up with the android.  So the hands slip around a lot (remember -- I didn't edit ANY of the motions.  The hands appear to "slip" but in reality it's just the vase that's moving around in them.  That's the only animation that I actually did on the timeline.  But for Zane I needed to redo that animation because I don't even think his hand pivot point is the same, so when he "picks up" the vase it doesn't look nearly as convincing).

Again, very, very little time spent doing any of this.  To clean up the animation I really should take at least another 10-15 minutes but since I'm not going to use any of this for anything I don't care.




By justaviking - 6 Years Ago
Wow - that was a great test.
Thank you for sharing it.

I find myself thinking old thoughts, and that is about the feasibility of collaborating with other animators to work as a team.  To what extent could we have one person do motion capture for the body, another do the facial animation capture to get the expressions, another focus on the hands., a lead animator to put it all together, and a director to oversee it all.  Nobody would need to buy all the tools.  That remains a dream of mine, to attempt a project like that.  It would also include a set designer, character creation, prop modeling and texturing, lighting, music, sound effects, voices, special effects (like PopcornFX, etc.), compositing, writing, and final editing.  I've made meager attempts in the past.  Maybe it's time to try again, largely as a "test of what's possible" to start with.

But back to KT's test, I thought it was very good.  The picking up of the vase was well done.
By Kelleytoons - 6 Years Ago
Thanks, Dennis.

You probably already know, but anytime you need some mocap done I'd be glad to help.  I'm not so sure I want to commit to a large project per se, but since (now) it's so much fun to do mocap of any of the three kinds (body, face and hands) I would see it as more "play" than work.

Along those same lines (sort of) what occurs to me is sharing mocap because folks move differently.  I can see myself in the movements here clearly, and while I'm an actor (sort of) and can alter my walk and movements (kind of) I doubt whether I can do a female convincingly.  So this weekend I'm going to get my wife to do some movements (watch it, guys!  No rude remarks, please :>) and see how they will differ and be used.  Luckily she's an actress (much more than I'm an actor -- she was the lead in the first musical I directed, which is how we met) and she does like doing stuff like this (so she won't take a lot of convincing, although putting on the apparatus is going to be a bit odd for her).  In any case, I could see people "trading" movements just to get some variety (and luckily these motion files are tiny, so sharing is no problem at all).
By sonic7 - 6 Years Ago

He he .... loved your comments here Mike (about the wife and all) .... very funny.
Glad to hear she's an actress and keen to do this with you! .... :P

Edit:
....  only just noticed this is another video test - hey ---- this is great stuff Mike !!! 
Wow - such potential ! ....
By justaviking - 6 Years Ago
Thanks for the reply, Mike.

Yes, with motion capture files, I think a "team" approach is becoming more feasible since we don't need to struggle with everyone sharing a single iClone project file.

So... you are not going to strap on the harness and do your best "female" strutting around the room?  You could also video it and post it on YouTube.  I could be very funny.. But I also think I would be very concerned about not being able to unsee it.  So yeah, maybe not.
By Kelleytoons - 6 Years Ago
I might mince a bit for a comic effect.  But to be a convincing female?  Not a chance (because male and femaleness is biologically determined).

The nice thing about getting my wife involved is she won't freak out when she sees me doing mocap (after she's experienced it -- so far I've only done it when she isn't home).  I might post one last test shot just to show how much difference there really is (there's a reason Andy S. gets the big bucks -- performing well for mocap is a definite art).
By argus1000 - 6 Years Ago
The motion capture of the body itself is good. Not the hands. Not the picking of the object.  If you do another test, I would appreciate if it'd be more finished, if you can.


By Kelleytoons - 6 Years Ago
As I said, I wasn't holding the controllers properly, but I'm also not going to do much more testing now, since I'm sure I'll keep it.

Picking up an object is really more a question of animating the object and not the hand animation.  You can see this clearly when I switch from the android to the human.  While in neither case is it perfect, it doesn't look at all reasonable with the human even though it's *exactly* the same motions.  The reason it doesn't is the pivot points on the hands aren't right.  To get it correct you would need to adjust that object (particularly the transfer).  I also think that it's possible to get a better "transfer" by doing the old "duplicate the object" trick (where you have one object that sits there and as you transfer control you just hide/unhide the dupes accordingly).  But while I may experiment with that myself it's not really the point of these tests (which were to convince me whether I wanted to keep the hardware and buy the software).

The thing about all of these tests (all three of them) that's the weakest is the fingers -- I'm still having a tough time getting the Leap controller to work with LIVE well.  The biggest issue is it takes time to "setup" so no matter what you do there is a bit of roughness when you first start out.  I suppose that might be possible to edit out but I haven't looked at that at all.  Ideally you'd record some motion with a lead in time (actually, ideally RL would have written the software correctly to have a countdown so you can set it up properly).  Now, the fingers would have added a lot to the object pickup (assuming I could have gotten them right).  I almost think it would have been better just to use static positions -- that's another problem with the LIVE Leap -- you get to thinking you should use it when most of the time you really don't need finger "movement" per se.

I *might* do one more test that has all of it -- the body, the fingers (sigh) and the Faceware.  That's the main reason I didn't use a human in the first two tests or most of the third, because facial expression is also needed to be convincing.  But while I have the time to do it I have no particular inspiration (perhaps I'll think of something tomorrow -- I just came home from tennis and am pretty tired).
By raxel_67 - 6 Years Ago
Hey KT, how much did you spend on the hardware? Cause looking at your tests i get a very bad case of: Gimme gimme gimme i want i want i neeeeeeed thiiiisss!
By sonic7 - 6 Years Ago
I 'suffer' from the same ailment as Raxel ... lol
But in my case the ailment will simply have to go away, because this set-up is not possible for me (financially speaking).
There's *so much* that could be achieved with this .... and so much 'faster'....
Why you could get a serious amount of acting done in a single 'session' - potentially an 'entire' short 'episode' of a series.
All you need now Mike is 'large man cave' for a studio! :)

By Kelleytoons - 6 Years Ago
The hardware cost isn't bad -- even with straps I think the entire cost was around $900 or so (I got it from Amazon).  I *think* you can even do it slightly cheaper if you don't opt for the head tracker (you can use the wired headset instead, but I didn't want to be wired nor deal with that thing on my head.  The head tracker I wear is so light I almost don't notice it).

For most it will be the software that is the deal breaker -- at current exchange rates (this is a Brit company) it's a touch over $500 per year.  IKinema has a history of letting prices go down, however, and for eventually making perpetual licenses, so both of those things are possible.  But for now, at least, we are looking at a payback for the PN at around two years and then this system gets more pricey.  But it wasn't total cost that drove me to this -- I still don't like the look of the PN "suit" and the magnetic interference thing is off-putting to me.  There is literally no calibration needed for this and while Kungfu says he is now capturing in five minutes with his I suspect that clumsy me would take far longer (and he has also spent more money for some kind of magnetic holder case, I think -- so payback might be another additional year).  Plus folks say there is cleanup that must be done with the PN capture and so far I don't think I'm going to do any cleanup with this capture (I am perfecting my capture process to that point).

I also look at this the way I did originally at iPisoft -- I bought into that system thinking that a year or two down the road something better would come along.  It did, but unfortunately it did *right* at the time I spent a lot more money on the iPisoft (bought a ton of brand new Kinects and converters, and paid a three-year license fee for the software).  However, I am way too smart to fall for the sunk cost fallacy (being old has a lot to do with that as well).  In another year or two something far better may well come along and I won't feel too bad about it (I suspect I'll be able to eBay the VR system and still get back most of my money for the hardware -- I hope I can do that with the Kinect, although my big cost is that three year license I haven't yet activated).

And Sonic -- my cave is just fine (it's a spare bedroom, with only computer furniture in it).  But the capture area is such that the dining room works better -- we just need to remove the table (a year ago my wife was after me to do this, so it won't be hard to convince her now).  I have my old computer set up there permanently, and yesterday went ahead and mounted the lighthouses on the walls so now I don't have tripods in the way.
By sonic7 - 6 Years Ago
Sounds great Mike ....
Excellent idea about trying to 'perfect your captures' so that *ideally* no cleanup would be needed.
I reckon that's definitely the way to go. If you're able to 'live monitor' - (or have someone do that), it would mean you could just 'go again' after a bad take. Even if you ended up 'averaging' 2 or 3 takes per required shot, it could still be quicker than manually 'tweaking' things later. - You could get a lot done in a single session.
This is really exciting stuff! - And yeah - sounds like you need the lounge room *Full Time* Mike. I'd give that table the flick ... - do you have a wood fire?  :w00t: 
But anyway, sounds like you've got a good setup ......  :)
By Kelleytoons - 6 Years Ago
That "two or three takes" is right -- I'm kind of doing that now, without even checking the takes.  I just keep recording (one of the bad things about the iPisoft system was that even though it produced really perfect captures -- the cleanup software it uses is just perfect -- because it used two networked computers it was hard to capture more than a minute or two without a failure.  Very, very frustrating).  I just love leaving the capture on and getting three or four (or more) minutes of capture, doing whatever I want.  The way it's set up is that after you make the capture you "record" what you've captured to a BVH or FBX file (I've been using BVH exclusively).  During this record process you can snip and edit whatever you want (you press record while the playback plays, and then stop when you are doing with that particular recording of what you've already captured, if that makes sense).  It's really nice to be able to tailor what you did in that manner.

And, truthfully, I haven't even taken much advantage of it.  These three tests are mostly just me starting the capture process and then stopping it.  I've only edited the beginnings and tails of it, and haven't done any editing at all in the middle.  So that will be my next step. refining the workflow so I can do it as quickly as I need.
By Jesus Martina - 6 Years Ago
Hi, Would you say its better or easier to use than, say, iPisoft or Brekel using the kinect and playstation controllers?
By animagic - 6 Years Ago
Sounds more and more attractive, Mike. I bought the PN plugin for iClone when it was on sale, so the loss is not too bad when I abandon it. I do appreciate your jumping right in and do the research.

I have a question about export/import. Does the solver software have some generic 3D model that it assumes or can it be adjusted? I'm asking because I always thought the motion imports needed to be tailored to the target. (Excuse my inadequate terminology...:unsure: )
By Kelleytoons - 6 Years Ago
Jesus Martina (6/28/2018)
Hi, Would you say its better or easier to use than, say, iPisoft or Brekel using the kinect and playstation controllers?


There's no real comparison using Kinect with either iPisoft or Brekel.  Brekel's Kinect stuff is okay as long as you don't turn around or otherwise occlude limbs (so you can't put your arms behind your body and you must always be facing forwards).  iPisoft with two Kinects can handle occlusion properly, but it also requires two PCs running on a network and it's also finicky -- not awful, but more trouble than I liked.  Plus I kind of hated the calibration process.   I've never used the PSEyes cameras on either, so I can't speak to that but I'd guess it will be similar (camera tracking is always a bit problematic due to turning issues -- again, if you are *careful* and either face forward or turn slowly you can get some excellent captures.  With VIVE you don't have to worry at all).

To put it another way, I spent close to 1K on brand new Kinects and iPisoft software late last year (just before Christmas) and I'm spending more now on this solution.  It's much, much better and easier.

By Kelleytoons - 6 Years Ago
animagic (6/28/2018)
Does the solver software have some generic 3D model that it assumes or can it be adjusted? I'm asking because I always thought the motion imports needed to be tailored to the target. (Excuse my inadequate terminology...:unsure: )


It's real time capture, so whatever is being "solved" is done as you move.  You can adjust the model for your own dimensions, but there wouldn't be any special adjustments for a target character.  Now, they do provide a number of characters for you to map to, so perhaps that answers your question.  They are all pretty standard, though -- and there is some kind of pipeline where you can have them take any custom character you want and put it in (but at a cost).

I don't know if that's a particular issue or not.   Obviously extra long limbs or other body parts would map somewhat differently in any animation, but I assume you could easily adjust that inside of iClone with the edit motion thingee.  Do you have a specific case in mind I could test?  (Or, conversely, I could provide you with a BVH file and you could play yourself.  But they are pretty standard BVH files and I've used such files in the past with even fairly odd and specific characters, like monsters and such, and they work just fine).
By Jesus Martina - 6 Years Ago
Kelleytoons (6/28/2018)
[quote]Jesus Martina (6/28/2018)

To put it another way, I spent close to 1K on brand new Kinects and iPisoft software late last year (just before Christmas) and I'm spending more now on this solution.  It's much, much better and easier.


Whoa! 1K WHAT? Thats expensive,..

Why not just use 1 Kinect? Thats only $39. for the 3 month ipi license. And you can get a used kinect 2 and cord for about $80. ??
By Kelleytoons - 6 Years Ago
The 1 Kinect capture is crap.
By Jesus Martina - 6 Years Ago
Kelleytoons (6/29/2018)
The 1 Kinect capture is crap.


Without risk of being banned or getting in trouble... May I ask Why??  You sound so strongly opinionated...
By Kelleytoons - 6 Years Ago
Have you tried it?  There's a demo, you can see for yourself (the problem with any image based capture is it can't handle any obstruction of the limbs, so if they occlude the body, or each other you lose tracking.  It's a real mess unless you are *very* careful about staying completely forward facing and never cross your legs or arms in any manner.  Multiple camera/imagers can "solve" this by seeing the other sides and understanding what is happening).  

I've bought three different 1 Kinect software programs and all three are practically worthless.  The two camera iPisoft works well, within its' own limitations (it can also get "confused" and the entire process of running networked on two PCs and calibration is a PITA.  And, ultimately, not any cheaper than the VIVE/Orion solution, which is a lot more straightforward).
By animagic - 6 Years Ago
Kelleytoons (6/28/2018)
animagic (6/28/2018)
Does the solver software have some generic 3D model that it assumes or can it be adjusted? I'm asking because I always thought the motion imports needed to be tailored to the target. (Excuse my inadequate terminology...:unsure: )


It's real time capture, so whatever is being "solved" is done as you move.  You can adjust the model for your own dimensions, but there wouldn't be any special adjustments for a target character.  Now, they do provide a number of characters for you to map to, so perhaps that answers your question.  They are all pretty standard, though -- and there is some kind of pipeline where you can have them take any custom character you want and put it in (but at a cost).

I don't know if that's a particular issue or not.   Obviously extra long limbs or other body parts would map somewhat differently in any animation, but I assume you could easily adjust that inside of iClone with the edit motion thingee.  Do you have a specific case in mind I could test?  (Or, conversely, I could provide you with a BVH file and you could play yourself.  But they are pretty standard BVH files and I've used such files in the past with even fairly odd and specific characters, like monsters and such, and they work just fine).

Thanks, Mike. No I don't have a particular use case at the moment. My characters are usually pretty standard-sized. I will not be able to get to this until the fall, but I am interested in further developments.
By argus1000 - 6 Years Ago
I wouldn't touch the one Kinect systems, because of the occlusion problems. I presently have an Ipisoft two Kinect camera system, and it never let me down. True, it doesn't capture the head or hands movements,  but I have Faceware and the L eap Controller for that. I wonder if purchasing the HTC Vive system would be so much better.
By Kelleytoons - 6 Years Ago
The three problems I have with the iPisoft two Kinects are:

1) Networking the two PCs.  Perhaps it's just because one of them that I have to use is an older machine, but from time to time I lose connection and thus the capture fails.  It's also a lot more equipment that takes up a lot more room than the VIVE hardware.

2) I HATE the calibration.  It takes too long and is annoying in the extreme.  With VIVE I'm ready to go in about 10 seconds (most of which is my PC booting up off the SSD).  After that I'm capturing and the capture is finished with no need to refine or smooth it out like with iPisoft (which, to be fair, is fast but not as fast as not doing it at all).

3) Too small a space to use it in.  Even in the area I can use my VIVE system in now, the capture for the Kinects is about 50% smaller (due to you not being able to get that close to them).  So I can't do nearly as much as I can do with the VIVE system.

Now, I have three more years on an iPisoft Pro license I haven't even activated yet, as well as four or five brand new Kinects in a box, so I've still got some money tied up in the system.  But even with that the VIVE is so much more of a pleasure to use.  But the iPisoft two Kinect is a very good system, no question about it (I have half a mind to just permanently setup the Kinect cameras mounted to the walls in my PC room so I don't ever need to recalibrate, and use it in there just for seated captures.  That would also help with my network issues, since the PCs in that room are my main one and my much newer laptop -- I think I'm talking myself into have two completely different mocap rooms.  Nice).
By argus1000 - 6 Years Ago
I don't have problems networking 2 Pcs. I have the Kinect 1 system. I need only 1 PC. I don't have to pay annual fees either. And I'm very satisfied with it. The body movements are very smooth. It's not a real time system, but it makes up for it in reliability and precision.

Calibration is not a problem. I have my 2 Kinects installed permanetly and ready to go at all times.

True, the capturing space is very limited. I wish I had more space.


By Kelleytoons - 6 Years Ago
Not sure you what mean about only needing 1 PC but you have the 2 PC Kinect system (because you said you wouldn't touch the one Kinect PC systems).  You can't run two Kinects off 1 PC -- it's a Microsoft issue.
By argus1000 - 6 Years Ago
Oh yes, you can. I have 2 Kinect cameras wired to one PC. Before Kinect V2 for Windows appeared, which needs 2 Pcs, there was Kinect for Xbox One, which needs only one PC even if you have 2 Kinect for Xbox One cameras. Kinect V2 for Windows is more advanced. Google it.