Profile Picture

Motion Live can not add my 32 Neuron Perception to the gear

Posted By marco_dotzler 5 Years Ago
You don't have permission to rate!

Motion Live can not add my 32 Neuron Perception to the gear

Author
Message
marco_dotzler
marco_dotzler
Posted 5 Years Ago
View Quick Profile
Senior Member

Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 6, Visits: 46
Hello,

I bought the  32 Neuron Perception and the Motion Live.
After setting up the suite and the Perception Neuron software I like to connect it with my Iclone with Motion Live.

I do it like in the tutorials. But if I click on the "+" to add a Gear there is nothing to choose.
Is this a bug or is this my mistake.

I hope you can help me.

Br
Marco

Kelleytoons
This post has been flagged as an answer
Kelleytoons
Posted 5 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)

Group: Forum Members
Last Active: 1 hour ago
Posts: 9.1K, Visits: 21.8K
Did you get the PN plugin for Motion Live?

There are three parts to using the PN -- you must, of course, have the hardware.  You must have the Motion Live software.  And you must have the particular plugin for Motion Live that is for the PN.  This is separate from Motion Live (costs extra) and while it's sometimes bundled with a package you must be sure you have it.  And have installed it.

If you have it (and have installed it) can you tell us how you are running things?  You will still need to run the Axion Neuron software first (and I can provide you a link to the procedure if you need it).



Alienware Aurora R12, Win 10, i9-119000KF, 3.5GHz CPU, 128GB RAM, RTX 3090 (24GB), Samsung 960 Pro 4TB M-2 SSD, TB+ Disk space
Mike "ex-genius" Kelley
marco_dotzler
marco_dotzler
Posted 5 Years Ago
View Quick Profile
Senior Member

Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)Senior Member (340 reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 6, Visits: 46
Thank you for your quick help....
I got the problem i do not have the particular plugin....
Now i am a little bit pissed off... because before I bought the Motion Live and Perception Neuron I asked the Support if I need something else... and they say no!

But thank you very much.
Have a nice day
Kelleytoons
Kelleytoons
Posted 5 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)

Group: Forum Members
Last Active: 1 hour ago
Posts: 9.1K, Visits: 21.8K
You can still use the PN without the plugin, but you can't use Motion Live.  You can capture using the Axis Neuron software and, if you have 3DXChange, you can bring the files into iClone that way.  I did that for a while myself.  There are *some* advantages in doing it that way, but it's also nice to directly capture inside of iClone (which is how I do it now).  Either way you'll be running the Neuron software (you either run it to capture or you run it in the background and feed it into the plugin which then feeds it into Live).

Sorry if you weren't told about the plugin, but RL is pretty good about refunds within two weeks, so if you don't want to buy the plugin you can return Motion Live (again, if you have 3DX you can still use the files you produce).  Or perhaps you can return it and get a bundle price (they often will sell it bundled with the plugin for a discount -- check the store for any sales).



Alienware Aurora R12, Win 10, i9-119000KF, 3.5GHz CPU, 128GB RAM, RTX 3090 (24GB), Samsung 960 Pro 4TB M-2 SSD, TB+ Disk space
Mike "ex-genius" Kelley
Hookflash
Hookflash
Posted 5 Years Ago
View Quick Profile
Veteran Member

Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)

Group: Forum Members
Last Active: 3 Months Ago
Posts: 98, Visits: 2.3K
Kelleytoons (12/17/2019)
...
You can capture using the Axis Neuron software and, if you have 3DXChange, you can bring the files into iClone that way.  I did that for a while myself.  There are *some* advantages in doing it that way,
...


Hey Mike, do you mind elaborating a bit on the advantages mentioned above? I was just about to pull the trigger on the Perception Neuron profile (or "plugin-plugin", as I like to call itWink), but I'll probably hold off if there are compelling reasons to import via 3dxchange (which I already own) instead.

Kelleytoons
Kelleytoons
Posted 5 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)Distinguished Member (35.6K reputation)

Group: Forum Members
Last Active: 1 hour ago
Posts: 9.1K, Visits: 21.8K
As I said, there are pros and cons to both ways.  For the longest time (over a year) I only used 3DX because the plugin was (is?) so damned expensive (I already owned Motion Live for use with Faceware and then Live Face and Leap) but it wasn't until they had a sale of something like half off that I bite the bullet and went for the PN part.

So -- here are some things to consider.  First of all, you have to run Axis in the background and while at first glance that doesn't seem SO bad it's certainly not as transparent a process as it might be for such a $$$ plugin (by contrast the other pieces like even Faceware are about half the cost and give the price of Faceware itself you'd think that it would not be so).  More important, though, you can't just load up your scene and characters and expect to feed it without issues.  Even on my relatively high end machine (it's my secondary machine, with a 1080 8GB NVidia card) I could not load even a small scene with highly detailed characters and have them controlled properly.  I ended up having to use the dummy avatars to get a decent response (otherwise it lagged behind WAY too much).  Now, with perhaps a scene that is bare bones you'd have better luck, but I kind of doubt it.  If you'll note virtually none of the demos are anything other than the character on a bare background or very simplistic scene.

The other thing I find frustrating about using the ML PN plugin is that motions aren't always intuitive or easy to assign properly.  Let me see if I can explain this easily: you have two different options, to either use the PN workspace for your avatar (so if he's centered somewhere that will be where your iClone avatar will "jump" to start off) or you can use the iClone space which kind of offsets or repositions the avatar.  Well and good except if your computer is located one place (as mine is, in the corner) and you want your character to start in a different place (because you don't have the space around the computer you want/need) you will either need someone else to press the spacebar to start capture or perhaps use a wireless keyboard to start where you want to be (even wireless in my house it doesn't work well so I am often frustrated and have to wait for my wife to help).

Not only that, but the idea of looking at yourself while you capture seems tempting at first but is rife with problems -- your hands won't move properly if you are facing the computer until you use the "mirror" option in the plugin, but then your motions are reversed so you had better not plan on walking anywhere without taking this into account.  Plus if you do more than move 5 feet or so you'll be so far from your monitor you can't see it anyway.  I find most of the time it's just better not to worry about what it looks like on screen and just record it and adjust it later -- IOW, one of the prime "advantages" (being able to see yourself react to the scene in real time) really isn't there.

Finally, the last thing you can't do is apply the Axis smoothing algorithm to your motion.  If you've been using the PN for a while you know about this, but capturing inside of iClone you can't do this.  As a rule I don't find this awful as I find the capture inside to be pretty good, but you will no longer have those options.

All that aside, there's something to be said for just capturing inside of iClone and not going through the process of having to translate it via 3DX.  I came *this* close to returning the plugin and then finally thought, "what the hell" and ended up keeping it and I now don't capture any other way.  (But I also have given up trying to watch the capture and just plot it out on the ground the way I was doing in the past prior to having it).  The determining factor for me is this: I capture to pre-recorded dialog and/or music.  To do this with the Axis I play the sound using VLC and then record multiple takes.  But then I have to realign it up in iClone and that's a PITA.  With the plugin I capture the audio on the iClone track itself two or three times, and when I then capture I'm playing back and reacting in Real Time with no need to resync it afterwards.  And that is priceless -- it saves me a ton of time trying to do t-poses or something I can use to sync up the tracks.  However, if you aren't going to do that, or have no need of that approach, I'd think very long about spending the money for the plugin.

Hope this helps.



Alienware Aurora R12, Win 10, i9-119000KF, 3.5GHz CPU, 128GB RAM, RTX 3090 (24GB), Samsung 960 Pro 4TB M-2 SSD, TB+ Disk space
Mike "ex-genius" Kelley
Edited
5 Years Ago by Kelleytoons
Hookflash
Hookflash
Posted 5 Years Ago
View Quick Profile
Veteran Member

Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)

Group: Forum Members
Last Active: 3 Months Ago
Posts: 98, Visits: 2.3K
Thanks for that, Mike! I'm definitely going to have to put the credit card away for now...
Scyra
Scyra
Posted 5 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)

Group: Forum Members
Last Active: 4 Years Ago
Posts: 220, Visits: 2.5K
I find Motion Live to be indespensible for making animations with Perception Neuron. The little Axis Neuron robot does not give you a realistic indication of what your movements will actually look like. You could record a bunch of stuff and find out that it is way off. Motion Live gives you an exact representation. This was how I found out about the Genesis 8 shoulder drooping issue six months ago.

You've read how mocap actors train themselves to walk with their arms held away from the body? That is what Motion Live will teach you to do as you modify your movements based on realtime visual feedback. This will cut down on cleanup work.

Motion Live can be resource intensive, and it is a second realtime data stream on top of Axis Neuron. If you try to do something intense, like a fast spinning kick, it can lag. Even for cases like that where I record in Axis Neuron, the Motion Live rehearsal is always worth it.



CC3 & Daz Tricks | CC3 to Unity workflow

Hookflash
Hookflash
Posted 5 Years Ago
View Quick Profile
Veteran Member

Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)Veteran Member (952 reputation)

Group: Forum Members
Last Active: 3 Months Ago
Posts: 98, Visits: 2.3K
Scyra (12/18/2019)
I find Motion Live to be indespensible for making animations with Perception Neuron. The little Axis Neuron robot does not give you a realistic indication of what your movements will actually look like. You could record a bunch of stuff and find out that it is way off. Motion Live gives you an exact representation. This was how I found out about the Genesis 8 shoulder drooping issue six months ago.

You've read how mocap actors train themselves to walk with their arms held away from the body? That is what Motion Live will teach you to do as you modify your movements based on realtime visual feedback. This will cut down on cleanup work.

Motion Live can be resource intensive, and it is a second realtime data stream on top of Axis Neuron. If you try to do something intense, like a fast spinning kick, it can lag. Even for cases like that where I record in Axis Neuron, the Motion Live rehearsal is always worth it.


Damnit, now I'm reaching for my credit card again! Wink
Re: Shoulder drooping, did they ever get that resolved? Or did you have to come up with your own solution?

Scyra
Scyra
Posted 5 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)Distinguished Member (2.0K reputation)

Group: Forum Members
Last Active: 4 Years Ago
Posts: 220, Visits: 2.5K
Mike posted as I was making my reply, so I just read his. I either use a gaming laptop or a long HDMI cord hooked up to a 50" TV when I use Motion Live with PN. The Clicker device for PN can be fairly useful if you don't have an assistant.



CC3 & Daz Tricks | CC3 to Unity workflow




Reading This Topic