Perception Neuron V2?


https://forum.reallusion.com/Topic366514.aspx
Print Topic | Close Window

By yoyomaster - 6 Years Ago
Hello everyone!

I was wondering what are the benefits of the Perception Neuron V2 compared to the original, I cant find any real info on that on their site?

Thanks for any help!
By mtakerkart - 6 Years Ago
I don't know if you saw there webinar about the V2



It seems that the wire are more robust and there are no more specification about right and left side. It's only upper or lower.
Software is the same.
By yoyomaster - 6 Years Ago
mtakerkart (4/25/2018)
I don't know if you saw there webinar about the V2



It seems that the wire are more robust and there are no more specification about right and left side. It's only upper or lower.
Software is the same.

Getting the video, thanks, I was hoping for better magnetic insulation, will look at the video and comment if need be, thanks again man!
Looked at Xsens, but too rich for my blood at the moment, so its either Neurone, or a video camera setup!
By mtakerkart - 6 Years Ago
If I had to buy back an another suit , I'll buy again Neuron because with this improved solidity , it's the only affordable price suit with fingers mocap.
Depending of you attempt but for me I've good result. Wink


By yoyomaster - 6 Years Ago
The good things about Xsens is that you dont need to set it up all the time, it is self contained, no problems with magnetic fields, but its expensive, and no fingers as far as I know!
By yoyomaster - 6 Years Ago
Good video, lots of info, it looks quite usable, the shutter is a nice add-on!

Anyone using the Facerig package?
By Jfrog - 6 Years Ago
If I would buy again I would probably chose the HTC vive with the Orion software option. I bought two neuron mocap suit 9 month ago and I have not  been able to use both at the same time yet.  Before Christmas I test both systems individually, than the day after I had two actors and one of the kit arm wasn't working properly so we lost  a few hours trying to change the neurons. But the cable was the problem, then it was the head strap... There is always a problem with one of the suit so I use one suit just for spare part. The support is ok, they wrote me back and I was suppose to send some part back before christmas but I have found the time to nailed down some problem. The thing is it takes huge amount of time to troubleshoot and I don't have this time.  I have two magnetic case but still, I have to use my finger neurons as spare part for the one that stop working.  

The version 1 of the suit is very fragile and can't fit a strong tall man.  I don't want to complain because this was the cheapest option but I would rather pay $500 for the orion every year and do the finger with the motion leap after. The neuron suit works much better when you don't use the fingers.

This is my humble opinion of course.
By yoyomaster - 6 Years Ago
@Jfrog thanks for the review, this is pretty much all I read about it, as for the HTC Vibe, it is the first time I hear about Orion, looks pretty good, 1200$ for both the HTC Vibe and Orion software + 3 trackers, isn't too bad either, will look more into it!

What would you suggest for facial mocap, iClone Faceware, or other solution?
By Jfrog - 6 Years Ago
The faceware plugin is the cheapest solution and it is pretty decent.  I haven't had the chance to tried with the 7.2.1 Iclone version but the problem I have now is the lack of reliability when monitoring the lips from the actor for longer Iclone timeline.  What I mean is even if i have a very fast computer and a great video card, it seems that it can't handle realtime preview of the lips of the actor. When doing a quick preview or test you won't notice it, but if you start working with it, you will see what I mean. 

I tried  working with 50,000 frame session at first, then I had to switch to 25,000 frame session. When I record the first few lines everything is smooth but after 5 or 6 lines the preview screen (lips) become choppy.  If I play back afterward, the recorded lips are ok, but you have to double check what you recorded before recording the next line every time. This is really time consuming and inefficient.
I come from a world of sound ( I recorded a lot of voices for the last 30 years) and I would like to be able to record the facials in realtime simultaneously with the sound (on a second computer) without this half decent workflow of double checking everything afterward. It doesn't look very good when the actor is waiting between every line, it also breaks his mood and rhythm. In front of the client, everything should be snappy, no time to wait.

Sometime for a 5 minutes script I have to clean up and edit 2 hours of recorded audio on my second computer because of all the double checking in Iclone. Perhaps I am asking too much but for me the only viable solution would be to hit record in Iclone and record the facials in realtime for as long as I need and to have a decent lip sync preview. It seems like this can be done with the faceware standalone solution but at a cost...

 I will probably invest in the standalone version of faceware eventually. I never tried it but if you can record realtime with great preview monitoring, it would be a much better solution.



By mtakerkart - 6 Years Ago
What would you suggest for facial mocap, iClone Faceware, or other solution?


I didn't find an easy way to import facial mocap with other App because Iclone does no to allow it. The only way I found is to animate your character in other software then import it
as a prop (not as character) to keep the facial animation. 

I tried  working with 50,000 frame session at first, then I had to switch to 25,000 frame session


Very long recording session for mocap !! Blink  I suggest to record the body only with the Axis open because you will have to "feet contact/clean/smooth" any way inside Axis.
For the facial I really suggest too to record with video directly inside the camera (that's why Faceware use Gopro camera) then you can split it after when using Faceware. It's the way Faceware do.
Audio needs less ressources to record than motion capture. May I suggest too tu use wifi for the body mocap. Video inside camera + wifi body mocap = Character freedom performances.
By Jfrog - 6 Years Ago
Hi Mtakerkart,

Thanks for your input it's always appreciated!

When doing traditional animation, the voice is always recorded before the animation is done. Once the voices are recorded , they use the storyboard to give the pace to the story adding a few seconds for action here and there and pacing the conversation.  Even if we can now have realtime facial recording, many directors still want to direct the voices and get the right intention, inflection, projection. So I alway record the voice (in a second computer running pro-tools)and the facial mocaps only.   When recording voice, the clients like to direct the actor. So we always record multiple takes to please the director or client. So I only capture the facial at the same time, no body mocap.

When I record the body motion,  I recorded directly to the Axis software and monitor thru Iclone for instant feedback. We do the body motion capture while listening to the facial audio track as a guide to adapt the body motions to the script. 

Once a mocap line is recorded, I don't really understand why it should eat up computer resource, specially if you don't play the line, it is just sitting in the timeline.  I think it should be normal to be able to have a 15 minutes Iclone session open and recorde 50 or 60 lines of face ware motion without choking the computer. Once the line is recorded, it could be automatically collected and store if Iclone' s timeline can not handle it.   Just an idea...  

I might be wrong but I think the realtime recording workflow can be done within the stand alone face ware app.

By mtakerkart - 6 Years Ago
I think the difficulty for the hardware is because you stream data in Axis with display computation with Hardisk writing wich stream data to iclone with display computation with hardisk read/write  too.
Loooot of coding to manage in realtime so the "buffer" could work for short cession but disapear for long session. 
I understand the pipeline of recording voice before animation but todayI have the opportunity to bypass this fastidious process with mocap.
Performer are always better when they can act with their body in situ. 
By animagic - 6 Years Ago
There is a limitation in Faceware that was mentioned during the tutorials, where the calibration will be off after a few minutes. So long sessions would not be very realistic, it seems.

I may be simple, but for me a scene never lasts more than one or two minutes (this is for narrative animation), so I don't see the current limits as limitations.
By Jfrog - 6 Years Ago
Loooot of coding to manage in realtime so the "buffer" could work for short cession but disapear for long session.

That make sense. But it would be great if the computer could get his resource back between each recording.

There is a limitation in Faceware that was mentioned during the tutorials, where the calibration will be off after a few minutes. So long sessions would not be very realistic, it seems.There is a limitation in Faceware that was mentioned during the tutorials, where the calibration will be off after a few minutes. So long sessions would not be very realistic, it seems.


You are right animagic. I recalibrate at least 
every 2 or 3 minutes, that's why I would love to be able to skip the playback quality check between each line.  But that does not mean you can't have a 15 or 20 minutes session to record all your lines. Even when working on a 5 minutes script,  we often record several takes per line.  This is just a normal workflow.  





By yoyomaster - 6 Years Ago
animagic (4/27/2018)
There is a limitation in Faceware that was mentioned during the tutorials, where the calibration will be off after a few minutes. So long sessions would not be very realistic, it seems.

I may be simple, but for me a scene never lasts more than one or two minutes (this is for narrative animation), so I don't see the current limits as limitations.

That limitation does not exist in the full Faceware, is it a plugin thing?