shadybearbklyn
|
shadybearbklyn
Posted 4 Years Ago
|
Group: Forum Members
Last Active: 5 days ago
Posts: 337,
Visits: 2.8K
|
I’m working on a project that has a lot of fast singing. I’m noticing that using auto lip sync with a well recorded solo vocal track is giving me more realistic mouth movements than facial local via Live Face and an IPhone. By tweaking the viseme track manually I seem to have more control over the finished product. I’ve had no luck connecting my IPhone via USB, so I’m not sure if the lag is due to the WIFI connection. Any thoughts on this?
Edited
4 Years Ago by
shadybearbklyn
|
Eric C (RL)
|
Eric C (RL)
Posted 4 Years Ago
|
Group: Administrators
Last Active: 2 days ago
Posts: 552,
Visits: 5.9K
|
Hello @shadybearbklyn, If you are having trouble to connect Live Face, here's the guide for you! https://bit.ly/3cHHLsNWe also encourage our user to connect Ethernet, as it gives better result without delay! https://mocap.reallusion.com/iclone-motion-live-mocap/iphone-live-face.htmlGood Luck!
|
toystorylab
|
toystorylab
Posted 4 Years Ago
|
Group: Forum Members
Last Active: 2 minutes ago
Posts: 4.6K,
Visits: 29.9K
|
So, we would get better results with ethernet compared to USB? As in the guide they talk about USB, not ethernet... And would this be the right product?: https://www.ebay.de/itm/Lightning-zu-RJ-45-Ethernet-LAN-Netzwerk-Kabel-Adapter-fur-iPad-iPhone-XS-XR-8-7/392476783169?hash=item5b61707241:g:944AAOSweahdoZli
Toystorylab on Vimeo : Crassitudes (my "Alter Ego") on Youtube:
|