Profile Picture

CC, Unity and SALSA - realtime lipsyncing

Posted By recourseai 4 Years Ago
You don't have permission to rate!
Author
Message
recourseai
recourseai
Posted 4 Years Ago
View Quick Profile
Senior Member

Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)

Group: Forum Members
Last Active: Last Year
Posts: 6, Visits: 102
Hello, 

Apologies if this is the incorrect place to post this, but I'm currently looking into using CC, Unity and SALSA for animated, lipsynced avatars, and crucially, i need the lipsync in realtime. I've managed to export a CC avatar as an FBX and import it into Unity, I've then tried numerous methods of hooking the avatar up to SALSA, the simplest of which has been using the 'oneclick'. However the lipsync quality doesn't seem to be great, I've had to manually tweak each viseme config to get good results. I'm just wondering if:
- there are easier ways of doing what I'm after?
- if I'm doing something wrong in terms of exporting the avatar?
- am I able to use CC's animations for realtime lipsync or is this purely for baking animations before runtime?

Thank you
Dorothy Jean
Dorothy Jean
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)Distinguished Member (4.1K reputation)

Group: Forum Members
Last Active: Last Month
Posts: 266, Visits: 1.1K
I know that if you have iclone and 3d exchange pipeline you can create all the animations and lipsyncing right inside iclone and then export the alembic file over into unity and it will have all the animations including soft cloth.



https://forum.reallusion.com/uploads/images/dbdccbe9-2417-45d2-b87d-7b31.jpg

animagic
animagic
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)

Group: Forum Members
Last Active: 15 hours ago
Posts: 15.7K, Visits: 30.5K
By real-time do you mean to show the lip-sync on the character while speaking? 

It seems that Salsa uses an audio file as input, so it's not really real-time. 

Real real-time is possible with a facial mocap system (Faceware or Live Face).


https://forum.reallusion.com/uploads/images/436b0ffd-1242-44d6-a876-d631.jpg

recourseai
recourseai
Posted 4 Years Ago
View Quick Profile
Senior Member

Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)

Group: Forum Members
Last Active: Last Year
Posts: 6, Visits: 102
I know that if you have iclone and 3d exchange pipeline you can create all the animations and lipsyncing right inside iclone and then export the alembic file over into unity and it will have all the animations including soft cloth.

Sorry, I'm not sure what this means, this is still baking certain mouth animations into the model and then exporting them, right? 


By real-time do you mean to show the lip-sync on the character while speaking? 
It seems that Salsa uses an audio file as input, so it's not really real-time. It seems that Salsa uses an audio file as input, so it's not really real-time. 
Real real-time is possible with a facial mocap system (Faceware or Live Face)Real real-time is possible with a facial mocap system (Faceware or Live Face)


To be clear I don't mean realtime as in, the character needs to speak given live mic input. I mean it needs to be able to do it at runtime having never seen the audio input before. Basically my scenario is, when In Unity I will have audio files/text files with given phrases, and I want to be able to feed these to SALSA, which in turn will animate the model appropriately. The audio/text will be generated dynamically so I can't know what will be said ahead of time. What I'm trying to figure out is is this possible with Iclone's animation? Because as far as I can see it's more designed for prerecording and baking speech animations into the model, rather than being able to adapt to new text.

Thank you both
Dan Miller
Dan Miller
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)

Group: Forum Members
Last Active: 3 days ago
Posts: 365, Visits: 1.8K
olly_767935

SALSA LipSync Suite looks pretty cool. I see it does support CC3 characters and even Playmaker for us non programmer types. If I get the concept it is viseme driven and is simply being fed an audio or text at runtime. It sounds like just a few file name changes on a CC3 character to use in real time. 
Did you reference this page and this video series?
animagic
animagic
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)

Group: Forum Members
Last Active: 15 hours ago
Posts: 15.7K, Visits: 30.5K
Any lip-syncing can be done in iClone, and then you will be able to lip-sync unseen audio. An update has been announced to improve the lip-syncing process.

Someone else will have to answer if that can actually be exported via FBX. I don't use that pipeline.


https://forum.reallusion.com/uploads/images/436b0ffd-1242-44d6-a876-d631.jpg

Dan Miller
Dan Miller
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)

Group: Forum Members
Last Active: 3 days ago
Posts: 365, Visits: 1.8K
animagic (4/8/2020)
Any lip-syncing can be done in iClone, and then you will be able to lip-sync unseen audio. An update has been announced to improve the lip-syncing process.

Someone else will have to answer if that can actually be exported via FBX. I don't use that pipeline.


The more videos I watch on this asset, I'm liking it. I don't think iClone works in this pipeline. From my understanding so far:

1. Export CC3 character for Unity (fbx).
2. Make sure auto script is used. It currently does not support anything newer than Unity 2019.1.
3. Import CC3 character by dragging into the CC_Assets folder.

This can be used right on the Timeline in Unity.

recourseai
recourseai
Posted 4 Years Ago
View Quick Profile
Senior Member

Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)Senior Member (351 reputation)

Group: Forum Members
Last Active: Last Year
Posts: 6, Visits: 102
Thank you for your replies.

I am already using the 'oneclick' scripts from SALSA, and it provides reasonable results. However it's obviously not as good as it would be if I'm doing it directly through IClone. So I was wondering if there's anything I can do to improve it before export, e.g. if I lipsync it in Iclone, whether this would make any difference to how well SALSA performs? My instinct is that it won't, and that at best Lipsyncing in Iclone would result in exported animation clips which SALSA won't work with.
Dan Miller
Dan Miller
Posted 4 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)Distinguished Member (5.1K reputation)

Group: Forum Members
Last Active: 3 days ago
Posts: 365, Visits: 1.8K
olly_767935 (4/8/2020)
Thank you for your replies.

I am already using the 'oneclick' scripts from SALSA, and it provides reasonable results. However it's obviously not as good as it would be if I'm doing it directly through IClone. So I was wondering if there's anything I can do to improve it before export, e.g. if I lipsync it in Iclone, whether this would make any difference to how well SALSA performs? My instinct is that it won't, and that at best Lipsyncing in Iclone would result in exported animation clips which SALSA won't work with.


Ahh, I see what you mean. I really don't know. Maybe one of the more knowledgeable guys will see this and help you out. Too bad the results are not very good, it looked like a great solution to work within Unity. 

outerringz
outerringz
Posted 4 Years Ago
View Quick Profile
New Member

New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)New Member (9 reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 2, Visits: 35
Hi,

This is Michael from Crazy Minnow Studio (SALSA LipSync Suite). Our contact at Reallusion dropped us a line to let us know there was some SALSA discussion going on here. I'll just clarify a few things and if anyone has additional questions you can inquire on our Unity forum. SALSA is a real-time system, meaning it can analyze audio in real-time to perform lip-sync. It doesn't matter if that audio is a prerecorded clip, a live microphone stream, real-time text-to-speech, or VoIP data in a multiplayer game chat. There are many parameters available to adjust and tune your lip-sync results, we've got thorough documentation on the topic at the link below.

https://crazyminnowstudio.com/docs/salsa-lip-sync/modules/salsa/using/

..our Unity forum.
https://forum.unity.com/threads/salsa-lipsync-suite-lip-sync-emote-head-eye-and-eyelid-control-system.242135/page-36

Michael




Reading This Topic