Hi
I too am migrating to a new pipeline where Iclone will be the primary Character motionbuilding tool and Blender 2.81 EEVEE will be the final rendering environment .
I will be following this thread with interest.
Sadly I have no solution to your specific Dilemma, as I am using Daz figures with retargeted Iclone motion and exporting them from Daz studio to Blender as pre-animated FBX files.
(Sample test renders)
While there are some lingering issues with the varying
quality of the lipsync , I am certain they can be addressed on the Daz studio side of the pipeline.
However even when you( hopefully) get your Blender native Armatures manually mapped to accept Imotion data in 3DX, the challenge will be getting that Iclone animated figure ( or its Motion Data), back into blender for rendering in some repeatable fashion.
My Iclone animated Daz Characters export to blender fine.
However I am desperately seeking a means to Apply NEW motion data to them in Blender without have to completely re-export the same Character from Daz studio as FBX and redo all of my BSDF shaders each time.
Blender has a (highly undesirable) manual retargeting method for body motion but We really need a means to
retarget externally created bone and blend shaped based animation in Blender for body & facial animation.
Also ,before anyone suggests Point cache methods like Alembic or .obj /MDD....
My early tests have resulted in Blender crashing immediatley with Alembic files beyond a certain size limit and frame number From Iclone.
And while .obj/MDD from Daz studio technically works fine in Blender 2.81
Sadly Blender loads all of the MDD data directly into the .blend file instead of "streaming" it externally as Lightwave3D does.
This results in an unacceptably Huge Slow down in blenders performance
when a Massive .MDDanimation data file is applied to a character mesh in Blender.
I hope we can get the retargeting issue resolved in blender and please report any breakthroughs you have with your pipeline.