More info is needed. Exactly what problems are you having? Reallusion definitely has its own ideas about incorporating ARKit into its characters.
For starters, they randomly decided to internally rename the standard ARKit expressions, which makes retargeting an exported character in a 3rd party program a big pain in the ass. The CC4 Extended facial profile has lots of additional uniquely-named expressions beyond the standard 54 ARKit expressions.
I wish Reallusion would provide a Facial Profile template which ONLY has the standard ARKit expressions named correctly, but you can create your own 'Custom' set in the Facial Profile Editor>Edit Expressions in about half an hour & save as a custom profile. Start by loading the CC3+ Traditional facial profile which will load ARKit expressions into a 'Custom' set within the Facial Profile Editor (although you'll have to rename them individually) and as an extra step, deactivate all the Traditional expressions unless you want them later.
The *biggest* problem is that CC4 expressions use a combination of mesh deformation (otherwise known as a blend shape) AND bone/joint transformation with the 'Eyes Look Up/Down/In/Out' and the 'Jaw Open/Mouth Open' expressions.
For example, in CC4 & iClone, the 'Eye Look Up' expression slider works as expected: the eyeball rotates upward and the eyelid/eye muscle mesh deforms accordingly. However, when you export the character to use in another 3D program, the eyeball motion is NOT included in the blendshape action, because CC4 uses joint rotation for the eyeball. The eyelid facial geometry will 'look up' but the eyeball geometry remains fixed.
So obviously, when using the iPhone for face capture, it doesn't know about bones or joints, it only cares about facial geometry. You cannot retarget eyeball transforms to an eye joint because that data doesn't exist within ARKit. (It depends on which ARKit app you use, but I think most of them output eyeball tracking motion as part of the blendshape rather than as rotational data.)
Reallusion hasn't offered any solution for this, which is profoundly weird, because eye tracking is the most important aspect of face capture besides mouth & lip synch.
For the jaw/mouth expression, which also uses a combo of mesh and bone, fortunately there is a solution. When you export the character as .fbx, in the options panel (click the gear icon at the top of the Export FBX window to open) you will see a checkbox "Mouth Open As Morph" which will create a new blendshape called "Mouth_Merged" or something like that, which you can set as the target for ARKit's "jawOpen" shape.
I don't know if this helps solve your problems without knowing more. I've been trying to figure out a good pipeline for ages, so if anyone has any tips or workarounds, I'd appreciate it.