just my 2 cents
Many old Icloners have adopted Iclone for its extremely short learning curve. period.
This is the only way for a one man band to make a film that tells a story.
Unreal is like all other professional software. Its learning curve is pharaonic.
It is unsuitable for carrying out a descent project alone in a reasonable timeframe. It is also unsuitable for making films.
It is a video game editor to which have added modules to generate cinematics in video games.
So far the only descent film productions I've seen have been made by video game technicians.
who have asked to produce films with a linear timeline.
From what I've seen of UDK 5, the most useful benefit is that optimization of the created objects will no longer be necessary.
The originals can be used directly. The time savings in production will be tremendous. It certainly won't be easier to use. The new generator of
particle , Niagara, is proof of this. More powerful certainly, but how much more complex to use.
I've been using my free time for several months to learn UDK. I have not yet managed to get out of descent result of what I want.
The tutorials are to die for because you never find suitable content for pure movie production. It's always a mix of video game features
and you have to sort out what you need .
The worst part is that every time I face a problem that cannot be solved. My creations are based on the use of
particles. It's hell to use them in the sequencer. I find no tutorial to publish the particle settings
like for example the color or the velocity for keyframing in the sequencer.
Just this video that explains how to get around some UDK stupidity is proof of that:
The day I would see a course to realize this scene in a sequencer and not in game mode with facial mocap dialogues
so maybe I would think about it more seriously ...
As for the facial mocap, Like Charly Rama I have never succeeded in having the result I am looking for. I bought Faceware
that I definitely gave up and the liveFace has a serious problem of data broadcasting, even with an usb cable.
Broadcasting the data is the worst thing to do. My iPhone captures my face very well but Iclone misses a lot of data.
It's the same when I broadcast with the Axis Neuron software, I have much better results importing the fbx files from Axis to
Iclone. There should be the same function with the live Face. The iphone saves the data then we should have the possibility to export the file
into Iclone.
The new cc3 + is breathtaking.
Iclone's renderer is my favorite because it's unique and broadcast quality. If Reallusion had as much motivation to develop the particle generator to make it as easy as character creation,
I am convinced that we will see some very entertaining productions.
Finally , Thank you very much Bassline trying to motivate us to use UDk.