Hi Mark (and all),
ok, as per your request, I will go into ALL the details... it's going to be a long post.
Hope not too boring and trivial...
Here we go: "just" 10 steps...
Step 1: composing and playing the music part
It took more than a while to create the guitar arrangement & learn to play it...
The most difficult part was to synchronize the percussion effects done with the right hand with the notes played by the left hand.
One part is quite tricky because the left hand pinkie shall pluck the open E string while the right hand hit the guitar body.
Step 2: Shooting the video
Better to not count how many attempts shall I usually make before something good comes out...
... but I got some relief when I read that also the Pros cannot make it at the 1st shot
For the green screen, I bought something online, nothing super-pro, enough cheap, but still ok.
I sat at about 1.5 from the green screen and I used 5 lights:
- 2 behind me, pointing 45 degrees towards the green screen in order to make the color of the screen uniform
- 1 behind me, pointing to my back, in order to get some separation and deepness from the green screen
- 2 in front of me, one pointing over my face, another a bit down, to kill some shadows
Note: in reality the lights should be set according to the final 3D scene... but I recorded the video 1.5 years ago without even knowing whether or how I could make it in iClone...
The camera was set in vertical position (so that I look like laying horizontal when played on the monitor) in order to maximize the area that I cover.
The audio was recorded on a Zoom H4 digital recorder, one channel fed by a Rode microphone, and the other by the guitar pickup.
Step 3: Pre-processing the audio and syncing with the video
I mixed and blended the 2 audio channels with Sony Soundforge, adding some equalization and reverb.
Then I synced them with the video in the NLE, which is Sony Vegas Movie Studio 13 Platinum Suite.
The sync was pretty much done manually, by listening to the camera and Zoom tracks together, and shifting the tracks till they sound the same.
Note: the internal time bases of my camera and the Zoom have some kind of slight shift, so usually I need to have the audio a bit ahead of the video at the beginning so that it is slightly late at the end and perfectly synced in the middle.
Another problem is that my camera is quite old and shoots in interlaced mode.
Therefore, I needed to do one video pre-processing step to get it de-interlaced.
I used Handbrake with the following settings.
Picture:
Width = 1920
Height = 1080
Anamorphic = None
Modulus = 2
Cropping = Automatic
Filters:
Detelecine = Off
Deinterlace = Deinterlace
Method = Slower
Denoise = Off
Deblock = off
Video:
Codec = H.264 (x264)
Framerate = 30fps
X264 Preset = Placebo
Fast Decode = unchecked
H.264 Profile = Main
H.264 Level = 4.0
Quality = Constant Quality
Another problem was that the original video was in PAL, while iCLone has a NTSC-like internal engine: in order to avoid trouble with the sync, I used Handbrake also to re-sampling to 30fps.
Handbrake works quite well for this purpose!!
Once the audio and video were synced, I was finally ready for the chroma keying...
Step 4: Chroma keying
This was the very first source of problem, because I took all the videos sitting on a horrible IKEA white stool, which then shall be cut out. Not easy because my legs moved over the stool white legs of the stool...
And I already recorded more than other 60 videos with this stool...
For the future, I bought a beautiful green new chair, which I will be able to cut out very easily.
To apply the chroma keying, I had different filters/solutions:
- Sony Chroma key (standard effect of Vegas Movie studio)
- Boris (in bundle with Vegas Movie Studio 13)
- NewBlueFx Chroma Key pro
- HitFilm Express Chroma Keys ( last year HitFilm Express was offered for free!!
)
I experimented a lot, really a lot... trying all the possible solution.
For this video, I ended up using the Boris suite, which is quite comprehensive and helped me to cut out the stool.
In fact, to delete this f..ing stool, I had to apply multiple and different chroma keys filters over parts of the footage.
And here the Boris suite came really in handy, because it lets you choose where to apply the chroma key, so that some part of the footage is keyed while other are not.
By using some light grey keying on dedicated areas I could cut out the stool.
Boris suite also has some nice filters to rebuild the matte: very useful when i had to cut out the stool and this created some damages in the matte.
I also used some static masks, wherever I know that I would not have penetrated it while playing: this helps to narrow down the green color uniformity.
Some more chroma-keying tricks:
1. use the Chroma Blur (Sony VMS has it as a standard filter), in order to avoid the typical zig-zag pixel pattern on the edges (e.g. the edges of the body of the guitar).
2. apply a bit of Sharpness to the video in order to have a better definition (I used NewBlueFX Sharpen filter from VideoEssential pack)
3. Re-touch a bit the color of the subject in order to give a more "alive" appeareance (I used NewBlueFX Color Fixer Plus filter from VideoEssential pack)
Once the chroma key was ok, I rendered the video 2 times:
- one was the subject (me) plus the transparent background. This is going to be applied as Diffuse in iClone
- another was a black-white matte, which is to be applied as Opacity in iClone
To create the B&W matte I just used the Show Matte options of Boris filters.
In case the B&W would have inverted respect to what iClone need, the solution would have been just as simple as applying Sony Invert filter available within VMS 13.
Both the Diffuse and opacity videos were rendered as AVI with these settings:
HD 1080-50i, using Sony YUV codec. OpenDML compatible.
Video: 29.970 fps, 1920x1080 Progressive, RGB
Pixel Aspect Ratio: 1.000
Note 1: I did not render any audio because I do not need it in iClone (I can get the idea of what is the music from the movement of the fingers)!
Note 2: I rendered at NTSC frame rate because IC6 has as internal engine of 60fps. This guarantees no sync issue (not sure if it is still ok when working in PAL, but there is no reason to do this for me... if I want PAL, I can do it in the final rendering in VMS, where I assemble the iClone camera videos)
Note 3: I rendered and imported into iClone the whole music video.
The key is to re-generate from iClone a 3D CGA video which shall have the same length and Frame-per-Second as the source video given to iClone!
Note 4: the generated file are massive! 69 (nice number...
) GB each!
I need to figure out how to backup all of these files once I will have done many other videos... luckily the drop in price of the HD is faster than my animation speed & capability, so I have hope
Step 5: Finally into iClone!!
Here I did exactly the opposite of what is suggested by many people around here: I built a huge set!!
Why this ? Because for this type of video I do not have to care about different scenes: mostly it is "just" a job of different camera angles around one single scene.
Also, I wanted to have this fade-out effect at the end of the video and a panorama at the beginning and I wanted it to be consistent with the rest of the close up scenes.
So, I thought as the whole set first and played with the camera later.
To build the set, I did not pursue any specific method: just following my imagination and trusting my eyes
To import the footage, I followed the suggestion that I got time ago here.
I think it was you, Mark, who explained me how to import the videos preserving the proportions
In summary, on behalf of others, CTRL-dragged the Diffuse videos from Windows Explore into iClone and then applied the Opacity channel.
The operation perfectly preserves the proportions.
Then, I just rotated by 90 degrees the plane where the video is applied (remember that I took the video with the camera in vertical position)
I added a bit of self-illumination on me, in order to get some more brightness and definition.
For the project organization, I basically used just 2 projects: the first one is the Heaven and the 2nd one is the Hell, where I manually replaced all the Heaven trees with a winter tree, exactly at the same position (quite a job...).
Note: both the projects contains the full video of the song so they have the same length (1800 frames), but the Heaven project has animations from frame 1 till frame 12520 (counted as 60fps), while the Hell project has animation and characters starting from frame 11210 (counted as 60fps) till frame 18000.
390 frames is the overlapping space of the projects when applied in the NLE.
I used N different cameras, placed at frame multiple of 30 ( guess why ?
)
Then I set the Camera Switch track, so that I could play the whole project getting a feeling of how the camera would have moved.
I placed a Flag to the start of each camera, naming it Camera 1, 2, etc.
This helped to move quickly around!
Finally I decided to have a fixed transition length of 200 frames for all the camera from camera N to the next camera N+1.
Very important: I wrote on a Notepad the start and end frame number of each cameras!
I did so many rendering attempts... and then needed to come back, modify something, render again... by writing the camera start/end frame helps...
It is also useful for the final assembling and audio-video sync in the NLE... more on this later on.
Step 6: The animation
Really a pain in the xxx
Surely I need to improve and master more techniques to become faster and more effective.
The most difficult part was the connection of the animation clips for Adam & Eva.
I can't really say I learned how to do in a smooth / rational way... I used a lot of do&redo&redo&redo attempts.
Next time I need a more "scientific" approach
Anyway, one technique that I learned, which is useful for the feet and character sliding, is to work out the character animations using the Bottom View of the Preview Camera.
Once the character is shown from the bottom, it is really easier to align the feet or the character to the right position and the result is quite good when reverted to a normal view.
I wonder why the Bottom View camera has no shortcut ("B" would be the ideal one): this way
the whole editing would be much faster!
For the butterflies and dragonfly I used several paths.
For the spider falling down, I created a dummy Static plane, acting like a sliding and another one the ground, then attached the spider to a dummy ball and let it fall down rotating along the planes.
The rotating axe hitting the zombie was just a simple physic applied with initial speed and rotation.
The speaking of Eva follows an invented language: this time the phoneme syncing was quite easy
I basically used no lights for the Heaven and the iClone Horror scene light preset for the Hell (which is green, but later on I will explain how this turned out to be blue).
Finally, I used some Reverse animation clips: the lion standing up, the zombies coming out from the ground, etc.
I attached all the props to one dummy, in order to be able to make them visible or invisible with one click: I wish I could do the same for the trees, instead of selecting all of them (which is still working fine but not as fast as one single click).
Step 7: The rendering
I rendered as AVI Uncompressed, at 30fps, maximum resolution settings.
For this video, I did not use Indigo Plug-In.
Maybe in future, I will experiment to render everything again with Indigo and check the difference, but now I can't block the PC for so long time...
Each camera used in the 2 projects generated one file, so, for example: camera 1 generated a file starting at frame 1 and ending 200 frames after the start of camera 2 (remembered I left a transition time of 200 frames between each cameras).
I used 12 cameras, so I created 12 file.. again a lot of GB that now I do not know where to back-up
Step 8: Assembling the rendered videos in the NLE
By knowing the exact start of each camera ( that's why I wrote them in the Notepad
) I could easily sync them.
The NLE is still Sony VMS 13.
The original audio (remember that iClone rendered videos had no sound) was placed at frame 1, the camera 1 also at frame 1.
Then I placed the other cameras at the exact frame number they were supposed to be, as written in the Notepad notes.
I used some VMS Marks to help me to visualize where are the camera start frames.
Result: the video and audio was perfectly in sync!!
Thumbs up for Reallusion: I was not sure this worked out
Step 9: The post production
To create a more trust-able integration between me and the 3D scene, I used some NewBlueFX and HitFilm effect, adjusting the whole video color so that it turned into something more uniform.
I like the HitFilm Temperature filter (in bundle with VMS 13): I used it to turn the Hell into this blue, cold color, but still preserving the original spectral contrast given by the iClone Horror preset.
Once the main video was done, I added the leading and trailing parts: just a simple project in iClone where I used some of the Motion Montage pack assets and animations.
Note: by using the Ripple feature I could easily move all the track and audio in sync, creating the necessary room for the leading part.
For the titles I used NewBlueFX Titler 3 Pro, which I could buy some time ago after an offer from NewBlueFX.
Step 10: Publishing
I used You Tube and then embedded the video here (just copy paste the YouTube embedding code here, before or after or within the post text).
When I have time I want to experiment with the 3D Stereoscopic version, but I need to wait for Reallusion to fix the issue with the number of frames when the 3D Stereoscopic option is turned on.
That's it.
Hope it was interesting and useful.
If you survived to this long post and something is not clear or you want to ask some questions, just feel free to ask
Cheers
Roberto
My PC:
OS: Windows 10 Pro English 64-bit / CPU: Intel i7-9700 3.6GHz / MB: ASUS ROG Strix Z390 RAM: 32GB DDR4 2.6GHz / HD: 2TB+3TB / SSD: 2x512GB Samsung 860 EVO + 1x2TB Samsung
VB: Palit GTX2080 TI GamingPro 11GB / AB: embedded in the MB and VB (audio from the MOTU M4 I/F) / DirectX: 12
Edited
9 Years Ago by
RobertoColombo