The cheapest way to speed up iRay renders.


https://forum.reallusion.com/Topic396751.aspx
Print Topic | Close Window

By jarretttowe - 6 Years Ago
As new people come to the forum, I think this bears repeating. There is a way to speed up your renders considerably, and do it cheaply.
1. Render at half size as lossless png files.
2. Buy Gigapixel AI for 99$.
3. Double size the renders in the Gigapixel batch function.
I have used this program extensively, and for animations, it is great. Of course resolution is not identical to a native render, but it comes in at a 1/4 of the render time.
Also, I have found that simply doubling my model textures works extremely well, too.
By 3d guy1 - 6 Years Ago
Hi,

Im confused.  How does this do video? 
By justaviking - 6 Years Ago
A lot of people render to a sequence of PNG files, and then use a video editing application (NLE = Non-linear editor) to make a video from the PNG files.  That gives you a lot of control over many aspects, including more "video conversion/compression" options.

So you'd render a "half-size" set of PNG files, and use that magical tool up-size them to full resolution, and then put the full-sized images into your NLE.
By animagic - 6 Years Ago
I'm reading that processing an image may take quite some time so it will be interesting to compare rendering at a higher resolution with rendering at a lower resolution and use Gigapixel AI. 

The approach is interesting though. I have tried other applications that claim the same thing, and I have never been impressed.
By jarretttowe - 6 Years Ago
Free demo. Easy to test!
By animagic - 6 Years Ago
jarretttowe (12/7/2018)
Free demo. Easy to test!

I did download the demo, and I am impressed by the quality. It is also quite fast, so it would beat rendering at a large size. Thanks for the tip!
By illusionLAB - 6 Years Ago
If the ultimate goal is to do 'low sample' Iray renders and then upscale with this plugin to get HD video it's worth pointing out that any sort of "smart" process... like Ai Denoising, approaches every frame as new frame - that is, it's unaware of what it did to achieve the 'denoise' or 'upscale' for the previous frame.  When you process a frame sequence with a series of smart filters you also usually create an entirely new fault which only becomes apparent when played as a video.  Smart filters like "NeatVideo" (best denoiser!) have a comprehensive "temporal" section to make the plugin aware of the previous and next frame to minimize the variance you get from processing single frames.  The first 'video' render I did with Blender's cycles and it's denoiser looked great, but had a 'fixed pattern' noise that made the render look like you're watching it through a screen door.  Fortunately, they do have a setting to 'randomize' the noise pattern (side effect of the denoiser... go figure!) making it look more like film grain or video noise... which is less annoying than fixed pattern and also something that NeatVideo can deal with quite well.
By animagic - 6 Years Ago
I have NeatVideo also, so I will see how that works in comparison. Thanks for pointing that out.
By animagic - 6 Years Ago
I found that PNG images in Gigapixel are saved as 48-bit PNG, which makes them huge. I did not find a way to save as 24-bit (8-bit per color). I can convert afterwards, but it is an extra step.
By justaviking - 6 Years Ago
illusionLAB (12/8/2018)
...approaches every frame as new frame - that is, it's unaware of what it did to achieve the 'denoise' or 'upscale' for the previous frame.  When you process a frame sequence with a series of smart filters you also usually create an entirely new fault which only becomes apparent when played as a video.


I have shared the same observation here a few times over the past few months.

Several people have assumed they'd be able to render with fewer Iray "iterations" when they make a video.  I strongly disagree, for the reason described in the quote above.  Noise and grain that might not be distracting in a still image will catch your eye when it changes (flickers) from frame-to-frame.  That became clear to me way back when the Indigo plug-in became available.

So if you are doing 60-second renders in CC3, you might want to prepare for 90-second renders (per frame) in iClone.