Viewport size determines render quality?


https://forum.reallusion.com/Topic358593.aspx
Print Topic | Close Window

By Firepro - 7 Years Ago
When doing a still image render with a large Viewport on screen the render takes just a few seconds and comes out poorly.  If I make the Viewport very small the render takes a long time and the results are much better.  Is there an explanation/reason for this? 

Thanks!
(Win10-64bit, i7,32GB,GTX1080)

By Rampa - 7 Years Ago
There is not an explanation, but make the most of it. Higher quality is nice. It'll probably give you stronger DOF as well. :)
By Firepro - 7 Years Ago
It is not really higher quality that one gets, but instead it is the quality that one would expect to get when choosing the render settings.  If you don't make the viewport really small the hirez final render setting you set are ignored.  Hard to believe RL has not addressed this.
By 4u2ges - 7 Years Ago
Firepro (3/7/2018)
It is not really higher quality that one gets, but instead it is the quality that one would expect to get when choosing the render settings.  If you don't make the viewport really small the hirez final render setting you set are ignored.  Hard to believe RL has not addressed this.


What hirez final render settings are you referring to? There is only supersampling, which works lot better with large viewport render.
DOF and blurring might have better quality with small viewport, but again you can tweak it (now that we have lot of settings to control it) to make it work better with large viewport. I did not play with it much. Just need to experiment.
Not to mention that small viewport greatly slows down the render process. I only use small viewport render on limited number of frames where I have problem with opacity artifacts (mostly with hair).
Other than that.. I hide taskbar, hit Ctrl+7, open render window, then detach it to make viewport cover the entire screen and then start render.

Here is an 8 seconds test video from my current project.
First 4 seconds rendered with fairly small viewport. Speed is 0.2 fps. Look how "zigzaggy" and flashy glowing edges are.
Next 4 seconds - same frames rendered with largest possible viewport. Speed is 1 fps. Not perfect but glowing edges are whole lot better as camera moves around.

One more tip. With long and GI "heavy" scene, turn the Auxiliary light on during render. It does not influence the final render, but it does speed up the initial "scene for suppressing shadow" thing (still do not know just that the hell does it do :) )



By GOETZIWOOD STUDIOS - 7 Years Ago
That's an interesting issue you have exposed here.
This definitely needs to be investigated thoroughly, and we need more precise info than just "small" or *large* viewports.
What size is your "hirez final render" ?

Is this a progressive issue ? Meaning does the quality change progressively depending on the viewport size ? Or is there a viewport size threshold from which the quality change abruptly ?

What if you render your final images at a size higher than your desktop resolution (for instance if your desktop is 2560x1440 then render at UHD). In this case does the quality still change depending on your viewport size ?
By 4u2ges - 7 Years Ago
IMO, the answer/explanation of the "phenomena" should have been given by RL a long time ago, since it was first discovered in iClone 6.
So that we do not have to waste time experimenting and guessing. But RL keeps quiet about this.

The fairly small window I was referring to: ~500x300 px. I never fully tested anything smaller than that. And the reason being - render time is growing with geometric progression.
Dragging the timeline up with the above project and making viewport ~150x85, sets sender time to 1 frame per 3 minutes.
I have no patience to even render 120 frames to see the quality outcome (which I am sure would be terrible). I just had to kill the project after 3 frames :)
Now for large viewport, I only rendered at 1920x1080, which matches my current screen resolution. And I never tested anything beyond that as I had no needs... up until now...

Thank you for suggesting the higher screen resolution! I just set it to 4k, got stronger glasses (because I could barely see anything) and rendered 2k with the same 120 frames.
Outcome: 1.5 fps (which I do not care that much about). But the quality is now nearly perfect. There almost no flickering or pixilation on glowing edges.
It's like an anti-aliasing was set to it's max all of the sudden. So I have uploaded a new test video adding a 2k render on a full 4k viewport.



I have to render the rest of the video with 4k viewport though to get a full picture of the outcome. As I am afraid I might get unexpected results with some other aspects.

By GOETZIWOOD STUDIOS - 7 Years Ago
Arf', your last sentence make me think I've been tricked again by english wordings.

Don't tel me you are using the word "viewport" to describe your final render resolution/image ?
By Rogue Anime - 7 Years Ago
Ahh - I see the conundrum here!  Problem is in the wording. It's BACKWARD! I
t's RENDER SIZE DETERMINES 'VIEWPORT SIZE' - Set your render size, and your will see your 'viewport' or 'screen' will change to match the proportions you have chosen under 'Render Size' Does this answer your question?  ~V~
By 4u2ges - 7 Years Ago
No, no.. I am still going to render my video at 1920x1080 px... but stretched to full 3840x2160 computer screen resolution in viewport. That is where I get the best testing quality result.
By GOETZIWOOD STUDIOS - 7 Years Ago
To me the viewport is the embedded window in which you work your scene. I thought the issue was that the quality of the rendering depend on the original size of that viewport ^^ I hope not !
Then there is the rendering window showing the final image. Never use the word "viewport" to name the final image rendering windows! It is so confusing :)
But perhaps I still don't get it..
By Kelleytoons - 7 Years Ago
Guy,

No, that's EXACTLY what is means -- apparently there is some kind of bug where the viewport determines things in rendering.  I know this for a fact, because when I have the viewport set to Quick View (which should not affect the render at all) it will crash my system on some renders, whereas with the Custom or High view it does not.  The view quality should NEVER cause a difference in renders, but it apparently does, as does the actual size of the screen.
By Rogue Anime - 7 Years Ago
grabiller (3/10/2018)
To me the viewport is the embedded window in which you work your scene. I thought the issue was that the quality of the rendering depend on the original size of that viewport ^^ I hope not !
Then there is the rendering window showing the final image. Never use the word "viewport" to name the final image rendering windows! It is so confusing :)
But perhaps I still don't get it..

@grabillar - You 'get it' - it's just that the word 'view port' threw everyone off! ~V~
By GOETZIWOOD STUDIOS - 7 Years Ago
Kelleytoons (3/10/2018)
Guy,

No, that's EXACTLY what is means -- apparently there is some kind of bug where the viewport determines things in rendering.  I know this for a fact, because when I have the viewport set to Quick View (which should not affect the render at all) it will crash my system on some renders, whereas with the Custom or High view it does not.  The view quality should NEVER cause a difference in renders, but it apparently does, as does the actual size of the screen.

Woa, I see, I never noticed that because my viewport is always the same size, I never change it relatively to my desktop view.
So yes this is an awful bug.

What comes to my mind immediately is that perhaps, depending on the size difference between the "viewport" and the final rendering resolution, iClone decides - for memory reason - to reuse or not the current GPU framebuffer for final rendring.
Full HD is often of lower resolution than nowadays desktop resolutions so it is possible iClone does not "reset", the framebuffer properly for final rendering.

On my side I always render higher than my desktop resolution so perhaps that why I've never noticed that issue.
By justaviking - 7 Years Ago
4u2ges (3/10/2018)
IMO, the answer/explanation of the "phenomena" should have been given by RL a long time ago, since it was first discovered in iClone 6.
So that we do not have to waste time experimenting and guessing. But RL keeps quiet about this.


So very true!

I  could complain (AGAIN) about the lack of communication with the development staff (even indirectly) when we have some very legitimate, thoroughly-vetted questions like this.  We should not have to figure this out by observation, trial-and-error, and other indirect means.  They have access to the code and the development tools, we do not.

And more than anything else, this makes no sense at all, and should have been fixed a long time ago.



Making your iClone "application window" small - which results in a smaller viewport - affects rendering speed, and various quality issues such as Depth of Field.  Some of these things have been fixed, or at least improved, but it's still ridiculous that us users should be impacted by something as odd as that.


By 4u2ges - 7 Years Ago
@justaviking
Sadly, I don't think this would ever gets fixed, or properly explained. Just found in the official manual something that relates to the issue (it was apparently added for IC v6.5):
http://manual.reallusion.com/iClone_7/ENU/Pro/Default.htm#iClone_7/Pro_7.0/12_Export/WYSIWYG.htm%3FTocPath%3DRendering%7C_____2

@Kelleytoons
Yes, I quit rendering from quick mode. It most likely crashes because before "final render" begins, IC converts the scene to "High" mode and at the end of the render back to "Quick" mode. So that it can use viewport as a source for building frames.
For "preview render" it just grabs the current screen with whatever mode you have currently set.



UPDATE: I got much better and faster results rendering 2k in 4k screen resolution mode if I turn super-sampling OFF. When it's ON for some reason I am getting an overwhelming (uncontrollable) amount of glares (although the scale is set to minimum). I suppose there is no need for super-sampling at all in this case, as anti-aliasing is taking place internally with much better outcome when frame is down-scaled from 4k to 2k.

By GOETZIWOOD STUDIOS - 7 Years Ago
4u2ges (3/10/2018)
../.. UPDATE: I got much better and faster results rendering 2k in 4k screen resolution mode if I turn super-sampling OFF. When it's ON for some reason I am getting an overwhelming (uncontrollable) amount of glares (although the scale is set to minimum). I suppose there is no need for super-sampling at all in this case, as anti-aliasing is taking place internally with much better outcome when frame is down-scaled from 4k to 2k.../..

I'm really confused by your wordings ;). You mean you render your image sequences (I presume) at 4K resolution (which is independent of your screen resolution) then you downsize those images to 2K for your final video ?

If that so then allow me to disagree about super-sampling: You should always use the highest available super-sampling values to get the maximum anti-aliasing quality as possible (3x3 is already quite poor). Down-sizing helps but it is far from being as efficient as multi-sampling. The result should be "better" in any case.

There is no antialiasing taking place "internally". If you don't activate Super Sampling at render time there is no antialiasing occurring whatsoever (except for the downsizing poor antialiasing effect).

If for some reason this introduces "glare" then the problem is elsewhere (HDR) and should be fixed. I would rather have a better antialiasing than glare, because I can always recreate glows and glares in post-prod, while good antialiasing I can't.

By justaviking - 7 Years Ago
@4u2ges - So it a feature, not a bug?  Boo!

I can understand some minor differences, especially because of the "real time" aspect of iClone.  But the final render should always be superior, never inferior, to what you see in the real time viewport.

I don't remember all the details, but as people shrunk their viewport, rendering times would gradually improve (as an example), until some after you passed a certain size and then the render times would start to increase again.  That could happen even if you were on a 1920x1080 screen (meaning the viewport portion of your iClone window would always be smaller than that) and you were rendering out to HD 1920x1080.

In what way is unexpected and unpredictable behavior a benefit to the user?

Somewhat documented or not, I perceive most of this behavior as a BUG.
By 4u2ges - 7 Years Ago
@grabiller
lol no, that is not what I did. I do not shrink images after the render.
I did set my computer screen resolution to 3840x2160. Then in iClone I did set the viewport size (you can call it a 3D viewer or viewport, it does not really matter) to a full screen with Ctrl+7 and set the render export size to 1920x1080. Then started render. That is it. 
Now the iCLone super-sampling in this render setup is irrelevant and it simply does not have any effect on a final render, whether I turn it ON or set OFF.
But at the same time the anti-aliasing is still taking place and works whole lot better than iClone super-sampling. Why? I cannot be sure. I can only speculate.
Possibly I am getting a "Spatial anti-aliasing" effect (Indigo super-sampling works that way, not sure about iCLone).
Just look at the last 4 sec of my test video. Super sampling is OFF there. But it looks a lot better that 2 previous parts where I had super-sampling ON.

HDR Glares do not work as expected either in this setup and has nothing to do with initial settings in the HDR panel. But I do not really care about it either. as long as I get perfectly anti-aliased image.


@justaviking
I perceive it as a bug too (regardless whether RL mentioned about it in the manual or not). You are right, the final render should be the same or better quality as the viewport regardless of it's size. Sigh...
By Dr. Nemesis - 7 Years Ago
This problem is obviously affecting a lot of us. Some of us significantly.
Does any know if a bug been filed under Iclone 7 in the feedback tracker?
I’d love to add my vote to it.
By GOETZIWOOD STUDIOS - 7 Years Ago
4u2ges (3/10/2018)
@grabiller
lol no, that is not what I did. I do not shrink images after the render.
I did set my computer screen resolution to 3840x2160. Then in iClone I did set the viewport size (you can call it a 3D viewer or viewport, it does not really matter) to a full screen with Ctrl+7 and set the render export size to 1920x1080. Then started render. That is it../..

Lol ok, so I was right to suspect my understanding of your wordings ^^

4u2ges (3/10/2018)
../..Now the iCLone super-sampling in this render setup is irrelevant and it simply does not have any effect on a final render, whether I turn it ON or set OFF../..

It can't be. Perhaps you just don't notice it due to the low contrast of your scene and the high resolution.

4u2ges (3/10/2018)
../..But at the same time the anti-aliasing is still taking place and works whole lot better than iClone super-sampling. Why? I cannot be sure. I can only speculate.
Possibly I am getting a "Spatial anti-aliasing" effect (Indigo super-sampling works that way, not sure about iCLone)../..

Again, it can't be, there is NO iClone "internal" antialiasing taking place outside the super-sampling feature.

4u2ges (3/10/2018)
../..Just look at the last 4 sec of my test video. Super sampling is OFF there. But it looks a lot better that 2 previous parts where I had super-sampling ON../..

I can't see any difference in antialiasing between those part, I think you qualify the last part being "better" because you are focused on the glare.

Ok, now, lets try to understand. If there is indeed antialiasing happening during your renderings despite the fact you did not enabled super-sampling, then the only reason I can think of is that because anti-aliasing is "forced" in your graphic card drivers settings. If not, then there is no antialiasing and you are blind lol.

Well, to be sure of that, lets experiment this, I'll do it on my side too:

1) Open the default Project Template -> 1. Character -> 2. Turntable (the horse)
2) Be sure render settings are Image/UHD/Final Render/ NO SS.
3) F10 then save the image
4) Do the same but this time enable SS 3x3
5) Open the images in any image viewer - ie photoshop - and zoom both images to 800%

You then should clearly see the difference.

Here Without Super Sampling (no anti-aliasing):

https://forum.reallusion.com/uploads/images/196ff6ae-feee-4f5b-811c-194a.png

And Here With Super Sampling 3x3 (anti-aliasing is clearly present):

https://forum.reallusion.com/uploads/images/01c9f766-d09a-474b-821b-8526.png
By 4u2ges - 7 Years Ago
Guy, you were partially right. I was mostly paying attention to glowing part of the test video, which came out the best. But nonetheless...
You're still talking about an ordinary rendering and super-sampling and posting examples of UHD renders. Sure it works as expected because super-sampling is taking place.. nothing really new here.
Those do not apply when you try to render with full blown render viewport.
And that is what I am trying to show here. I did not render and talk about UHD. Instead I render HD on UHD viewport (I hope you do not get confused again :) ).
And yes, anti-aliasing is taking place without super-sampling in this case.

Here are my examples blown to 800%. All renders are 1920x1080 (HD), but taken under different viewport size

Normal viewport, NO SS
https://forum.reallusion.com/uploads/images/ffded553-add7-49ef-84b2-361d.jpg

Normal viewport, 3x3 SS
https://forum.reallusion.com/uploads/images/d01f7f0e-e261-45ba-937f-b100.jpg

UHD viewport, NO SS
https://forum.reallusion.com/uploads/images/48b8963c-53f6-4e5a-8a86-ce54.jpg

As you see there is a very small difference between render with SS in normal viewport and with NO SS in UHD viewport. And that is what I am talking about here

By Rampa - 7 Years Ago
There is another twist to all this that was discussed last year. That is that if you run your card with virtual resolutions (DSR in Nvidia speak) and render out at something smaller (like 1080) that you'll get really good anti-aliasing with no speed penalty. It's like doing SS but faster.

Wildstar found that he could render in preview mode with it looking like SSx3.

The thing being discussed here may be a result of screen-space effects. That's why things like DOF get much more pronounced, like actually looking something shot at F1.4.
By 4u2ges - 7 Years Ago
Rampa (3/10/2018)
There is another twist to all this that was discussed last year. That is that if you run your card with virtual resolutions (DSR in Nvidia speak) and render out at something smaller (like 1080) that you'll get really good anti-aliasing with no speed penalty. It's like doing SS but faster.

Wildstar found that he could render in preview mode with it looking like SSx3.

The thing being discussed here may be a result of screen-space effects. That's why things like DOF get much more pronounced, like actually looking something shot at F1.4.



BOOM! Good point and I remember that thread now. Just checked my card setting and sure is... I did set DSR to 4.00x native resolution at the time...  and forgot about it. :Whistling:
By Rampa - 7 Years Ago
The reason it works is that is exactly what DSR was designed to do. Make realtime (think games) graphics look better.
https://www.youtube.com/watch?v=isALzFmYjV8
By GOETZIWOOD STUDIOS - 7 Years Ago
Ah, I'm not crazy after all ;) The antialiasing was then "forced" - so to speak - through DSR by the card drivers settings.

That said, DSR is still a "poor man" antialiasing method, rendering at higher resolution then resample down will never be as good as proper anti-aliasing through multi-sampling even if the difference is sometimes subtle, like in low contrast situations.
By 4u2ges - 7 Years Ago
I tell you what I'm mostly excited about. Not the artificial anti-aliasing, not the speed of the rendering.
But how the glow and GI emitters lights come out. GI flickering is a real PITA. I was unsuccessfully fighting it for a while now.
Just opened my old project and did test render of the part, which came out lousy at the time.

Pay attention to the wall lamps flicker in the first part and how it does stop in the second part (same render in UHD viewport)



By GOETZIWOOD STUDIOS - 7 Years Ago
@4u2ges
In the Project -> Visual Settings panel there is a "Suppress Light Flicker" option.
Is it ON in your both cases or OFF ?
If it is OFF, have you tried with ON ?
By 4u2ges - 7 Years Ago
lol Guy, you think I would not go through all that?
I did test every possible settings in GI, preferences and material settings. Nothing worked. That is by design I think.
By GOETZIWOOD STUDIOS - 7 Years Ago
4u2ges (3/10/2018)
lol Guy, you think I would not go through all that?
I did test every possible settings in GI, preferences and material settings. Nothing worked. That is by design I think.

I dunno, when someone exposes an issue it is not always obvious to the reader to understand the current configuration and settings of the OP. Sometimes an option can easily be missed or overlooked.

Anyway, I'm glad you found an interesting solution to this issue on your side, that shows there is definitely something wrong in the iClone rendering pipeline as the viewport should not interfere in anyway with the final rendering quality.

I've never had this issue because while having my viewport roughly always the same size (as for the CTRL-2 layout), I always render at higher resolution than my desktop resolution and do post-process at that resolution, even if the final resolution of the video will be only Full HD or HD and then downsized at the end, an old habit. DSR is OFF on my side though.
By 4u2ges - 7 Years Ago
Thanks to you Guy! You gave me an idea in your first post :)
By Firepro - 7 Years Ago
Thanks all for your posts.  I kept wondering what I was doing wrong and why RL support was not being very helpful.  Now I see we are all in the same boat.  Coming to iClone from DAZ Studio I was expecting that when I increase the resolution of my final render that my image would improve any jaggies (and of course render time would increase as well), but that wasn't happening until I discovered making the viewport small.  

Given that my 4K renders with large a viewport give me  very quick render times and lo-rez poor results and when viewport is made small the render times went way up and  I got very good hi-rez results it appears to me that the render results are essentially an interactive/live version (quick lo-rez) versus a photoreal (slow hi-rez).  I have to assume that reducing the viewport size forces the rendering engine to switch from interactive mode to photoreal mode.  I can't understand why this issue has not been fixed. 
By 4u2ges - 7 Years Ago
If you do a lot of stills and looking for the photo realism, just wait a little. Iray is coming to iClone. https://forum.reallusion.com/357102/iClone-7-2018-Roadmap
By Firepro - 7 Years Ago
Perhaps adding iRay rendering support to upcoming iClone is more important than fixing this bug.  Hopefully iRay will be as successfully implemeted in iClone as it was in DS, but with better live capabilities.  And please RL update UI of 3DX Pipeline...at very least need 4K monitor support!
By wildstar - 7 Years Ago
me and another member of the community have already exposed this bug in the release of iclone 7 and I was greatly criticized for having talked about it. I was accused of wanting to undermine reallusion. the fact is that the options of supersampling are totally useless, because depending on the size of the viewport you can have better results than with supersampling. search the forum for the topic "DSR hack" and you will understand what I'm talking about, but I'll give the tip here. activate the DSR function of your nvidia card to multiply your current resolution "the higher the resolution you reach with the DSR function, the softer your images will be in the iclone, an example: if your monitor only supports 1080p, put DSR to" emulate "4k the render time and render in preview mode.the result will be much better than using 3x supersampling.for a fraction of the time.and this is just one of the many reasons that made me stop thinking about iclone as a rendering tool and use it just in what he is the best currently: Animate characters.
By wildstar - 7 Years Ago
Firepro (3/11/2018)
Perhaps adding iRay rendering support to upcoming iClone is more important than fixing this bug.  Hopefully iRay will be as successfully implemeted in iClone as it was in DS, but with better live capabilities.  And please RL update UI of 3DX Pipeline...at very least need 4K monitor support!


 
it's a big mistake to focus on new features when the main one is not working well. the iclone-using community needs "Realtime animation" in a world where all medium and small studios are all migrating to realtime solutions like unity and unreal. the iclone has an engine that in theory should deliver the same quality of unity or unreal or even higher because it uses VXGI technology, but in reality this does not work. the current light system is confusing. the shadow system is poor, the PBR material system is simple and limited. Supersampling is a mess with the viewport. and if I really do make a list I will again be accused of wanting to undermine the reallusion. so I stopped reporting bugs and try to understand why iclone rendering can not be used in a professional way and unity can, since in theory both use the same technology (PBR, IBL, shadow maps) and I am not considering possibility of unity to make lightbake. just by putting together the GI system of unity with the iclone VXGI. Finally, as I said in another post, I would like reallusion to find the way, I love iclone and I already have it as part of my workflow.
By TonyDPrime - 7 Years Ago
wildstar (3/17/2018)

 
it's a big mistake to focus on new features when the main one is not working well. the iclone-using community needs "Realtime animation" in a world where all medium and small studios are all migrating to realtime solutions like unity and unreal. the iclone has an engine that in theory should deliver the same quality of unity or unreal or even higher because it uses VXGI technology, but in reality this does not work. the current light system is confusing. the shadow system is poor, the PBR material system is simple and limited. Supersampling is a mess with the viewport. and if I really do make a list I will again be accused of wanting to undermine the reallusion. so I stopped reporting bugs and try to understand why iclone rendering can not be used in a professional way and unity can, since in theory both use the same technology (PBR, IBL, shadow maps) and I am not considering possibility of unity to make lightbake. just by putting together the GI system of unity with the iclone VXGI. Finally, as I said in another post, I would like reallusion to find the way, I love iclone and I already have it as part of my workflow.


I think there are those of us that see iClone's value as part of a workflow, and then others see it as the total tool, start to finish.  Both are equally constructive approaches to a project, depending on a users own preferred workflow.
I myself like the criss-cross of Daz, Octane, 3DS Max, Unreal, and iClone, so I don't get too hung up on some things not working here and there a because I find I can find alternate solutions in outside applications.  So, if iClone gets new bells and whistles I rather enjoy it.  But, I can see when I have wanted to restrict my workflow to just iClone alone, I find myself feeling more stuck when something isn't working right.  

If you do work with other applications, what happens is you can't help then but compare iClone's abilities to these other applications, in one form or another.  But I don't think these comparisons undermine iClone, rather, quite the contrary, they enhance iClone.  I know my iClone workflow has been enhanced through understanding these applications, and likewise, my understanding of iClone enhances my workflow in these other applications.  Like they all contribute to making the total workflow of a project better.  But that is my point of view, and, to each his/her own.  

Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

By GOETZIWOOD STUDIOS - 7 Years Ago
TonyDPrime (3/19/2018)
../..Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

Compared to what ? Not sure to follow.

By mtakerkart - 7 Years Ago
Very strange that Iclone will integrate Iray because today microsoft annouced that Dirext X 12 integrate realtime raytracing... UDK 4 annouced too that they will integrate this tecnology.


By GOETZIWOOD STUDIOS - 7 Years Ago
Don't get too excited, RTX is for Volta architecture. TITAN V $3000 anyone ? ;)
iRay is working now and is not incompatible with RTX, it is just another rendered.
Then as Reallusion already implemented NVidia Gameworks VXGI for realtime Global Illumination, there is no reason they couldn't implement the upcoming NVidia Raytracing Gameworks which provide raytracing for Ambient Occlusion, Reflection and Shadow (that's what makes RTX anyway).
Still, Volta needed.
By TonyDPrime - 7 Years Ago
grabiller (3/19/2018)
TonyDPrime (3/19/2018)
../..Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

Compared to what ? Not sure to follow.





Compared to upcoming RTX.
By wildstar - 7 Years Ago
grabiller (3/19/2018)
TonyDPrime (3/19/2018)
../..Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

Compared to what ? Not sure to follow.



Iray is more slow than, cycles, more slow than ,Amd Pro render, more slow than octane render, more slow than ..... ahh before you ask , yes i study and test all this renders , and one more thing all this renders i talked, are FREE.   

 

By wildstar - 7 Years Ago
grabiller (3/19/2018)
Don't get too excited, RTX is for Volta architecture. TITAN V $3000 anyone ? ;)
iRay is working now and is not incompatible with RTX, it is just another rendered.
Then as Reallusion already implemented NVidia Gameworks VXGI for realtime Global Illumination, there is no reason they couldn't implement the upcoming NVidia Raytracing Gameworks which provide raytracing for Ambient Occlusion, Reflection and Shadow (that's what makes RTX anyway).
Still, Volta needed.


 i recomend read the documentation made by remedy team ( the guys made this video another guy post here ) about RTX.  the actual video boards cannot reproduce rtx at full fps. in pascal boards its get something like 10 fps, using all rtx features. the unity and unreal developers are adapting the technology , to use mixed with actual techs like screen space AO, and screenspace reflections. AND if it be used like a Render with all your features. i will love a render generating at this quality in my HD at 10 frames per second, for my animations.    
By GOETZIWOOD STUDIOS - 7 Years Ago
wildstar (3/21/2018)
grabiller (3/19/2018)
TonyDPrime (3/19/2018)
../..Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

Compared to what ? Not sure to follow.

Iray is more slow than, cycles, more slow than ,Amd Pro render, more slow than octane render, more slow than ..... ahh before you ask , yes i study and test all this renders , and one more thing all this renders i talked, are FREE.   

I'm sorry but I won't take your words for it. In my life I've seen so many renderers comparisons made with wrong contexts, settings and final QA check that I only believe what I see in this matter.

ps: Octane is not free.
By GOETZIWOOD STUDIOS - 7 Years Ago
wildstar (3/21/2018)
grabiller (3/19/2018)
Don't get too excited, RTX is for Volta architecture. TITAN V $3000 anyone ? ;)
iRay is working now and is not incompatible with RTX, it is just another rendered.
Then as Reallusion already implemented NVidia Gameworks VXGI for realtime Global Illumination, there is no reason they couldn't implement the upcoming NVidia Raytracing Gameworks which provide raytracing for Ambient Occlusion, Reflection and Shadow (that's what makes RTX anyway).
Still, Volta needed.

 i recomend read the documentation made by remedy team ( the guys made this video another guy post here ) about RTX.  the actual video boards cannot reproduce rtx at full fps. in pascal boards its get something like 10 fps, using all rtx features. the unity and unreal developers are adapting the technology , to use mixed with actual techs like screen space AO, and screenspace reflections. AND if it be used like a Render with all your features. i will love a render generating at this quality in my HD at 10 frames per second, for my animations.    

Film makers will be satisfied with anything faster than 0.25 FPS.
By Kelleytoons - 7 Years Ago
I think a *reasonable* render rate for a product like iClone (where a render farm isn't really a practicality, although you could emulate one with lot of work) would be 30 frames per minute for film work.  And I speak as one who used to work in the industry where we were satisfied with getting a frame an hour.  Times have changed, though, and back then we could just throw more resources at a project (so a render farm could reduce that to a frame a minute).  And iClone isn't really positioned for such projects anyway, regardless of the occasional story it's mostly a one-man shop tool.

But 30 fpm?  That would be great.  But iRay isn't going to be that tool, and it disappoints me that it's being offered because I see it exactly as I see Indigo, which was offered as a sop to users and was useless for animations.  iClone is not about stills -- there are far better products for that -- and there's no real reason to spend valuable and limited resources offering something that few will use, when there are bugs and other features we need worked on.  But I can see this as a "hey, you wanted mirrors -- now you have them so we won't worry about them further" sort of solution.  Which is none at all.
By GOETZIWOOD STUDIOS - 7 Years Ago
Yes, 30 frames per minutes is 2 seconds per frame - or 0.5 FPS.
To me too this is the threshold of a reasonable "interactive" rendering speed suitable for iClone like / machinima film making.
By wildstar - 7 Years Ago
grabiller (3/21/2018)
wildstar (3/21/2018)
grabiller (3/19/2018)
TonyDPrime (3/19/2018)
../..Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

Compared to what ? Not sure to follow.

Iray is more slow than, cycles, more slow than ,Amd Pro render, more slow than octane render, more slow than ..... ahh before you ask , yes i study and test all this renders , and one more thing all this renders i talked, are FREE.   

I'm sorry but I won't take your words for it. In my life I've seen so many renderers comparisons made with wrong contexts, settings and final QA check that I only believe what I see in this matter.

ps: Octane is not free.

Octane 4 will be free next month    
By wildstar - 7 Years Ago
Kelleytoons (3/21/2018)
I think a *reasonable* render rate for a product like iClone (where a render farm isn't really a practicality, although you could emulate one with lot of work) would be 30 frames per minute for film work.  And I speak as one who used to work in the industry where we were satisfied with getting a frame an hour.  Times have changed, though, and back then we could just throw more resources at a project (so a render farm could reduce that to a frame a minute).  And iClone isn't really positioned for such projects anyway, regardless of the occasional story it's mostly a one-man shop tool.

But 30 fpm?  That would be great.  But iRay isn't going to be that tool, and it disappoints me that it's being offered because I see it exactly as I see Indigo, which was offered as a sop to users and was useless for animations.  iClone is not about stills -- there are far better products for that -- and there's no real reason to spend valuable and limited resources offering something that few will use, when there are bugs and other features we need worked on.  But I can see this as a "hey, you wanted mirrors -- now you have them so we won't worry about them further" sort of solution.  Which is none at all.


the new AI denoiser and AI lights, will give you noise free images, with 60 samples. ( in my system titanx/1070. ) in octane 3.07 got 4 secs to render a 2k cinemascope image with lots of noise at 60 samples. the 4.0 version is something about 3x more fast than 3.07 . 

By wildstar - 7 Years Ago
only, to complete, and strengthen my point of view. iclone 7 with VXGI is wonderful. the integration of VXGI with HBO + is perfect. the only things missing for the iclone to really be unimpeachable compared to other realtime engines is that:
first to correct this render output that is shameful, the user has to find out that he needs to maximize the viewport to have a good quality of antialias. it is time to implement the TAA in the render output.
 simplify your light system. all light behaves like emissive GI, and generate shadows .and shadows and lights behave like in real life, larger sources of lights = more soft shadows , smaller sources of light = hard shadows .
 implement. screen space reflections. and a alembic importer. and boom!.  I guarantee that no one else will want to use other rendering solutions.
By GOETZIWOOD STUDIOS - 7 Years Ago
wildstar (3/21/2018)
grabiller (3/21/2018)
wildstar (3/21/2018)
grabiller (3/19/2018)
TonyDPrime (3/19/2018)
../..Hey, you know what's weird- Iray is made by NVidia, and involves PBR.  Yet it is one of the slowest rendering solutions.  How is that for irony! 

Compared to what ? Not sure to follow.

Iray is more slow than, cycles, more slow than ,Amd Pro render, more slow than octane render, more slow than ..... ahh before you ask , yes i study and test all this renders , and one more thing all this renders i talked, are FREE.   

I'm sorry but I won't take your words for it. In my life I've seen so many renderers comparisons made with wrong contexts, settings and final QA check that I only believe what I see in this matter.

ps: Octane is not free.

Octane 4 will be free next month    

Thanks for pointing that video to us, indeed, very interesting and promising.

What's unclear in what we see is where is Brigade, where is Octane, what is the difference in workflow and interface, this is still very fuzzy.

ps: Octane 4 wont's be free, there will be a free version limited to 2 GPU, which is different. That's still a great news though.
By GOETZIWOOD STUDIOS - 7 Years Ago
wildstar (3/21/2018)
only, to complete, and strengthen my point of view. iclone 7 with VXGI is wonderful. the integration of VXGI with HBO + is perfect. the only things missing for the iclone to really be unimpeachable compared to other realtime engines is that:
first to correct this render output that is shameful, the user has to find out that he needs to maximize the viewport to have a good quality of antialias. it is time to implement the TAA in the render output.
 simplify your light system. all light behaves like emissive GI, and generate shadows .and shadows and lights behave like in real life, larger sources of lights = more soft shadows , smaller sources of light = hard shadows .
 implement. screen space reflections. and a alembic importer. and boom!.  I guarantee that no one else will want to use other rendering solutions.

What is surely very exciting is that we are at the edge of seeing matter colliding anti-matter to form the ultimate Graal in CGI: real time / or at least interactive pathtracing, whatever the software or hardware. This will happen soon, no more than a few years from now, and we will see that happening with our own eyes. I don't know if you realize but for someone (and I'm not alone here) who has seen the release of the first personal computers in the world in the 70', that is something.

During that colliding period there will obviously be an imbroglio of available solutions, software/hardware and it will be a bit confusing (like with the multitude of personal computers invented back in the day) but at some point only a few solutions will stabilize and emerge as the standard choice(s).

It will be easier for developers to make a choice then. Meanwhile, let's hope Reallusion will do the right ones.
By Kelleytoons - 7 Years Ago
I'm lucky enough to have lived this long (I got the second Trash 80 sold in my state, back in the early 70's, and I was already a married adult with my own home by that time).  I doubt whether I'll live long enough to see this "holy grail" (note spelling) but even just a nice renderer that could do mirrors and such and do it all in a second or two per frame would be great.  And even if I have to spend more $$$ on a higher end video card than my NVidia Titan XP.
By justaviking - 7 Years Ago
My performance expectations and wishes...

15+ fps for "real time" preview while editing is usable, but below that it gets hard to assess the quality of the "motion" animations
2 to 0.5 fps for final render at 1920x1080 resolution with 2x or even 3x anti-aliasing and all "pretty" features turned on

I can live with slower "final" rendering speeds, but when you start hitting  5 seconds per frame (0.2 fps) it goes well beyond what I expect for iClone (NOT ray tracing, etc.).

Of course, faster is always better.   ;)


P.S.
And the size of my editing/preview window should NOT be a factor in the final render!!!!
It should not affect the speed.  And it especially should not affect the quality!!!!!!!!!!!
By GOETZIWOOD STUDIOS - 7 Years Ago
@Kelleytoons
Ah! My first one too, TRS-80 Model I (with memory extension ;)):

https://forum.reallusion.com/uploads/images/141d1dcd-5924-45c5-bafd-bdc7.jpg

Computer RAM: 16 KB (the computer is in the keyboard btw) + K7 tape recorder (not shown) to save/load softwares.
Extension (that big box under the screen): 32 KB so a total of 48 KB in all ! Added the possibility to use 5 1/4 Disk Drive to read floppy disk of 75 KB capacity ! (no more need for the tape recorder, yeah!)
Screen: Green 47 lines of 127 characters ^^
And the first version of Microsoft Flight Simulator was running on "that".
(actualy it was subLOGIC who created it and then licensed it to Microsoft in 1982)

https://forum.reallusion.com/uploads/images/320eba65-f9d5-4267-8bd1-d324.gif

Sorry for the OT but it is so fun to remember that while looking at what we have now..

(regarding *Graal" and the spelling I believe it is ok in english, it is an older form from what I've seen, so "Holy Grail" and "Holy Graal" are equivalent, but thanks for pointing it out, that could have been my mistake)
By Kelleytoons - 7 Years Ago
Oh, Guy, you started off with the FAR better model.  I had the one with 4K (later expanded but I first programmed on that machine, learning to write REAL condensed Basic with multiple lines of code on each line, because you did not want to use a CR if you could avoid it).

And, of course, that tape recorder -- folks have no idea today of waiting for a 4K program to load taking 30 minutes only to find a glitch that made you start all over again. Sigh.

But I have many fond memories of that machine -- it brought me a career in programming that led to being an IT manager for a statewide operation, and, of course, into my career into 3D and 2D animation.  So many, many thanks to Radio Shack, may they R.I.P.
By GOETZIWOOD STUDIOS - 7 Years Ago
@Kelleytoons
Hehe, BASIC, right, my first programming experience too. When you switched on such machine, you just got a prompt.. No GUI obviously but no application either, the machine was "empty". You had to load any application through the tape recorder, and yes.. what a pain it was lol :D
And saving your programs on the tape, praying it was "recorded" correctly, crossing fingers the first time you loaded it back (monitoring the actual audio from the tape) ^^
By GOETZIWOOD STUDIOS - 7 Years Ago
Hum, I believe peoples have this image in mind of @Kelleytoons and me now lol:

https://forum.reallusion.com/uploads/images/30b5baea-5022-45e8-b922-c778.jpg
By Kelleytoons - 7 Years Ago
Hey, I resemble that remark (as well as my avatar here).
By TonyDPrime - 7 Years Ago
wildstar (3/21/2018)
[quote][the unity and unreal developers are adapting the technology , to use mixed with actual techs like screen space AO, and screenspace reflections. AND if it be used like a Render with all your features. i will love a render generating at this quality in my HD at 10 frames per second, for my animations.    


So interesting from a marketing perspective-
Did you see how NVidia RTX was demoed on Monday, and Octane suddenly releases its V4 test build...on the same day! (A day after they released a 3.08 test build....) 
And even though Octane 3.1 hadn't been released yet, Otoy said they were releasing a V4 test build on Monday, because it was 'ready'....and then they go on to promote, in excited campaign fashion, the future pricing options for V4, when it releases officially.
BUT...then they said the release date is probably happening....ready?....
---in about 9 months! 

On top of this, they then say the real V4 Brigade component, which was not in the Monday released V4 test build, is 40% done :blink:, but they are releasing it now as a test build because they want everyone to test out the denoiser. 
But what happens - in the forums many users are having issues with the denoiser!....:ermm: 
"Ready"...Really?

I think this - Otoy heard through channels in advance that this RTX thing was happening, so they prepped up a campaign to market the future competing V4 product, right when the news would break. 
This was a timing thing designed to maximize attention on their upcoming product.  Pretty good marketing.

But, same time this showed me Otoy did what they did because they felt they had to move.  So, they must see RTX as a competitive element they are going to have to contend with. 
People have been asking about Brigade on their forums for ever, and nothing, very little info and response. But enter a competitor in the news, and BOOM- "40% done..."!

Now, RTX will require Volta and Direct X 12.  So a Win 7 system user with a Titan Pascal GPU, upon hearing about RTX would maybe consider upgrading to a Volta and Win 10 Direct X 12. 
But then with the Octane V4 news, then would likely be less inclined to consider buying a Volta in prep for RTX, and consider Octane, as you can use old GPU and your same setup.  I think 

Which brings me to Unreal Engine and Unity.  These are the users who are going to be deciding whether to use RTX, or an Octane plug, or the reg default light building. 
So this will showcase the actual competition.
As Unity and Unreal in future will have RTX and Octane, it will be interesting to see these 2 (RTX and Octane) go head to head.

Okay...So!....
Finally, this whole thing now circles back to the viewport size determining render quality.  We always imagine companies ideally respond to user requests, etc. 
But this all teaches us this....$ first.
You can see now so clearly, in real-time.  No dressing up announcements or news can disguise it anymore....You can see it so easily.
Thus, I can say, with 1,000% certainty, this viewport render size thing will not be fixed unless an update to iClone makes fiscal sense.  (Go ahead, prove me wrong!...:D  I double- tripple-dare you to fix and prove me wrong)

Here is my final thought...taking it all into consideration....
They are moving to capture the demand of the non-iClone game developer crowd with CC3. 
Thus, if there were enough potential non-iClone customers asking questions about the viewport size thing, it would be fixed...IMMEDIATELY! 





By mtakerkart - 7 Years Ago
Just saw the begining gdc 2018 live from udk. Tremendous realism.
All realtime. But didn't said how much for the hardware ;)



When the complete live will be avalaible I suggest to see at the begining the facial mocap....
By mtakerkart - 7 Years Ago
Here it is:



Behind the scene:



Our great Gollum performer:



The Andy Data to E.T (my best) :


 
By mtakerkart - 7 Years Ago
And for those who like mirrors. Ok it's my last...


By TonyDPrime - 7 Years Ago
mtakerkart (3/21/2018)
And for those who like mirrors. Ok it's my last...




OMG...I can't take it...
I LOVE - I LOVE - I LOVE this stuff so much. 
Love rendering &  3D tools. 
Thank you for posting the updates!