Profile Picture

Linear Workflow Example

Posted By GOETZIWOOD STUDIOS 9 Years Ago
You don't have permission to rate!
Author
Message
GOETZIWOOD STUDIOS
GOETZIWOOD STUDIOS
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 1.2K, Visits: 7.2K
Hi,

I've been ranting for quite some time about having a linear workflow in iClone but I've never given a concrete example of what would be the benefit of it.

So here is a simple example 'simulated' in iClone by manually linearizing the textures (thanks to iClone OpenEXR support this is now feasible without any loss or artifacts using such linearized textures) and colors (here the background), then applying a ~2.2 gamma correction to the rendered image. This is not the 'true' linear workflow, as, normally, a proper sRGB -> Linear (for the textures) and Linear -> sRGB (for displaying the final image) should be applied. But the result is quite close enough.

Both images have exactly the same Light set, one key light, two rim lights and one fill light.

First, here is what you get by default in iClone:



And here is the "linear workflow" version:



The difference is quite obvious.

Note that there is a slight color saturation difference but this is due to the fact that a simple gamma correction has been applied instead a proper sRGB -> Linear -> sRGB one.

Aside from that, you will notice that the shading is much more natural with the linear workflow. The different lights add their intensity correctly. Indeed, if you look at the default image you will see that the highlight areas of the image are "clipped" (color reached an intensity above 255) which is not the case with the linear workflow. This is due to the 1+1=3 (almost literally) issue not using a linear workflow.

Also the global look with the default workflow is more dark and has more contrast. This is the major issue for the artists because they start to put a full intensity key light and they feel the image is still dark, so they will add and add lights until they get the global "intensity" they want but due to the biased light intensity addition they get clipped, ugly colors all over the place, yet the result is usually still too much contrasted due to the unnatural addition of lighting.

This is a simple example really, but I hope this will give you a hint about why having a linear workflow is so important to get proper shading.

Cheers,
Guy.

--
guy rabiller | GOETZIWOOD STUDIOS
"N.O.E." (Nations Of Earth) Sci-Fi TV Show, Showrunner.

Edited
9 Years Ago by grabiller
Rampa
Rampa
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)

Group: Forum Members
Last Active: 7 hours ago
Posts: 8.1K, Visits: 60.5K
That does look a lot better! Smile

To much contrast is something I am always trying to overcome in iClone, and have had varying degrees of success.

Could you explain your method of getting these results? Step-by-step, so we can try it out as well. What is your conversion process to Open EXR method?

Dynamic range and gamma are sure powerful things! w00t
GOETZIWOOD STUDIOS
GOETZIWOOD STUDIOS
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 1.2K, Visits: 7.2K
rampa (1/25/2015)
That does look a lot better! Smile

To much contrast is something I am always trying to overcome in iClone, and have had varying degrees of success.

Could you explain your method of getting these results? Step-by-step, so we can try it out as well. What is your conversion process to Open EXR method?

Dynamic range and gamma are sure powerful things! w00t


For each iClone material color texture (at least the diffuse one):
1) Save texture as *.exr
2) Open the texture in any soft with support for OpenEXR and apply a ~0.45 gamma correction then save the texture back.
3) Re-Load the texture.

For each rendered image:
1) Apply a ~2.2 gamma correction.

Normally each color value should be converted as well (background color, diffuse/spec colors if you don't use textures, light colors, etc..).

But in practice this is quite cumbersome to do this manually every time for everything, (convert all Speedtrees textures ? Substances ? etc..).

--
guy rabiller | GOETZIWOOD STUDIOS
"N.O.E." (Nations Of Earth) Sci-Fi TV Show, Showrunner.

planetstardragon
planetstardragon
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)

Group: Forum Members
Last Active: Last Week
Posts: 11.5K, Visits: 45.9K
one trick I've used for a pseudo-gamma control is the NPR effect, I turn the paint weight to 0, and use the colorizer fader to brighten up the scene. Not as effective as a gamma option, but not too shabby either. I haven't tried exporting that to indigo though to see the results. I find that indigo is unpredictable from iclone, so I do any tweaks in indigo with minimal lights from iclone, I don't try to make my iclone render better ....I just try to get the best render from indigo.


Rampa
Rampa
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)

Group: Forum Members
Last Active: 7 hours ago
Posts: 8.1K, Visits: 60.5K
Cool! Thanks for that info grabiller. Smile
GOETZIWOOD STUDIOS
GOETZIWOOD STUDIOS
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 1.2K, Visits: 7.2K
planetstardragon (1/25/2015)
one trick I've used for a pseudo-gamma control is the NPR effect, I turn the paint weight to 0, and use the colorizer fader to brighten up the scene. Not as effective as a gamma option, but not too shabby either. I haven't tried exporting that to indigo though to see the results. I find that indigo is unpredictable from iclone, so I do any tweaks in indigo with minimal lights from iclone, I don't try to make my iclone render better ....I just try to get the best render from indigo.


One thing very important to understand is that having a linear workflow is not an "artistical control" over the rendered image contrast, it is simply *the* way to have *the* correct (mathematically speaking) behavior of the rendering pipeline.

What I mean is, lets say one prefer the look of the first image I've posted (the default workflow). Why not ? After all it is a question of taste. Then I can get the first image "look" from the second image (linear workflow) by color grading it if I wanted to. But what I can't do, is the opposite, because informations are wrong or even lost at render time (clipping) in the first image.

Regarding IndigoRT, it uses a linear workflow, so you should assume the rendering is correct. Then if you don't like the look of what you get, it is then an artistic matter, not a technical one. For instance the iClone asset textures have been made with the iClone rendering result in mind, which means those textures may have to be "adapted" to be used with a "technically correct" renderer.

--
guy rabiller | GOETZIWOOD STUDIOS
"N.O.E." (Nations Of Earth) Sci-Fi TV Show, Showrunner.

planetstardragon
planetstardragon
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)Distinguished Member (23.2K reputation)

Group: Forum Members
Last Active: Last Week
Posts: 11.5K, Visits: 45.9K
I'll admit to being all artist in 3D because I don't have the formal training for workflows....but I can appreciate what you are saying from music programming .....in a parallel....I couldn't work with a software that doesn't allow me to arrange the effects in the order I want - while that's a technical option, it does in fact affect my artistic choices in the end.

I normally don't have major qualms with 3D softwares I use because since I don't have a pro-reference, I simply look at them for whatever they are good at ....and not good for.
( it's not to say I don't think some softwares are not good, and voice that...but I simply accept their limitations and move around them or get a software that does what i want )


Everything for me in the 3D world is a work around as a result - hence I have so many 3D toys to work with, I'm quite sure that I often use things in a way that would make the developers cringe! lol w00t

Them: "You don't have to get that software, we do that "

Me: "Yeah, but you take 15 years, ain't nobody got time for that!!"

Just ask swoop, he started giving me lessons in UV's, I painted everything black in rebellion! looked cool actually!


Edited
9 Years Ago by planetstardragon
Rampa
Rampa
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)Distinguished Member (35.8K reputation)

Group: Forum Members
Last Active: 7 hours ago
Posts: 8.1K, Visits: 60.5K
Our ultimate goal is to only blow the highlights and lose the shadow detail when we want to.

Here's my trick, and it's really just a tweak to the materials in iClone. It's not for night scenes though. The basic idea is to simulate bounced photons a little bit, and allow for the great dynamic range of our eyes.

Material settings:

Set the diffuse to pure white.
Set the ambient to pure black.
Set the specular to pure white (or other if needed).
Set the self-illumination to 50% (That's the magic w00t ).

Light settings:

I usually run my scene ambient light at neutral gray. The main light can get very bright with these settings.

Edited
9 Years Ago by rampa
animagic
animagic
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)

Group: Forum Members
Last Active: 13 hours ago
Posts: 15.7K, Visits: 30.5K
That's helpful information, Guy.

I'm interested in making things look as good as possible in iClone, as I don't see Indigo as a very feasible alternative for animation at the moment.


https://forum.reallusion.com/uploads/images/436b0ffd-1242-44d6-a876-d631.jpg

GOETZIWOOD STUDIOS
GOETZIWOOD STUDIOS
Posted 9 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)Distinguished Member (8.8K reputation)

Group: Forum Members
Last Active: 3 Years Ago
Posts: 1.2K, Visits: 7.2K
animagic (1/25/2015)
That's helpful information, Guy.

I'm interested in making things look as good as possible in iClone, as I don't see Indigo as a very feasible alternative for animation at the moment.


I agree, while IndigoRT is certainly interesting for very short productions or even still images, it is a no-go for more ambitious productions especially those with difficult lighting conditions (interiors with lot of indirect lighting).

There is a lot of room for OpenGL or DirectX renderers to produce highly PBR images in realtime or at least interactively (a few seconds of rendering time).

We are not working with a game engine but with a "machinima" engine where developers can use some technologies not usable in games. For instance I like a lot the rendering part of SFM (Valve Source Film Maker) which does actual multi-sampling (up to 512/1024) for anti-aliasing, ambient occlusion, motion blur and depth of field. They did not felt to the trap of only using game engines technologies, but implemented specifically for SFM some technologies from offline rendering (but still usable interactively).

A typical example of what is ok for games and game engines but not for "machinima" engines is SSAO (Screen Space Ambient Occlusion). It is 'good' enough for game engines because it is fast, but totally useless for "machinima" engines as SSAO is technically wrong and actually "pollute" the image. Point based AO could be used instead, for "interactive" rendering where you don't need to render the final images at 60fps.



--
guy rabiller | GOETZIWOOD STUDIOS
"N.O.E." (Nations Of Earth) Sci-Fi TV Show, Showrunner.

Edited
9 Years Ago by grabiller



Reading This Topic