Profile Picture

NetFlix Requirements

Posted By wildstar 6 Years Ago
You don't have permission to rate!
Author
Message
illusionLAB
illusionLAB
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)

Group: Forum Members
Last Active: 2 Years Ago
Posts: 393, Visits: 4.8K
I'm very impressed with eevee and if I'm going to take the time to learn another software, the Blender route makes much more sense than trying to use game engines.  I figure if I start getting comfortable in Blender now... once 2.8 is released I'll have better than a working knowledge and should find it easy to move my iClone projects (providing RL fixes their FBX export/import issues before then!)
animagic
animagic
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)Distinguished Member (32.5K reputation)

Group: Forum Members
Last Active: 5 hours ago
Posts: 15.7K, Visits: 30.5K
Tony, banding occurs in other instances as well and also with non-iClone imaging. From what I understand it's just a limitation of of 8-bit per color channel output. I wouldn't call it a bug as I don't think there is a way that you can prevent it with 8-bit output. Going to 16 bit would resolve that, or possibly adding some noise component like I did in iClone 5.


https://forum.reallusion.com/uploads/images/436b0ffd-1242-44d6-a876-d631.jpg

illusionLAB
illusionLAB
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)

Group: Forum Members
Last Active: 2 Years Ago
Posts: 393, Visits: 4.8K
Banding is not an iClone bug... but a limitation of images created in 8 bits.  Our computer monitors and TVs can only display 8 bits (yes, there are exceptions) so it's generally an accepted 'display' format.

Any sort of gradient... sky, glow, shadows etc. are prone to banding as the change in luma may be very gradual.  So, imagine a gradient from 50% grey to 60% grey across an HD image.  With 8 bit having only 256 "steps" of luma , that means the 10% change of luma would be represented with only 25.6 steps across 1920 pixels - hence the distinct bands.  The same scenario with 16 bit images means you'd have 65535 x 0.1 = 6553.5 steps of luma across 1920 pixels - more than 3 times the minimum resolution to avoid banding.

To get 'true' 16 bit depth renders, your texture bitmaps would all need to be 16 bit as well.  This puts a tremendous strain on any software to try and do 'real time'.  In the "high end" animation/VFX world we primarily use "floating point 16 bit" files (like EXR) as they can offer the same quality as 32 bit integer in a smaller file size - of course, "real time" is not even a consideration Wink

sonic7
sonic7
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)Distinguished Member (13.5K reputation)

Group: Forum Members
Last Active: Last Year
Posts: 1.7K, Visits: 19.4K
My Laptop PC seems to show up this 'banding' issue quite noticeably (more than a desktop?) - don't know for sure.
Maybe it's also a graphics card related setting?  .... But I'm accutely aware of it and don't like it at all.... I don't understand the deeper technical side of things, but my eye knows when something looks right or not. (Well, most of the time anyway). Smile

** Like when you have a transition from a shade of gray through to a 'slightly' darker shade of gray, but over a large(ish) distance ..... and it looks like a relief contour map !! Lol Smile

--------------------------------------------------------------------------------------------------------------------------------------------------------
Please be patient with me ..... I don't always 'get it' the first time 'round - not even the 2nd time! Sad  - yikes! ... 
MSI GT72VR Laptop, i7 7700HQ 4-Core 3.8 GHz 16GB RAM; Nvidia 1070, 8GB Vram iClone-7.93  3DXChange Pipeline 7.81  CC-3 Pipeline 3.44  Live Face  HeadShot  Brekel Pro-Body  Popcorn FX  iRAY  Kinect V2  DaVinci Resolve17  Mixcraft 8.1

Edited
6 Years Ago by sonic7
justaviking
justaviking
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)Distinguished Member (20.4K reputation)

Group: Forum Members
Last Active: 2 Weeks Ago
Posts: 8.2K, Visits: 26.5K
@Sonic - Laptops are notorious for having bad displays (cheap screens).  You can buy a cheap $200 tablet device, and it will have a better screen than a $1,200 laptop.  Slowly, ever so slowly that is changing.

The "banding" issue, as explained, can be caused by 8-bit input images to even the best software.  Here is a post in the Substance Designer forum, complete with picture.  If you look closely, it looks like a "contour map."
https://forum.allegorithmic.com/index.php/topic,24285.0.html



iClone 7... Character Creator... Substance Designer/Painter... Blender... Audacity...
Desktop (homebuilt) - Windows 10, Ryzen 9 3900x CPU, GTX 1080 GPU (8GB), 32GB RAM, Asus X570 Pro motherboard, 2TB SSD, terabytes of disk space, dual  monitors.
Laptop - Windows 10, MSI GS63VR STEALTH-252, 16GB RAM, GTX 1060 (6GB), 256GB SSD and 1TB HDD

TonyDPrime
TonyDPrime
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)

Group: Forum Members
Last Active: Last Week
Posts: 3.4K, Visits: 12.4K
Yeah, but UE4's Bloom and Octane's bloom don't produce banding.  I use them so I experience first hand. 
However, okay, for Octane they are 16-bit renders.  For UE4, I don't know what the heck it is actually.  Production -Default.

@Animagic, I don't dispute that banding exists elsewhere, but mind you your fix was in fact for iClone, so I'm just pointing that other rendering tools don't have the issue at 16-bit.
But yeah, watch the Eevee video YoYoMaster posted with the creature.  At the end when it is a dark scene with a light, you do see the banding in Eevee, too.

@Illusionlab - okay, let's go with the idea that you need 16 bit to eliminate banding, or reduce it, but then I thought you had said in this thread that iClone probably outputs 16-bit? 
But so, it is actually 8-bit?  In iClone's case, PNG vs JPG does nothing as far as banding goes.  If they gave us EXR and it would go away, that would reveal a lot.

Because, if the banding could be fixed by implementing a 16 bit render output, then I'm a 'bit' disappointed that iClone hasn't done this yet.
So that's why I say a bug.  Maybe not a programming glitch coding bug, but definitely a performance/output bug. 
It is not the intended output AND there is NO way around it through iClone itself.  
Anyway at minimum, we all wish it would go away!
illusionLAB
illusionLAB
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)Distinguished Member (3.9K reputation)

Group: Forum Members
Last Active: 2 Years Ago
Posts: 393, Visits: 4.8K
"let's go with the idea that you need 16 bit to eliminate banding"
It's not an idea it's a fact.  There are, however, ways to minimize 'banding'... like adding noise or dithering - usually "a process" when creating the rendered file.

The effects that produce the most banding in iClone are essentially 'real time' post effects, so I suspect they are calculated as 8 bit to maintain 'real time' performance (they could actually be hardware provided from the graphics card... so there's nothing RL could do to change it).  If they were calculated at 16 bit during final render it would pretty much solve the issue (at the expense of render time).  The upcoming addition of iRay rendering will improve things, as it's 32 bit it should substitute iC glows, lighting etc. with it's own (converted, like indigo did).  I amended my guess at iClone's architecture to 32 bit as it does handle HDR - but, just because the environment is 32 bit does not mean all the calculations are made at 32 bit... this is probably true of all the game engines, as they wouldn't possibly be able to offer real time.  It's common practice to dynamically assign bit depths 'where they're needed' - you can get away with 8 bit for most things... like texture maps.

Also, it's absolutely possible to see 'banding' on 16 or 32 bit images... like I said before 'most' computer displays are 8 bit (the higher quality the better the 'interpolation').  My setup is optimized for colour grading, so I've splurged for a 10 bit (30bitRGB) monitor... driven by a 10 bit Quadro card.  All 'gaming' cards, like 1080s, are 8 bit.  My theory is iClone actually uses the images generated by the graphics card for renders... which would ultimately mean the PBR renderer in iClone will never produce a 16 bit image.  Again, this is why iRay (or other 3rd party 'non real time' renderer) will be the only option for 16 or 32 bit depth images.
Edited
6 Years Ago by illusionLAB
yoyomaster
yoyomaster
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)

Group: Forum Members
Last Active: 4 Years Ago
Posts: 272, Visits: 1.3K
Banding, from my experience, appears mostly in overblown shots, like with bloom, also, depending on the codec used for streaming, it may become more apparent, so I understand why Netflix ask for 16 bits, only 8 bits is definitly a limitation on iClone's part if you are looking to output for any serious production, not to mention that many compositing tools prefer 16 bits over 8 bits!
Edited
6 Years Ago by yoyomaster
yoyomaster
yoyomaster
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)Distinguished Member (2.7K reputation)

Group: Forum Members
Last Active: 4 Years Ago
Posts: 272, Visits: 1.3K
Another Eevee render test.
Model by Jakub Chechelski, you can find it on Gumroad at https://gumroad.com/l/JiRtZ, I gave 5 bucks for it, as it is a great model and a great asset to learn character texturing in Substance Painter.
The Node Wrangler add-on with the CTRL/SHIFT/T combo key, makes it a snap to setup Substance Painter textures in Cycles or Eevee, took me a whole 5 minutes to set this up, including the SSS setup.
This was rendered in Blender Eevee in 19 minutes (600 frames) with 25 4K maps, on a GTX 960 with only 2GB of ram, which is impressive!


TonyDPrime
TonyDPrime
Posted 6 Years Ago
View Quick Profile
Distinguished Member

Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)Distinguished Member (12.7K reputation)

Group: Forum Members
Last Active: Last Week
Posts: 3.4K, Visits: 12.4K
yoyomaster (6/13/2018)
Another Eevee render test.
Model by Jakub Chechelski, you can find it on Gumroad at https://gumroad.com/l/JiRtZ, I gave 5 bucks for it, as it is a great model and a great asset to learn character texturing in Substance Painter.
The Node Wrangler add-on with the CTRL/SHIFT/T combo key, makes it a snap to setup Substance Painter textures in Cycles or Eevee, took me a whole 5 minutes to set this up, including the SSS setup.
This was rendered in Blender Eevee in 19 minutes (600 frames) with 25 4K maps, on a GTX 960 with only 2GB of ram, which is impressive!




How many maps does this thing have...5 minutes, but for how many textures did you have to do? 
Just curious what kind of workflow you are really going to have to deal with here when it comes to a full scene, or avatar, coming from iClone .  Looks cool!



Reading This Topic