|
Author
|
Message
|
|
JIX
|
|
|
Group: Forum Members
Last Active: Last Year
Posts: 1.1K,
Visits: 1.3K
|
Sure ... just throwing in some visions of the future. No worries, I don´t expect them to become reality at one swoop. ;)
|
|
|
|
|
justaviking
|
justaviking
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 8.2K,
Visits: 26.5K
|
It's true that a good rendering engine will cull "out-of-sight" objects from the rendering process. But on the other hand, it's in the scene, so it needs to be taken into account to a certain extent. We have to accept some responsibility for the load we put on our system. It's also a responsibility for the people who create and sell props. A well constructed prop is both beautiful and efficient, both in texture and polygon count. (Shhh.... don't let anything think sw0000p and I might be in agreement on something. Shhh...)
iClone 7... Character Creator... Substance Designer/Painter... Blender... Audacity... Desktop (homebuilt) - Windows 10, Ryzen 9 3900x CPU, GTX 1080 GPU (8GB), 32GB RAM, Asus X570 Pro motherboard, 2TB SSD, terabytes of disk space, dual monitors. Laptop - Windows 10, MSI GS63VR STEALTH-252, 16GB RAM, GTX 1060 (6GB), 256GB SSD and 1TB HDD
|
|
|
|
|
JIX
|
|
|
Group: Forum Members
Last Active: Last Year
Posts: 1.1K,
Visits: 1.3K
|
There are many good reasons to be thoughtful when creating your scene. I'm not good at it, but we should always think like a "real" movie producer, and you only build what the camera will see. If you plan your shots, you don't need a forest of trees behind the camera. You don't need fully rendered buildings if you only see the front surface. You don't need 4k high-resolution textures on unimportant props that will never be more that a hundred pixels high on the screen. So true, but because it´s not real at all, the software should lighten the users workload wherever and whenever possible. At least this is preferable in my opinion. Sorry RL, again some workload for you guys. ;)
|
|
|
|
|
justaviking
|
justaviking
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 8.2K,
Visits: 26.5K
|
genao87 (10/20/2015) God damn JustAskViking, you have a full blown 4GB video card and Iclone uses it all. So it is safe bet to get the 980Ti or Titan that have 6GB. To be fair (sort of), I intentionally created a stress test for iClone 6. I exceeded expectations in that regard. I believe a lot of people have come to the realization that there is plenty of room for optimization in iClone. In fact, I was able to demonstrate cases where having a reflective surface in a scene would cause my GPU (not CPU) to run at 100% even when iClone was idle (not playing). If I minimized iClone, the GPU would idle again. I think I'll test that again, now that I think of it. There are many good reasons to be thoughtful when creating your scene. I'm not good at it, but we should always think like a "real" movie producer, and you only build what the camera will see. If you plan your shots, you don't need a forest of trees behind the camera. You don't need fully rendered buildings if you only see the front surface. You don't need 4k high-resolution textures on unimportant props that will never be more that a hundred pixels high on the screen. But with all that said, I was salivating over rumors (circa Dec. '14) of an 8MB "980" card. It turned out to be only that, just rumors. But I am extremely impressed with the specs and reviews of the 6GB 980 Ti. I'd love to re-render my test case on that card. I'm willing to accept a card if anyone wants to donate one to my experiment. ;) For anyone who is curious, but hasn't seen my lighting stress test before, here it is once again for your convenience.
CORRECTION:
My Radeon 6850 seems to have 1GB of memory. Strange that DXDiag was saying 4GB. It had been a while since I looked at the actual specs of my current card (rather than the ones I dream about), so I repeated what DXDiag said without thinking anything was amiss. GPU-Z reports 1GB. Hmmm, strange indeed. My test project, though, did require a bit over 4GB of video memory, so there was a lot of swapping going on. That's why I'm so excited about the 980 Ti. Maybe I'll get one someday.
iClone 7... Character Creator... Substance Designer/Painter... Blender... Audacity... Desktop (homebuilt) - Windows 10, Ryzen 9 3900x CPU, GTX 1080 GPU (8GB), 32GB RAM, Asus X570 Pro motherboard, 2TB SSD, terabytes of disk space, dual monitors. Laptop - Windows 10, MSI GS63VR STEALTH-252, 16GB RAM, GTX 1060 (6GB), 256GB SSD and 1TB HDD
|
|
|
|
|
genao87
|
genao87
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 175,
Visits: 1.0K
|
God damn JustAskViking, you have a full blown 4GB video card and Iclone uses it all. So it is safe bet to get the 980Ti or Titan that have 6GB.
|
|
|
|
|
Rampa
|
Rampa
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: Last Week
Posts: 8.2K,
Visits: 62.6K
|
I've found that lots of lights kill the performance pretty quickly. Lots of tees and avatars don't have as big an impact.
|
|
|
|
|
justaviking
|
justaviking
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 8.2K,
Visits: 26.5K
|
I typically get around 5-10 fps on my final render. Typical output is... MP4, 1920x1080, 4x(?) anti-aliasing.
My worst-ever performance was 2 to 3 fps during preview, and 23-seconds per frame (0.04 fps) while rendering. That was for my big iClone 6 "unlimited lights" test, with 10 avatars, about 25 lights, and a full forest of trees. According to GPU-Z, it resulted in a 4.5GB memory load on my 4.0GB graphics card. I attribute the majority of the poor performance to having exceeded my card's available memory.
iClone 7... Character Creator... Substance Designer/Painter... Blender... Audacity... Desktop (homebuilt) - Windows 10, Ryzen 9 3900x CPU, GTX 1080 GPU (8GB), 32GB RAM, Asus X570 Pro motherboard, 2TB SSD, terabytes of disk space, dual monitors. Laptop - Windows 10, MSI GS63VR STEALTH-252, 16GB RAM, GTX 1060 (6GB), 256GB SSD and 1TB HDD
|
|
|
|
|
MistaG
|
MistaG
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: 5 Years Ago
Posts: 43,
Visits: 277
|
What is it with duplicate postings...:w00t:
|
|
|
|
|
MistaG
|
MistaG
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: 5 Years Ago
Posts: 43,
Visits: 277
|
Thanks for the update Kellytoons. I was looking into buying 970. I would probably still get it and see if there would be any changes in performance.
|
|
|
|
|
Kelleytoons
|
Kelleytoons
Posted 10 Years Ago
|
|
Group: Forum Members
Last Active: Last Year
Posts: 9.2K,
Visits: 22.1K
|
Just to give you some more data, I tried rendering the Abandoned House scene, at 1080p with High Quality shadows turned on and supersampling set at 2x2 and I got around 90 frames a minute in final render mode on my NVidia 970 card (I have an i7 dual-core machine but it ain't particularly fast, as it's at least a year old and wasn't top of the line when I bought it). I got around 370 frames per minute in preview mode. So, oddly, my 970 card is two or three times slower than Rampa's in Preview, but about 50% faster in final render mode. (Which is weird but hardware ain't my thing - I'm a software engineer :>).
Alienware Aurora R16, Win 11, i9-149000KF, 3.20GHz CPU, 64GB RAM, RTX 4090 (24GB), Samsung 870 Pro 8TB, Gen3 MVNe M-2 SSD, 4TBx2, 39" Alienware Widescreen Monitor Mike "ex-genius" Kelley
|
|
|
|