|
By justaviking - 11 Years Ago
|
|
Greetings, I saw Rampa started his corner, and I thought, "Why not?" So welcome to Justaviking's Corner. For any new people, you may call me "Viking" or if you really want to save letters, "JVK" or even just "JV." Me? I never conserve on letters. I type a lot. I have made of little videos already, which Reallusion was kind enough to post. One was a tutorial on "Indigo materials" and applying them to scenes in Indigo. I have learned more since making that, and will add some updates in a "Part 2" I've committed to completing. The also posted a "Make it Sparkle" video, which is intended to be a "marketing video" showing differences between iClone and Indigo renders. Lastly, I've also committed to making a "miscellaneous" tutorial about some Indigo stuff I've learned. Hopefully I'll get them finished in the next week or so. SOMETHING SPECIAL: I've been wanting to do a couple things, and finally finished: - Explore iClone 6's "unlimited lights" capabilities
- Stress test iClone 6
The stress test turned into a major stress on my graphics card. I found out that a 1GB graphics card really struggles to render a 3GB project. Duh! I rendered as slowly as 30 seconds per frame. So my little project that consisted of 6,000 video frames took about 45 HOURS to render in iClone. (And I thought Indigo would be slow!) Really, it's not iClone's fault. Other projects render just fine. It's a good demonstration of what happens when your project is THREE TIMES LARGER than what your GPU can hold. On a positive note, it rendered flawlessly. As for the lights, I'll go ahead and give you some product statistics: - 1 huge spot light
- 4 small point lights
- 10 small spot lights
- 9 colored spot lights
and - 10 avatars
and - A zillion trees (I didn't count them)
Without further ado, here's my little video: iClone Lights (and Mason)
|
|
By wires - 11 Years Ago
|
|
Way to go Viking. :cool: I just knew that you lot weren't just a hoard of plunderers, but that you could enlighten us with your knowledge and skills. :)
|
|
By RobertoColombo - 11 Years Ago
|
Thumbs and toes up for you too, Justaviking! Thanks a lot
PS: the video is simply great! The idea is superb
|
|
By Rampa - 11 Years Ago
|
|
Awesome JVK! I see you there kitty-corner from me. ;)
|
|
By justaviking - 11 Years Ago
|
|
I'm glad you enjoyed the video. It was fun making it, other than the agonizingly slow render time. It will be fun to re-render it someday in the future after I have a video card update. I must repeat this... Other projects rendered fine iClone 6. This one massively overloaded my graphics card and it's not iClone's fault.
Here are the other two videos I made. This first one is an Indigo animation. It's meant to be more "marketing and sales" than technical. I benefited from symmetry, so I only had to rotate the diamond "one facet" or 1/16th of a rotation. Then I copied the images in my editor to make it look like it was rotating longer. (There I go again, giving away all my secrets. Sort of the inverse of Viking pillaging.)
This next video is actually my first one, and is more of a "tutorial" type of video. I have learned some other ways of doing things shortly after I made this, and back when I made it there were some features that weren't working yet (but are now). So what you see were is not the only way to do this, and is not necessarily the best, but it is one of several ways of doing things. The focus was on Indigo materials, how they differ from iClone materials, and how to apply non-iClone materials to your model when you render it in Indigo. I do still intend to finish this multi-part tutorial, hopefully in the next few days.
|
|
By VirtualMedia - 11 Years Ago
|
|
That's impressive for both you and ic6. Nice Work!
|
|
By Sifr - 11 Years Ago
|
I actually learned something new, here. Can't wait for the next installments.
Btw, in general practice make sure you have the renderer "STOP" then go to the Render Settings and turn on/off GPU support for selecting your desired render resources. A lot of features are grayed out during render processes and can not be adjusted on the fly.:w00t::w00t::w00t:
|
|
By justaviking - 10 Years Ago
|
|
Fwooo! (That's me blowing the dust off this thread.) Life is interesting. I have been "almost done" with Part 2 of my "Materials" tutorial for two, maybe three weeks now. It suffered from bloat, and I trimmed it back down to focus on the primary topic and put the "bloat" into it's own tutorial, which is 3/4 done. I've also been working on another one, the one that will probably be the most interesting, and it also is about 3/4 done. Just when I thought I'd finally wrap up one or two of these things, we discovered out-of-state friends were going to be passing through, and would stop in for a visit. Well, pause iClone and queue up the frantic house cleaning! Then we discover their itinerary changed, and maybe they'll be here next week. Sigh. Oh well, the house looks a lot better now (hopefully it survives a week), and I finally got some quality time to finish up one of my tutorials today. It's uploading right now. It shouldn't be long and I'll be posting my newest video.
|
|
By justaviking - 10 Years Ago
|
|
Okay, here is Part 2 of 2 of my tutorial about Indigo Materials for iClone users. In Part 1, I replaced the semi-diamond material with a real diamond material, but the diamond turned out blue rather than sparkly white. Why? That's the topic of Part 2. This is actually less of a tutorial than it is lecture, or rather, I'd like to think I'm taking you along with me on my journey of discovery. Hopefully you'll be able to find something useful and interesting in here. Now that I am starting to collect a couple of videos, I should set up a proper playlist on YouTube.
|
|
By theschemer - 10 Years Ago
|
Hey Viking, why is the diamond so blue? Shouldn't it be clear? :D
Great videos, I am playing catch up... Thanks, TS
|
|
By justaviking - 10 Years Ago
|
|
@TS - Great quote. That means you didn't fall asleep (at least not during that part), so that's encouraging to me. @everyone - I made a playlist. I feel so official. https://www.youtube.com/playlist?list=PL_2ITnz48gdfwZLcOqMOY7WF4BFDZpCr8 I'm going to add an older tutorial on "Smooth Camera Motion" that I've had on Vimeo. I'm uploading it to YouTube and I'll include it in my Tutorial playlist. It was made in iClone 5 but applies equally well to iClone 6. It's still uploading, so I'll add it to the playlist in the morning.
|
|
By KenCoon - 10 Years Ago
|
Really enjoyed the video. Only part I am worried about is the long rendering time in Indigo. My new computer is only an i3 so it will be much slower. I do have 8GB RAM and am working on a GTX750TI card. The computer should be here on Tuesday and then I will have to put an operating system in and download everything. Maybe Wednesday or Thursday before I am up and crawling. I am just waiting to learn more and practice. Ken:)
|
|
By pumeco - 10 Years Ago
|
Just started watching these, I have no interest in Indigo (at least not for now), but I still enjoyed the videos; you have very pleasant way of speaking, you make a good tutor!
|
|
By kenmatthews - 10 Years Ago
|
Hello Viking,
Loved your lighting video, very imaginative...
Best regards. Ken.
|
|
By justaviking - 10 Years Ago
|
|
Thanks for the kind words, guys. It's sort of funny, isn't it, that I'm spending time on these Indigo tutorials and I didn't even buy a licensed version of Indigo? (Notice my Indigo shots are always watermarked.) I was initially thrilled with the idea, but then did the math on the rendering times and realized it wasn't for me. It's fun to make the stills, but wasn't sometime I wanted to put money into. However, since I played with it and there's a lot of interest in it, I decided to share what I learned. I also have a couple other tutorials I've wanted to do for a long time. One is about UV maps, which applies equally well to both iClone 5 and iClone 6, and the other is a "technique" I wanted to reveal (top secret until I make the tutorial) that applies to all versions of iClone. My remaining Indigo ones are getting close to being done. Hopefully it's only a matter of days, not weeks, until they're done.
|
|
By mark - 10 Years Ago
|
|
AWESOME info VIK!!!! Thanks and of course...KEEP 'EM COMING!!!:P:P:P:P
|
|
By justaviking - 10 Years Ago
|
|
I have a complaint about YouTube. I used to upload all my videos to Vimeo. It was great, but on one project I encountered size limitations on the free version, so I finally created a YouTube channel. YouTube does not allow you to REPLACE A VIDEO like Vimeo does. The logic, I guess, is you could have a popular video with a lot of views, and replace it with a dumb one, and people would wonder why they got directed to your video, and why it was popular? I'd be happy, though, to have the option of replacing the video (while keeping the URL link the same) even if it mean wiping out all the statistics. I don't monetize my videos anyway. Why does this matter? I uploaded my "Smooth Camera Motion" tutorial, and picked my full-HD version, even though I knew I had uploaded a different one to Vimeo. The one I uploaded last night has "pixellation" errors, especially near the end, introduced by my video editor (a test version). So now my video on YouTube looks really bad when the camera pans. In Vimeo I'd simply "replace" the video. It's not changing any substance. It's like fixing a spelling error in a post. But in YouTube the new upload is it's own thing, and I'll probably have to delete the other one (or provide a link to the fixed video). I wish I could simply update the video file, just like I can the description and title..
|
|
By justaviking - 10 Years Ago
|
|
Okay, I think I'm done bumping my own thread for a while. I re-rendered my old "Smooth camera motion" tutorial in full 1920x1080 resolution without the pixilation I was suffering from before. (It was a bug in my editing software, not iClone, and has been fixed.) The "Channel" URL I gave before is the same: https://www.youtube.com/playlist?list=PL_2ITnz48gdfwZLcOqMOY7WF4BFDZpCr8
For your convenience, here's the clean HD version of the camera tutorial: .
|
|
By animagic - 10 Years Ago
|
|
Very nice tutorial about the camera motion, Viking! This will come in handy for my next project! :cool:
|
|
By theschemer - 10 Years Ago
|

We're safe! The Viking is happy again!
Great tutorial. I like how you say what key(s) you press when you do anything. That helps more than you know. :) Thanks, TS
|
|
By brand468 - 10 Years Ago
|
|
Great animations Viking :)
|
|
By pumeco - 10 Years Ago
|
justaviking (1/11/2015)
Okay, I think I'm done bumping my own thread for a while. I re-rendered my old "Smooth camera motion" tutorial in full 1920x1080 resolution without the pixilation I was suffering from before. (It was a bug in my editing software, not iClone, and has been fixed.) The "Channel" URL I gave before is the same:
https://www.youtube.com/playlist?list=PL_2ITnz48gdfwZLcOqMOY7WF4BFDZpCr8
For your convenience,here's the clean HD version of the camera tutorial: .
Will have to give that a go. I've run a camera along a path whilst having it look at something else, but I've never run it along a path while targeting something on another path to control it in such a way, neat idea!
|
|
By justaviking - 10 Years Ago
|
|
I'm going to whine for a moment. Feel free to ignore it. Why, oh why does it seem like everything has to turn into another oddity and mystery to investigate? I was finally, seriously working on a long-overdue project. Then something caught my eye. Indigo was doing something really strange, and it's a function I've used successfully before. Do I try to ignore it, or try to figure out why it's behaving strangely now when I know I've gotten it to work fine before? I have a hard time turning my back on this. I fear other people will trip over it. And I'd like to understand it. But here we go again, another distraction to keep me from ever finishing my real project. Grrr and sigh.
If anyone is really curious... In Indigo, when you do a Save As on a scene (using the IGS format), you should get the IGS file and *two* folders; one for the meshes and one for the texture maps (JPG files). Everything should all be in the same parent folder. Tonight I wasn't getting the "images" folder (with the texture maps). In short, the images folder was getting written to the default directory that iClone uses for the Indigo renders. This was from doing a Save As in Indigo. The IGS file went to the correct location. The "mesh" folder went to the correct location. But not the "images" folder. Looking inside the IGS file, it actually pointed to the two different locations. So the IGS file would work... until I wrote a new Indigo scene from iClone, thus overwriting the "images" folder. At that point, my carefully saved IGS file is broken. So why did it work before? And why is it misbehaving tonight? If I can figure this out, even a little, you can be sure I'll be writing up an ISSUE on this. But is this purely an Indigo problem (seems likely since I'm doing a Save As inside Indigo, so iClone should have nothing to do with it at that point). Gahh, why can't I simply ignore stuff like this and do my project?
Double-grrr.... I can't even get it to work correctly at all now. It worked correctly once, earlier this evening. Why that one time? I don't want to reboot my machine, but even killing all my Indigo windows hasn't helped. For some reason it's determined to write the "images" folder into iClone's default Indigo render temp folder. Something's really stuck. I really don't want to reboot my computer (I have a bunch of stuff set up just right, and they don't survive reboots). But man, this is really annoying me. It could be a serious problem for some people. I know what is happening, but not why!!!
|
|
By animagic - 10 Years Ago
|
|
OK, I've just tested this and I notice the same. Images are saved under C:\Users\<user>\AppData\Roaming\Indigo Renderer\TX_default. It works indeed differently from the Indigo trial version I used before. Perhaps it has to do with the special Indigo RT iClone version. Anyway, this is not good at all, because it will put huge amounts of textures on your C: drive.
|
|
By animagic - 10 Years Ago
|
|
OK, well I think it isn't as bad as I thought. I did another Save As test and checked the IGS file and in that case it shared the Images folder, so it did not make a copy of that. It may be that images in C:\Users\<user>\AppData\Roaming\Indigo Renderer\TX_default are perhaps from a scene that was exported directly from iClone without saving it first as scene file. I went to the specific folder and it only had textures for this one scene. I have used Indigo RT after the official IC6 release and it did create the two directories in the right place initially. And when I did a Save as the Images folder turned out to be shared. I have to go to work now to pay for this expensive hobby but I will check tonight. So we may be OK and you can go back to your project...:P
|
|
By justaviking - 10 Years Ago
|
|
Thanks, Animagic. I will have to look into it more too. Well, I don't' *have* to, but I will anyway. ;) Your folder path seems a bit different from mine. Of course we are on different systems, but mine was something like ....iClone/Temp/Indigo/RenderExport... something like that. It was buried in the iClone "temp" area. My observation was it always overwrote the "temp" files. This is when you do a "Render" (not the "Export") from iClone. So the good news is your C: drive does not fill up with garbage. I just keeps overwriting it. I'm starting to think the Save As never wrote the "Image" folder to the correct location, and the successes I saw were when I did the "Export" option. Oh, FYI, I'm running Indigo RT that I downloaded directly from the Indigo web site, not the what comes through Reallusion. I wonder if some sort of environment variable is set somewhere, or some goofy Registry setting. Indigo or iClone? Whose at fault? Doing a "Save As" in Indigo shouldn't involve iClone, but it's witting into an iClone folder. Very curious. Anyway... I'll play with it a bit more again tonight, and then write up an issue.
|
|
By wires - 10 Years Ago
|
|
@Viking, Have you checked the settings in the Indigo Options? 
|
|
By Shaky - 10 Years Ago
|
|
justaviking, I liked your video!
|
|
By justaviking - 10 Years Ago
|
|
@Shaky, Thank you. I'm glad you liked it. @Gerry (wires), Yes I did. But when I use "Save As" I'm not using either autosave or cache folders. (I left both options un-checked.) It should save the entire scene where I tell it to. The mesh folder works correctly, why not the images folder? I suppose I could try setting custom folders. I wonder if the mesh folder will go where I "Save As" to, and the image folder will then end up in my custom auto-save folder. That would be interesting. Wrong, but interesting. I wonder if there's anything about this on the Indigo forums.
|
|
By justaviking - 10 Years Ago
|
|
Greetings, EXECUTIVE SUMMARY: - I'm planning to do an iClone performance benchmark that involves a GPU upgrade to an Nvidia GTX 970.
I thought I would be sharing some fascinating render performance information with you today. But it hit a snag. THE PLAN: - My son is upgrading to a GTX 970 graphics card.
- Take a project that crushed my system
- Run it on his system before he upgrades
- Run in on his system with the new GPU
THE HARDWARE: My son and I have nearly identical systems. - CPU
- Both have i7 4770 CPUs.
- Mine is the "unlocked" 4770k" so they're not identical, but very close.
- Motherboard
- Both have ASUS motherboards.
- I have more fan headers and a couple more SATA ports, but in terms of performance they're extremely similar.
- SSD
- Both have 250GB SSDs.
- Not identical, but both are on 6Gbps SATA ports, so they're very similar
- Memory
- I have 16GB, my son only has 8GB.
- Both are "1600" speed.
- Graphics cards
- I have the AMD Radeon HD 6850 (Rating = 2,235) <-- No CUDA
- He has the Nvidia GeForce GTX 460 (Rating = 2,252) <-- Has CUDA cores !
- Both cards are 1GB memory cards
- HIS NEW CARD WILL BE AN 8GB GTX970 (Rating = 8,625) <-- CUDA, lots of it
- Overclocking
- None.
- Other than the "automatic" overclock provided by Intel and ASUS, we don't overclock.
The GPU ratings are from the chart PlanetStarDragon likes to use: THE TEST: On my iClone 6 lighting "stress test" video (the "UFO" video), I rendered at the super-slow rate of 25-seconds per frame. Ouch! It took way over 30 HOURS of non-stop rendering to make my video. THE WILDCARD: If anything made a difference in peformance, it should be the CUDA cores on his Nividia card. I really wanted to find a good example case where his CUDA cores would run circles around my otherwise-similar non-CUDA AMD card. THE SNAG AND RESULTS: What went wrong? My son's 8GB of system memory was a surprising limitation. After running out of RAM on the first attempt, we ended up rebooting to help clear things out. Even after a reboot I was annoyed at how much RAM his system consumes. I don't make great efforts to optimize and fine-tune my system, but he has so much crap that runs in the background I couldn't believe it. So we killed a bunch of things (he was surprised too), launched iClone 6, and opened the project. His realtime preview ran around 6 FPS. Mine seemed to run anywhere from 4-6 FPS when I was working on the project, so this provided initial confirmation that our systems really are as similar as I expected them to be. Then we rendered. Wow!!! He was rendering at 3 frames per second!!! That is 75 times faster than my system!!! But then we hit the snag. We played the resulting WMV file, and it was 100% black. No video at all. So we rendered to AVI. It was black again. And that's where my son ran out of patience and wanted to get to work on his computer rebuild. (He has to get a new case so the GTX 970 would fit, so he has to transplant all the guts to the new case.) I don't know why it wasn't working. During the render, nothing moved on the screen. The system memory was hovering around 6.3GB on his 8GB computer. I may never know why it wasn't rendering for him. NEXT STEP:
He doesn't have a lot of hands-on experience with the hardware, and he didn't finish the transplant. I'm pretty sure I'll be helping him finish it tonight. Once the dust settles, we'll try the project on his new GTX 970. At least we'll have a good baseline for the "preview" rendering speed. And we'll check his rendering speed too. He also has some Blender baselines prepared, since they interest him more than iClone does. Hopefully I'll have updates (much shorter) in a couple of days.
|
|
By mark - 10 Years Ago
|
Got me glued to the screen...:D can't wait to see what happens in the next thrill-packed episode!!!:P:P:P:P
Thanks for sharing Vik!!!!
|
|
By The Mythical Dragon - 10 Years Ago
|
Awesome thread. You are really putting out some useful information. Thank you very much.
|
|
By justaviking - 10 Years Ago
|
|
Mark and Doug, Imagine how thrilling and useful it will be when I have results to share! :P
|
|
By KenCoon - 10 Years Ago
|
I am glad to find that I am not the only one who gets side tracked during the day and never gets back on track until much later and then we are behind more. ( There is a line that fits in here but this is a family forum.) Have a nice day and keep up the good work. You are much appreciated. Ken:)
|
|
By mark - 10 Years Ago
|
|
justaviking (1/27/2015) Mark and Doug,
Imagine how thrilling and useful it will be when I have results to share! :P
Results! Results? We don't need no results... it's the thrill of the chase that counts!!!! :P:P:P:P
|
|
By justaviking - 10 Years Ago
|
|
KenCoon (1/27/2015) I am glad to find that I am not the only one who gets side tracked during the day and never gets back on track until much later and then we are behind more. ( There is a line that fits in here but this is a family forum.) Have a nice day and keep up the good work. You are much appreciated. Ken:)As my wife learned as a child from her dad, "The hurrier I go, the behinder I get."
|
|
By justaviking - 10 Years Ago
|
|
It's a strange mix of pride and shame now that my son's computer out-performs mine. PRELIMINARY RESULTS - BLENDER: First, a recap of his old and new cards: - OLD... GTX 560, 1GB (Rating = 2,252)
- NEW... GTX 970, 4GB (Rating = 8,625... theoretically 3.8x faster)
SatelliteLevelForTwinStickShooter1 - 560: 09:45.22 | Mem:73.37M, Peak: 75.62M
- 970: 04:31.80 | Mem:69.03M, Peak: 71.28M
- Improvement: 2.17x faster
DesktopGalaxy2 [progressive render - renders whole image sample level by sample level until you tell it to stop OR it reaches the given stopping point] - 560: 23:25.96 | Mem:72.67M, Peak: 72.67M
- 970: 13:13.76 | Mem:72.88M, Peak: 72.88M
- Improvement: 1.77x faster
ANALYSIS: First and foremost, you have to be careful about looking at ONE benchmark. The "ratings" listed a few posts back were the "G3D Mark" score. It is typical for cards to excel in one synthetic benchmark, struggle in a particular game, and reign supreme in another game's benchmark. Based on the G3D Mark ratings, we could expect the new card to be 3.8x faster, but we're seeing roughly a 1.8x to 2.2x improvement. NEXT STEP: I can't wait for some iClone rendering results. Not only will it be a faster card overall, but my old card doesn't have CUDA. (I really wanted to see if his old card with CUDA would noticeably outperform my non-CUDA GPU. But I don't trust the "all black" rendering results.) Stay tuned for more results. :)
|
|
By justaviking - 10 Years Ago
|
|
Finalllly.... Some iClone performance numbers. The cards: - Old = GTX 560 - 1GB
- New = GTX 970 - 4GB
The test case: - My "UFO" lighting and stress-test project:
Interactive preview fps: - 560 - About 3
- 970 - About 16... 5x faster
Rendering: - 560 - About 25 seconds per frame (not frames per second)
- 970 - About 3 seconds per frame (still not frames per second)... 8x faster
The project was bumping into [EDIT: "exceeding"] the 4GB RAM limitation on the 970. No wonder my little 1GB card struggled. I tried turning off supersampling (render option), and also turned on "Level of Detail." That helped speed things up, but only for a while. The GPU memory usage was around 3.85 for a little while, but after several seconds climbed up and hit [EDIT: "exceeded"] the 4GB ceiling again. It would be interesting to delete a lot of trees and see if that would keep my memory usage down. I'm pretty sure I have a lot of trees that are not in the cameras field of view (at least during my test segment). The 16fps felt reasonable for editing purposes. Not ideal, but I could "scrub" well and I could see what was happening.
|
|
By animagic - 10 Years Ago
|
|
Interesting that you almost ran into the 4GB limit. I had the same with one of my projects. This is somewhat troubling... The card I bought was initially for a new build (postponed because of time and costs), but is now really helping my older system. I was surprised though when GPU-Z showed a memory usage of about 3.8GB. I was rendering at 1920x1080. What was your resolution? Anyway, if the new build ever gets build, I may be looking for an 8 GB version of the card, which is expected later this year.
|
|
By justaviking - 10 Years Ago
|
|
I should clarify that I EXCEEDED 4GB of RAM usage, not merely approached it. I was rendering at 1920x1080 resolution. I usually use "supersampling = On" and my output format was MP4 video. I was excited when one of the changes looked like it would keep me below the 4GB level, but it didn't stay that way for very long. On a related note, there has been some "discovery" that the memory on the 970 is divided amongst two controllers. Roughly 3.5GB is on a "fast" controller and the remainder is on a "slower" controller. One has to keep in mind that "slow" memory on a graphics card is still a lot faster than reaching back to the system RAM on the motherboard. Here are two articles about it: It's difficult to find games where hitting the "slow" memory is actually a problem, because you usually slow down for other reasons first, but this might be a non-gaming situation where it becomes relevant. Of course, since I actually peaked over 4GB, I hit system RAM anyway, and that would have a much greater impact. The 970 is still a very nice card, and this new information does not invalidate any of the published benchmarks. It is very interesting to hear about an 8GB version coming out. Any idea when??? Does "later" this year, mean June, or November? Based on this sample project, I'd probably be much better off with an 8GB 970 than a 4GB 980. There's not doubt that the 4GB 970 is a huge improvement over my current card, but I can already see where "more than 4GB" card would be nice.
|
|
By animagic - 10 Years Ago
|
justaviking (2/18/2015) It is very interesting to hear about an 8GB version coming out. Any idea when??? Does " later" this year, mean June, or November? Based on this sample project, I'd probably be much better off with an 8GB 970 than a 4GB 980. There's not doubt that the 4GB 970 is a huge improvement over my current card, but I can already see where "more than 4GB" card would be nice. The release date looks like a moving target from what I'm reading...:P An indication that a release may be imminent (it was March in January) is that Samsung has cranked up the production of 8Gbit chips. (:Whistling: Starts reminiscing about the 1kByte RAM on his KIM-1 microprocessor kit...:doze: Oops!) So we will see. But in the next two months would be nice. Unfortunately, 99.99% of the discussions I've read are about whether it's useful for gamers (opinions divided), with claims that the memory bus is not wide enough for that amount, etc. They assume that it maybe useful for video editing, but nobody seems to do 3D animation. And, I would concur that an 8GB GTX 970 is the better choice. My GTX 980 has it easy, even with heavy scenes usage hovers around 50%. Do you have some GPU-Z data of your experiment? I'm curious to know how much the 970 gets taxed.
|
|
By prabhatM - 10 Years Ago
|
Could we have some more info about the kind of Project you were doing that you overshot the limit of 4GB on a 970/980 ?
A still image would be of some help.
|
|
By justaviking - 10 Years Ago
|
|
prabhatM (2/18/2015) Could we have some more info about the kind of Project you were doing that you overshot the limit of 4GB on a 970/980 ?
A still image would be of some help.Sorry about that. I guess it's time to do that again. I purposely built this scene to stress test iClone and demonstrate "unlimited" lights. It takes place in the forest in the "Level of Detail" sample project. The amazing thing is I rendered for over 30 hours without any crash or error on my poor little video card. My "performance measurements" on the 970 card were taken from around the 2:25 mark on the video below. My son was running ASUS GPU Tweak. I didn't grab any screenshots of it, but was surprised to see it go over 4GB. I'll have to annoy him into re-opening the project and getting some more statistics. (He also has to shut down some other junk on his computer so he doesn't exceed his 8GB system RAM. One is some "chat" thing, and he was stunned to see how much memory was freed up when we closed it. I think he uses close to 5GB while doing nothing much of anything. Kids these days.) A quick note on 8GB cards... Putting two 4GB cards in SLI doesn't equate to 8GB, since each card basically has a copy of the same information, more or less. So two 4GB cards is still 4GB. .
|
|
By prabhatM - 10 Years Ago
|
justaviking (2/18/2015)
prabhatM (2/18/2015) Could we have some more info about the kind of Project you were doing that you overshot the limit of 4GB on a 970/980 ?
A still image would be of some help.Sorry about that. I guess it's time to do that again. I purposely built this scene to stress test iClone and demonstrate "unlimited" lights. It takes place in the forest in the "Level of Detail" sample project. The amazing thing is I rendered for over 30 hours without any crash or error on my poor little video card. My son was running ASUS GPU Tweak. I didn't grab any screenshots of it, but was surprised to see it go over 4GB.
Thanks. I remember this clip. Just that I forgot/missed the context.
|
|
By animagic - 10 Years Ago
|
|
Very nice clip, Dennis! I somehow missed it...:blush:
|
|
By mark - 10 Years Ago
|
|
Same here!! VERY COOL!!:D:D:D
|
|
By justaviking - 10 Years Ago
|
|
Thanks, guys. I'm still impressed that I rendered for 45 hours without a crash. I re-rendered one part because I noticed an error. My real-time preview was so sluggish, I didn't notice one of the glowing orbs went through the UFO, so I re-rendered a portion. Someone with really sharp eyes can catch a "jump" in the waving tree branches where I spliced the film together. I've since noticed another minor error. It's funny how you can literally watch your video 20 times, and still overlook something. It's not worth fixing, though. If I could overwrite my uploaded video and use the same URL (like you can in Vimeo) I might consider it, but in YouTube you can't do that, so this will remain as-is. Hopefully I'll make another project someday.
|
|
By Bellatrix - 10 Years Ago
|
That video is a good way to test out spotlight number capacity. Good job there...fun to watch too...
IClone6 has sufficient lighting style options for outdoor shots now, to me anyway. Finally, UE3 outdoor quality is possible with some cheats...
Spotlight has weak output range however, at certain fall off- intesnity combo it will start to clip and act like the dreaded Point lights, and become agnostic to walls. Which means constant Heidi mouth check to prevent light leak. It's 2015 still no per Spotlight shadow tuning, a Unity Free feature. Very wow.
Light is cheap and shadow remains expensive in iClone 6.
I tried to max out the other shadow option: Directional light count with a large displaced plane and Heidi only to maximise hack-ability. Viewport starts to lag after a dozen or so... Trying to push the limit of 2011 basic game render hacks in a 2015 "high end" game...I know...silly me. :D
|
|
By justaviking - 10 Years Ago
|
Wow, I've neglected my own little corner for much longer than I realized.
I am now drooling over a new video card: You get 97% of the Titan for 65% of the price. What really captures my attention is the 6GB of vram. My "lighting stress test" video exceeded 4GB of vram, and rendered really slowly. There were early rumors of a 6GB GTX 970, but they appear to have been unfounded. I wonder how it would do on this card?
This is still an expensive card, launched at $650 (US). But I'd once again have bragging rights over my son's computer, which has the GTX 970.
http://anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review
|
|
By mark - 10 Years Ago
|
Drooling a bit here too! Sounds like the solution for me as well...just hope it will cure my odd "spotlight-shadow-flicker-problem" :blink:
|
|
By Tarampa Studios - 10 Years Ago
|
|
justaviking (2/18/2015) ... I'm still impressed that I rendered for 45 hours without a crash... I've since noticed another minor error. It's funny how you can literally watch your video 20 times, and still overlook something...
Don't be too hard on yourself. In Django, Tarantino and his whole crew (seemed/chose) to overlook the mix-up during editing of the scene where the band of rednecks attacked the dentist's wagon... e.g. first you see them crossing a creek to gather at the base of the hill and peer over at the wagon, then you see them discussing the bags to hide their faces, and how one of their wives should have made the eye-holes bigger... then you see them galloping from miles away, and *then* attacking the wagon... and it's so easy to pick up, that my five-year-old-niece laughed... but on the other hand, the main purpose of that movie is entertainment, so shifting an epic-gallop scene right before the brief battle scene certainly did have better impact for the explosion, even if it was filmed/written out of order at first... and it seems that none of the judges in any of the awards since then have disagreed... haha
Do you mind if I ask - for 45 hours of rendering; how many minutes/frames was that for, and was it 1920 X1080?
|
|
By justaviking - 10 Years Ago
|
@Bleetz, Sometimes I get hard on myself, but other times I'm simply surprised at what can be noticed later. There may be a thousand "problems" - large and small - but then after it gets posted you YouTube, up pops #1001. Sometimes it's trivial, other times you just gotta to a big face-palm. Fortunately this was a trivial one.
About my video and the rendering... It is 12,200 "iClone frames," meaning 6,100 video frames at 30 fps. I rendered at 1920x1080.
My rendering speed, if you can call it that, was a slow a 25 seconds per frame. Not frames per second. Barely over 2 frames each minute! Simple math of 6,000 frames at 2 frames per minute gives you 3,000 minutes, which is 50 hours. So, yeah, my 45-hour rendering time is correct. :( My real-time preview limped along at 2 to 3 fps, which made editing difficult, which is why I missed couple of things (that's my excuse and I'm sticking to it).
I tried the same project on my son's computer after he upgraded his GPU to an Nvidia GTX 970 with 4GB of vram, and got very similar results. The overwhelming problem, I think, is exceeding the card's memory. And that is why I've been extremely interested in a 6GB card.
I really want an Nvidia card (mostly because of PhysX and CUDA), but there's no why I can even consider a $1,000 card. I was hoping for a $450 to $500 6GB version of the GTX 970, but that doesn't seem to be happening. I will allow myself to be tempted with the new 980 Ti, but won't buy it for a while. Maybe I'll turn it into an early Christmas present several months from now.
|
|
By animagic - 10 Years Ago
|
|
I'm very pleased with my GTX980, but could use a bit more memory. I still have a new built in the pipeline and the GTX980Ti sounds like an interesting option.
|
|
By VirtualMedia - 10 Years Ago
|
Any chance you could put together a good indoor lighting tutorial, or share some tips on something that would resemble studio lighting for single characters?
Keep rendering!
|
|
By jgrant - 10 Years Ago
|
|
Just found this page and am happily learning more thanks to your hard work. Many thanks for sharing.
|
|
By alemar - 10 Years Ago
|
justaviking thank you , a lot of info here.
|
|
By justaviking - 10 Years Ago
|
@VirtualMedia... Sadly, I'm not very good at lighting. That was something I really wanted to work on while doing my last large project, but it fell victim to time pressures (again). Since this is an area I've wanted to work on for quite a while, maybe when I really do learn more about lighting that will help me to share what I learn with my friends here.
@JGrant and Alemar, Thanks for the kind words.
I have a couple of long (LONG!!) overdue tutorials I still want to make. I've done so little iClone lately, but I'm finally starting to get back into again. Just recently I *finally* touched Substance Designer after having owned it for many months. I have to get some of these things out of the way before the next Pinhead project (I still have a voice-over I promised for a friend here.)
Thanks again for the encouraging words.
|
|
By Rampa - 10 Years Ago
|
@JVK,
I find the backlight to be tricky to get really good in iClone. That shine through the hair isn't possible, but I have a couple ideas. I bet you do too! ;)
I hear you on the "long overdue tutorial part". Whenever RL starts teasing the next big thing, I sort of shut down and wait.
|
|
By justaviking - 10 Years Ago
|
Welcome back to my little corner of the forum.
I have *finally* been spending some time with Substance Designer and making my own substances. I followed the Reallusion tutorial, and it only took me two tries to get it right.
Now I have stepped out into my own.
To challenge myself, and to "force" myself to learn more about SD, I decided to *not* use any outside images or bitmaps. It would be so tempting to try to create a "picture" and then use a tool like B2M3 to great the basic textures. But I challenged myself to see if I could make something entirely procedural.
I found a particular "substance" to make, and I chose it for three reasons:- It interested me.
- Because it had a specific pattern, I thought that would be a good challenge since it's very different than making grungy, dirty, blotchy substances like rusty metal.
- Also because of the geometric pattern, normally my first reaction would be to draw it, maybe as an SVG file, and I wanted to see if I could do it without that.
The material (substance) I chose was a woven mesh fabric that's on the back of some office chairs at work.
After a lot of experimenting I have made some good progress. I did get a couple of tips from the SD user forum, but no hand-holding.
My substance graph has several dead-ends (extra nodes) that I've been using to test intermediate steps, and some placeholders are scattered throughout. Once I'm "done" and am satisfied with the results, then I'll review it and see what can be done more efficiently. I know that a lot of substances require numerous small steps, but I'm also certain I'm doing some things the hard way, but hey, I'm still a beginner.
Making a texture for the frame of the chair was very easy. It's the woven mesh, especially creating a specific pattern, that's really adding to the challenge.
It's not done yet, but it's far enough along to be recognizable, so I figured I go ahead and share my work in progress.
Part of the chair...

Close-up of the mesh...

My substance graph (in progress)...

In iClone 6...

Indigo render...

|
|
By Rampa - 10 Years Ago
|
|
Now that is some pretty awesome progress JVK! very cool challenge. :)
|
|
By Lawsuit Pending Records - 10 Years Ago
|
|
Good going I find SD a bit of a challenge This project you are working on looks great
|
|
By justaviking - 10 Years Ago
|
Here's a little update on my slow-going "weave" project.
Much of the slowness is due to lack of time, but also partly because I specifically chose it to challenge me to do something "geometric" in Substance Designer without resorting to any geometric type tools (such as SVG files). I recently started adding some texture to the vertical cloth straps.
Reference photo...

In iClone 6...

|
|
By justaviking - 10 Years Ago
|
I think I'm satisfied with the result, and learned what I wanted to learn.
There are some imperfections, but I don't think they're worth fussing with and you need to be pretty close to see them. Too bad I didn't smooth the curve of the chair's frame in 3DXchange (real time smoothing didn't help much).
I might stage a picture or two at the office and place my chair in the environment, just for grins.
I was able to get a pretty good imitation of the material for the frame of the chair with very little effort.
It's amazing how different the material can look in Substance Designer, iClone, and Indigo. iClone tends toward lightness, and Indigo definitely tends to turn out dark.
It's been a fun project.
In iClone...

A close-up, in Indigo...

|
|
By Bellatrix - 10 Years Ago
|
Very nice weave effect! Looks exactly like the Aeron I'm sitting on... I have a mental block with node based anything. Instant glaze over lol Good to see Substance being put to good use in iClone creative process...
Custom material auditioning can be a lot more fun... Imagine... Simultaenously comparing realtime native viewport render with mini viewport showing Indigo render ZERO multi-click multi-window jumping As in, you know... DUAL VIEWPORT in REALTIME!!! You know...the kind of custom material realtime auditioning fun Blender Daz Carrara etc etc users have been enjoying Because they get 2 viewports at the same time?! For many years now?! Still, here we are...
If I get a normal non-hack realtime mirror next week I'd stop whining about the state of ICindigo for a while I promise ;)
|
|
By justaviking - 10 Years Ago
|
This is really random (which is part of the reason I have this thread), but I noticed I had an old iClone/Indigo video that was "unlisted" on YouTube.
I don't see any reason not to open it up.
It's probably boring, but here it is...
|
|
By justaviking - 10 Years Ago
|
Greetings.
Just a little after-Christmas update for people interested in SSDs.
I upgraded my SSD:- WAS... 240 GB - OCZ Vertex 3
- NOW... 500 GB - Samsung 850 EVO
My main reason to upgrade was the increase in capacity. (And I hope to put the old on in a laptop, after I delete some files to get below the 240GB capacity.)
I also benefited from a nice bump in performance. What a nice surprise. Actually, I knew from some benchmarks that the new one should be significantly faster than the old one, and low-and-behold, I have confirmed it on my own computer.
According my Crystal DiskMark benchmarks, percent IMPROVEMENTS (Read, then Write) are:
- Test Read Write
- Sequential = 2.3 % 63.8%
- 512K = 2.8% 55.6%
- 4K = 25.0% 8.5%
- 4K (QD32) = 29.7% 52.4%
I think my desktop icons appear a little faster when I logon, and maybe some applications launch a little bit quicker. But since I already had a respectable SSD, we're typically talking about changes from 1.5 to 1.0 second, or 7 to 5 seconds. Without a stopwatch and recorded benchmarks, it's not a difference you really notice.
But since my primary goal was to double the size (mission accomplished), the improved performance is merely a little treat, like sprinkles on your frosted cupcake.
|
|
By pumeco - 10 Years Ago
|
One of the fIrst things I'm doing when I manage to come across that extremely rare stuff called cash, is to grab a copy of the Substance Indie Bundle. Was playing with it just before Christmas and really like it, great product. When I glanced at the iCLone6 version you posted next to the real one, I honestly thought it was a photo at first sight, specular especially looks very convincing, not just the pattern.
SSD, I've been meaning to upgrade to SSD for 3-4 years now, and still haven't got around to it!
|
|
By justaviking - 10 Years Ago
|
|
pumeco (12/29/2015)
One of the fIrst things I'm doing when I manage to come across that extremely rare stuff called cash, is to grab a copy of the Substance Indie Bundle.
Yeah, Substance Designer and Substance Painter are great applications. Tons of fun.
I wasn't afraid, but I'll admit to being a bit intimidated thinking about creating those "graphs" in Substance Designer. I'd seen ones like that before in Blender tutorials, but I'd never actually used them. (I've avoided learning Blender for fear of addiction.) After blindly following a couple tutorials, a couple of times each, step by painful step, it started to make sense.
I'm glad you liked my chair mesh. It was a lot of fun. I had visions of making some improvements to it, and cleaning it up, and posting it on Substance Share over the Christmas holiday, but I haven't even launched SD yet.
SSD, I've been meaning to upgrade to SSD for 3-4 years now, and still haven't got around to it!
I highly recommend it. The deltas I posted above are from a "good" SSD to a "better" (and bigger) SSD. The differences compared to a spinning HDD are stunning, and you can actually feel the improvement. Just like going to a dual-monitor setup, once you try a solid state drive (for your boot disk), you'll never go back.
|
|
By animagic - 10 Years Ago
|
|
Viking, I assume you have the iClone temp folder on your SSD (it being the system drive)? That should make a definite difference.
|
|
By pumeco - 10 Years Ago
|
@Dennis Nodes, nodes, nodes! I love nodes but hate math, so I have a love/hate relationship with those things - lol
@Job Yup, I remember someone once saying that it's best to have your system installed on the SSD so that the OS feels a lot more responsive. I've never used a machine with SSD so it sounds as if a surprise is to be expected when I finally do.
My mechanical drive isn't too bad to be honest, but I know it's nothing like the results I'd get from SSD.
|
|
By justaviking - 10 Years Ago
|
Yes, my temp folders and "assets" are also on my C: drive (SSD). Usually I install software to their default file locations. I try to have most of my "working" stuff on the SSD, and now with my increased capacity I can be a bit more lax and clean up after myself less often. All my renderings and other big stuff go to my hard drives. I also used Windows to redirect "My Documents" to large hard drive.
I got a "free" copy of the Assassin's Creed Syndicate game when I bought the Samsung SSD, and it didn't fit on my old SSD, so I had to install it on to a traditional hard drive. I might look into moving it. It took up something like 32GB of space. Between "writing" (programming) the game and creating all those assets, it's no wonder big-name games cost millions of dollars to produce. I'm not a gamer, but that series has always intrigued me. So far I installed it and got through the opening animation sequence and up to the point where I had control of the character. I tried the WSAD keys, he moved, and then I got back to other things.
|
|
By justaviking - 9 Years Ago
|
Having a some fun tonight. I involves the usual aggravation that seems to come with a lot of software projects, but I'm calling it "fun" anyway. I got Pixar's Renderman running. They really want you to have Maya installed (which I do not), but there are ways around it. Using a sample file, here is my first-ever image using Renderman.

|
|
By mark - 9 Years Ago
|
Lookin' real cuddly! Does it have a name?:P:P:P
|
|
By justaviking - 9 Years Ago
|
|
mark (5/31/2016) Lookin' real cuddly! Does it have a name?:P:P:P
Stubby ? ? ?
|
|
By justaviking - 9 Years Ago
|
|
< Double Post - Deleted >
|
|
By jlittle - 9 Years Ago
|
Will you look at that, there is a RenderMan plugin for Blender! Intro To RenderMan for Blender
Jeff
|
|
By pmaina - 9 Years Ago
|
|
Very cool & creative. Mason has major swag, and he can moonwalk & do the robot as well. Oops! The first post is over 2 years old.. Ma bad.
|
|
By VirtualMedia - 9 Years Ago
|
Is this guy rigged? Would love to see him wreak havoc on some barbie dolls :P
|
|
By justaviking - 9 Years Ago
|
@VirtualMedia,
Is the dinosaur rigged? I suspect that's the case, but I don't have a definitive way to back up that statement. The best answer is, "I don't know."
@Sw00000p,
I have no idea what shader is used. Sorry. At this point, about all I can say is I got Renderman installed and it successfully rendered the sample dinosaur. I'd probably have to dissect the .rib file, which is something I've never done before. Yesterday was the first time I've even had a .rib file on my computer. I've done zero tutorials. I haven't learned it at all. I merely got it to work, and especially without Maya, that took a bit of effort.
I'm running Renderman in a "standalone" mode. No Maya, no Katana, not even Blender at this time. I ran Renderman (prman.exe) from a CMD window. That's right - it is 2016, I'm running Renderman, and I basically used DOS to do it.
After installing the "Pro Server" application, I had to go back and install the "Studio" application that seems to be part of the Maya integration, even though I don't have Maya on my computer. The "Studio" installation gives you the "Image Tool" (a.k.a. "IT") that will display your render. So then I manually launched "IT" first, and then ran prman, directing the output to the "IT" application and feeding it the dino.rib file. It worked, and I was thrilled. And for now that's all I know.
I don't claim to know anything (yet) about Renderman. I know what it is, and its connection to Pixar. I wondered if I could get it to work on my computer, and I did. Yay. Running it via command prompts isn't a practical way to learn it. I'm going to consider exploring the Blender plug-in, but in all honesty I might take my little success and consider this experiment done. I am not going to buy Maya (Pixar only officially supports two integrations - Maya and Katana, and Katana is Linux-only), and there isn't really any learning documentation I could find for running it via command line. They even warn you that the command-line approach is only for "technically minded" people.
BOTTOM LINE: I have no compelling reason to use Renderman. I just thought it would be fun to try it. And I did, sort of. I have a lot of other, more productive uses for my time.
@pmania... If you're talking about the "UFO" video, thanks, I'm glad you enjoyed it.
|
|
By justaviking - 9 Years Ago
|
|
sw00000p (5/31/2016) Run it on Maya and your eyes will pop out of your head, with amazement! :P ...I know... you have NO interest in Maya.
I'm certain my eyes would pop. I have no interest in Maya because I have no budget for Maya. :crying: Those kids of mine, they want food every day.
|
|
By justaviking - 9 Years Ago
|
|
I ran across a little tidbit, and this seemed like as good a place to post it as anywhere else...
We often talk about "real reflections" in iClone. Specifically the lack thereof. I found it interesting that even Pixar's Renderman makes good use of "shortcuts" for things like reflections. https://renderman.pixar.com/view/faking-reflections
In this page too, in "The Elements" section (right below the snow picture), they talk about doing reflections for the groundhogs. https://renderman.pixar.com/view/inside-pixars-shorts
Of course we're talking apples-to-oranges here (not a direct comparison), but I found it interesting anyway. Even Pixar does not rely on "real" reflections all the time. But we still need *something* that works better than what we have (don't have) in iClone 6, so I'm letting Reallusion off the hook yet.
|
|
By justaviking - 9 Years Ago
|
A little news from Viking Land...
I got a new graphics card yesterday.
I've never owned a flagship GPU before. I've been eyeing the Nvidia GTX 980Ti for quite a while, but just when I was ready to buy it, the GTX 1080 came on the scene.
After battling "Out of Stock" issues for a couple of weeks...
I am now the happy owner of a GTX 1080.
I got the "SC" model from EVGA. http://www.evga.com/Products/Product.aspx?pn=08G-P4-6183-KR The card is 10.5 inches long, and that really filled my case. There are a couple of cables between the end of the GPU and my hard drive cage, but it's tight.
So... is it faster? YES!!!
I came from a well-aged Radeon HD 6850 graphics card... with only 1GB of VRAM.
QUICK TEST ON A CURRENT PROJECT:
NOTE: The project loads my GPU to 1.3GB of memory, and my old card only had 1.0 on it, so the simple addition of more VRAM helps hugely.
Realtime preview: Old = 4 to 6 fps New = 57 to 60 fps
Render time: Old = 65 to 75 minutes New = 7 minutes
In both cases, on this project, I'm seeing a 10x improvement.
Editing on the timeline is so much nicer now. Before I had to choose between choppy realtime, or ByFrame where it's hard to get a feel for the speed of a hand motion.
If you're upgrading from a 4GB card, don't expect a 10x improvement. The 980TI, still a very fine card, is finally dropping in price. I saw one for $430 after rebate a few days ago.
I got my card from Newegg. http://www.newegg.com/Product/Product.aspx?Item=N82E16814487244&Tpk=N82E16814487244 I used Newegg.com's "Auto Notify" feature to send me an email when the card I wanted was in stock. I would get the emails about 20 minutes after it was already out of stock again. It typically stayed in stock for 3 to 6 minutes each time. Hopefully that will calm down soon. If you want to look for a GTX 1080, check out the site "Now In Stock" where it can help you see when to try and place your order. https://www.nowinstock.net/computers/videocards/nvidia/gtx1080/
Have a nice weekend, everyone!!!!
P.S.
Two other things go know... since the card is so new, and drivers aren't mature yet... a) Nvidia's Iray renderer does not run on the GTX 1080 yet. This is expected to be fixed in time for the SIGGRAPH show (July 24-28). b) Blenders "Cycles" renderer doesn't properly support it yet. There's a patch (which you may need to compile yourself), but the expect to officially support it in the near future when there's some sort of CUDA update from Nvidia. On that note, apparently CUDA compile times are cut in half (twice as fast) in the new update. Wow.
|
|
By mark - 9 Years Ago
|
Oh man! That's great! Keep us posted on how it continues to perform. if I had only waited a few more months. As it is I'm stuck with the "old" GTX 980Ti!!! :P:P:P:P
|
|
By Rampa - 9 Years Ago
|
Yay! :)
A truly worthy upgrade! Your 'Cloning will be so much more pleasant now.
Can you do another test? That would be PhysX. So load 5-10 Natalies, and see how fast the physics runs in both realtime, and by frame. Then open your Nvvidia control panel, and switch the PhysX to run on your CPU instead. Remember to make sure "bake" is turned off in iClone's project tab for accurate results.
I have a lowly GTX 770, and have about zero difference between PhysX on the GPU or CPU. I would think a high-powered card like the 1080 would make a difference.
Thanks
|
|
By animagic - 9 Years Ago
|
|
mark (7/2/2016) Oh man! That's great! Keep us posted on how it continues to perform. if I had only waited a few more months. As it is I'm stuck with the "old" GTX 980Ti!!! :P:P:P:P
Don't feel too bad, Mark! I've used EVGA's step-up program to upgrade from a GTX980Ti to a 1080, and there is not that much difference. Most benchmarks you see are for games and only in part applicable to iClone.
|
|
By brand468 - 9 Years Ago
|
|
rampa (7/2/2016) Yay! :)
A truly worthy upgrade! Your 'Cloning will be so much more pleasant now.
Can you do another test? That would be PhysX. So load 5-10 Natalies, and see how fast the physics runs in both realtime, and by frame. Then open your Nvvidia control panel, and switch the PhysX to run on your CPU instead. Remember to make sure "bake" is turned off in iClone's project tab for accurate results.
I have a lowly GTX 770, and have about zero difference between PhysX on the GPU or CPU. I would think a high-powered card like the 1080 would make a difference.
Thanks
Hello Rampa, I also have a GTX 1080 + GTX 980 Ti and have tested with 10 Natalie. When I have a GTX 1080 set as PhysX, I get 60 FPS in frame by frame, but when I run real time I get less than 1 FPS. When I switch PhysX in Nivida panel I notice no difference between the GTX 1080 and GTX 980Ti, same when switch to CPU. Maybe i need to reboot when changing PhysX ??
|
|
By justaviking - 9 Years Ago
|
Hi Rampa and Brand...
I sat down at my computer to do the rest, and I see Brand beat me to it. Thanks for testing and sharing.
Is anyone starting to think there's an issue with the settings in iClone? A 980Ti or a GTX 1080 just has to run PhysX much, much faster than a CPU. Shouldn't it?
I'm also curious, and maybe this is another learning opportunity for me, but I would expect By Frame to be slower than Realtime. That's been my experience. Was that reported backwards? Or what am I missing?
ABOUT THE DIFFERENT CARDS...
I concur with Animagic, the 980Ti is still a wonderful card.
A 980Ti actually has significantly more CUDA cores that the new GTX 1080. Sure, the 1080 runs faster, but there will be times the extra cores come out ahead. There is also a bump from 6GB to 8GB of VRAM, but if you don't exceed 6GB then that doesn't matter. On a plus side, Mark has been able to enjoy and benefit from his 980Ti for months, while I limped along with a sub-par card until yesterday. Maybe Mark will go for a GTX 1080, which has not been announced, but is assumed to arrive, maybe around the end of the year or 1Q17, based on past history.
|
|
By Rampa - 9 Years Ago
|
The by frame does indeed work a ton better in this case. It seems it lets the rendering run at full speed while the animation engine runs slower. It'll fool you in much the same way that the 60 FPS rendering and 30 FPS output do. It's always best to think of it like a game. It just runs the rendering as fast as it can for any given scenario.
I'm speculating that the iClone's implementation of PhysX is a software only one? This is going to play a huge factor in my GPU upgrade. I'm leaning towards the AMD RX 480, as it a reletively inexpensive 8 gig vram solution. It seems that the GPU is really only for rendering in iClone.
Looking at internet feedback about it, most games also seem to use CPU only for some reason.
|
|
By justaviking - 9 Years Ago
|
|
rampa (7/2/2016) The by frame does indeed work a ton better in this case. It seems it lets the rendering run at full speed while the animation engine runs slower.
I can accept that... mostly. But 60fps vs. 1fps? I don't know, but something about that sounds really odd to me, but I don't have science to support my feelings at this time.
I'm speculating that the iClone's implementation of PhysX is a software only one? This is going to play a huge factor in my GPU upgrade. I'm leaning towards the AMD RX 480, as it a reletively inexpensive 8 gig vram solution. It seems that the GPU is really only for rendering in iClone.
It does appear that way, based on the data. Reallusion; what say you? It sure would be nice to get an official statement on that.
It would also be really, really great to have Nvidia GPU Physix support (assuming it's not already there).
Looking at internet feedback about it, most games also seem to use CPU only for some reason.
Maybe so they have one code thread, and the game plays equally on all GPUs. Maybe. That might also explain what we're seeing in the test results reported here on this forum.
Of course it's great that iClone runs on both AMD and Nvidia cards (whether via the CPU or not). Even Nvidia's Iray will fall back to a CPU mode if you don't have a supported card (which includes my GTX 1080 for a few more weeks). But it sure would wonderful to leverage our GPUs to the greatest extent possible, even if it does favor one brand over the other. I favor Nvidia, but if I got a lot better iClone performance with AMD, you can bet one of their cards would have been high on my shopping list.
|
|
By Rampa - 9 Years Ago
|
|
My thinking is that my card is old, and so would be easily trounced by the newest mid-range cards. I have yet to have my view of VRAM being the most significant factor for iClone shattered. ;)
|
|
By RobertoColombo - 9 Years Ago
|
Hi Dennis,
great purchase!!! Thumbs up! I bought the GTX980 only 1.5 years ago and I feel it as already old! Damned speedy technology improvements... :P
Roberto
|
|
By justaviking - 9 Years Ago
|
|
rampa (7/3/2016) I have yet to have my view of VRAM being the most significant factor for iClone shattered.
I totally agree. Many, many years ago I learned about the drastic impact "swapping" has on performance. Exceeding available memory will destroy(!) performance, whether you're talking about CPU or GPU.
Based solely on "performance," I anticipated a 5x improvement with my new card, but saw 10x on my current project.
I also opened up my "unlimited lights" project (see FIRST post at the start of this thread). That project demanded at least 2.5GB (maybe 3.5GB) of VRAM on my poor 1GB card.
I remember rendering that for many, many hours. Something 32 (or maybe 42) hours. Yeah, 30+ hours of rendering... at the sluggish rate of 25 seconds per frame. Yes, approximately 0.04 fps.
I opened up the project and did a quick test. My realtime ran at 45+ fps. Rendering was better than 1 fps, so I estimate a 30x improvement on rendering for that project.
Of course, if I could have increased the VRAM from 1GB to 4GB on my old card, I would have seen a nice improvement too, but I'll keep my new GTX 1080. :)
|
|
By brand468 - 9 Years Ago
|
Hi , I did a testrender to disk (MP4 1920x1080 and discovered now that I get the exact same time 500 frames 2 min 15 sec in the "Frame by Frame and Real Time." So it seems that the working window in iClone skips frames and provide inaccurate time in FPS. The strange thing is that I'm still get exactly the same time when I select different PhysX processor (GTX 1080, the GTX 980 Ti or CPU)?
By the way, I bought GTX 1080 to make faster renderings of the Octane but was suprised when they do not support the card yet. I understand that it will be supported when CUDA 8 is released. Maybe it will improve board performance even on iClone?
|
|
By brand468 - 9 Years Ago
|
Justaviking, Do you have the same FPS when switching between GTX 1080 and CPU PhysX settings on your computer ?
|
|
By justaviking - 9 Years Ago
|
@Brand,
a) CUDA 8... I think that's what Blender is waiting for, too. Nvidia has a "release candidate" available, and the final release is planned for August (some people say "is "early August").
b) CPU-v-GPU PhysX... I didn't run the test after Rampa reported his results. I'll try to do some tests later today, but I might have a busy day so I can't promise anything.
|
|
By brand468 - 9 Years Ago
|
Viking, Yes, I have also seen that Relese candidate but it seems that Octane need some modifications and they wait for the full version in August (hopefully before) No hurry to test for my part, just wanted to check if you have the same problem :-)
|
|
By justaviking - 9 Years Ago
|
@Brand,
a) Suddenly, I don't trust iClone's "fps" reporting. At least not in this test.
Maybe it doesn't mean what I thought it meant. I always believed it was a measure of how quickly frames are being rendered and displayed on my monitor. That interpretation has always been consistent with the behavior I saw. If my Realtime preview looked smooth, my fps was 30 or more. When it was choppy and jerky, my fps was down to 10 or even lower. (I paid less attention to my fps numbers when doing By Frame on my old video card, so I'm less sure about those numbers.)
Now on this test, my fps numbers do not match what I'm seeing on my monitor.
I started with the "Skirt and Hair" sample project. Everything was silky smooth. Then I added 9 more copies of Heidi (no animation on them). My "Realtime" preview became very jerky, sometimes with a full second between screen updates. I am definitely NOT *seeing* 60 frames per second.
Then I switched to By Frame, and it ran smoothly, but slowly. Of course it renders every frame (that's what By Frame does), but not in real time. My fps should have been displayed as approximately 5 fps, not the 60 that iClone reported.
b) How do you switch between CPU and GPU PhysiX? I've had an AMD card for quite a while now, and the last time I had an Nvidia card I didn't play with PhysX. Do I go to an Nvidia control panel? I don't see any settings in iClone.
UPDATE --> Found the setting on the Nvidia Control Panel, as I suspected.... testing commencing...
c) For what it's worth, here are a couple of "utilization" numbers for the 10-Heidi (one dancing) scene...
By Frame: CPU load = 25% GPU load = 35-50%
Realtime: CPU load = 20% but the results are a bit more spikey, so it's hard to say GPU load = 17% or less, which makes sense since the preview would "pause or freeze," so there are fewer frame updates occurring on the screen
|
|
By Rampa - 9 Years Ago
|
It is in the control panel.

|
|
By brand468 - 9 Years Ago
|
Viking, Yes, you find it on http://www.nvidia.com/, download the software for your card. Then you go into Nivida control panel and configure Surround, PhysX.
Se picture (swedish version)
|
|
By brand468 - 9 Years Ago
|
|
justaviking (7/3/2016) @Brand, My "Realtime" preview became very jerky, sometimes with a full second between screen updates. I am definitely NOT *seeing* 60 frames per second.
Yes same problem with "Realtime" preview, very jerky, but when i render to disk it worked fine.
|
|
By animagic - 9 Years Ago
|
PhysX behavior in iClone is indeed puzzling. It's not according to what RL has let us believe. I wouldn't buy an AMD card anyway, so it doesn't really matter to me. But a clarification from RL would be helpful.
As to real-time FPS: in my experience the number is accurate in Realtime Mode in the sense that iClone skips frames if it can't keep up, but it maintains the timing, which is what you want.
EDIT: By-frame mode does not give a playback FPS value as does not skip frames, so every one of them is rendered (all 60 of them for every second). So FPS is to be interpreted as rendered frames per second of movie. I have found that to be true for both real-time and by-frame.
|
|
By justaviking - 9 Years Ago
|
Thanks, guys. I found the setting on the Nvidia Control Panel, as I expected. I should have looked before typing.
CPU-v-GPU PhysX testing commencing.
ABOUT RENDERING TO DISK... "Rendering" is basically a by-frame activity. That's why it will take as long as necessary to ensure each and every frame is fully rendered, and you don't skip any frames. So this "fps" conversation, I believe, is all about the "preview" window when you're editing.
|
|
By Rampa - 9 Years Ago
|
brand468 (7/3/2016)
justaviking (7/3/2016) @Brand, My "Realtime" preview became very jerky, sometimes with a full second between screen updates. I am definitely NOT *seeing* 60 frames per second.
Yes same problem with "Realtime" preview, very jerky, but when i render to disk it worked fine.
When you render, it is always "by frame". I think if you set the render to "Preview", you'll get a rendering at the exact same rate as the "by frame" in the window. "Final Render" will probably be slower.
|
|
By mark - 9 Years Ago
|
I would be happy if the "Quick Mode" would play a bit more smoothly in preview mode but, to me, it looks like playback is just as bad as the "High" mode.... no difference.
|
|
By justaviking - 9 Years Ago
|
RESULTS FOR CPU-v-GPU PHYSX...
I see NO DIFFERENCE between CPU and GPU PhysX settings. (I restarted iClone each time I change it in the Nvidia control panel, just to be sure.)
MY CONCLUSION: With the courage to be wrong, based on my observations, iClone seems to be hard-coded to use only the CPU for PhysX, and completely fails to leverage the power available in an Nvidia GPU.
ADDITIONAL OBSERVATIONS, COMMENTS, AND RANTING:
I watched the GPU Load using "GPU-Z."
I have reported (complained about... ranted about) this before. Here I go again.
When iClone is "Stopped" but is the active window, my GPU Load was at least 40% with iClone just sitting idle, doing nothing. I think I even saw it at 70% one time. So iClone uses up 40% or more of a GTX 1080 to do nothing. That is so wrong.
Just clicking on another window, making something else the active window, causes the GPU Load to drop to 1%, as it should be.
I have yet to be convinced that is reasonable and normal behavior. People have tried to justify it by saying iClone uses "game engine technology," but even games don't chew up my graphics card when they are paused. What could it possibly be doing? Just refreshing the same static image over and over again and making the graphics card hot for no reason.
Adding insult to injury, my GPU Load *decreases* when I press "Play," Go ahead, justify that. Pressing Stop causes a larger load on my GPU than pressing Play???
I thought I'd reported this in Feedback Tracker, but maybe I was adding a comment to someone else's report where it seemed relevant. I've certainly reported it here in various forum postings. Maybe I'll formally enter it in Feedback Tracker later today.
|
|
By justaviking - 9 Years Ago
|
Quick note about "By Frame" fps...
The iClone internal engine is based on 60 fps. But I would like to see the "render rate" for lack of a better term displayed as fps when I press Play and I'm in By Frame mode.
|
|
By Rampa - 9 Years Ago
|
|
I don't think there is a way to pause iClone. I think the timeline is simply a function within iClone, not what determines if it's running or not. But yeah, Your seeing the same results I've seen.
|
|
By mark - 9 Years Ago
|
Report it! Report it!!!!
|
|
By planetstardragon - 9 Years Ago
|
modo seems to be ahead of the game on this and now has the best of both worlds ( without being hyper expensive or bloated )
|
|
By animagic - 9 Years Ago
|
|
mark (7/3/2016) Report it! Report it!!!!
That seems to be the best and most sensible course of action. In order to support AMD cards, RL implemented CPU-supported PhysX, with the understanding however that there would be GPU-supported PhysX for those with NVidia cards.
It looks then that somehow NVidia cards are not recognized, so that is a bug that should be reported. I leave it to the discoverers to do so.
|
|
By animagic - 9 Years Ago
|
Rather than trying to compare FPS values, there is actually a much easier way to determine if an application uses CPU or GPU PhysX.
In the NVidia Control Panel, go to 3D Settings and check "Show PhysX Visual Indicator". That will tell you.
In iClone, if the PhysX Engine is turned on, you will see PHYSX > CPU (so definitely not PHYSX > GPU). If the Bullet Engine is turned on, nothing is displayed, obviously.

There is an MSI benchmark suite, Kombustor, that includes benchmarks to compare GPU and CPU PhysX. For my CPU (a 5830K) the difference is not that dramatic (about half the FPS), but with an older CPU it would make a significant difference.
|
|
By justaviking - 9 Years Ago
|
animagic (7/3/2016)
mark (7/3/2016) Report it! Report it!!!!
That seems to be the best and most sensible course of action. In order to support AMD cards, RL implemented CPU-supported PhysX, with the understanding however that there would be GPU-supported PhysX for those with NVidia cards. It looks then that somehow NVidia cards are not recognized, so that is a bug that should be reported. I leave it to the discoverers to do so.
I'll do it.
If anyone finds a "promise" of GPU-accelerated PhysX, please let me know. It may be an hour or two before I get to it, but I will report it in Feedback Tracker, and will come back here with a link to it.
|
|
By justaviking - 9 Years Ago
|
In addition to playing with iClone, I also enjoy Substance Painter.
I helped put the spotlight on a "caustics" problem in their Iray implementation, and they added a new information display to SP (to report the size of the model) in response to a question I asked.
Here is an Iray rendering that includes both refraction and caustics. The texture has some strange lumpiness in some places because this was a "learning" project, but it still looks pretty cool. (I'm attaching a 1920x1200 resolution picture so you can better appreciate the caustics and refraction.)
|
|
By justaviking - 9 Years Ago
|
Did it.
I entered a Feedback Tracker regarding the lack of GPU-accelerated PhysX on Nvidia cards: http://www.reallusion.com/FeedBackTracker/Issue/Support-GPU-accelerated-PhysX-on-Nvidia-cards
@Animagic,
I just noticed your post about how the Nvidia Control Panel can tell you if PhysX is running on the GPU. Cool. My conclusion was based not only on the fps, but also very much on the CPU/GPU utilization as reported by GPU-Z and Windows Task Manager. Good tip though. Thanks. :)
|
|
By justaviking - 9 Years Ago
|
I have ignored my own little corner here for a long time. Time for an update.
A few weeks ago I finally dipped my toe into the world of Blender. I've always known I would enjoy it, which is why, strange as it may sound, I purposely avoided it. It wasn't out of any fear, I simply knew it would distract me from other activities, such as iClone. When my favorite Blender hero (other than my son) finally produced a "beginner's" tutorial (and, bonus, it was on the new version) I finally decided to put my hand on it for more than 15 minutes.
Here's my result from the tutorial. I took liberties with the coffee cup, as is readily apparent. I was simply practicing a technique, got inspired, and carried it through to completion. The spout is actually hollow. I also added my own cloth material and put wrinkles in the cloth.

After finishing the tutorial, I've been playing around with a couple other models, entirely on my own. I made the lamp shade translucent, and the surface of the table is actually a vinyl material I created from scratch in Allegorithmic's Substance Designer (it has a lot of detail that doesn't show up in this image). You don't have to look very closely to notice the table's not done (see the latches for the legs), but I'm simply having fun learning and exploring.

My big dilemma now is, "How deep do I want to get into Blender's texture node editor?" Since I have limited time to play with my toys, I am concerned that if I start learning more about Blender's node editor, I'll be perpetually confused when I switch between Blender and Substance Designer. So I'm thinking maybe I should do the modeling in Blender and then do all the texturing using Substance Designer and Substance Painter. Between iClone, Substance Painter, and Blender, I already have enough trouble zooming/rotating/panning the model, I keep using the wrong mouse and key/mouse clicks, so I'd rather not "unlearn" what I know about Substance Designer. As I already said, I don't get to spend a lot of time with the software, so that makes it easier to get mixed up when you switch between the applications.
Anyhoo... I'm having a lot of fun, and wanted to share my humble start with you.
|
|
By 4u2ges - 9 Years Ago
|
|
justaviking (11/16/2016) Between iClone, Substance Painter, and Blender, I already have enough trouble zooming/rotating/panning the model, I keep using the wrong mouse and key/mouse clicks...
Lol so true, keep messing my projects. Thanks God for Ctrl+Z, which is the same almost everywhere you go. Nice corner! :)
|
|
By animagic - 9 Years Ago
|
|
Looks promising, Viking! But don't forget the most important part, making movies!
|
|
By justaviking - 9 Years Ago
|
Thanks for the comments and encouragement. I'm having fun, as I knew I would. I even managed to remember some of the keyboard shortcuts after being away from it for a few days. ;)
One obstacle I struggled with while making the table is that my years of working with CAD (Computer Aided Design) software sort of got in the way. To make the table-top, I would have started with a square of specific dimensions, and them simply added a 4-inch round on the corners. I'd be done in 30 seconds. It turned out to be deceptive challenge, and even my son scratched his head a bit as to the "best" way to approach it. It wasn't just me, as it was easy to find a lot of discussions in Blender forums about making a plane with rounded corners. Making the tube that supports the table-top would also have been easier in a CAD package.
Part of any learning process is to understand the "thought process" of the new software. It's a different way of approaching a problem. As you know, it is also easy to make something that looks okay when it's untextured, but has some horrible underlying topology that becomes apparent when you apply a texture and render it.
I'm playing with another model (in addition to the lamp on the table), and it's probably a bit advanced for my present beginner's skill set, but it's fun to poke at a little bit now and then. I'll post a picture when I've got something worth sharing.
|
|
By urbanlamb - 9 Years Ago
|
|
justaviking (11/16/2016) I have ignored my own little corner here for a long time. Time for an update.
A few weeks ago I finally dipped my toe into the world of Blender. I've always known I would enjoy it, which is why, strange as it may sound, I purposely avoided it. It wasn't out of any fear, I simply knew it would distract me from other activities, such as iClone. When my favorite Blender hero (other than my son) finally produced a "beginner's" tutorial (and, bonus, it was on the new version) I finally decided to put my hand on it for more than 15 minutes.
don't have to look very closely to notice the table's not done (see the latches for the legs), but I'm simply having fun learning and exploring. Anyhoo... I'm having a lot of fun, and wanted to share my humble start with you.
Well glad someone used that tutorial set :)
As for the dilema .. as a self confessed software junky I tend to ignore the node stuff inside blender and use my other toys for that I use blender for the bulk of my hard surface modelling and my other toys for the more organic stuff (characters for instance been working slowly on some things). The video making stuff well since as I put it "its my dime" i do what pleases me at any time LOL
so the side effect of that is i have right now 1) video being done (working on the sets its not a long video) 2) one character set I am working on (nothing from CC my own wierdness actually) 3) a pile of hard surfaced assets for various ideas either half done or in the back of my brain..
I realized long ago that I should just enjoy myself. The videos are coming slower (like way slower) but they do get done.. besides my passion more about 3dart then video though if it happens to turn into a video then yay, but otherwise I dont worry about it :)
reallussion still makes lots of money from me though so we are all happy I guess.
|
|
By pumeco - 9 Years Ago
|
Good to see you having fun with Blender, Dennis, looks like it's working for you, nice Cycles render!
Funny thing, Blender, you'll find that after a while, she'll start speaking to you.
You'll go to bed and all of a sudden you'll hear voices, which in your case will go something like this : "Dennis ... come to me Dennis ... I know you want some Blendertime Dennis! ..."
|
|
By justaviking - 9 Years Ago
|
|
pumeco (11/16/2016) You'll go to bed and all of a sudden you'll hear voices, which in your case will go something like this : "Dennis ... come to me Dennis ... I know you want some Blendertime Dennis! ..."
Funny! :)
sw00000p (11/16/2016) That's precisely why I will NOT use CAD generated mesh's. PITA Garbage!
No argument there. Nor would I want to design (for manufacturing) machine parts using Blender. Different tools for different purposes.
The first two approaches that seemed obvious were to use a "Surface Subivision" modifier, but it made a "curve" rather than an "arc" for the corners, so the table top corners didn't follow the tubes on the underlying frame. The "Bevel" feature was an easy way to make a very nice arc, but it sort of messed up the topology.
In the end, I used a cylinder, flattened it to the thickness of the table top,separated out quarter-sections, and "Bridged" nice quads between the quarter-sections. (No duplicate vertices or edges, of course.) It seemed like a lot of mouse clicks, especially compared to "add round", but the end result turned out very well and it was nice for unwrapping.
|
|
By pumeco - 9 Years Ago
|
|
justaviking (11/16/2016) Funny! :) Yes, but make sure your wife doesn't find out about it!
BTW (on the subject of Blender and CAD) I don't know if you're aware of it but there's a powerful Architectural CAD suite hiding inside the latest version of Blender, you just have to manually enable it in the preferences before you can see it. Just go into Preferences and search for "Archimesh", enable it, and save your preferences. You should have a new tab on the left of the screen called Archimesh. No point me posting any videos about it cause there's hundreds of the things on YouTube, it's very popular from the looks of it, which doesn't surprise me at all cause it's very easy to use and build stuff with - really fast!
|
|
By JIX - 9 Years Ago
|
|
NVidia designed PhysX to "JAM" using GPU ...and boy what a huge difference! :Wow:
I didn´t know that they designed it just for me! :w00t:
But joke apart, I was planning to use a second GPU only for PhysX, but that doesn´t make sense for now.
|
|
By JIX - 9 Years Ago
|
sw00000p (11/17/2016)
JJJAM (11/17/2016)
NVidia designed PhysX to "JAM" using GPU ...and boy what a huge difference! :Wow:
I didn´t know that they designed it just for me! :w00t:
But joke apart, I was planning to use a second GPU only for PhysX, but that doesn´t make sense for now. Hey.... The last time I checked, Blender CAN'T use Nvidia PhysX. ....nevertheless, Yep! a HUGE difference!:Wow::Wow:
Not in Blender. I was talking about iClone 6. Hopefully iClone 7 will support PhysX > GPU.
|
|
By justaviking - 9 Years Ago
|
|
sw00000p (11/17/2016) ... and what good is PBR "If you can't render it REAL TIME!
If you are trying to play a game, then nothing. But if you are rendering a movie, then the "good" is a better-looking movie.
P.S. I submitted an Issue in Feedback Tracker about GPU-accelerated PhysX on Nvidia cards a fee months ago. I haven't seen any announcement of them catering to my wishes yet, so we wait.
|
|
By pumeco - 9 Years Ago
|
|
sw00000p (11/17/2016) I have no interest in VR and what good is PBR "If you can't render it REAL TIME! IC7 Not looking so good to me, now.:crying: I could be wrong but I reckon PBR will sell more copies of iClone than any other feature they ever added. And the "Realtime" aspect of it is relative really, cause even if it's not completely realtime, it's still realtime compared to waiting for raytracers and unbiased renderers. At least when PBR arrives, we will be building our movies inside a realistic environment. If you add a sphere to your PBR scene, it's going to look realistic, and when you edit it's material, it's still going to change instantly as you edit it, right there in the environment, right there in the viewport instead of some irritating external window.
So even if technically it might not be realtime on some systems, it's still gonna feel like it is, it's gonna be awesome, or as a sw00000p would say ... BADAZZZ!!! :cool:
|
|
By pumeco - 9 Years Ago
|
|
urbanlamb (11/16/2016) I realized long ago that I should just enjoy myself. The videos are coming slower (like way slower) but they do get done..
I watched your new video "The Big Boom" a few weeks back, and enjoyed it just as much as the previous one about the chemistry - had that quirky charm to it again! But why you don't post the damn things on the Showcase Forum is beyond me. Many here won't even know about them if you don't post a heads-up there. Anyway, he might be a Hillbilly but he doesn't miss a trick :D
(Hope Dennis doesn't mind me embedding it in his corner thread) :
sw00000p (11/17/2016) PBR is new technology... Reallusion would have to re-write their ENTIRE Default Render Engine! :w00t: TODAY, RL will NOT invest on that "Super Expensive" algorithm!
Watch 'N Learn :cool:
Ah, but who said they want someone else's expensive algorithm? They have many new developers on-board from the looks of it, so they're probably doing it themselves, their own engine - the "iClone Engine" :cool:
If that is their plan, iClone is already a fantastic name for a PBR engine, and never forget, that just because the big-name game engines out there have fancy PBR and iClone currently doesn't, there is nothing to stop Big Daddy hiring experts in the field just like those companies have. And always remember sw00000p, Reallusion were around and developing software a long time ago, while some of those new guys where still sperms, swiming around in their daddies testicles!
Have more faith in BigDaddy, I think they have no intention of using other developers algorithms, I reckon they hired experts in the field, and are developing their own :D BigDaddy might have plans to take back his territorry, and BigDaddy just might succeed!
|
|
By justaviking - 9 Years Ago
|
@swooooop,
Most importantly... thanks for sharing. :)
The things yout are explaining are not new concepts to me. (But maybe to others, so thats cool.) I have even been familiar with Blender's cabilities (and modeling in general) for quite some time, it's just that I never personally been the one at the controls. Sort of like watching someone ride a bike, eventually you need to hop on and do it yourself if you really want to learn how. But you can still know the concepta and principles. I also understand unwrapping.
They "broke" our computers at work (thank you, security) and I can no longer post on the forum (and i would only do that only during lunch or on breaks, of course). I plan to share a couple of pictures this evening for you to enjoy. I have a great video to share, too, but I'm not going to try and do that on my phone.
Didnt want you to think i was ignoring you. Bye for now.
|
|
By justaviking - 9 Years Ago
|
As is usually the case, "life happened" and I didn't get any Blender or iClone playtime. But I did manage to accomplish some landscaping (the last half was done in the dark - I'm curious to see it in the morning).
Before I go to bed, here is the video I wanted to share. If nothing else, please watch from about 1:20 to 2:20.
In a CAD environment, you create GEOMETRY and FEATURES. A hole is actually a hole, not just a mesh that looks like a hole. Notice how quickly and easily that tool can add a fillet (a rounded inside-corner) and the bevel. And how you can easily drive it with dimensions. I really like where they drive that slot down through the part.
sw00000p You are used to "CAD" modeling FOR YOU. "Push the button - GIMME What I Want." a) Of course!!! That is the purpose of having quality tools. b) But it doesn't model "for me" any more than Blender models for me, or iClone animates for me. c) As I said, I just have to make that adjustment to the new tool, and how best to use it to achieve the desired result. d) CAD tools sure to excel at making geometric shapes, though I wouldn't want to try and sculpt a face in one.
I know sometimes sw00000p and I speak across each other. Often, I think, due to our sometimes unspoken perspectives. When he says, "You can do it," I think he usually means in the context of the universe of software tools available to you, often regardless of price. When I say, "You cannot do that," I am usually referring to the iClone tools from Reallusion. So, can you do bones and mesh deformations and all that stuff? Yes, per sw00000p. No, per Justaviking. And I think we are often both right, in the context from which we speak.
P.S. I did not have time to create a bunch of screenshots showing the three ways I made the tabletop. Maybe tomorrow, or at least sometime this weekend.
|
|
By JIX - 9 Years Ago
|
|
d) CAD tools sure to excel at making geometric shapes, though I wouldn't want to try and sculpt a face in one.
Not only at making geometric shapes. CAD also can do freeform surfaces, but still it´s a complete different approach.
I have a lot of CAD experience and thought it would be helpful, but actually it´s kinda hard to free oneself from the CAD mindset.
|
|
By justaviking - 9 Years Ago
|
JJJAM (11/18/2016)
d) CAD tools sure to excel at making geometric shapes, though I wouldn't want to try and sculpt a face in one. Not only at making geometric shapes. CAD also can do freeform surfaces, but still it´s a complete different approach.
I have a lot of CAD experience and thought it would be helpful, but actually it´s kinda hard to free oneself from the CAD mindset. I agree on both points. I thought about CAD's ability to do "organic shapes" but decided I had already typed enough in that post. I like to use Microsoft Word and Excel as an analogy. I can type words in Excel, and I could write a letter to someone in Excel, print it, and you'd never know I used Excel. So why have Word? There is a point were you might want to use a tool that is better for the task at hand. You can do "freeform" curvy shapes in a CAD tool, but I don't think you'd want to use if for creating the monster character for your game. One the other hand, CAD would be perfect for that folding table I made.
I can totally relate to what you are saying about the "CAD mindset." The ability to think and visualize in 3D is a universally transferable skill, but as my little table project demonstrated (and as I already knew from past experience), different tools are built on a different approach. It's like moving from Photoshop to Substance Painter. Both are great programs, and you need to understand "layers" to use both programs effectively, but they have very different ways of approaching the task of painting a texture. It makes learning new tools an adventure.
|
|
By JIX - 9 Years Ago
|
|
I didn´t know that CAD is female, but what I meant to say, nice to meet people who feel at home in both worlds, or at least are heading for understanding them both. ;)
|
|
By animagic - 9 Years Ago
|
|
I haven't used it in a while, but I believe DAZ Hexagon is a modeling package that blends the CAD and 3D modeling world to an extent.
|
|
By JIX - 9 Years Ago
|
|
True, that´s why I feel quite comfy in Hexagon.
Independently from the software still the approach is different. I don´t care about meshes in the CAD world.
|
|
By justaviking - 9 Years Ago
|
While we wait for iClone 7 with PBR and all that fun stuff, here is another Blender distraction. Yes, it was 95% "tutorial," (I'm learning, so no shame in that) but I also went beyond it in a few places to be sure I understood what was happening and to learn a couple of things on my own. Fun stuff.

|
|
By animagic - 9 Years Ago
|
|
Dennis, that's a very nice bear! Well done!
|
|
By justaviking - 9 Years Ago
|
Thanks, Job.
Thanks, sw00000p.
Everything started as a "UV Sphere." It was a lot of good practice using "Proportional" displacement to stretch the arms and legs into their shape. Even his "feet" (light brown) are just spheres scaled in one axis, so they are shaped like M&Ms that fit into indentations in the arms and legs.
The main lesson to me was working with the hair. That involved populating it with parent hairs and then having child hairs to fill it in. Using the hair editor was fun to comb it in different directions and to add in some extra hair in a couple of thin spots.
The mesh isn't really good in a technical sense. Nothing is welded together. The head just sits on the body, and the arms and legs are pinched on one end and positioned against (slightly into) the body. So there certainly is no optimization in that regard. But for making a nice looking teddy bear that renders in about 65 seconds (at full HD) it's fine. And the lesson, to me at least, was to learn about the hair and also to use other basic techniques before I forget them. There was no UV unwrapping involved, and inefficient meshes are not a problem in this case.
For anyone that's interested, the guy was very explicit in almost every instruction. He rotated and scaled by entering numbers, so it's easy to recreate the same bear he did, and get everything to position correctly. He doesn't really "teach" about the hair/particle system, but he gives you the values he did so you can match his end result.
He has another one that morphs a rolling ball into a wine glass. I half-watched it while doing other things, and it looked pretty good too.
Having done this tutorial, now I want to go grab a bunch of my kids' stuffed animals and start making them. See? I always knew Blender would suck me in and now I won't get anything else done.
|
|
By justaviking - 9 Years Ago
|
Today has been a happy day.
I have FINALLY been able to enjoy rendering with Iray on my GTX 1080. :) :D :)
After a long delay, Nvidia finally released Iray with support for the Pascal (10xx) series of GPUs. It was dependent on CUDA 8 being completed, among other delays.
Then Allegorithmic had to incorporate in their products, test it, and release it. Substance Designer 5.6 became available a few days ago. Substance Painter will get it's update at a later date (nothing specific yet).
With much anticipation, I rendered a simple model and...... got a pure black image. Long story short, I worked with their support team and there was some gunk in one of my Substance Designer registry entries. Cleaning that out saved the day and Iray works now.
RESULTS? My GTX 1080 renders the Iray scene 15x faster than my I7 3770k CPU. You read that correctly. Not 15 percent faster; 15 times faster!!! :w00t:
Here is a sample project I rendered. I take no credit for the quality. It is a sample project from Allegorithmic.
At 1920x1080 resolution, it rendered 471 iterations in one minute.


|
|
By fmccann - 9 Years Ago
|
|
Nice render;):)
|
|
By justaviking - 9 Years Ago
|
|
fmccann (12/20/2016) Nice render;):)
I'm glad you included the winky face. It helps me reiterate that it''s not my art work. Both the modeling AND the painting are a sample provided by Allegorithmic.
I just had to share my long-awaited success in running Iray on my new (half a year ago) graphics card. My old card was an AMD card, so I've always had to do my Iray renderings on the CPU. I knew it would be a lot faster on the GTX 1080. The CPU/GPU delta for Indigo was not very impressive in my opinions, but for Iray I am totally satisfied with the 15x improvement in rendering speed.
I'm sure in a week I'll start wishing for more speed. Isn't that typical? But for now, I'm happy and I'm going to savor the moment.
|
|
By justaviking - 8 Years Ago
|
Greetings people.
I'm doing a little test in FeedBack Tracker.
Will someone kindly add a COMMENT to this Issue? http://www.reallusion.com/FeedBackTracker/Issue/Doing-a-FeedBack-Tracker-test
Thanks.
|
|
By justaviking - 8 Years Ago
|
Thank you, UrbanLamb.
What I confirmed: a) It does indeed ask you to confirm your desire to delete the Issue. b) It will delete the Issue even if there are comments added to it.
I "accidentally" deleted a long and detailed Issue, and I couldn't remember if it gave me a "Confirm/Cancel" option or not. It does. The deletion was 100% my fault, not FeedBack Tracker. We get so used to saying, "Yes" that we don't always pause and give thoughtful consideration to that prompt.
The test is done. Thanks again.
|
|
By justaviking - 8 Years Ago
|
Greetings,
Some of you may have noticed I've been unusually quiet around here lately. There are three reasons: 1) Life has simply distracted me with other things that have absorbed my time. Nothing bad, just the ebb and flow of life's activities. 2) I always enjoyed posting during 5 and 10-minutes breaks while at the office. But due to changing internet security profiles, I cannot post there anymore (and I detest trying to do it on my phone). I can't even view via the "Latest" tab; I have to manually search each category, which makes it a bit of a chore to keep up on things. One or two days (a few weeks ago) I could post again, but then it went away. 3) I've spent some time with Substance Painter.
So, I'm just saying "Hi" to let you know (warn you?) that I think I'm getting back into iClone land, where I play with some of my favorite software and correspond with people here that I call friends.
Wait, did I say, "Substance Painter?" Yup. The good folks at Allegorithmic did a contest to paint a very simplistic model they provided. It was a lot of fun, and like any "project" if causes a person to learn new things. I tend to poke at individual bits of functionality in Painter, much like you might tap individual keys on a piano. Figuring out how the keys and the pedals work is one thing, but playing a song is quite another.
I'll insert a picture here, and provide a couple of links.

Link to the "Meet MAT" contest forum thread: https://forum.allegorithmic.com/index.php/board,53.0.html
Link to my contest thread, which shows a bit about a couple earlier concepts I played with: https://forum.allegorithmic.com/index.php/topic,17046.0.html
I even opened a Sketchfab account so you can view my model in its full 3D glory: https://skfb.ly/6qFwS
Link to a "collection of entries" that someone started (with only 14 replies so far): https://forum.allegorithmic.com/index.php/topic,17333.0.html
There were a lot of stunning entries. One major rules was we were not allowed to change the actual geometry (mesh) at all. Using simply geometry (and using their "sample object" for a head, rather than using a human face) really fueled a lot of creativity. Astronauts, deep-sea divers, traffic lights, clocks, showerheads, bubble gum machines, and on and on.
P.S. It wont' be long until I need to start an "iClone 7 corner" thread.
|
|
By Rampa - 8 Years Ago
|
Hi JVK,
For viewing "Latest" on your phone browser, click on the three-liney thing. ;)

|
|
By justaviking - 8 Years Ago
|
Thanks, Rampa. I knew that, and I've done that from time-to-time, especially if I had a topic or two I was particularly interested in, but it's just not fun to read on the phone's small screen, and then I'm often being left with the options of; a) Painfully replying on the phone, or b) Not replying, or c) Trying to remember to reply when I get home.
On a related note, though, when you open a post on the phone, is there a way to see where the post came from? (Breadcrumbs?)

|
|
By Rampa - 8 Years Ago
|
It seems there is! :)
I just tried it. Rotate your phone sideways to landscape. You may need to zoom out as far as possible. What displays is based on zoom-scale, so just like on your computer when you zoom in or out on the RL forum, different things wink in and out.
|
|
By justaviking - 8 Years Ago
|
Indeed. There it is. Why did I think it wasn't there? I know I've put my phone in Landscape position many times for reading posts.
It's possible something changed, but most likely I actually did know that, but forgot about it while I was busy complaining about trying to use my phone to access the forum. Smart phones are great things, but they are no replacement for a reasonable computer monitor and a real keyboard.
|