So I have some cool things planned for the Archives, and in order to facilitate that, I have employed the Services of the Bank of Borb (<3) to provide some funds (repayable) to acquire a certain type of processing unit (okay, it's of the graphics variety) to speed up certain forms of... okay, I'm just gonna say it. My instant-noodles will take a while to cool down so, here it goes.
I now have an RTX 3060 12GB in addition to my RTX 3050 I already posted about. The 3060 is actually primarily for OptiX rendering in Blender. So that gives away my new hobby, and well, I love it. I have essentially learned to model and animate in blender simply by using it constantly and pressing buttons to see what they do (this isn't far from the truth) and a few online tutorials for basic stuff to build on. Anyway, anyone that knows the Archives, will know that my models (such as space ships) are built out of digital lego bricks in software called Stud.io, well I am now taking it a step further than just producing the Ray Traced still images I used from that.
I am now converting my models into blender and modifying them as necessary in order to produce fully animated CGI movies with them. And... I am really excited about what I am going to do with that, to bring the Archives to life in a way I have never been able to do before.
So, what about your PC, Sash?
Well, my first few scenes are poorly optimised and due to the fact that the plugin I am using for Blender to read LDraw (lego) files also converts each individual brick into a distinctive mesh (including the internals), my scenes have a LOT of hidden vertices that I have been working with until semi-recently, where I have been optimising them by hand. Ugh, I am sort of going off on a tangent here, what I mean to say is that my blender scenes need a LOT of VRAM, so the 12GB on the RTX 3060 is very much appreciated.
That said, going forward, I am going to try and keep the scene memory below 8GB, using tiling and cleaner geometry, so I can use the 3050 and the 3060 together; since OptiX will literally let you use any RTX GPU you can throw at it, which is cool. Otherwise, I am sure you already read somewhere (Actually, I don't even know if I typed about it); I sold my 5950X and am just using my trusty old 3900XT. The reason for that is, well, cost. The 5950X was one of those parts I just don't keep hold of because they dont' feel like a "good investment". So, I am planning to get a cheap 5900X as Zen4 launches, which will be a much better value upgrade to this 3900XT which I also got fairly cheap a couple years ago.
Anyway, having two GPUs is cool, as right now I am actually doing a scene render on the 3060 and have been playing games (as usual) on the 3050. The 3900XT is powerful enough that it can do both with no lag, and my system has 64GB memory, so blender is satisfied (and my biggest scene used ~30GB to open the model lol).
Okay, this was more of a babble than an actual update post. But, hey. It is what it is. Oh, that's a category, too. So what now? Meh. I'll eat these noodles, drink some Diet Caffeine Free Coke and get back to working on the Archives. I have a whole bunch of new content planned, including some new stories and plot branches, that you may have noticed lately.
Oh, I almost forgot. I may actually use the 3060 for games like Dying Light 2, Metro Exodus Enhanced and Cyber Punk 2077 (if I actually play the latter, it's on the list), so essentially RT games where the 3060's extra 8 SMs and more bandwidth will make a difference, otherwise I'm basically still using the 3050 for gaming because it's genuinely enough for my dailes like Warframe / DRG. What I meant by this last bit is, I might do RTX 3060 gameplay (and OC) videos, if I actually ever get back to posting gameplay videos... I should... But... Blender..
Oh, they both overclock really well. Like, 2100 MHz rock solid, well. :3
Meow, for real this time.