top of page
Writer's pictureSasha W.

Sash Update: GeForce, drivers, CUDA, NVENC... and less than 8GB of video memory.

Well, where do I start? Okay let's start with the first part of title: GeForce. The long and the short of it is: I am back with NVIDIA GeForce again. And by this point, you probably guessed why (considering the second part of the title). And yes, that would be correct. Now, the RX 5500 XT 8GB was a great card, with 8GB of memory and I was perfectly happy with the performance, efficiency and the price I paid (it wasn't stellar value, but was reasonable at £185 for the 8GB model). I even typed an Autism Piece on this little GPU, so you know I was fond of him.


But the thing is, while the hardware is great (and this is a bit of a historical issue with AMD) the drivers/software are not so. Now, if you follow me and my content (that one person :O ) you would realise that this is a bit of a huge turning point for me. I'm generally really Pro-AMD, unless of course they do something dumb, or dumb then correct it, but, yeah you get the message. I generally have a lot of positive things to say about AMD, and especially Ryzen.


Radeon, however; I have a bit of a love/hate relationship with, and that relationship over the last year or two has been pushed to breaking point many times, with me going GeForce for a while then switching back, but now I think I have made a firm decision.


I'm going to use Nvidia GeForce video cards now.

Now, some people (AMD hardcore fans especially) will accuse me of being fickle (TBF: I am Fickle) and now shilling for GeForce, but you can read my content, see what I type and decide for yourself.


It isn't just about the driver instability, though.

I talked a lot about the CUDA/NVENC reason in this post, so you could read that as it is relevant and I don't want to parrot what I typed there, in this one to avoid bloating it out. And I have a habit of bloating posts out. :D Ultimately, software I use (image & video editing and rendering) uses CUDA acceleration. For whatever reason, it doens't use OpenCL on AMD, but that's not my problem; I just use what I have available, and that is CUDA. Yes it's a shame because AMD's hardware is more than capable for this sort of work, but go blame the devs, not me. NVENC needs no explanation and the gains in File size/quality vs AMD's Encoder, and the speedup gains vs encoding on my 1920X are really helpful.


Sure, if I have some time I'll encode with the CPU and H.265 on Handbrake's 'Slow' preset for maximum quality/size ratio, but my 1920X pulls double duty as a workstation and WCG server; time spent encoding is time not doing WCG - and that is my passion/hobby.

Offloading that work to the GPU is really nice. The same argument applies to stud.io's Eyesight Renderer; which can use CUDA - allowing me to render the image on the GPU and free up CPU cycles for Paint dot net / GIMP editng in the meantime. But, enough about that.


Less than 8GB of video memory? That is Heresy and you should be Executed for Crimes against Textures.

Okay, this is a bit of sensitive subject for me, because I spent a lot of time talking about how I'd never get less than 8GB and how 8GB is the minimum and all that, in fact the very reason why I disliked the 5600 XT was because of the 6GB - so you can see where this is going.


I'm not doing a full U-turn here and saying 6, or even 4GB "Is enough!" for 1080p gaming maxed, especially going forward with the new console generations, but I am saying it can work for a lot of people right now, and even in the future with some minor tweaks to a few settings.


In a recent Tweet, I admitted that I was a bit "too aggressive" on the Subject of VRAM, after playing my favourite titles on the GTX 1650 Super with its 4GB; and realising that (ironically) the femguy spewing that "8GB is the minimum" was playing games that work just fine on 4GB, and that came across to me as a tiny bit of hypocrisy; and I really dislike hypocrites.


Sorry about that. Now let me explain a bit about my new position on this subject. Do I think 4GB is enough going forward? The simple answer is No. It is not, even 6GB might be problematic, but it's not quite as clear-cut as I made it seem. For a huge array of current (even newer) titles, 4GB at 1080p is enough for a good experience. Borderlands 3, which used regularly over 4GB, and even over 6GB occasionally on my 5500 XT, actually runs perfectly fine on the GTX 1650 Super with 4GB. And those are the exact same settings as the 5500 XT, so don't shout at me for lowering the render to 75% for the 1650Super; since it was 75% for the 5500 XT, too, since both these cards don't really have the grunt to push the FPS where I feel it is comfortable for this fast-paced shooter/looter at full res, but that's beside the point.


Would I recommend the GTX 1650 Super for "future proof 1080p gaming"? No. But it makes a lot of sense right now in its price class, for current and older games, and should do OK with newer games, too, as Nvidia's VRAM management seems to be better than AMD's. If not, a tier lower on texture resolution would probably fix that. Big deal? I think not.


Sash, you're a Retard. You did a U-Turn on VRAM. I hate you, I'm not going to read your shitty blog or Website anymore!

Did you just read what I typed? I just said I didn't do a full U-turn, I "adjusted" my opinion to be more reflective of real world use cases. To be fair, I didn't play Fallout 76 on the GTX 1650 Super yet, but... Fallout 76. We'll see.



I'm losing focus and I need a poo, so I'm going to have to wrap this one up. I just wanted to add, that I've just ordered a GTX 1660 Super (that's 6GB) for £199 on Amazon with free shipping, and that's only £15 more than what I paid for my 5500 XT, for 15% more performance, but 2GB less memory. Considering all I've stated above, and for games I play especially, I think that's a better deal.




Recent Posts

See All

Comments


bottom of page