How Much VRAM Do Gamers Need? 8GB, 12GB, 16GB or MORE?

I've yet to play a game that needs more than 12GB of vRAM to play at the native resolution of my monitor (3440x1440).

Elite happens to be one of the few that likes more than 8GB. Of course, there are games that will need 12GB+; i just haven't played them.
 
IMO depends whether you upgrade every other generation or plan to keep your GPU for 5 years/skip 3 generations. Also whether you have PCIe4 CPU/mobo and 5+ GB/s NVMe drive that allow for very fast asset swaps even if you do run into VRAM limitations. I'm stuck at PCIe 3 and a 2 GB/s NVMe drive (AMD not releasing B550 chipset until the end of Ryzen 3000 series life) and upgrade every 5...6 years so as much VRAM for me as I can have🙃
 


It really depends on resolution and settings and ofc it it's VR or not
Also on case usage scenario
And ultimately, on the budget allocated for the gaming rig.

My previous gaming laptop had an 1080p screen and rtx3080 laptop with 16gb of ram. Overkill amount of vram to play at that resolution, no matter the settings.
But that amount of memory started to matter when i dual logged 2 Odyssey clients, both obviously having all settings maxed out (ultra plus)

My current laptop has an 1440p screen and an RTX4090 laptop, again with 16gb vram. Still overkill for this resolution.
But then again, without 16gb of vram i would have troubles dual-logging 2 instances of Odyssey on Ultra. I would have to lower the settings (Ultra needs 8gb vram, per instance)
And i did test this - with 3 instances of Odyssey on Ultra+, the game struggles on my system with 16gb vram, but if i drop the settings to default High, it's playable.

IMO, 8gb vram is still enough for upto 1440p, high settings, in all current games, but it may pose some issues in several years - but nothing that couldnt be solved by lowering certain settings
IIRC The last of us struggled a bit on 8gb of vram on ultra settings, but it was ok lowering some settings. And it's not like it was unplayable, but it has quite an amount of fps variation and low frames for 1% lows
 
Last edited:
My last two pc's both have 32gb of system ram. Video card ram is 8gb for one and 16gb for the other.

Next pc is still years off for me, but by then, I expect 64 gigs system ram will be the norm, and 24-32 gigs of video ram.
 
"How much VRAM do gamers need?"
"How long is a piece of string?"

Same questions, different targets. Both are impossible to answer, but make for great clickbait.
I don't think it was clickbait—Hardware Unboxed is pretty good. Rather, it was a good insight to how much VRAM modern games eat up at max and high settings, and that selling 8 GB 400+ € GPU-s is a travesty.
 
Do I really need the 24Gb of VRAM on my RX 7900xtx?...No, not really. I never used anywhere near the 16Gb I had on my old RX 6950xt, no matter how high I wound the graphics options up on anything I played...even The Last of Us when it was first released on Steam in a fairly broken state 🤷‍♂️
 
Last edited:
The other day I checked the AMD overlay while playing E: D with my new RX 7800XT (Ultra settings, 1.5x supersampling on my 1080p monitor) and it showed just over 12 GB VRAM allocation. Granted, VRAM allocated isn't VRAM used, but still... On a QHD (1440p) monitor and 1.25x or 1.5x SS to mitigate bad AA and Moire patterns 12 GB might be pushing the limits. But, of course, E: D with it's bad AA is an edge case and more optimized AAA games that implement good TAA should be fine with 12 GB at QHD.

Generally big leaps in system requirements tend to come with new console generations, so I expect that 12 GB will start becoming minimum viable in around 3 years (extrapolating from the PS launch cadence since PS2), just like 8 GB started becoming minimum viable in 2020/2021 after the release of PS5.
 
I regularly see 15GiB+ in Odyssey at my usual settings, but the game also runs without any real VRAM related issues on my 10GiB cards (at somewhat lower settings).

When it comes to buying GPUs today, there are two main segments where VRAM might be a concern. All those 250+ USD cards that only have 8GiB of VRAM are rip-offs. Do not pay $250 or more for a card with only 8GiB of VRAM, there are better choices. The other segment where VRAM might be an issue is the RTX 4070 Super and Ti which are fast enough to occasionally make the 12GiB they come with a problem. Most other cards one might buy new are reasonably safe.
 
I don't think it was clickbait—Hardware Unboxed is pretty good.
Indeed they are, they are so good that recently they suggested that if one disagreed with their maunderings they unsubscribe, so I did!

Some of the bigger 'influencers' consider themselves to be important, rather than just some guys making money from clicks...

On topic: I have 24 GB of VRAM too, I don't think any game I play uses all of it, either.
 
Indeed they are, they are so good that recently they suggested that if one disagreed with their maunderings they unsubscribe, so I did!
To be fair, they probably are completely fed up with complainers about their methodology.

For example, every time they test CPU performance in games there are a bazillion comments along the lines "Who uses 4090 at 1080p, LOL, your methods stink!" without realizing that they're supposed to cross-reference 1080p RTX4090 CPU benchmarks with GPU performance benchmark (done with the best available CPU) to determine whether their planned system will have CPU or GPU bottleneck at your monitors resolution. Eg, if the midrange CPU you plan to buy does 130 FPS w/ 4090 @1080p and the GPU you have your sight on can manage 120 FPS w/ 7800X3d @1440p, the system is well-balanced and won't benefit from a higher end CPU. But people don't seem to understand this...
 
To be fair, they probably are completely fed up with complainers about their methodology.

For example, every time they test CPU performance in games there are a bazillion comments along the lines "Who uses 4090 at 1080p, LOL, your methods stink!" without realizing that they're supposed to cross-reference 1080p RTX4090 CPU benchmarks with GPU performance benchmark (done with the best available CPU) to determine whether their planned system will have CPU or GPU bottleneck at your monitors resolution. Eg, if the midrange CPU you plan to buy does 130 FPS w/ 4090 @1080p and the GPU you have your sight on can manage 120 FPS w/ 7800X3d @1440p, the system is well-balanced and won't benefit from a higher end CPU. But people don't seem to understand this...
Those are common complaints on any of the 'tech' channels, because people don't understand exactly what is being benchmarked.
With that channel I just took their "go away if you disagree" at face value, maybe one day they will see daylight and smell fresh air, but I doubt it...
 
With that channel I just took their "go away if you disagree" at face value, maybe one day they will see daylight and smell fresh air, but I doubt it...
Just out of curiosity, what do you find so disagreeable with HWU? Not to say I am a devoted fan of them, I feel rather neutral towards them--and maybe I miss something here, because I don't watch their every video or follow them that closely. Every tech journalist has their biases, preferences and "blind spots", but I still find them worth reading/watching/listening to even if I have my disagreements. Exceptions being when they start completely and blatantly trashing (or fanboying) something for no good reason or go down some conspiracy theory grifter route (blergh).
 
Just out of curiosity, what do you find so disagreeable with HWU?
I used to find their content interesting and engaging, until, as is common with larger channels, they..
start completely and blatantly trashing (or fanboying) something for no good reason
Opinion is not fact, and should be presented as such.

On a video when they were busy trashing reactions to comment that had made previously, they did suggest that if the viewer didn't like what they said, they should unsubscribe from the channel.

Losing one subscriber isn't going to hurt them...
 
On a video when they were busy trashing reactions to comment that had made previously, they did suggest that if the viewer didn't like what they said, they should unsubscribe from the channel.

Losing one subscriber isn't going to hurt them...
Ah, I see. I haven't seen that video—I try to avoid all the onlinr interpersonal drama.

And I guess it's all fair from all sides—I have unsubbed from many channels for various reasons even if I find some (or even most) of their content generally useful and informative.
 
Back
Top Bottom