12th gen 13th gen vs ryzen 7000 series etc

So here we are. New AMD gpu & cpu
New Intel cpu (won't mention their gpu its pants). And the mighty 4000 series geforce gpu.
I've seen the tube vids. I'm confused as hell.
It's all smoke n mirrors!
Can someone kindly say which is best for Gaming.
And are we going to have to wait for 3.0 psu?
 
I'm gonner take a punt assuming new psu compliance etc.
Zen 4 7950x3d (ain't out yet)
And a 4090ti (or a 4090fe)
32 gig ddr5
What board?
 
Nothing is really available yet so wait and see the tech reviews.

One thing for sure, If you are concerned on power usage just don't bother.
 
It's too early.

It seems a 3.0 ATX PSU will be pretty much needed with the new Nvidia cards. There seems to be some concerns about the power cables that feed 4 150W cables into a 12 pin plug to feed the huge power demand. It seems to be overheating and melting in some testing.

It will take a while for all the PCIe gen 5 hardware to get tested and evaluated to really get a good handle on the best approach to putting it all together. PSU, ram, Mobo, video card, and the next gen m.2 drives.

It does seem that the 7950X will dominate early on the CPU side - but the competition isn't going to just lie down. And then will AMD video cards mount a stout challenge to the Nvidia cards? I still need to see some sign that they can improve on the driver side of the equation even if the hardware is competitive.

I expect things should clarify a bit after the Holiday scene. Not to mention we'll likely be in recession by then and prices might be surprisingly affordable compared to what we've been through the past few years.
 
The 5800X3D is still the best currently available gaming chip. Nvidia's 4000 series is, in the words of Steve from Gamers Nexus, currently a "clown car" of a GPU. In today's climate, just throwing more watts at performance (and that is exactly what Nvidia have done) is just downright silly. Most of the shrouds for these cards are 4 slots deep, just to dissipate the heat - FE cards are a 450w TDP. EVGA even stopped making Nvidia cards! It's stupid. Just stupid.

AMD are no angels here either, with performance-per-watt not significantly better in its new CPUs; they've just thrown more power at their cores for more clock cycles.

I guess you could use a new rig to heat your house, dry your clothes and cook dinner - you won't need the central heating on; your rig will be kicking out 600w minimum.
 
The 5800X3D is still the best currently available gaming chip. Nvidia's 4000 series is, in the words of Steve from Gamers Nexus, currently a "clown car" of a GPU. In today's climate, just throwing more watts at performance (and that is exactly what Nvidia have done) is just downright silly. Most of the shrouds for these cards are 4 slots deep, just to dissipate the heat - FE cards are a 450w TDP. EVGA even stopped making Nvidia cards! It's stupid. Just stupid.

AMD are no angels here either, with performance-per-watt not significantly better in its new CPUs; they've just thrown more power at their cores for more clock cycles.

I guess you could use a new rig to heat your house, dry your clothes and cook dinner - you won't need the central heating on; your rig will be kicking out 600w minimum.
Indeed - this is an issue

Even if I decide to build a gen 5 rig, I'd likely be looking at grabbing a clearance 3090 for the build if one is available
 
To me, after the next gen cpu's and gpu's are released, is a great time to buy... last gen. The Black Friday/Holiday sales this year should be good, as long as you don't need or want the latest and greatest. Last gen deals should be well worth it.
 
To me, after the next gen cpu's and gpu's are released, is a great time to buy... last gen. The Black Friday/Holiday sales this year should be good, as long as you don't need or want the latest and greatest. Last gen deals should be well worth it.
Yup. I'll be looking for a 5800X3D after Christmas for myself - the vanilla 5800X I have right now is just fine, but I want to put together a PC for my son who's arriving in the UK around March. I already have the GPU, so I only need an AM4 motherboard and RAM, both of which should be pretty cheap then too.
 
If anyone was running a high-end GPU and CPU before, power consumption is not likely to be a deal breaker with new stuff. Price might be, but not power. The real concern is if you have an older or lower quality PSU, or a model that has low OCP limits preventing it from handling reasonable power excursions...but that would likely have been an issue with an RTX 3080/3090 or RX 6800/69xx part as well.

Even the RTX 4090 shouldn't need a new PSU if one's current one is half-way decent. Transients are more tightly controlled than the past generation and the issue with the 12VHPWR cables is related to early implementations of the 12-pin connector (one should stress test these to ensure they don't get hotter than they should under peak load, new PSU or not), not the 8-pin side of adapters or PSUs themselves having issues. I was fully intent on running an RTX 4090 on a 650w SFX-L PSU (and would have power to spare) until I decided to work in better margins to keep the PSU from overheating if I need to block of some of it's intake air flow with the ducting I'm planning on using to keep the CPU heatsink from ingesting heated GPU exhaust.

At stock limits performance per watt is going to increase significantly with the Ada and RDNA3. Stock limits are invariably way past the point of peak efficiency, so a modest undervolt and/or underclock will reduce power far faster than it reduces performance. You can set whatever limits you like and be essentially guaranteed 50-100% more performance at the same arbitrary power limit as one would expect from a prior generation part.

CPU side is only slightly different. For example, the Ryzen 7000 series are allowed to pull well in excess of 200w to ensure they are all-round competitive with Intel's 12th gen and AMD's own 5800X3D, but the new manufacturing process and modest architectual changes still result in a fundamentally more efficient chip. This is clearly evident when capping power; at 65w (about 30% of normal limits) the 7000 series CPUs reach 70-85% of full performance and the performance advantage they have over 65w Zen 3 parts (relative to uncapped ones) increases significantly. However, I still wouldn't recommend the Ryzen 7000 series parts for a pure gaming build until the v-cache models show up (and would see how those fare against Raptor Lake); they don't offer enough here relative to either the 5800X3D or Alder Lake.

Between AMD and NVIDIA on the GPU side, I'm leaning toward NVIDIA mostly because I'll be able to get one sooner. At the top-end I expect the 4090 to be slightly faster in rasterization and still significantly faster in RT (though exact degree of this is still highly speculative) than the top RX 7000 series launch part. NVIDIA also has a few new features that are tempting. On the other hand, the lack of DisplayPort 2.0 is a glaring omission for NVIDIA. Yes, DSC is 'visually lossless', but NVIDIA GPUs have numerous caveats when using it. I'm not sure if these will still apply to Lovelace, but if you're using DSC, you can only attach two monitors to current NVIDIA cards, and using DSC knocks out all forms of DSR. Since I expect to soon have displays (240Hz+ 4k) that cannot do native resolution and refresh rate via DP 1.4a or HDMI 2.1 without DSC, the lack of these limitations on AMD's upcoming GPUs are making RDNA3 very tempting.

With regard to AMD's drivers, I'm not concerned about their quality in general. If anything I've found AMD easier to work with (the third party tuning tools are actually better and the parts less locked down) in this regard for the past few years. However, Elite: Dangerous 4.x is an outlier application that has major problems with recent AMD GPUs and drivers (even before the Orange Sidewinder problems the line AA issue was, and is, annoying), and I am far from certain of an AMD-side fix to these.

Nvidia's 4000 series is, in the words of Steve from Gamers Nexus, currently a "clown car" of a GPU.

This was more in reference to the absurd marketing and naming conventions of NVIDIA and their AIBs than technical aspects of the hardware.

Even the 3090 is not worth it for the power usage. But some 'gamers' just need 300 fps

as long as you don't need or want the latest and greatest

An RTX 3090 can't consistently deliver 60+ fps (in purely GPU limited scenarios) at some surface settlements in Odyssey with the settings I currently use...settings that still don't adequately address the aliasing issues the game has. I'm not even sure an RTX 4090 will be fast enough to do what I want to do with Odyssey. That said, I'm not buying a GPU for Odyssey specifically.

I have several other games, most of which are not new, that do not run to my satisfaction (and I'm targeting ~70 fps P1% frame rates) on any extant GPU.
 
Last edited:
My 6800 cost me 650€ and its well enough
And the power needed is very low
Realy we don't need those xx90 graphics cards at all

Same things for my 5700x
Thats well enough and power usage is low and its cold
Absolutly no need of this xxxx3D cpus
 
Yup. I'll be looking for a 5800X3D after Christmas for myself - the vanilla 5800X I have right now is just fine, but I want to put together a PC for my son who's arriving in the UK around March. I already have the GPU, so I only need an AM4 motherboard and RAM, both of which should be pretty cheap then too.
Be aware though that in every other task that is not gaming, your 5800X3D will be slower than your current 5800X.
 
How do ya'll find out the news about this stuff? When I try to read the reddit page for either company I get thoroughly confused.

I did get the 5800x3d even though I can't figure out the specs of why it's better. RIP my wallet.

One thing I noticed is I have a 3090 which has something like 12g ram, but FPSvr shows it's not using all of it. So what is the point of having all that ram then?
 
NVidia are quite clowns with their naming conventions.
see rtx 3080 and rtx 3080 (laptop)... there is definitely a need to get back to the old notation of XXXXm for mobile parts or even be fair enough and nor market a laptop card that is slower than a desktop RTX3070 as RTX3080 (laptop)

And they go pushing this farce even more and go into desktop domain with RTX4080/12gb and RTX4080/16gb (both are named 4080, but the differences are not only in memory size but also in the number of cores, memory bus bandwidth and tdp)
 
And they go pushing this farce even more and go into desktop domain with RTX4080/12gb and RTX4080/16gb (both are named 4080, but the differences are not only in memory size but also in the number of cores, memory bus bandwidth and tdp)
How can you tell which one would play ED better? Is it mostly by cores / GHz / bandwidth?
 
Be aware though that in every other task that is not gaming, your 5800X3D will be slower than your current 5800X.

Well, archival, virtualization, and a few other things do pretty well on the 5800X3D too. Anyway, If one is focused enough on gaming performance to consider the 5800X3D and not enough on general-use performance to just get a 5900X, they aren't going to miss the 5800X's extra ~300MHz.

I did get the 5800x3d even though I can't figure out the specs of why it's better. RIP my wallet.

The 5800X3D has three times the L3 cache of the standard model, but is clocked slower.

It's a much better gaming CPU (provided you aren't simply GPU limited) and situationally better at a few other tasks, but most certainly not universally better. Indeed, it's worse in those apps that don't care about memory subsystem performance.

One thing I noticed is I have a 3090 which has something like 12g ram, but FPSvr shows it's not using all of it. So what is the point of having all that ram then?

24GiB. Some small portion of people will use it, but mostly it's marketing.
 
How can you tell which one would play ED better? Is it mostly by cores / GHz / bandwidth?

The more powerfull one will play ED better, and the 16gb version seems to offer about 20% more performance...


1664663404613.png
 
Back
Top Bottom