To me the most telling thing was AMD's coyness about showing ray-tracing images or stats in their demos. I'm waiting for third party evaluation of their ray-tracing performance because this make me unsure it will be as good as NVidia's.
RDNA2's ray tracing performance almost certainly won't be as good as Ampere's. I expect NVIDIA's entire product stack to hold a commanding lead here, dollar for dollar.To me the most telling thing was AMD's coyness about showing ray-tracing images or stats in their demos. I'm waiting for third party evaluation of their ray-tracing performance because this make me unsure it will be as good as NVidia's.
It can be used to whatever extent developers like and most games featuring ray tracing have quality levels for it.So ray tracing isn't like on/off, there are settings in-between? It could lease new life to Turing cards...
People want to do a lot of things with these cards.People want to game with these cards. They do not want extra heaters in the house and go on a mod spree.
Is this a problem?The race is being run and you are declaring the winner all the time over and over.
If you want me to comment on something, you'll have to provide a link. I can't be expected to be aware of every video every editor of every tech site produces.You comment only on what suits you. How convenient of you not reacting to Leo's video....
Why would I pick apart something I agree with?Picking apart only statements that you can attack.
This comes of as more than a little hypocritical.You are way too biased.
I've made no claims of being the only one with correct answers, nor that I'm infallible. I think I usually do a pretty good job assessing the information on hand, but that information is often in flux, and even when it's not, my interpretations are not always correct...it's still more informed and ultimately more accurate than quite a bit of the content pushed by people making a buck off Youtube.Certainly in reacting to other peoples posts and needing 'to correct others so you are the only one giving the correct answer'.
I will freely speculate about what I suspect the implications of some facet of a product are, while also challenging assertions that one can divine everything there is to know about the underlying hardware by the arbitrary specifications imposed upon the products built from them.You can speculate and make what-if statements but others can not?
I decided to wait until the dust settles, again only Flight Sim VR could trigger real need for new GPU, but that game needs another year of refinement and exit its de facto beta state. Besides, loads of other expenses are piling up these days.Some more recent rumors on possible counters to the RX 6000 line up from NVIDIA:
Kopite7kimi: NVIDIA GeForce RTX 3080 Ti rumored to feature 10496 CUDA cores and 20GB GDDR6X memory - VideoCardz.comKopite7kimi now reports on slightly better RTX 3080 Ti specifications. NVIDIA GeForce RTX 3080 Ti Just last a week ago, Kopite reported on the plans to revive the GeForce RTX 3080 Ti SKU. This was quite surprising, even to us, as the SKU has been canceled for at least two months on all the...videocardz.com
Looks like the previously canceled 3080 20GiB could be replaced by a 3080 Ti with 20GiB of memory and more SMs. This would make sense as a 3080 with double the memory would increase cost, but not make the part look more favorable vs. the 6800XT or 6900XT in most benchmarks. An SKU with increased SMs would likely allow it to convincingly beat the 6800XT and compete directly with the 6900XT...at the cost of cannibalizing some of the 3090's market. Will be interesting to see how that pans out.
It also appears that all the the previously rumored 3070 Ti SKUs have been canceled. This is unfortunate as that part looked like it would have been better balanced than the 3080 and faster than the 6800, but likely would have been appreciably cheaper than the 6800XT.
Anyway, we're two weeks away from 6800 and 6800XT availability. I don't think I'm going to wait for non-reference models or even comprehensive reviews...though I'll certainly read them as they become available. I've decided I'll try to get a 6800XT out of the gate for the following reasons:
- The 3080 FE is the only 3080 I am likely to be able to get my hands on that will fit in my intended system and from first-hand experience, I do not want the FE cooler (memory/VRM temperatures are too high).
- A GA102 3070 Ti is either not being released or has been pushed back to a far later date. My system will be ready for a new card very soon.
- Most of the games I expect to play aren't likely to be very ray tracing heavy, and some are more fill rate than shader dependent, so the 6800XT will probably match or beat the 3080 FE anyway.
- I expect Navi21 supply to be tight...not as bad as Ampere, but no where near enough to meet demand.
- Neither the 6800 nor the 6900XT are really contenders on the same level of value as the 6800XT. The 6800 is overly castrated for it's price, while there is even less of a difference in performance between the 6800XT and the 6900XT than there is between the 3080 and 3090. The 6900XT may well have a higher peak power limit, or some advantage in binning, but I'm sure not going to pay 350 dollars for it.
- AMD has a long running trend of solid reference PCBs (the HD 5850 was the last time I've felt like a reference card was a major negative) reference 6800XT cooler is unlikely to be deficient.
- Non-reference Navi21 parts are still a ways off.
Obviously, we'll have to wait for reviews to see where everything lands, but with what is known at this point I'd bet on:
- The 6800XT is going to trade blows with the 3080, while the 6900XT will prove to be between the 3080 and 3090, overall.
- All of these parts are going to be power limited in most GPU limited scenarios. Thus out-of-the box power consumption will typically be pegged at TDP and power efficiency will be slightly in favor of AMD, again out-of-box.
- Ray tracing will be a particular weak point for RDNA2, relative to Ampere, but not by enough to make or break performance in most titles using ray tracing. It will skew the averages of test lineups that include them in favor of NVIDIA, however.
- The differences between the architectures and SKUs will result in a new counter-optimization war between AMD and NVIDIA with AMD focused titles wasting VRAM to make the 3080 and 3070 look bad, while NVIDIA backed titles will have wasteful levels of RT and excessive shader effects to make RDNA2 look bad. People will quickly realize that using the next lower texture setting or ray tracing quality barely changes IQ, but fixes all performance issues on opposing hardware.
Disclaimer: Yes, I am speculating about unreleased hardware in the thread for speculative rumors about unreleased hardware.
Not all 3080FE samples sold got the extra pad, and the 5-10C that pad saves still leaves very little thermal margin if you are hitting the power limit of the board and don't want the fan to sound like a tornado.VRAM cooling: hasn't Nvidia fixed it with extra padding already?
Doesn't sound like you have any reason be in a rush for an upgrade, but from the currently expected specs, the RTX 3050 Ti looks like it would be roughly comparable to an RTX 2070/2080.And here's me replaying Black Mesa right now. So I could run that on virtually anything.
Joking aside. I'm keen to see what the GTX3050ti is like. I think I might upgrade at that point, but as I don't feel the need to game at over 75Hz, 1080p and my monitor can't do that anyway I don't know what value I would extract from a high-end GPU. Even the difference between medium and ultra settings is pretty marginal for me these days.
One of the reasons I've been looking at RDNA2 is the mining potential. Most profitable GPU hashing algorithms that haven't been taken over by dedicated mining ASICs are memory-hard specifically to keep them profitable on consumer GPU hardware. The large cache on Navi21 would naturally benefit the hashing performance in such algorithms.(that huge LLC on the Navi21 will probably let my 6800XT pay for itself this winter)
That’s grim news, after being frustrated when trying to get a 3080 it sounds like my shift target will suffer the same fate.Most GPU miners have probably been anticipating this from the moment that 128MiB infinity cache was rumored.
Unfortunately, this also reduces the prospects for a more ready supply of these parts, especially early on.
However your 1080 should be more than enough to play Odyssey..That’s grim news, after being frustrated when trying to get a 3080 it sounds like my shift target will suffer the same fate.
I’m really hoping to upgrade my 1080 prior to Odyssey launching, but it’s starting to look less and less likely
Here's another one (or two): https://www.alphacool.com/search?sSearch=rtx+3080&p=1I've been looking at watercooling options for my brother's the 3080FE. Probably leaning toward a full-cover block setup despite the cost, for less complexity and ease of restoring it to default configuration (he wants to keep the warranty, so epoxying heatsinks all over the board is right out).