Hardware & Technical Latest Nvidia Ampere Rumours

Too bad, there was interesting offerings before this news, and I was planning on updating my 6GB gtx 1060
It's 1st gen, and even on 1080, it's starting to show it's age.
I won't use AMD, my experience with linux has always been better when using nvidia.

You can go for one of the RTX 20XX super, even the RTX2060 Super comes with 8gb ram and is a huge performance increase over the 1060 and for just a bit more you can get a RTX2070 Super. They also both support ray tracing, and you likely won't need to upgrade your PSU. They now cost between 400$/500$, but they should drop a bit in the near future.

For 1080p, unless you want to use VR, these cards will run all there is to be ran currently and in the near future in top settings.
 
  • Like (+1)
Reactions: EUS
Another bad business move then Offer it just to look good and then revoke it or was this just a unconfirmed rumor?

NVIDIA never officially announced anything, and I'm not even certain if the rumor was ever cited as al first-party NVIDIA thing, or just an AIB thing. I don't think NVIDIA could have seriously considered a 3080 FE with 20GiB of memory, at least not with current GDDR6X dies...it never would have passed validation without a reduction to memory clocks that would have made it measurably slower than the less expensive 3080 10GiB in almost everything. Also, neither the reference nor the FE 3080 PCBs look like they were intended for memory on the back of the boards.

I supposed NVIDIA could have been banking on higher density GDDR6X ICs from Micron, and it's yields of these parts that are the issue, but again, I haven't seen anything official from NVIDIA that says they were planning on any first party release of a 20GiB 3080.

If they did have plans for larger memory variants, they are probably saving those until they can get a more substantial refresh out. There is so little headroom on current parts that they may need to wait for TSMC GA102s, or until they can stockpile enough of the top bins of the current Samsung parts, to get a performance increase worthy of a 3080 'Super'.
 
An interesting decision, if true. They must have a lot of partially functioning GA102 parts and/or perceive a major threat to the GA104 3070 from AMD. If it's only missing TMUs and CUDA cores vs. the 3080 it could be a real lower-high end value contender.

On the AMD side, some more Navi 21 leaks, specifically of the 6800 XT which seems to be the second fastest SKU:

My primary take away from these figures is that the AMD part is much less power limited (TimeSpy and most especially FireStike extreme will be severely power constrained on most high-end parts), but isn't likely to handle ray tracing as well as Ampere.

The 6900 XT could actually be an all-round 3080 competitor, which is very exciting.
 

Robert Maynard

Volunteer Moderator
An interesting decision, if true. They must have a lot of partially functioning GA102 parts and/or perceive a major threat to the GA104 3070 from AMD. If it's only missing TMUs and CUDA cores vs. the 3080 it could be a real lower-high end value contender.
Interesting indeed.
On the AMD side, some more Navi 21 leaks, specifically of the 6800 XT which seems to be the second fastest SKU:

My primary take away from these figures is that the AMD part is much less power limited (TimeSpy and most especially FireStike extreme will be severely power constrained on most high-end parts), but isn't likely to handle ray tracing as well as Ampere.

The 6900 XT could actually be an all-round 3080 competitor, which is very exciting.
Which, given that we still appear to be in the early adoption phase of RT in games, makes RDNA2 seem to be an attractive option, depending on pricing.

,,,, but, as RT console games will need to be optimised for the AMD version of RT, we should see performance similar to consoles (for those discrete GPUs with similar clocks / CUs to the console APUs), I would hope.
 
Last edited:
Puget Systems made on a base of Xeon W-2255 decacores and 128 GB of DDR4 3200 ECC Registered, a SLI of 4 RTX 3090 signed GIGABYTE Turbo.


qud-rtx3090-puget.jpg


🦠 🦠 🦠 🦠 🦠 🦠 🦠 😷
 
New MSI AB beta fixes the issues with setting custom fan curves:

Beta 2 snaps the fan targets to lower and much less granular PWM settings than expected, which limited it's usefulness. Beta 3 seems to work as expected.
 
3070 reviews are up everywhere. Performance is pretty much exactly as expected, with it trading blows with the 2080 Ti. It's also considerably quieter than the 2080 Ti FE and, due to a less crowded PCB, doesn't appear to have the memory cooling issues of the 3080 FE. Cooler is also less annoying to take apart.

Power efficiency is not especially good. It matches or exceeds most previous architectures in performance per watt, but falls behind the GA102 parts; likely due to the use of older GDDR6 (lower bandwidth per watt), and a smaller GPU (meaning proportionally more support hardware vs. processing hardware) of the same architecture on the same process.

In a vacuum it would be easy to recommend, but I think it will be in an odd position vs. Navi21 and it's GA102 siblings. GA102 just has so many more functional units and bandwidth that the gulf between the 3070 and 3080 (and probably even the upcoming 3070 Ti) are enormous, and the GA102s also have more upside if one is willing to pay for a model with a higher power limit and put it underwater. The Radeon RX 6800 and 6800XT will likely beat it (except in ray tracing), possibly by significant margins and at least one of them should undercut it's price (we'll find out more tomorrow).
 
For me all this is vaporware with Mickey Mouse prices. At this point it is ridiculous that all youtubers are testing all sorts of versions with more coming up (3080ti, etc.) in the news, yet it's impossible to buy any of them without paying twice the price and being committed like hell.
 

Robert Maynard

Volunteer Moderator
Watching the RDNA2 stream - the 6800XT looks like quite the GPU - November 18th; $649. :) 72CU; 300W

6800; $579; 250W; 60CU; November 18th.

6900XT; 300W; 80CU; December 8th; $999.

All have 128MB Infinity Cache.

The RDNA GPU performance quoted in the 5000 series CPU launch was confirmed to be a 6800XT.
 
Last edited:
AMD really hit the mark. That 6800xt looks pretty attractive. Actually all 3 did but I think the 6800xt is where it's at. I was all ready to jump back over to Nvidia with the 3080 but not too sure now......plus you can't get one. Drivers and availability will be the big question mark. I got my 5700xt at launch and I think it took 6 months to get good/reliable drivers.
 
AMD newer Ryzen cpus and their newer gpus are looking more and more an attractive combo.

It will still take a while, but the next time I need a major system upgrade I might be tempted to ditch intel and nvidia again, and build an AMD cpu and gpu system.
 
AMD newer Ryzen cpus and their newer gpus are looking more and more an attractive combo.

It will still take a while, but the next time I need a major system upgrade I might be tempted to ditch intel and nvidia again, and build an AMD cpu and gpu system.
The performance/price ratio is undoubtedly unequalled.

🦠 🦠 🦠 🦠 🦠 🦠 😷
 
AMD newer Ryzen cpus and their newer gpus are looking more and more an attractive combo.

It will still take a while, but the next time I need a major system upgrade I might be tempted to ditch intel and nvidia again, and build an AMD cpu and gpu system.
I have always had Intel CPU's but flipped between AMD and NVIDIA (as well as Matrox and Voodoo) when it comes to GPU , but it does indeed look like my next build will be AMD only.

Just need for the reviews to be out.
 
6800XT is fairly impressive, if slightly more expensive than I'd have hoped. Almost certainly the same PCB and cooler as the 6900XT and may have similar headroom if it hasn't been binned for efficiency as hard (higher leakage ICs also have a tendency to clock better, if you can cool them).

That "+Rage Mode +Smart Access Memory" footnote on the 6900XT performance figures is noteworthy and not in a positive way. They turned on their auto overclocking feature to get it to trade blows with the 3090. Since a 3080 can trade blows with a stock 3090 if you tune it properly, this is somewhat misleading. The 6900XT and 3090 are just wastes of money, but if money were no object, I am nearly positive I'd rather have the 3090, or even a 3080 and a water loop just for it, than the 6900XT.

The plain 6800 is slightly faster than the 3070 for a higher price at the same board power. Since it's the same Navi21 as the others, there is a fair chance it will OC well, if that power limit is loosened a bit (assuming some of those Navi21s were assigned to the 6800 because of defective functional units rather than simply being bad bins), but with it only being $70 less than the 6800XT it seems poorly positioned.

Personally, I'm going to try to get a 6800XT. Perhaps if the 6800 is the only thing readily available, and I can get if with a favorable return policy, I'll grab one to see if it can be clocked to levels similar to the higher parts. Failing that I'll probably wait for the rumored GA102 3070 Ti, or see if I can snag a more mod friendly 3080 than the FE. After some first hand experience with GA102, I really don't want a 3080 in my SFF system, but I know it would be a monster with unconstrained power and cooling, so spending 300 dollars on top of one to get it where I want it could be an option (and I would much rather have a 450w watercooled 3080 than anything else I could get for a $1000), if all else fails.
 
Last edited:
If this wasn't a pandemic I'd bin my own 6800 at Best Buy (I went through six X800 Pros back in the day before I got a sample I liked).

Wow them prices!

The new normal I guess. Any reviews out yet?

Official reviews probably won't be out until launch, or a few days before. Need to see where the power limiters are and what can be done with soft powerplay tables before I commit to one.

Prices are always dictated by the competition. I think the 6800 and 6800XT would probably look more competitive with 50 dollars knocked off, but if they prove to be all-round faster than their direct price competitors from NVIDIA they will sell. If they don't AMD could just do a last minute price cut, like they did last time.
 
I think what it boils down to is that for most use cases raw power is at diminishing returns now, so a few percent here and there won't mean much. All tests are showing ridiculous FPS.

So two things matter IMHO:
  • Who can provide decent availability
  • Who can show better exclusive games with real benefits for their fancy new techs
 
hmmmm .... If you are after Ray tracing this article suggests that a 30x0 is a better choice than a 6x00 , although it doesnt say which 6x00 card was tested

edit: Which ties in with the Port Royal synthetic benchmark leaks.

 
Last edited:
Top Bottom