Hardware & Technical Latest Nvidia Ampere Rumours

After seeing the size of the 3090 I don't think they're gonna beat its raw power.That heatsink is enough to warm a small apartment..
I do think they're going to be very competitive though, and more efficient.

The heatsink looks like it's geared for silence and aesthetics first. No doubt the 3090 will consume a lot of power, but I'm doubtful it will dramatically exceed prior top-end parts in this regard.

I would also be surprised if the top RDNA2 part was more efficient, in terms of performance per-watt.

Few more days to go to see if this is true. Looks like the 3080 and 2080ti are pretty close to equal based on these specs. If it's in the $800 range then I'll be interested. Remains to be seen what effect AMD will have on NVidia pricing......if any.

https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked

https://www.techpowerup.com/gpu-specs/geforce-rtx-2080-ti.c3305

I'd expect architectural improvements to give the 3080 an appreciable edge over the 2080 Ti, especially if the 3090 (which should only be 15-20% faster than the 3080, given the number of functional units and relative clocks) is any where near the kind of improvement over the 2080 Ti that has previously been rumored.

Basically, if these rumored specifications are largely accurate, and the 3080 is still only as fast as the 2080 Ti (meaning either negligible increase in performance per clock cycle or little increase in clock frequency) then the 3090 is only going to be ~20% faster than the 2080 Ti, which would be quite disappointing.
 
After seeing the size of the 3090 I don't think they're gonna beat its raw power.That heatsink is enough to warm a small apartment..
I do think they're going to be very competitive though, and more efficient.
Definitely a great investment for the coming winter and also a double blow.

🦠 🦠 🦠 🦠 🦠 🦠 😷
 
Definitely a great investment for the coming winter and also a double blow.

Hehe. Definitely, especially here above the arctic circle. I'm running a small GPU mining rig for a few years now. It helps with heating year round ;)

The heatsink looks like it's geared for silence and aesthetics first. No doubt the 3090 will consume a lot of power, but I'm doubtful it will dramatically exceed prior top-end parts in this regard.

I would also be surprised if the top RDNA2 part was more efficient, in terms of performance per-watt.


edit: Nevermind, hadn't seen the new leaked specs linked above.
The rumour is a 350-400 watt TDP.

 
Last edited:
If this article is to be believed, on the Zotac 3090 they are using 3 x 8-Pin connectors, I wonder if other AIB vendors will do the same.

edit:

The draw of a 3090 is rated at 350W apparently which is 40% more than the 2080Ti which was rated at 250W. I wonder how that will translate to an increase in performance ?


 
Last edited:
I personally wont be rushing to buy a new Ampere Card ... That doesnt mean I wont end up with one, but I really am going to have to see what Big Navi looks like.
 
The draw of a 3090 is rated at 350W apparently which is 40% more than the 2080Ti which was rated at 250W. I wonder how that will translate to an increase in performance ?

Increasing the power limits will significantly increase performance in and of itself in many titles and benchmarks.

Contrary to popular belief, TDP and estimated board power figures are only loosely correlated with the actual power consumption possible. In reality it's a cap to make sure a part, at least at stock, stays within power delivery standards and is able to be cooled at an acceptable noise level. It also reduces failure/return rates and allows cheaper components to be used than if power draw was unconstrained.

Many high-end GPUs have been able to reach ~400w actual power consumption/heat dissipation for the last decade or so (the first GPU I cracked 400w with, measured with a clamp ammeter at the PCI-E power cables, was a water cooled GTX 480 that I had 10 years ago) though more recent GPUs usually need various current limiters to be loosened or disabled (via software, firmware, or hardware) to reach such levels.

Anyway, the real test, for me, will be what the parts can do uncapped, or at least at maximum software allowable board power. My better 1080 Ti allows for +50% on it's power limit without and firmware or hardware mods, and will match a totally stock 2080 Ti in a few tests...which is highly misleading because a 2080 Ti with the same power limit is a solid 30%+ faster in those same tests.

And also an increase in the electricity bill.

🦠 🦠 🦠 🦠 🦠 😷

If you let the hardware stretch it's legs, sure, but newer architectures are a almost invariably more power efficient than older ones, and with the same cap will perform much better.

At the same clock speeds and voltages more effective cooling also reduces power consumption by lowering internal resistance as well as reducing parasitic capacitance and inductance. It also allows less frequent refreshes of DRAM (a significant contributor to DRAM power consumption), for those that have temperature based or retention/content aware refreshes...though the later is new and I haven't seen it in any consumer applications yet.
 
MICROSOFT OFFICIALLY RELEASES DIRECTX 12_2

The new version of DirectX 12 allows the support of several new features that will bring the future graphics cards RTX 3x00 and RDNA2 during this second part of the year 2020.

As a reminder, this version brings a redesign of DirectX raytracing, the addition of mesh shading, the variable rate shading and the feedback sampler.

🦠 🦠 🦠 🦠 🦠 🦠 😷
 

Since all the non-reference cards revealed so far seem to be using two (or more rarely, three) 8-pin connectors and are using more conventional cooler designs, this reinforces my initial feeling that the NVIDIA reference cooler has the design it has mostly for aesthetic reasons.

Anyway, I'm looking at the likely relative performance and price differentials between the two GA102 models being released first (3090 and 3080), there is no way I can justify getting a 3090. I'm hoping that I can flash a 3080 with 3090 firmware to unlock higher board power limits (unlocking functional units is an extremely unlikely scenario) which should make overclocking more fruitful. Prior optimism surrounding the possibility of the 3080 having the same exact physical memory configuration, just with two channels disabled in firmware, looks unfounded given the 24GiB on the 3090 and 10GiB on the 3080.

I think with Odyssey this can happen often. :)

🦠 🦠 🦠 🦠 🦠 🦠 😷

Elite: Dangerous in general is pretty demanding, power wise.

However, I was referring to board power limits. Without modding, the card will never pull more current than it's firmware allows it to, capping performance to prevent this from happening.
 
.... Prior optimism surrounding the possibility of the 3080 having the same exact physical memory configuration, just with two channels disabled in firmware, looks unfounded given the 24GiB on the 3090 and 10GiB on the 3080....

If the rumours are to be believed there may be variants of the 3080 with 20GB VRAM released later by some AIB partners.
 
If the rumours are to be believed there may be variants of the 3080 with 20GB VRAM released later by some AIB partners.

I'm sure there will be.

The rumor I'm referring to was the one that stated that the boards for the 3090 and 3080 were identical, aside from the GPU SKU itself, which would imply the same memory ICs and perhaps a possibility of unlocking disabled memory channels. This is extremely unlikely to be the case, given the massive difference in GDDR6X capacity between the reference (and initial non-reference) 3080 and 3090s. Custom boards with 20GiB of memory for the 3080 would also exclude this possibility.
 
Morbad it has been a long time since the glory days of being able to unlock a GPU to a high end model (I remember doing it with my geforce 6600 and my ATI 9800 and it really was a free upgrade)
I will eat my hat if NV made this possible even IF the cards are essentially the same.
It's a shame as a 100% improvent in RTX sounds nice but I think I am out and will wait for a refresh or big Navi.
Forgetting the cost (which is a hard thing to forget) I just don't like the idea of such a power hungry GPU....... It's ok in winter after all I need to heat my room anyway but in summer it really is horrible sitting next to a fan heater and even best case scenario just means wasting a load of energy.
The idea of needing an 850w PSU if I want a top of the range gaming rig does not sit well with me. I wish NV had gone with TSMC and not Samsung and I think you could have probably chopped 50-100w off those requirements.
I also think 20 or 24 GB is overkill and adding to the price but at the same time cutting my video card ram in an "upgrade" does not sit well with me either.

For gaming an RTX 3080 with 12gb and a 3090 with 16gb would be a better balance I think
 
Morbad it has been a long time since the glory days of being able to unlock a GPU to a high end model (I remember doing it with my geforce 6600 and my ATI 9800 and it really was a free upgrade).
I will eat my hat if NV made this possible even IF the cards are essentially the same.

Most recent GPU I've fully unlocked to a higher model was a Radeon HD R9 290, which took a 290X bios and became a full fledged 290. There have been a few cases like this with more recent GPUs, as board partners still occasionally flash higher-end boards to lower-end cards to make up for supply deficiencies in the lower-end models. It's certainly uncommon though.

Anyway, I fully expect the disabled portions of the GPU to be fused off and impossible to reenable with the RTX 3000 series, and I expect different physical memory ICs on the 3080 and 3090 boards as well as two unpopulated memory channels on the 3080.

If the memory tables are compatable, flashing 3090 firmware onto a 3080 could still be the easiest way to increase the maximum power limit of a 3080, without having to resort to hardware mods.

Forgetting the cost (which is a hard thing to forget) I just don't like the idea of such a power hungry GPU....... It's ok in winter after all I need to heat my room anyway but in summer it really is horrible sitting next to a fan heater and even best case scenario just means wasting a load of energy.
The idea of needing an 850w PSU if I want a top of the range gaming rig does not sit well with me.

The board power requirements are not all that different from prior generations, and should be just as tunable. Idle power consumption is also negligible and performance per watt has certainly increased. You don't need an 850w PSU for a high-end system with a 3090, and even if you did, it's not going to draw anywhere near peak loads the overwhelming majority of the time.

I wish NV had gone with TSMC and not Samsung and I think you could have probably chopped 50-100w off those requirements.

We don't know how different these processes actually are in efficiency as no GPUs made on Samsung's 7nm process have yet been reviewed.

I also think 20 or 24 GB is overkill and adding to the price but at the same time cutting my video card ram in an "upgrade" does not sit well with me either.

For gaming an RTX 3080 with 12gb and a 3090 with 16gb would be a better balance I think

Not practical configurations. The GPU has a 384-bit memory interface which mandates 12/24GiB in fully enabled configurations. ROP clusters are almost certainly still tied to the memory controllers as well, making it impractical to cut them down without disabling memory channels.

The 3090 has 24GiB to make sure it has a bigger number on it than the competitor's part. People will buy 24GiB cards at the high-end even though it's a meaningless figure and by the time any game benefits appreciably from more than twelve, the 3090 will be quite long in the tooth.

The 3080 has 10GiB because it's only got ten of twelve memory channels enabled and putting 20GiB on the Founders Edition would be an extra cost on the BoM and would eat into AIB's options for making distinctive non-reference products.

NVIDIA is also leaving a gap between the 3080 and 3090 to account for the possibility of near-future competition from AMD or even Intel. The 3090 or Titain should be enough to secure a win at the top-end, but the high-end RDNA2 part could concievably best the 3080. If that is the case, I'm sure we will soon see a 3080 Ti or 3085 with a 352-bit memory interface and 11GiB of memory to fill the gap.

Regardless, 10GiB is fine. I have two 1080 TI's (in different computers) with 11GiB and have zero hangups about upgrading to a 10GiB 3080 for my primary gaming system. Both the new Microsoft and Sony consoles have 16GiB of shared memory, and the odds of more than 10GiB being available for graphics at any given time, or many games being able to significantly leverage it in the next few years are rather small.
 
3080/3090 reference/FE PCB:
854eecd6-29e9-436a-b452-5e5af2bde2f4.jpg


Looks like an 18+1 phase setup, though since they all look identical, I suppose it could be a 16+3 or something, which might make more sense if 24GiB of GDDR6X is ~60w.

Odd shape and pretty compact in length, presumably to allow airflow through the cooler from the other side of the card. I could probably fit a water cooled one in my min-DTX system without issue. The non-reference cards all seem to have much more conventional PCBs.
 
Back
Top Bottom