Hardware & Technical Latest Nvidia Ampere Rumours

So, it appeared that my PSU is causing issues so I swapped to a 750W Gold unit.
The most curious thing that happened is that with the very same mining settings, my GPU became more efficient, jumping from 430 to 440 Kh/J.

Plus, I could let the Watts loose, as even at full load (370W or thereabouts) and at 80% fan speed the card remains in the high 50s C. This is also very stable, it went flawless for many hours in MS Flight Sim.
Time Spy result.
 
Just checked amazon; 3070 $2,999, 3080 $3,999
Insanity.
Yeah, I'm out. By the time prices return to 'normal' and production is on track we're probably at least 12-18 months in, and by then we're getting close to the new series I guess. Its annoying as I was hoping to replace my 2070s and give that one to a mate who is building a rig, but I guess this is about as much a first world problem as it gets.
 
As long as mining remains profitable it's hard to see the problem changing. Miners are scooping them up as fast as they can get them.
 
Yeah, I'm out. By the time prices return to 'normal' and production is on track we're probably at least 12-18 months in, and by then we're getting close to the new series I guess. Its annoying as I was hoping to replace my 2070s and give that one to a mate who is building a rig, but I guess this is about as much a first world problem as it gets.
I am in a similar situation, I was hoping to get a 6800XT to replace my 5700XT which was then going to my nephew. No chance any time soon of that happening it would seem.
 
Yeah, I'm out. By the time prices return to 'normal' and production is on track we're probably at least 12-18 months in, and by then we're getting close to the new series I guess. Its annoying as I was hoping to replace my 2070s and give that one to a mate who is building a rig, but I guess this is about as much a first world problem as it gets.
I've upgraded from a 2070 (not super) to a 3070, and as things are now, I would hold my horses. I expected a much higher performance boost than what I actually got (except for Cuda), and I paid like 700$ for the 3070 before the World went crazy. If this continues, I'll sell my 3070 in a year and thereby become a billionaire 💰:cool:(y)
 
I've upgraded from a 2070 (not super) to a 3070, and as things are now, I would hold my horses. I expected a much higher performance boost than what I actually got (except for Cuda), and I paid like 700$ for the 3070 before the World went crazy. If this continues, I'll sell my 3070 in a year and thereby become a billionaire 💰:cool:(y)

The jump from 1080ti to the 3080 was insane though, it made a huge difference, especially in VR.

Hopefully Eth 2.0 will change the game and reduce the return on invesment. I mean, even at $4k, if a 3080 mines net $3k a year it beats basically any asset class and makes it a bargain.
 
The jump from 1080ti to the 3080 was insane though, it made a huge difference, especially in VR.

Hopefully Eth 2.0 will change the game and reduce the return on invesment. I mean, even at $4k, if a 3080 mines net $3k a year it beats basically any asset class and makes it a bargain.
I have a Reverb G2 on the 3070. After a lot of tweaking it runs "ok", with a few frame drops here and there. I run most settings on high, and no resampling (1:1). I would like the 3090, but I guess that'll have to wait a few years.

When I bought the 3070 at regular price, it was on the edge of my budget. Now it's become so expensive that I wouldn't buy it. At the same time, I can't sell it because I need a GPU. Strange times.
 
I have a Reverb G2 on the 3070. After a lot of tweaking it runs "ok", with a few frame drops here and there. I run most settings on high, and no resampling (1:1). I would like the 3090, but I guess that'll have to wait a few years.

When I bought the 3070 at regular price, it was on the edge of my budget. Now it's become so expensive that I wouldn't buy it. At the same time, I can't sell it because I need a GPU. Strange times.

To be honest I haven't tried Elite. I also have the G2, and MS Flight Sim went from barely enjoyable to a fantastic experience. Re-played RDR2 ultra on 1440p too - on same settings the 3080 is flat out double the 1080tiy especially as I came from a low-end 1080ti to a high-end 3080.

I never wanted to buy an $1.5k GPU on a general principle, but by now it actually moned back almost half of it's value, so I am more than happy with the decision.

On a similar note to yours, I was contemplating how bad it would be if it would fail. Either I had to wait months at best to get another one, or have the money back to buy a 3060 for the same amount?
 
On a similar note to yours, I was contemplating how bad it would be if it would fail. Either I had to wait months at best to get another one, or have the money back to buy a 3060 for the same amount?
I know that feeling, but still I sold my "old" 2070 to some desperate teenage gamer for the original list price. The 3070 runs fine though, and since I bought my first Hercules graphics card for my homebuilt PC XT clone (almost 40 years ago :eek:), I have not experienced one single GPU dying, but then again, I don't overclock anymore. Not that it seems to matter much these days, with throttling being the "worst" thing that could happen unless you go extreme. As Linus and Electro Boom showed recently, you have to zap a memory module with a large Tesla coil before bad things happen. Computers these days are pretty "not fragile", (he said and looked around for some wood to knock on) :alien:
 
It's not just miners. Production has been terribad.

On Steam 3070 share is almost up to 1080ti levels, actually all new Nvidia cards are growing pretty fast, so production isn't bad at all, it is just mining plus PC gaming are at all-time highs. AMD cards however are nowhere on the list, possibly AMD has to fulfill console volumes first in their reserved production capacity - and new console prices seem to have come down a bit lately in the gray market.

I know that feeling, but still I sold my "old" 2070 to some desperate teenage gamer for the original list price. The 3070 runs fine though, and since I bought my first Hercules graphics card for my homebuilt PC XT clone (almost 40 years ago :eek:), I have not experienced one single GPU dying, but then again, I don't overclock anymore. Not that it seems to matter much these days, with throttling being the "worst" thing that could happen unless you go extreme. As Linus and Electro Boom showed recently, you have to zap a memory module with a large Tesla coil before bad things happen. Computers these days are pretty "not fragile", (he said and looked around for some wood to knock on) :alien:

Yeah, and I sold my 1080ti below market price for a friend. :)
For a kid or a teenager the current situation must be very disappointing, good luck convincing parents to buy a GPU at these prices...

I was always a console gamer, with the exception of the C64 era plus a few years on a family computer in the 90s that had a Radeon GPU (but unlike now, back then I wasn't interested in PCs at all). I switched from PS3 to PC...
So I hope they are not fragile as mining does load the memory module quite significantly. What's odd that the output has been better since the PSU change but also became unstable, so I am now dialling back the mining memory OC.

In terms of gaming OC, it depends, in 3dmark testing I did see the minimum frame rates rise a lot. On a monitor I am less concerned of dips, but in VR it is very important.
What's odd that in MS Flight Sim I seems that my 16gb 3200 Mhz memory is bottlenecking both the CPU and the GPU. It runs stretched out with a huge virtual memory file, while the rest of system runs at 70-80% only. But I guess spending an additional $200 for a new memory on an otherwise 4-year old platform with a 4-core CPU really doesn't make sense. 🤷‍♂️

But I do get your point - plus if it weren't for VR, the 1080ti would have been adequate for another generation still, even considering I have to admit: ray tracing looks very-very pretty indeed.
 
With the realease of the 1.2 patch I've been messing around a lot with CP2077 on my brother's RTX 3080 and my 6800 XT. I thought I'd miss ray tracing, but I really don't...on the 3080 it feels like a waste not using it, because it is quite usable on the 3080, but I'm not sure the visual trade-off is worth it even on this part. DLSS is what makes it workable, but the people claiming not to see the difference between native resolution and DLSS are either blind or insane. Even DLSS quality is a blatantly obvious reduction in clarity vs. native. It's the best upscaler there is currently, but it's still an upscaler.

On the 6800 XT ray tracing is totally out of the question, it's RT performance is too low to make it usable in CP 2077 past 1080p or so and the CAS upscaler+sharpener is no match for DLSS in fine details. On the plus side, there is no temptation to use RT on the AMD card, so I can play native 1440p on ultra at 70-110 fps without any hesitation. The 3080 can do this too, of course...sample I'm using might be slightly slower than the 6800 XT, even after replacing the thermal pads under the backplate (which didn't help memory temps much), but they are close enough to be perceptibly the same.

Anyway, the expiration date for the original prices on the RTX 30xx EVGA queue has expired, but given how much the street prices have gone up I'll probably still grab one if I'm ever offered one. Might still try for a 3080 Ti as well, though I have reservations about purchasing a mining-capped part.

But I guess spending an additional $200 for a new memory on an otherwise 4-year old platform with a 4-core CPU really doesn't make sense. 🤷‍♂️

This 2x16 kit of Timetec was $120 when I bought it (though the price has gone up to 145 last I checked):
c5B7GDz.png

JOo2eGr.png


It's cheap 3600 CL18 bin Hynix CJR, but with a little tweaking is faster than using XMP on most kits of three times the price. A good kit of Samsung B-die, Micron E-die, or Hynix DJR will beat it when similar care is given to tuning, but not enough to be worth it in a price/performance sense, even if one is looking at total system cost.
 
With the realease of the 1.2 patch I've been messing around a lot with CP2077 on my brother's RTX 3080 and my 6800 XT. I thought I'd miss ray tracing, but I really don't...on the 3080 it feels like a waste not using it, because it is quite usable on the 3080, but I'm not sure the visual trade-off is worth it even on this part. DLSS is what makes it workable, but the people claiming not to see the difference between native resolution and DLSS are either blind or insane. Even DLSS quality is a blatantly obvious reduction in clarity vs. native. It's the best upscaler there is currently, but it's still an upscaler.

On the 6800 XT ray tracing is totally out of the question, it's RT performance is too low to make it usable in CP 2077 past 1080p or so and the CAS upscaler+sharpener is no match for DLSS in fine details. On the plus side, there is no temptation to use RT on the AMD card, so I can play native 1440p on ultra at 70-110 fps without any hesitation. The 3080 can do this too, of course...sample I'm using might be slightly slower than the 6800 XT, even after replacing the thermal pads under the backplate (which didn't help memory temps much), but they are close enough to be perceptibly the same.

Anyway, the expiration date for the original prices on the RTX 30xx EVGA queue has expired, but given how much the street prices have gone up I'll probably still grab one if I'm ever offered one. Might still try for a 3080 Ti as well, though I have reservations about purchasing a mining-capped part.



This 2x16 kit of Timetec was $120 when I bought it (though the price has gone up to 145 last I checked):
c5B7GDz.png

JOo2eGr.png


It's cheap 3600 CL18 bin Hynix CJR, but with a little tweaking is faster than using XMP on most kits of three times the price. A good kit of Samsung B-die, Micron E-die, or Hynix DJR will beat it when similar care is given to tuning, but not enough to be worth it in a price/performance sense, even if one is looking at total system cost.

At my local online retailer 32Gb, reasonably fast RAMs range from $200 to $400, so I decided against spending so much this system that I'm planning to replace anyway, plus it would be truly only for the Flight Sim.
So I ordered an $50 256 Gb 3000Mb/s read/write nvme m.2 that for the time being will only serve as pagefile (if I ever need to reinstall Windows, I'll probably move either the system, or MS Flight Sim to it as well). :)

As for RTX on Cyberpunk, night scenes do look wonderful with all the lights - I don't know how it looks without RT, but with RT it is awesome.
What do you think about such RT reshader plugins? I am considering trying it on an otherwise not demanding game like Warships.

3080Ti - unless mining profitablity is unsustainable, I guess I am better staying on 3080 (I recon a mining 3080 can easily worth as much as a 3080ti if mining is gimped on it). I've achieved 19300 graphics score on the first try with a stable OC (I am using it for Flight Sim, and it goes for hours) anyway, GPU easily stays on 2100 Mhz an possibly could go higher. Interestingly, my VRAM refuses to go to 10300 Mhz and beyond, I guess I have a good(ish) GPU core with an average-at-best VRAM on my card.
 
Mostly gave up on EVGA's queue. They have zero incentive to supply the cheaper SKUs I was queued for while they can sell higher margin parts that aren't bound to old prices.

However, I tried a Newegg Shuffle for the first time the other day and to my surprise got an Aorus 3080 Master in a combo with a decent enough X570 board. Consensus seems to be that the thermal pads on this card generally aren't very good, but the card is vastly less troublesome to disassemble than the FE and has a much better cooler, so it should be workable.

Probably going to keep my 6800 XT as my main gaming card, as I've pretty much exhausted the RT titles I was interested in. Will have the 3080 as a backup and miner until I get the itch for something new that makes decent use of RT.

As for RTX on Cyberpunk, night scenes do look wonderful with all the lights - I don't know how it looks without RT, but with RT it is awesome.
What do you think about such RT reshader plugins? I am considering trying it on an otherwise not demanding game like Warships.

RT in Cyberpunk is most noticeable around a lot of glass and other highly polished surfaces. However, outside of certain bars/clubs and the city center, these areas are pretty rare and the performance hit is pretty heavy, even on RTX cards...and unusably so on the RX 6000 series.

Haven't tried any ray tracing shaders mods myself, but I've heard very mixed reports of the reshade one it in actual use. Regardless, it's a global illumination shader and won't be a substitute for more full-fledged effects (detailed object reflections and shadows)...though tuned correctly it should definitely be able to replace things like older bloom/HDR/specular lighting and ambient occlusion. Probably worth a shot if you have games that have performance to spare.

3080Ti - unless mining profitablity is unsustainable, I guess I am better staying on 3080 (I recon a mining 3080 can easily worth as much as a 3080ti if mining is gimped on it). I've achieved 19300 graphics score on the first try with a stable OC (I am using it for Flight Sim, and it goes for hours) anyway, GPU easily stays on 2100 Mhz an possibly could go higher. Interestingly, my VRAM refuses to go to 10300 Mhz and beyond, I guess I have a good(ish) GPU core with an average-at-best VRAM on my card.

They are definitely going to limit Ethash on the 3080 Ti, probably cutting performance in half vs. what the hardware is actually capable of. The limiter is likely going to get progressively more annoying to avoid as they improve how it's implemented, especially if you want to use the cards for anything other than mining, and cannot be tied to modified drivers based on outdated branches. The new cards being limited was a significant factor in me picking up a 3080 rather than just waiting on a Ti at this point. I'd like a Ti--I think it will be the high-end gaming sweet spot--but it's not going to be that much faster than a 3080 at gaming.
 
Last edited:
They are definitely going to limit Ethash on the 3080 Ti, probably cutting performance in half vs. what the hardware is actually capable of.
I don't mine, but this bugs the crap out of me. Why would they even do this? It's pretty scummy.
 
I don't mine, but this bugs the crap out of me. Why would they even do this? It's pretty scummy.

They have a mining product line that is cheaper to build and service. They can't sell those at full price if the gaming parts can mine just as well or better.

By limiting mining on gaming parts, they can also claim to be pro gaming in their marketing, which may actually win them some degree of brand loyalty brownie points among people prone to that affliction. If they can actually keep gaming cards out of miners' hands by making them less appealing (much easier said than done), they can also ensure that they keep a stranglehold on actual share and brand presence in the PC marketplace, which in turn makes it easier to court developers to optimize for their products and prolongs their competitive advantage.

TLDR - they seen an opportunity to profit via another layer of mostly artificial segmentation.
 
Testing this Gigabyte Aorus RTX 3080 Master. GDDR6X cooling is still pretty mediocre, but it's much better than the FE. Currently have it running at 20Gbps (about +800 on the slider, without CUDA P2 state disabled or +300 with it) and it gets to 98C at 90% fan speed, which is borderline. Does 94-95Mh/s like this.

Would test some games, but I'm short a CPU at the moment. Probably going to exchange the 5800X I was expecting to use with it, so the card is running in my oldest setup that is in major need of an overhaul and fresh OS install.
 
Memory clocks well on this sample. Switched over to Gminer, made a more agressive undervolt curve and bumped up the memory clock another couple of multipliers (NVIDIA cards tend to do best at whole number memory multipliers and have a 27MHz reference clock and an internal DRAM clock 1/4th that of what MSI AB reports, so setting figures that are multiples of 108 frequently work well), and maxed out fan speed to keep the GDDR6X junction temps under control.

Have a little more room to tune the curve down, but not much. Setting a power limit doesn't help either with an agressive custom core, either the limit isn't reached, or the clock throttling causes extra overhead and harms efficiency.

Currently getting 97.5MH/s at 234w for the card and about 300w at the wall for the entire system:
Q4joMKN.png


6800XT, in and of itself almost matches the efficiency fully tweaked (62MH/s at about 154w). However, there is more overhead due to having less total hashing power in one place.

After I get some CPUs that work to my satisfaction I'll do a gaming comparison between the 3080 and the 6800 XT. Will be interesting to see how they compare in Odyssey.
 
Memory clocks well on this sample. Switched over to Gminer, made a more agressive undervolt curve and bumped up the memory clock another couple of multipliers (NVIDIA cards tend to do best at whole number memory multipliers and have a 27MHz reference clock and an internal DRAM clock 1/4th that of what MSI AB reports, so setting figures that are multiples of 108 frequently work well), and maxed out fan speed to keep the GDDR6X junction temps under control.

Have a little more room to tune the curve down, but not much. Setting a power limit doesn't help either with an agressive custom core, either the limit isn't reached, or the clock throttling causes extra overhead and harms efficiency.

Currently getting 97.5MH/s at 234w for the card and about 300w at the wall for the entire system:
Q4joMKN.png


6800XT, in and of itself almost matches the efficiency fully tweaked (62MH/s at about 154w). However, there is more overhead due to having less total hashing power in one place.

After I get some CPUs that work to my satisfaction I'll do a gaming comparison between the 3080 and the 6800 XT. Will be interesting to see how they compare in Odyssey.

Neat! I think I could reach about 94-95 MH/s before the memory becomes unstable. Instead I'm mining at 85 MH/s at 200W that allow about 92C memory junction temperature - I'm fully OK with it as it gives me a peace of mind.
The monster gaming profile is rock solid, however. :)

Make sure to post a Time Spy graphics score!
 
Back
Top Bottom