Hardware & Technical Latest Nvidia Ampere Rumours

I don't think it will take six months for the supply issue to be straightened out.

A Super refresh at some future date is highly likely, but I'm not going to wait for it and I don't expect it to be any more significant than last generation's.

We'll see, but I think it will be the most sought after (high end) card and also the one with tightest supply. By the way, what makes you tilt towards the ti so decisively, the VRAM?
 
By the way, what makes you tilt towards the ti so decisively, the VRAM?

I'm expecting the 3080 Ti to be closer in price to the 3080 than the 3090 and be closer in performance to the 3090--current rumor is same number of enabled functional units as the 3090, with only a reduced memory bus width, due to two fewer memory channels utilized. The extra VRAM is nice, but isn't likely the primary advantage for most titles, including the ones I expect to play on it. Supposedly it's slated for launch in late February, which is about the earliest I'd expect, if it's supposed to be using the latest batches of dies NVIDIA ordered from Samsung in late December.

However, if I see a good price on an acceptable 3080 (or 6800XT) in the meantime, I'm not going to pass it up.
 
I'm expecting the 3080 Ti to be closer in price to the 3080 than the 3090 and be closer in performance to the 3090--current rumor is same number of enabled functional units as the 3090, with only a reduced memory bus width, due to two fewer memory channels utilized. The extra VRAM is nice, but isn't likely the primary advantage for most titles, including the ones I expect to play on it. Supposedly it's slated for launch in late February, which is about the earliest I'd expect, if it's supposed to be using the latest batches of dies NVIDIA ordered from Samsung in late December.

However, if I see a good price on an acceptable 3080 (or 6800XT) in the meantime, I'm not going to pass it up.

Yeah, same here. I'm only watching the FE models as the pricing of AIB cards are absolutely ridiculous, ranging from €150+ to rather double the price - given the FE constraint it will be very-very hard to score a 3080ti although I'd happily pay the extra €200 for the performance (assuming FE will cost €1000).
At the moment it is MS Flight Sim in VR that is the key point I need that extra power, though I admit the 1080ti provides better than expected performance in it.
 
I don't yet own MS Flight Sim 2020 but every credible techie review I've seen has indicated that it is mainly CPU-bound on a single thread, not GPU-bound. Therefore, if that is your use case, I'd want to research it a bit deeper. It might be that a CPU upgrade might be the better route for you (and have better availability), even if it means a new motherboard.
 
If the current bull trend on crypto prices holds (and it's likely to), demand could rise faster than supply even if Samsung works miracles.
 
I don't yet own MS Flight Sim 2020 but every credible techie review I've seen has indicated that it is mainly CPU-bound on a single thread, not GPU-bound. Therefore, if that is your use case, I'd want to research it a bit deeper. It might be that a CPU upgrade might be the better route for you (and have better availability), even if it means a new motherboard.

In VR this game is even more GPU bound than on monitor. My 1080ti is basically pegged at 100% while my 7700k hovers around 40% usage. Flight Sim doesn't use more than 8 threads, so at this moment I would probably be still GPU bound with a 3090... WMR render scale is at 80% while in MSFS it is set to another 60% with a mix of low-mid settings, only clouds are on high - with an FPS in the mid 40s. :)

If the current bull trend on crypto prices holds (and it's likely to), demand could rise faster than supply even if Samsung works miracles.

At this rate I might need to look into getting a 2080ti. :)
 
At this rate I might need to look into getting a 2080ti. :)

Hard to find anything Pascal or newer at this point. 2080Tis are going for 1000+ USD again and I have people offer to buy my worse 1080 Ti for more than I paid for it nearly three years ago.

I've been in EVGA's queue for each of their XC3 3080s for a while and was planning on buying whichever one was offered to me first. But seeing how demand is likely to be, I'll buy them all, if I get the opportunity, as I don't know the next time I'll see any high-end part in stock anywhere.

I was hoping Intel would be able to satisfy some demand soon, but their highest-end discrete Xe part, at least for a first run, looks like it will only be a mid-range part, and Intel is having TSMC make them due to their own wafer supply problems, so foundry pressure will remain high.
 
I was hoping Intel would be able to satisfy some demand soon, but their highest-end discrete Xe part, at least for a first run, looks like it will only be a mid-range part, and Intel is having TSMC make them due to their own wafer supply problems, so foundry pressure will remain high.

Is it only TSMC or the wafer production is also bottlenecked?
As discussed before, I think this is the single most unbelievable market imbalance at the moment.
 
Is it only TSMC or the wafer production is also bottlenecked?
As discussed before, I think this is the single most unbelievable market imbalance at the moment.

Intel's foundry business has been suffering capacity constraints since they started missing deadlines in their nodes several years back. Not that long ago it would would have outlandish to think of Intel needing to tap third party foundries like TSMC for a major product line, but that's the situation they are in...they don't have the capacity meet demand themselves.

Intels Xe parts are rumored to be made on TSMC N6. Maybe Intel booked early enough on that process to avoid competing for supply with AMD and NVIDIA, but it's too early to tell.
 
Intel's foundry business has been suffering capacity constraints since they started missing deadlines in their nodes several years back. Not that long ago it would would have outlandish to think of Intel needing to tap third party foundries like TSMC for a major product line, but that's the situation they are in...they don't have the capacity meet demand themselves.

Intels Xe parts are rumored to be made on TSMC N6. Maybe Intel booked early enough on that process to avoid competing for supply with AMD and NVIDIA, but it's too early to tell.

By the way, if the crypto prices are this high, perhaps it could be worth it to 'overpay' for a card and then let it mine when not gaming? :)
 
By the way, if the crypto prices are this high, perhaps it could be worth it to 'overpay' for a card and then let it mine when not gaming? :)
Dunno, is it profitable to even run a one (GPU) card miner these days given the math difficulty?
Couple years ago common knowledge was to forego GPU based miners altogether and run a local ASIC machine within a pool, has this changed?
 
Last edited:
Dunno, is it profitable to even run a one (GPU) card miner these days given the math difficulty?
Couple years ago common knowledge was to forego GPU based miners altogether and run a local ASIC machine within a pool, has this changed?

It seems it has, a 3080 can net about $10 a day at the moment, so even at part time usage it can pay itself the upgrade fairly quickly. Bitcoin just touched $40k today...

I hate the whole crypto thing, but it seems high valuations are is here to stay and as long as the stock market insanity continues... I just spotted a 3080 at a local retailer and contemplating picking it up. :)
 
Couple years ago common knowledge was to forego GPU based miners altogether and run a local ASIC machine within a pool, has this changed?

That's still the case if one is directly mining BTC, which hasn't really been viable on GPUs since 2012.

However, GPU mining alt coins never stopped being situationally profitable, and ETH is the main one being GPU mined now.

It seems it has, a 3080 can net about $10 a day at the moment, so even at part time usage it can pay itself the upgrade fairly quickly. Bitcoin just touched $40k today...

I hate the whole crypto thing, but it seems high valuations are is here to stay and as long as the stock market insanity continues... I just spotted a 3080 at a local retailer and contemplating picking it up. :)

Even my 1080 Ti's are making $4 a day, each, at current prices.

Anyway, I hate the whole money thing, especially as exemplified by modern fractional reserve banking. When I originally got into crypto in 2010-2011, it was from the crypto anarchist perspective, mixed with a bit of technical curiosity. By that measure, BTC has been a colossal failure, because instead of providing an alternative financial system, it went mainstream as was absorbed by the status quo.

Of course, I'm also a pragmatist, and I'm not going to turn down opportunities for profit. Despite selling most of my BTC back when it was worth around $2 each (I once spent fifty BTC on an early Amazon Kindle for my wife's birthday), and having a fair portion stolen from the original, very sketchy, exchanges, I've still been able to pay for all of my electricity (not just the mining use, but all of it) and all of my PC hardware for the last decade with mining profits.

I broke down my last dedicated mining system a few years back, but I've still been mining ETH on my 1080 Ti's when they weren't being used for something else. After subtracting the last three years of electricity costs, and the cost of the GPUs themselves, the wallet the pool I'm in is still at ~10K USD of profit...from the part time mining of two 1080 Ti's, purchased for gaming, three years ago. That's why my new GPU budget has gone up...I moved (but didn't sell) two ETH to buy a 3080 when that was the equivalent of ~600USD, it's ~2500USD now.

As to when this insanity stops, I wouldn't count on that any time soon. Current prices are largely driven by institutional investment--banks buying BTC and ETH for the long haul--not the pump and dump speculation of days past. I'm sure there will be some corrections in the short term, but a long term bull run seems likely.
 
That's still the case if one is directly mining BTC, which hasn't really been viable on GPUs since 2012.

However, GPU mining alt coins never stopped being situationally profitable, and ETH is the main one being GPU mined now.



Even my 1080 Ti's are making $4 a day, each, at current prices.

Anyway, I hate the whole money thing, especially as exemplified by modern fractional reserve banking. When I originally got into crypto in 2010-2011, it was from the crypto anarchist perspective, mixed with a bit of technical curiosity. By that measure, BTC has been a colossal failure, because instead of providing an alternative financial system, it went mainstream as was absorbed by the status quo.

Of course, I'm also a pragmatist, and I'm not going to turn down opportunities for profit. Despite selling most of my BTC back when it was worth around $2 each (I once spent fifty BTC on an early Amazon Kindle for my wife's birthday), and having a fair portion stolen from the original, very sketchy, exchanges, I've still been able to pay for all of my electricity (not just the mining use, but all of it) and all of my PC hardware for the last decade with mining profits.

I broke down my last dedicated mining system a few years back, but I've still been mining ETH on my 1080 Ti's when they weren't being used for something else. After subtracting the last three years of electricity costs, and the cost of the GPUs themselves, the wallet the pool I'm in is still at ~10K USD of profit...from the part time mining of two 1080 Ti's, purchased for gaming, three years ago. That's why my new GPU budget has gone up...I moved (but didn't sell) two ETH to buy a 3080 when that was the equivalent of ~600USD, it's ~2500USD now.

As to when this insanity stops, I wouldn't count on that any time soon. Current prices are largely driven by institutional investment--banks buying BTC and ETH for the long haul--not the pump and dump speculation of days past. I'm sure there will be some corrections in the short term, but a long term bull run seems likely.

Yeah, this is why I'm now considering that Palit 3080 Game Rock at a whopping €1150. Plus I can basically sell the 1080ti for the same price I bought it in 2018.
 
I don't know if I could swallow that 300 Euro plastic ice tax, simply on principle.

It is actually the cheapest 3080 I've seen since launch over here. In 11% lower VAT Germany the cheapest card costs €800 and that is the ASUS TUF non-OC that basically doesn't exist, most cards go for 900-1000+ - and even then they are simply out of stock everywhere.

This Game Rock is fugly, but 1. HU tested it to be decent with good VRAM cooling 2. I have a non-glass case so I won't have to look at it. My only concern is residual value. :)

Edit: Obviously it is gone. I managed to reserve an MSI Suprim X until Monday for about €120 more, which is still a bargain compared to 3070s that go for €1k. I will run the maths over the weekend, but assuming today's rates, as a rough estimation, it will take about 3-4 months (+1 including taxes to be paid in '22) of 12 hours-a-day operation to pay the difference between the non-existent FE.

I said I wouldn't pay this much above MSRP on a general principle, but if it pays itself for the difference then there is no reason to turn it down.
 
Last edited:
I will run the maths over the weekend, but assuming today's rates, as a rough estimation, it will take about 3-4 months (+1 including taxes to be paid in '22) of 12 hours-a-day operation to pay the difference between the non-existent FE.

I said I wouldn't pay this much above MSRP on a general principle, but if it pays itself for the difference then there is no reason to turn it down.

Any assumption of profit carries with it a degree of risk, especially if that profit needs to be realized on a schedule. Greater crypto valuations mean a spike in difficulty, and the prime avenue for mining profit with a 3080 is Ethereum, which is in the middle of a transition to proof-of-stake. The main reason I'm able to see the returns I am is because I can afford to hold crypto as an investment and only sell it when prices are elevated.

You could do folding@home to research treatments for Covid.

Who's to say I don't? None of these things are mutually exclusive.

Of course, I could also contribute more to COVID-19 research by donating money rather than contributing to a distributed computing project, if that were my priority for charitable contributions.

Folding@Home and projects like it are brilliant, as the computational power of the network is virtually free from the perspective of the organizers, but the projects themselves are grotesquely inefficient. People are all too willing to write off the costs as irrelevant because it's going to a good cause. In many cases, it's tantamount to sending rich white people, with no pertinent skills, to poor nations to build a few shoddy buildings that will have to be torn down and rebuilt anyway, rather than simply having them send some small fraction of their money. The latter may not feel as charitable to them as their precious blood, sweat, and tears, but will undoubtedly be a vastly more meaningful contribution than any personal labor they could ever perform.

Likewise, the total amount of computational power I'd be capable of devoting to Folding over the last decade would be a fraction of the value of what I've actually contributed by being modestly financially successful. Without mining profits, I wouldn't be able to afford the hardware to game or Fold anyway. Hell, I might even have to get a job to support my minimum preferred degree of comfort, which would radically increase my carbon footprint and potentially make me a danger to others on the roads, or by having to go out during a global pandemic.

If there is one thing you can assume about what I'm doing it's that I've thoroughly min-maxed the expense/return ratios of my lifestyle in accordance with my overarching goals of existence. And I'd bet almost anything that my personal cost vs. contribution ratio is far more favorable to society than most, partially because I'll use my hobby to profit where I can.
 
Back
Top Bottom