At least I made a good investment this year, my stock portfolio is getting hammered.
Yeah, I'm out. By the time prices return to 'normal' and production is on track we're probably at least 12-18 months in, and by then we're getting close to the new series I guess. Its annoying as I was hoping to replace my 2070s and give that one to a mate who is building a rig, but I guess this is about as much a first world problem as it gets.Just checked amazon; 3070 $2,999, 3080 $3,999
Insanity.
I am in a similar situation, I was hoping to get a 6800XT to replace my 5700XT which was then going to my nephew. No chance any time soon of that happening it would seem.Yeah, I'm out. By the time prices return to 'normal' and production is on track we're probably at least 12-18 months in, and by then we're getting close to the new series I guess. Its annoying as I was hoping to replace my 2070s and give that one to a mate who is building a rig, but I guess this is about as much a first world problem as it gets.
I've upgraded from a 2070 (not super) to a 3070, and as things are now, I would hold my horses. I expected a much higher performance boost than what I actually got (except for Cuda), and I paid like 700$ for the 3070 before the World went crazy. If this continues, I'll sell my 3070 in a year and thereby become a billionaireYeah, I'm out. By the time prices return to 'normal' and production is on track we're probably at least 12-18 months in, and by then we're getting close to the new series I guess. Its annoying as I was hoping to replace my 2070s and give that one to a mate who is building a rig, but I guess this is about as much a first world problem as it gets.
I've upgraded from a 2070 (not super) to a 3070, and as things are now, I would hold my horses. I expected a much higher performance boost than what I actually got (except for Cuda), and I paid like 700$ for the 3070 before the World went crazy. If this continues, I'll sell my 3070 in a year and thereby become a billionaire
I have a Reverb G2 on the 3070. After a lot of tweaking it runs "ok", with a few frame drops here and there. I run most settings on high, and no resampling (1:1). I would like the 3090, but I guess that'll have to wait a few years.The jump from 1080ti to the 3080 was insane though, it made a huge difference, especially in VR.
Hopefully Eth 2.0 will change the game and reduce the return on invesment. I mean, even at $4k, if a 3080 mines net $3k a year it beats basically any asset class and makes it a bargain.
I have a Reverb G2 on the 3070. After a lot of tweaking it runs "ok", with a few frame drops here and there. I run most settings on high, and no resampling (1:1). I would like the 3090, but I guess that'll have to wait a few years.
When I bought the 3070 at regular price, it was on the edge of my budget. Now it's become so expensive that I wouldn't buy it. At the same time, I can't sell it because I need a GPU. Strange times.
As long as mining remains profitable it's hard to see the problem changing. Miners are scooping them up as fast as they can get them.
I know that feeling, but still I sold my "old" 2070 to some desperate teenage gamer for the original list price. The 3070 runs fine though, and since I bought my first Hercules graphics card for my homebuilt PC XT clone (almost 40 years ago ), I have not experienced one single GPU dying, but then again, I don't overclock anymore. Not that it seems to matter much these days, with throttling being the "worst" thing that could happen unless you go extreme. As Linus and Electro Boom showed recently, you have to zap a memory module with a large Tesla coil before bad things happen. Computers these days are pretty "not fragile", (he said and looked around for some wood to knock on)On a similar note to yours, I was contemplating how bad it would be if it would fail. Either I had to wait months at best to get another one, or have the money back to buy a 3060 for the same amount?
It's not just miners. Production has been terribad.
I know that feeling, but still I sold my "old" 2070 to some desperate teenage gamer for the original list price. The 3070 runs fine though, and since I bought my first Hercules graphics card for my homebuilt PC XT clone (almost 40 years ago ), I have not experienced one single GPU dying, but then again, I don't overclock anymore. Not that it seems to matter much these days, with throttling being the "worst" thing that could happen unless you go extreme. As Linus and Electro Boom showed recently, you have to zap a memory module with a large Tesla coil before bad things happen. Computers these days are pretty "not fragile", (he said and looked around for some wood to knock on)
But I guess spending an additional $200 for a new memory on an otherwise 4-year old platform with a 4-core CPU really doesn't make sense.
With the realease of the 1.2 patch I've been messing around a lot with CP2077 on my brother's RTX 3080 and my 6800 XT. I thought I'd miss ray tracing, but I really don't...on the 3080 it feels like a waste not using it, because it is quite usable on the 3080, but I'm not sure the visual trade-off is worth it even on this part. DLSS is what makes it workable, but the people claiming not to see the difference between native resolution and DLSS are either blind or insane. Even DLSS quality is a blatantly obvious reduction in clarity vs. native. It's the best upscaler there is currently, but it's still an upscaler.
On the 6800 XT ray tracing is totally out of the question, it's RT performance is too low to make it usable in CP 2077 past 1080p or so and the CAS upscaler+sharpener is no match for DLSS in fine details. On the plus side, there is no temptation to use RT on the AMD card, so I can play native 1440p on ultra at 70-110 fps without any hesitation. The 3080 can do this too, of course...sample I'm using might be slightly slower than the 6800 XT, even after replacing the thermal pads under the backplate (which didn't help memory temps much), but they are close enough to be perceptibly the same.
Anyway, the expiration date for the original prices on the RTX 30xx EVGA queue has expired, but given how much the street prices have gone up I'll probably still grab one if I'm ever offered one. Might still try for a 3080 Ti as well, though I have reservations about purchasing a mining-capped part.
This 2x16 kit of Timetec was $120 when I bought it (though the price has gone up to 145 last I checked):
It's cheap 3600 CL18 bin Hynix CJR, but with a little tweaking is faster than using XMP on most kits of three times the price. A good kit of Samsung B-die, Micron E-die, or Hynix DJR will beat it when similar care is given to tuning, but not enough to be worth it in a price/performance sense, even if one is looking at total system cost.
As for RTX on Cyberpunk, night scenes do look wonderful with all the lights - I don't know how it looks without RT, but with RT it is awesome.
What do you think about such RT reshader plugins? I am considering trying it on an otherwise not demanding game like Warships.
3080Ti - unless mining profitablity is unsustainable, I guess I am better staying on 3080 (I recon a mining 3080 can easily worth as much as a 3080ti if mining is gimped on it). I've achieved 19300 graphics score on the first try with a stable OC (I am using it for Flight Sim, and it goes for hours) anyway, GPU easily stays on 2100 Mhz an possibly could go higher. Interestingly, my VRAM refuses to go to 10300 Mhz and beyond, I guess I have a good(ish) GPU core with an average-at-best VRAM on my card.
I don't mine, but this bugs the crap out of me. Why would they even do this? It's pretty scummy.They are definitely going to limit Ethash on the 3080 Ti, probably cutting performance in half vs. what the hardware is actually capable of.
I don't mine, but this bugs the crap out of me. Why would they even do this? It's pretty scummy.
Memory clocks well on this sample. Switched over to Gminer, made a more agressive undervolt curve and bumped up the memory clock another couple of multipliers (NVIDIA cards tend to do best at whole number memory multipliers and have a 27MHz reference clock and an internal DRAM clock 1/4th that of what MSI AB reports, so setting figures that are multiples of 108 frequently work well), and maxed out fan speed to keep the GDDR6X junction temps under control.
Have a little more room to tune the curve down, but not much. Setting a power limit doesn't help either with an agressive custom core, either the limit isn't reached, or the clock throttling causes extra overhead and harms efficiency.
Currently getting 97.5MH/s at 234w for the card and about 300w at the wall for the entire system:
6800XT, in and of itself almost matches the efficiency fully tweaked (62MH/s at about 154w). However, there is more overhead due to having less total hashing power in one place.
After I get some CPUs that work to my satisfaction I'll do a gaming comparison between the 3080 and the 6800 XT. Will be interesting to see how they compare in Odyssey.