Hardware & Technical 2080 Teased

The question is whether there is a headroom in overclocking. Jensen claims that there is.
Anyway, these results confirm that I made a good decision staying away from this new gen.

You mean that we convinced you to stay away? Don't mention it ;-)

Also, Pascal always had a room for OC, there was a noticeable "glass ceiling" across all boards which seemed to top off on around 2.1GHz give or take.
 
You mean that we convinced you to stay away? Don't mention it ;-)

Also, Pascal always had a room for OC, there was a noticeable "glass ceiling" across all boards which seemed to top off on around 2.1GHz give or take.

Well, yes, thanks, it was a good call. :)
I just hope the miner was sincere about the card not running high temperatures, as mentioned I did try it out and it works very nicely, fans spin up without noise and literally, there was not a single speck of dust on it. He also had a 100% upvoted rating on the used market.

Edit: if this is, that actually begs for the question, how and where the supposedly GTX 2060 and 2050 will be positioned in the lineup. Those must be faster than their predecessors but not too fast to threaten the 2070, which will be probably slower than a 1070ti? Somehow I have the feeling these will be recycled Pascal cards with GDDR6 memory.
 
Last edited:
The question is whether there is a headroom in overclocking. Jensen claims that there is.

There is always at least some headroom.

Even if NVIDIA's GPU Boost was perfect at extracting all practical, stable, performance from a given power limit and temperature, power limits can go up (allowing more voltage along with it) and temps can go down.

I just hope the miner was sincere about the card not running high temperatures, as mentioned I did try it out and it works very nicely, fans spin up without noise and literally, there was not a single speck of dust on it. He also had a 100% upvoted rating on the used market.

Most recent cards have very conservative limits in what you can do to them without hardware modding. There is no reason to think a Pascal part used for mining would be in any worse shape, except perhaps when it comes to fans, than one used for much of anything else. Peak practical mining performance, with no regard for efficiency, will mean the card was run with 100% fan speed 24/7, and was lightly overvolted, but saw essentially no thermal cycling and fairly cool temps. Peak mining efficiency means the card was very cool, probably undervolted significantly, and also had minimal thermal cycling.

Thermal cycling and the resulting mechanical stresses on solder joints are what's going to kill the majority of GPUs. A card used for gaming a few hours a day is likely to see many cycles with a ~40C temp delta and may see a daily peak delta of 60C or more. A card that is mining either sits at a fixed temperature target (to ensure predictable hashrates) or varies slightly with ambient temperature. I have cards that were mined on for twenty or thirty thousand hours which saw fewer lifetime thermal cycles than I can get in a weekend of playing Elite: Dangerous.
 
Plausible leaked RTX 2080 3DMark Time Spy score: https://wccftech.com/nvidia-geforce...ghz-and-beats-a-gtx-1080-ti-without-ai-cores/

For the record, that's slightly faster than a stock 1080 Ti and slightly slower than a stable, on air, OCed one.

Hmm, interesting, off to bench my system and see how close I get to a 2080's score, never tested timespy before, normally more of a firestrike guy :)

*edit*
Pretty much identical, with the 1080ti core clock being a mere 25MHz lower than the 2080's.

2080:
Graphics score:10030
-Test 1: 64.53fps
-Test 2: 58.17fps

-CPU score (not really relevant): 4652
-CPU test 1: 15.63fps

1080ti:
Graphics score:10099
-Test 1: 64.59fps
-Test 2: 58.89fps

-CPU score (not really relevant): 8991
-CPU test 1: 30.21fps

www.3dmark.com/3dm/28284892?
 
Last edited:
I see the RTX 2080 at $799 and we can have the GTX 1080 for 500 euros

I do not see a difference of 60-70% here

That's because you're comparing US price to EU price and forgetting VAT and... umm... "ocean tax" (because I cannot for the life of me explain why the Ti costs a whopping 1000PLN more than US price + VAT... ;-)

PS: updated previous post with local offerings to better reflect the price difference.
 
That's because you're comparing US price to EU price and forgetting VAT and... umm... "ocean tax" (because I cannot for the life of me explain why the Ti costs a whopping 1000PLN more than US price + VAT... ;-)

PS: updated previous post with local offerings to better reflect the price difference.

Yes globally 60%/70% then

:)
 
If you take Nvidia price for the Founders Edition of the GTX 1080 of $549 then the current list of the RTX 2080 will be 45% more. But, if you take the fact the I just picked up an EVGA GTX 1080 FTW for $429...

The ray tracing looks awesome in Battlefield V but the hit in performance is a terrible price to pay and how much do you see when playing the game anyway? I'm usually to busy looking for enemies and returning fire to admire the view. And all that extra power it takes to render the awesomeness will likely result in my death if I round a corner with an opposing player who is running 144hz and settings turned down. My 1080 runs BFV at 144hz with everything on ultra, buttery smooth as well, why would I trade that for something that can barely run 60 fps?

I was excited about deep learning until I found out it is something that needs to be enabled... It would be awesome to have that in ED with VR but will the Cobra engine even support it? Not likely for now and I would say it will be a while before it does.

So the two games I play or will play the most either don't support it or I'm not going to gain anything over what I have. I don't believe that the visuals will be much better on the on the Rift so I'm not going to pony up $370 for a 37% gain.
 
Last edited:
The ray tracing looks awesome in Battlefield V but the hit in performance is a terrible price to pay and how much do you see when playing the game anyway? I'm usually to busy looking for enemies and returning fire to admire the view. And all that extra power it takes to render the awesomeness will likely result in my death if I round a corner with an opposing player who is running 144hz and settings turned down. My 1080 runs BFV at 144hz with everything on ultra, buttery smooth as well, why would I trade that for something that can barely run 60 fps?

The importance of this release is that they're trying to shift the whole paradigm of rendering graphics in games - it's a good thing. It makes for exciting possibilities once a) developers get to know the tech and find ways to utilise it to it's maximum potential b) better hardware arrives. It also puts more requirements while also giving you (a developer) much more freedom in other areas. A lot of things in current model is faked or abstracted for the sake of performance, and they disappear completely with raytracing. Which also means for example that you would need to be very careful with reflections giving away your shenaningas, like in hollywood western town all buildings are facades ;-)

I am excited for this generation of RTX cards as a programmer - because I see loads of potential, especially when they step up to 7nm process. I am not at all excited by RTX cards as a consumer, because they do not provide any significant advantage for my current gaming setup (or at least significant enough to warrant the hefty price tag). One thing is for sure - we will experience higher fidelity in games. Whether it's worth childbirth pains and an investment bordering on extortion is up to the reader to decide :D
 
I get what they are trying to do, and I'm reasonably excited that it is happening... But I play FPS shooters and have been considering moving up 1440p now that I've got the GTX 2080 and the thought of playing 1080p at 30-60 fps is a non-starter for me. Others may think differently but once you play at 144hz it's hard to go back...
 
If you take Nvidia price for the Founders Edition of the GTX 1080 of $549 then the current list of the RTX 2080 will be 45% more. But, if you take the fact the I just picked up an EVGA GTX 1080 FTW for $429...

The ray tracing looks awesome in Battlefield V but the hit in performance is a terrible price to pay and how much do you see when playing the game anyway? I'm usually to busy looking for enemies and returning fire to admire the view. And all that extra power it takes to render the awesomeness will likely result in my death if I round a corner with an opposing player who is running 144hz and settings turned down. My 1080 runs BFV at 144hz with everything on ultra, buttery smooth as well, why would I trade that for something that can barely run 60 fps?

I was excited about deep learning until I found out it is something that needs to be enabled... It would be awesome to have that in ED with VR but will the Cobra engine even support it? Not likely for now and I would say it will be a while before it does.

So the two games I play or will play the most either don't support it or I'm not going to gain anything over what I have. I don't believe that the visuals will be much better on the on the Rift so I'm not going to pony up $370 for a 37% gain.

It is true that the prices are interesting currently on GTX 1080
 
I see the RTX 2080 at $799 and we can have the GTX 1080 for 500 euros

I do not see a difference of 60-70% here

If you take Nvidia price for the Founders Edition of the GTX 1080 of $549 then the current list of the RTX 2080 will be 45% more. But, if you take the fact the I just picked up an EVGA GTX 1080 FTW for $429...

I can reliably find new GTX 1080s for 430-460 USD in the States, open boxed or refurbs for 400, and used ones in good condition for 300-350. The lowest priced RTX 2080 pre-order I've seen is 750 USD.

once you play at 144hz it's hard to go back...

I ran most games at 120-170Hz on high-end CRTs from the late 1990s to about 2008, when my last Trinitron became too worn to continue using...about 60k hours on the tube, brightness was about 30% of new, and I had the HUDs of half-dozen games burnt into the phosphors.

Mostly using 60Hz LCDs now. VA panels are finally fast enough to shoot for something higher, though I might wait for OLEDs at this point as I'm fairly content with my current display.
 
Back
Top Bottom