Hardware & Technical 2080 Teased

If you take Nvidia price for the Founders Edition of the GTX 1080 of $549 then the current list of the RTX 2080 will be 45% more. But, if you take the fact the I just picked up an EVGA GTX 1080 FTW for $429...

The ray tracing looks awesome in Battlefield V but the hit in performance is a terrible price to pay and how much do you see when playing the game anyway? I'm usually to busy looking for enemies and returning fire to admire the view. And all that extra power it takes to render the awesomeness will likely result in my death if I round a corner with an opposing player who is running 144hz and settings turned down. My 1080 runs BFV at 144hz with everything on ultra, buttery smooth as well, why would I trade that for something that can barely run 60 fps?

I was excited about deep learning until I found out it is something that needs to be enabled... It would be awesome to have that in ED with VR but will the Cobra engine even support it? Not likely for now and I would say it will be a while before it does.

So the two games I play or will play the most either don't support it or I'm not going to gain anything over what I have. I don't believe that the visuals will be much better on the on the Rift so I'm not going to pony up $370 for a 37% gain.

I hope this launch isn't setting a disappointing precedent that all performance gain will cost more and more. It will be interesting to see pricing trend of GPUs in the next 12 months or so, which direction it takes.
As much as I love the experience of PC gaming, if the next generation goes on like this, then I will seriously consider switching back to new gen consoles... I just can't justify spending north of $2k (excluding peripherals) on a gaming rig.
 
I hope this launch isn't setting a disappointing precedent that all performance gain will cost more and more. It will be interesting to see pricing trend of GPUs in the next 12 months or so, which direction it takes.
As much as I love the experience of PC gaming, if the next generation goes on like this, then I will seriously consider switching back to new gen consoles... I just can't justify spending north of $2k (excluding peripherals) on a gaming rig.

GamersNexus speculates that this is done on purpose to give the current generation of pascals a room to breathe in until the stocks are cleared:

[video=youtube;ma1gh-21diQ]https://www.youtube.com/watch?v=ma1gh-21diQ[/video]
 
GamersNexus speculates that this is done on purpose to give the current generation of pascals a room to breathe in until the stocks are cleared:

Probably.
But nVidia could have dropped the 10-series pricing as well, or simply rebadge the 10-series and slot them (perhaps with upgraded GDDR6 RAM) below the RTX cards.
What happened to the incremental performance gain?
 
But nVidia could have dropped the 10-series pricing as well

They could have, but chances are they will sell out without any formal price drops before 7nm parts show up and the Turing line up is low volume enough that demand, even at inflated prices, is likely to exceed supply.

or simply rebadge the 10-series and slot them (perhaps with upgraded GDDR6 RAM)

It's a lot of work to rebadge stuff that's already in distribution channels and giving Pascal GDDR6 probably isn't practical. Yes, GDDR6 is quite similar to GDDR5X, but such a change would almost certainly either necessitate a new memory controller or would limit GDDR6 to near GDDR5X speeds. Since GDDR6 is more expensive, there is no incentive to do a costly respin of silicon to pair with cost increasing memory, for minimal performance gains, to fill a line up that is already saturated, in a generation of parts unlikely to last more than 9-12 months.

Probably wiser and more investor friendly to just milk what they can from who they can until AMD and/or Intel show up with competitive parts.
 
They could have, but chances are they will sell out without any formal price drops before 7nm parts show up and the Turing line up is low volume enough that demand, even at inflated prices, is likely to exceed supply.



It's a lot of work to rebadge stuff that's already in distribution channels and giving Pascal GDDR6 probably isn't practical. Yes, GDDR6 is quite similar to GDDR5X, but such a change would almost certainly either necessitate a new memory controller or would limit GDDR6 to near GDDR5X speeds. Since GDDR6 is more expensive, there is no incentive to do a costly respin of silicon to pair with cost increasing memory, for minimal performance gains, to fill a line up that is already saturated, in a generation of parts unlikely to last more than 9-12 months.

Probably wiser and more investor friendly to just milk what they can from who they can until AMD and/or Intel show up with competitive parts.

Well, since they have already done a downgrade of memory at the lower end, I figured they could have done it the other way around... :)

In any case, even if the product strategy adds up, and even if the pricing makes sense (as sort of early adopter tax on top of the existing cards), the communication and launch is failed as left people confused.
It left me buying a 2nd hand card, and also considering the unthinkable - going back to consoles with the next gen.
 
Well, since they have already done a downgrade of memory at the lower end, I figured they could have done it the other way around... :)

In any case, even if the product strategy adds up, and even if the pricing makes sense (as sort of early adopter tax on top of the existing cards), the communication and launch is failed as left people confused.
It left me buying a 2nd hand card, and also considering the unthinkable - going back to consoles with the next gen.

I think that launch wasn't aimed at gamers at all. Why the massive supercomputer focus and all that blabbing of film-making while not showing an ounce of performance of the new card? Rumours are it is ~30% faster than 1080Ti in real life gaming. We will see soon enough anyway. It is also more power hungry and much much more expensive. Let's face it, this launch is for tech whales and smaller content creating studios who can't afford a real Quadro ;-)

Also... you bought an "2nd hand card" which currently still is the fastest on the market. How did you went from that to going back to consoles which are nowhere near the level of power you're sitting on now is beyond me ;-)
 
Some benchmarks starting to show up: https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled

Looks like a ~15% advantage for for the RTX 2080 FE vs. the 1080 Ti at 4k, in the titles tested, and a ~40% advantage for the 1080 Ti.

1080 Ti performance looks a bit low in some of the tests, but nothing too glaring. Results for the RTX cards are better than the more pessimistic estimates, especially for the 2080 Ti, but doesn't look very good for relative performance per dollar. Still, if you need the fastest there is...
 
Last edited:
Still, if you need the fastest there is...

yep, specially for vr it's never enough, so even a 20% better performance is something.

the price is insane, though. not worth it for me atm.

also, getting a 2080ti for me would mean passing my 1080ti to my son, and i'd rather have him concentrating on his studies this year and not give him even more reason to procrastinate, his 980ti should be enough for a while.
 
I will wait for independent benchmarks, those are from "official rtx review guide" lol :D

Yes, and they are using NVIDIA's 'recommended' titles, which is probably the collection of games they had time to optimize most heavily for in the review drivers. I also have a feeling the 1080 Ti used is power limit capped in some of these tests.

Still, it should be largely representative.
 
I'm still undecided wether I'm underwhelmed by the less than stellar generational performance jump, or impressed that a £1300 2080ti beats a £2700 Titan V.

We will know the full story tomorrow at 2pm UK time when the review embargo lifts...

I am torn between being slightly surprised they charge $1300 for a gpu with no more performance gap than it has.

Or that people are actually buying them.
 
I'm still undecided wether I'm underwhelmed by the less than stellar generational performance jump, or impressed that a £1300 2080ti beats a £2700 Titan V.

We will know the full story tomorrow at 2pm UK time when the review embargo lifts...

The Titan V has no proper drivers to actually fully use its potential, if I'm not mistaken.
 
Some benchmarks starting to show up: https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled

Looks like a ~15% advantage for for the RTX 2080 FE vs. the 1080 Ti at 4k, in the titles tested, and a ~40% advantage for the 1080 Ti.

1080 Ti performance looks a bit low in some of the tests, but nothing too glaring. Results for the RTX cards are better than the more pessimistic estimates, especially for the 2080 Ti, but doesn't look very good for relative performance per dollar. Still, if you need the fastest there is...

Slightly OT, but I didn't want to bump the other thread:
I did undervolt the MSI Armor 1080ti, and it is stable at 1.95 Ghz and 1 V, temperature does not rise above 65C at about 70-73% GPU usage (I left a bit of headroom in settings for more demanding scenarios such as planetary settlements). As a second hand, high depreciation 1080ti (given its reputation), I think it is actually proving to be a solid decision.
 
Slightly OT, but I didn't want to bump the other thread:
I did undervolt the MSI Armor 1080ti, and it is stable at 1.95 Ghz and 1 V, temperature does not rise above 65C at about 70-73% GPU usage (I left a bit of headroom in settings for more demanding scenarios such as planetary settlements). As a second hand, high depreciation 1080ti (given its reputation), I think it is actually proving to be a solid decision.

As a rule of thumb, if Pascal boosts around 2GHz then you are utilising it properly ;-) And on 1V - nice. You can try to tweak the voltage curve in MSI Afterburner if you wish (CTRL+F) and flatten the voltage after ~2GHz 1V. It all depends on the card of course. Values over 2GHz usually are only good for e-peen measures, and they produce lots of heat increasing the chance of throttling.
 
I'm still undecided wether I'm underwhelmed by the less than stellar generational performance jump, or impressed that a £1300 2080ti beats a £2700 Titan V.

This generational jump isn't about rasterization performance, but about shifting a paradigm of computer generated graphics back to its raytraced roots... And it is awesome, really. People need to understand that this gen will change the gaming fidelity forever in just a few years. Does that mean you should go out and buy that as a regular consumer / gamer? Absolutely not. The 1080Ti is a much better value/performance proposition, unless you need the last ounce of power from the GPU, a CPU to feed it, and a game to be able to stress them both. No, Elite isn't one ;-)
 
Reviews are up...
[video=youtube_share;dLjQR0UFUd0]https://youtu.be/dLjQR0UFUd0[/video]

This image says it all for me.. 2080ti is a monster, an expensive monster, but still...
r4VYxC1.jpg
 
Performance more or less where it was expected to be.

2080 is meh as it's street price is a solid $100 higher than the 1080 Ti and it's barely faster.

2080 Ti takes the raw performance crown, as we knew it would, but 30-40% more performance than a 1080 Ti for 80% more money is only going to appeal to those who absolutely cannot get by with any other single GPU, or for those who never cared what it was going to cost.
 
Reviews are up...


This image says it all for me.. 2080ti is a monster, an expensive monster, but still...

To me this is bang on what was predicted, actually the 2080 doesn't look 10-15% faster compared to 1080ti, which was pretty much suggested. And the new cards have been OC'd as well.
 
Back
Top Bottom