Hardware & Technical 2080 Teased

not to mention it puts all the pressure on the devs. Dx11 is still king thanks to the driver teams at the respective vendors being largely responsible for the low level coding, leaving game devs to just make games.

Sorry for not responding earlier, I missed your post. While I am a software dev myself (not gamedev tho), I kinda miss the time the devs pressure was granted because of restrained hardware (think Commodore/Amiga/early PC days). Nowadays I get the feeling that optimisation is something that more often than not falls outside the scope, and games just up the requirements because they can. A low level API could at least up the quality bar a bit... Ofc with more power comes more responsibility and all that... but I fondly remember dabbling in assembler writing intros for amiga demoscene :p Interestingly enough, we had some simple real raytraced demos at that time :), very basic but for the time they were amazing :)

Impressive tech demo, despite half of the content being a rehash of previous stuff, but it seems like my discount 1080 Ti will prove to be a wise purchase.
Given that virtually all performance figures being hyped are for RTX games, using new terminology, I'm pretty confident that these parts are going to perform very similarly to Pascal, per clock, per functional unit, in non-RTX content.

The whole presentation didn't mention a single thing about non-rtx performance. IMHO if they have made a technological leap, it would be plastered with graphs all over the place. Plus it's still the same process as Pascal. And the TDP is huuuge.

I do hold out some hope that the 2070 will in fact be as powerful as a TitanXp/1080ti in non-RTX stuff, since his wording was exactly the same as the 1070 announcement where he casually threw in "oh, and it's faster than a TitanX(maxwell)".

I highly doubt it, the difference in CUDA cores between 1080Ti and 2080Ti is only 21%. Unless that thing overclocks like crazy (according to Jensen it could), I doubt there will be a price-justifying difference between the two, let alone between the 1080Ti and 2070.

Kinda feel like my 1080ti became obsolete overnight :D Obviously it's not true, but that's the power of marketing. I want a new toy now...

RESIST THE DARK SIDE! ;-)

Can I haz ur stuff?

;) Not really - running a 1080 right now for VR, and that pretty much maxes out my setup. Upgrading the GPU would mean upgrading the PSU, and then my old CPU would be the bottleneck, changing that would require a new motherboard....

Depending on the CPU the bottleneck may or may not be a problem. This is from a i5-3570k @ 4.4GHz turbo oc owner, now sporting a 1080ti. Sure - I get frame drops sometimes... from 120 to a staggering 104 fps in Shadow Of The Tomb Raider at maxed 1080p settings. I can also run that game in 4k60 (mostly, because there are scenes where performance dips heavily despite both cpu and gpu being "bored" according to RTTS, weird). Didn't try Elite yet, but I might in not so distant future as I am changing my HOTAS system.

But I can already see a drop of 1080ti pricing on the used market, so it will be an uphill battle to keep Pascal pricing untouched.
I'm inching to grab a 1080ti in the next two weeks. Even if the per core performance of the Turing blows Pascal out of the water, the sheer CUDA count of 1080ti vs 2080 plus the more memory tells me that any performance gain must remain minimal for almost twice the price of a used 1080ti. Hell, I can get a decent monitor from the price difference.

We will see what vendors do. We're seeing some extreme price gouging at the moment - the founder's editions cost a whopping $1200... I never saw my 1070 reaching the recommended nvidia msrp. For sure there will be movement in the used card market, and I glad I jumped on that :) As for having new bleeding edge tech... well as you yourself noticed, dev adoption will take a while. So grab that 1080Ti and enjoy. I think it currently will remain the king of price-performance ratio.

Overall, I was overjoyed watching that presentation. The return of ray-tracing to computer graphics is a welcome one, and it will take games to yet another level of fidelity. I am glad Nvidia is trying to change the paradigm. Let's hope AMD catches up someday too, so we don't have these ridiculous prices in the future.
 
We will see what vendors do. We're seeing some extreme price gouging at the moment - the founder's editions cost a whopping $1200... I never saw my 1070 reaching the recommended nvidia msrp. For sure there will be movement in the used card market, and I glad I jumped on that :) As for having new bleeding edge tech... well as you yourself noticed, dev adoption will take a while. So grab that 1080Ti and enjoy. I think it currently will remain the king of price-performance ratio.

Overall, I was overjoyed watching that presentation. The return of ray-tracing to computer graphics is a welcome one, and it will take games to yet another level of fidelity. I am glad Nvidia is trying to change the paradigm. Let's hope AMD catches up someday too, so we don't have these ridiculous prices in the future.

No doubt the whole thing, especially the AI is a work of a genius and will probably yield extraordinary dividend in gaming tech, but as usual there are value issues as with any other tech in early stages.
Yeah, the more I think about it, the more convinced I am about grabbing a 1080ti.

Edit: That's it, I pulled the trigger on a second hand MSX Armor 1080ti with two year warranty, completing this week's Rift acquisition. :) The difficult part will be to endure the next two weeks I can actually set up my rig again.
 
Last edited:
Interestingly enough, we had some simple real raytraced demos at that time :), very basic but for the time they were amazing :)

Back in the day I remember the Juggler and everyone going "Ooohh!" at that - but there was another Amiga demo that was utterly jawdropping and I cannot for the life of me remember what it was.

Vector dancing girl is the closest I can describe it.
 
But I can already see a drop of 1080ti pricing on the used market, so it will be an uphill battle to keep Pascal pricing untouched.
I'm inching to grab a 1080ti in the next two weeks. Even if the per core performance of the Turing blows Pascal out of the water, the sheer CUDA count of 1080ti vs 2080 plus the more memory tells me that any performance gain must remain minimal for almost twice the price of a used 1080ti. Hell, I can get a decent monitor from the price difference.

Anyway buying a RTX really makes sense only if the CPU is not limited compared to the GPU.

For example with my CPU Intel of Generation 4 (4770K), I wonder if it would be wise to buy a RTX 2070 or even RTX 2080.

If this is to be slowed down by a bottleneck, it's not worth it for the money
 
Last edited:
The whole presentation didn't mention a single thing about non-rtx performance. IMHO if they have made a technological leap, it would be plastered with graphs all over the place. Plus it's still the same process as Pascal. And the TDP is huuuge.

It's not precisely the same process, it's a half-node (12nm) shrink of TSMC's 16nm: http://www.tsmc.com/english/dedicatedFoundry/technology/16nm.htm

Likely all the extra die area from that shrink, and then some, went to the Tensor and RT cores, which is the only technological leap.

Probably a wise move by NVIDIA, really. They needed an opening to introduce RTX without making a part that would be a performance regression elsewhere, or have it's teeth kicked in by the competition. Since there is no competition this generation, they took the earliest shrink they could, built the biggest chip they could, and went all in on RTX, both technologically and marketing wise.

Now they've upped the bar, so when TSMC's 7nm hits and RTX doesn't need such a large portion of available transistors, they're hoping they'll be ahead of whatever AMD and Intel can bring to the table.

Anyway buying a RTX really makes sense only if the CPU is not limited compared to the GPU.

For example with my CPU Intel of Generation 4 (4770K), I wonder if it would be wise to buy a RTX 2070 or even RTX 2080.

If this is to be slowed down by a bottleneck, it's not worth it for the money

I'm running a 4.25GHz Haswell based part (i7-5820K, so I do have a few extra cores and 7MiB more L3 cache, but in practice not much faster than a similarly clocked 4770K in any but the most well threaded of titles) in my main system, and I'm generally completely GPU limited with a modestly OCed 1080 Ti at higher resolution/IQ.

Elite: Dangerous, for example, needs to be running around 160-180 fps (and not in a system map or Galnet) before the main render thread starts becoming CPU limited and increasing my CPU overclock starts to improve things.

I would probably be CPU limited in normal space at 4k with an RTX 2080 Ti, but not in rings, in starports, or on the surface of planets.
 
I'm running a 4.25GHz Haswell based part (i7-5820K, so I do have a few extra cores and 7MiB more L3 cache, but in practice not much faster than a similarly clocked 4770K in any but the most well threaded of titles) in my main system, and I'm generally completely GPU limited with a modestly OCed 1080 Ti at higher resolution/IQ.

Elite: Dangerous, for example, needs to be running around 160-180 fps (and not in a system map or Galnet) before the main render thread starts becoming CPU limited and increasing my CPU overclock starts to improve things.

I would probably be CPU limited in normal space at 4k with an RTX 2080 Ti, but not in rings, in starports, or on the surface of planets.

Difficult to find a really homogeneous configuration.

It would be necessary to change the components several times by decades

In any case, good news to see that the 4770K holds still the road with a GTX 1080 TI, if I understand correctly that you are GPU limited in a classic use with your 5820K.

:)
 
Last edited:
Back in the day I remember the Juggler and everyone going "Ooohh!" at that - but there was another Amiga demo that was utterly jawdropping and I cannot for the life of me remember what it was.

Vector dancing girl is the closest I can describe it.

I knew what you were talking about but I admit I had to google it ;-)
[video=youtube;wCc5ZHqwdXY]https://www.youtube.com/watch?v=wCc5ZHqwdXY[/video]

Also to see what was the result of hardware pressure on the "devs" back then, see this:
[video=youtube;9XCFqZEOXb8]https://www.youtube.com/watch?v=9XCFqZEOXb8[/video]
This ran pretty well on a Amiga 1200 which had Motorola MC6020, a 20MHz processor, no 3d hardware acceleration, and a need to transform pc-like display (so called "chunky" - pixels written as byte containing number of colour from colour palette), to "planar" - Amiga display was composed from bitplanes, because it was designed for scrolling graphics. And all graphics chips were specialised for scrollers...

And it was in 1998. Twenty years ago. Another marvel created by young, idealistic minds. Same year, same hardware :D
[video=youtube;6Meqv1eTVeM]https://www.youtube.com/watch?v=6Meqv1eTVeM[/video].

I hope you enjoy, and appreciate where we are today :) The new RTX cards are cool, visionary even. But still a generation to skip for anyone with a 1070 or up, in my opinion :)

EDIT: to anyone too young to know, these were realtime programs running on a now antiquated computer with feeble hardware (for today's standards).
 
Last edited:
EDIT: to anyone too young to know, these were realtime programs running on a now antiquated computer with feeble hardware (for today's standards).

Ahh - back in in the 680x0 days - when coding meant bashing metal, and a DMA transfer involved no OS interaction because it was invoked at the actual hardware level, and peeps were still in real control of their data!

Right - that's it. I'm going to dump a TCP/IP stack on Workbench. Who wants to instance through it?
 
Ahh - back in in the 680x0 days - when coding meant bashing metal, and a DMA transfer involved no OS interaction because it was invoked at the actual hardware level, and peeps were still in real control of their data!

Right - that's it. I'm going to dump a TCP/IP stack on Workbench. Who wants to instance through it?

Yeah the best part was the "disable OS" routine, then you were on your own. Complete freedom. And complete eff up possibilities, but you knew it was your fault ��
 
EVGA already announced and showed their RTX cards. https://www.evga.com/articles/01249/evga-geforce-rtx-20-series/
Nothing on prices yet. However, going by their FTW3 and ICX series of past gen they were quite a bit more pricey than reference. I really like their sensor array, and a great control over anything you can have control of just like it was on FTW3 1080ti.

Edit. Actually there are prices. They START at a FE edition range :)

Scan only has the 2080Ti XC Ultra listed for pre-order:

o1kziM9.jpg

The FTW3 is bound to be more than that.

Unless that price drops considerably, I will be keeping my 980Ti a little longer.
 
Scan only has the 2080Ti XC Ultra listed for pre-order:


The FTW3 is bound to be more than that.

Unless that price drops considerably, I will be keeping my 980Ti a little longer.

However if you achieve to sell your 980ti for a good price, you could buy RTX for cheaper

Although this challenge seems difficult
 
The question is whether there is a headroom in overclocking. Jensen claims that there is.
Anyway, these results confirm that I made a good decision staying away from this new gen.
 
Back
Top Bottom