Hardware & Technical 2080 Teased

However I read that the interest of nvidia is to sell a lot of 10xx when Turing will arrive because the stocks are still important

True enough, but as time marches on production of the new cards will start to increase inventories and motivate them to clean out the old stuff and move to the new stock to facilitate free cash flow. The question hard to answer is if that will be Q4 this yr or Q1 next.

Whenever that occurs should push 10XX series prices down.
 
Also, buying a blower? Why? The non-reference designs are usually better at cooling and noise levels.

The RTX 2080 reference cooler is a dual axial fan, internal exhausting type, like those popularly used on non-reference models of prior generations.

However I read that the interest of nvidia is to sell a lot of 10xx when Turing will arrive because the stocks are still important

Yeah, there is a glut of Pascal parts, so that could be incentive to keep Turing priced high for a while.
 
True enough, but as time marches on production of the new cards will start to increase inventories and motivate them to clean out the old stuff and move to the new stock to facilitate free cash flow. The question hard to answer is if that will be Q4 this yr or Q1 next.

Whenever that occurs should push 10XX series prices down.

Yeah, there is a glut of Pascal parts, so that could be incentive to keep Turing priced high for a while.

If Turing's performance is much more important than Pascal, then perhaps people wait a few months to buy Turing.

Unless the price drop on the 10xx series is really significant
 
If Turing's performance is much more important than Pascal, then perhaps people wait a few months to buy Turing.

Unless the price drop on the 10xx series is really significant

I'm on the fence about this as well. If there isn't a significant price incentive, I'll wait for the next series card, but it now looks like it will be 2 years or more before the capabilities of the new cards actually get incorporated into software that can utilize Ray Tracing and DX12. Of course, there might also be other benefits that will have an impact now, but I think it's still too early to fully evaluate.

Anyone considering buying the new cards should also evaluate their CPU/MB to see if it fully supports the new stuff. I'm running a 980Ti on a Core i7-950 CPU and I'm somewhere around 20% CPU bound.

Looking at Q4, I'm probably now ready to build again using a Gen 9 Intel CPU for myself and a Threadripper 2990WX for my son's desktop. Neither of us are actually active gamers other than my ED addiction, so GPU demands aren't too significant, but I tend to build for value longevity. My GPU power is focused on driving more pixels now that he uses a Dell 38" curved monitor with a 24" alongside and I will be looking at a similar configuration. Both the Dell 38" and 34" (that I use) have been really nice for daily use as well as ED. The other option I'm looking at is the possibility of a 43" 4k - but I'm not sure if that would be a good fit yet.

Anyone into VR should obviously be looking for as much raw power as possible when affordable, but it's been pretty surprising how well I've been able to get multiple monitors and large curved screens to perform on obviously less than state-of-the-art gear.

YMMV - spend wisely my friends - well timed hardware purchases can last a long time if you buy the right gear at the right price. I've been looking to build new desktops for almost 3 years now, but I'm very happy that I kept my powder dry. For me - the time is at hand.
 
Why do so many tech new sites use largely unrelated placeholder images for their articles?

Anyway, if that 2.5GHz clock potential is accurate, a tweaked RTX 2080 could actually wind up being appreciably faster than a 1080 Ti (GP 102 will generally only reach 2000-2050MHz on air, with ~2100MHz being workable on good water with firmware that removes the power limit).
 
Why do so many tech new sites use largely unrelated placeholder images for their articles?

Anyway, if that 2.5GHz clock potential is accurate, a tweaked RTX 2080 could actually wind up being appreciably faster than a 1080 Ti (GP 102 will generally only reach 2000-2050MHz on air, with ~2100MHz being workable on good water with firmware that removes the power limit).

And also with GDDR6 it seems
 
GDDR6 just lets them get similar bandwidth into a 256-bit memory interface vs. the 1080 Ti's 352-bit one. Unless the GDDR6 OCs extremely well the RTX 2080 isn''t going to have an appreciable bandwidth advantage.

Unless the bus width increases in a later version
 
Unless the bus width increases in a later version

The big die GDDR6 parts will have a 384-bit bus (Quadros and Titan), with a cut down version (probably 352-bit) likely for the RTX 2080 Ti (or whatever it's going to be called).

256-bit is the max for any GT104 part; we may see some cut down parts, but you can't just add memory controllers without making a new die flavor.
 
I'm just curious how fast whole ray tracing tech will become mainstream. Since it's first generation of cards focused on it, maybe it'll take a good 2-3 years for them to become actually relevant. 15% over 1080ti in current gen games would be kinda nice. But is it worth it? Questionable. I'd consider upgrade if I ran native 4k perhaps.
 
I'm just curious how fast whole ray tracing tech will become mainstream. Since it's first generation of cards focused on it, maybe it'll take a good 2-3 years for them to become actually relevant. 15% over 1080ti in current gen games would be kinda nice. But is it worth it? Questionable. I'd consider upgrade if I ran native 4k perhaps.

15% do not deserve a purchase in my opinion.

Unless achieve sell the old card
 
15% do not deserve a purchase in my opinion.

Unless achieve sell the old card

I am of the same opinion. Though, for VR, every raw power percentage helps, because it can make or break an experience depending on reprojection levels (locking the FPS to 45 just because you're hitting 80fps instead of 90...).

Also it depends on the price difference. For example 1070 vs 1080 was ~30% performance difference for 60% higher price at launch. Not exactly healthy. And then 1080Ti came and moped the floor with both of these cards. Which in ideal conditions would make them fall out of the sky with their prices, yet it didn't happen because of the cryptocraze.
 
I am of the same opinion. Though, for VR, every raw power percentage helps, because it can make or break an experience depending on reprojection levels (locking the FPS to 45 just because you're hitting 80fps instead of 90...).

Also it depends on the price difference. For example 1070 vs 1080 was ~30% performance difference for 60% higher price at launch. Not exactly healthy. And then 1080Ti came and moped the floor with both of these cards. Which in ideal conditions would make them fall out of the sky with their prices, yet it didn't happen because of the cryptocraze.

At 15%, I'd get that instead of a 1080ti. I do respect the more educated assumptions based on what we have seen so far, but I still think it would be a bad move from nVidia to launch a 2080 that does not deliver measurable advantage over a 1080ti, especialy as this cycle went on a bit longer than anticipated.

BTW I just reserved a nice used Rift I'll buy next week (I could not justify to buy it new given the tech is old) - it would be nice to have the VR honeymoon set on ultra instead of compromising with the 1060. :)
Which will come anyway in September, as due to our home upgrade, I have no hope of actually trying VR before anyway.
 
At 15%, I'd get that instead of a 1080ti. I do respect the more educated assumptions based on what we have seen so far, but I still think it would be a bad move from nVidia to launch a 2080 that does not deliver measurable advantage over a 1080ti, especialy as this cycle went on a bit longer than anticipated.

BTW I just reserved a nice used Rift I'll buy next week (I could not justify to buy it new given the tech is old) - it would be nice to have the VR honeymoon set on ultra instead of compromising with the 1060. :)
Which will come anyway in September, as due to our home upgrade, I have no hope of actually trying VR before anyway.

Congrats on the Rift purchase! Make sure you got touch controllers and a third camera for roomscale. Even if "it's only for elite" - you will most probably want it anyway ;-)
 
Now I have some tasty pre-release info:
https://videocardz.com/77369/nvidia-geforce-rtx-2080-ti-features-4352-cuda-cores
What we can infer from that is that the difference between 1080Ti and 2080 is only 768 CUDA cores. That's less than the difference between 1080 and 1080Ti, which is 1024 CUDA cores. So, unless they have made some other breakthrough, it suggest that the difference between current Ti and next gen Ti will be less than upgrading from 1080=>1080Ti.

Oh and optimal_909, there is one more important factor you need to consider and this is video RAM. The 2080 has only 8GB compared to 1080Ti's 11GB. And there are games (DCS World) which can make use of that RAM already. Strongly advised to wait for first impressions.
 
Since it looks like the 2080 will be slightly less than a full GT104 part, and even the full fat parts will be a bit behind the 1080 Ti in some metrics, I'm even more sceptical than before of the 2080 outperforming the 1080 Ti by any significant margin in titles that don't leverage DX12 DXR or even more niche proprietary NVIDIA features. I'm also still mining profitably on my current 1080 Ti when I'm not gaming on it, and it will likely take some time for mining software to adjust to the GDDR6 memory standard.

Maybe Turing will OC well, but that's a complete unknown at this point, so I decided to take advantage of some of the sales occurring now and bought another 1080 Ti (ended paying 635 dollars for a new non-reference MSI DUKE OC), which will be going in my HTPC.
 
Last edited:
Back
Top Bottom