New RTX GPUs

Depends on how chaotic the fallout gets. :p

Whatever happens, I wonder if I may not find myself still with my trusty 1080Ti for a while longer...


I am pretty sure those use frame analysis - essentially the same as when compressing video.
I was afraid it might be something like this. I still might take a crack at it. Pull down NV's SDK and see what I can do with it. Probably go nowhere but I might learn something.

BTW, I loved my 1080Ti. My 3090, on paper, is twice as fast. In EDO, realistically, it just made everything more smooth ( not 'wow' good though ) but it wouldn't have been worth £2k ( I paid £700, I think it fell off of a lorry ). 'If' the 5090 really is twice as fast as my 3090 ( and I'm watching leaks closely ), you could be looking at a solid 4x increase in raw render compute. Just food for thought.
 
I started to wonder about this but might I humbly submit the challenge that motion vectors must be exposed in some shape or form for the various existing VR techniques for reprojection to be applied.
I always thought that basic reprojection was just "same picture, rotated by the amount the headset rotated".
 
does anyone know what brand of monitor pairs best with nvidia 4090's? i know some monitors support nvidia while other support radeon better
 
I always thought that basic reprojection was just "same picture, rotated by the amount the headset rotated".
Oh, that would be even worse - then I would probably have no hope because no useful info is exposed unless.... Could... DLSS4 be 'that' clever? Does it have to have 3D motion vectors or is it smart enough for 2D vectors to be sufficient? But then, like Jojon said, it's kinda just doing frame analysis. I think I have some homework to do.
 
So does this also mean DLSS is challenging for a scene where a large number of complex objects are moving completely differently to one another and have nothing much else in common in terms of rendering? Because to me, that sounds like an on-foot battle in a Settlement, so is that just basically never going to benefit from DLSS even if FDev tried, and exposed the vectors?

The first version of DLSS had trouble with lots of very small objects, but not FPS small objects, think more like a Total War battle zoomed out and it would have artifacted.

Since version 2 they've improved that considerably and the new scaler model is pretty accurate. Even with 3 you had to get to pretty fiddlingly small details before you could tell the difference between native 4k and even Performance (As long as the model has enough input detail it can scale up quite a lot)
 
Probably go nowhere but I might learn something.
I suppose any intellectual excercise is a worthwhile endeavour... for those with the faculties to -- I wouldn't know. :7

As I recall, by Oculus' ASW 2.0, they started to ask the game for Z-buffer, which is neither here nor there, but I have no idea how it has evolved since, so who knows...

Just food for thought.
Food plate duly placed on a prominent shelf in the pantry, under a fancy cloche. :7

I always thought that basic reprojection was just "same picture, rotated by the amount the headset rotated".
The basic reprojection (Asynchronous Time Warp in Oculus parlance) is just that, yes -- the image is panned to match head rotation.

Then came motion smoothing (...or Asynchronous Space Warp as Oculus says), which extrapolates a future frame from the last, according to motion vectors derived by comparing the last two. This allows for translation, on top of just rotation, and also dead-reckons objects moving and animating in the scene, but the guesses made can be... imperfect...

Whilst having zero insight, I'd be inclined to guess even the latest DLSS to just be 2D vectors, but suppose 3D ones would LERP better to projection when things move in and out of the scene, which becomes relevant when we begin to speak more than one interpolated frame.
 
Last edited:
Hi All :)

Is it just me or....:unsure:
I've just been browsing through this topic, and also browsing through some of our main hardware sites in the UK this evening and (just) noticed that there seems to be a 'shortage' of Nvidia 4090 & 4080 Gpu's.
They're either out of stock or 'pre order' as far as I can see. (Scan & Overclockers) I've got to admit, I don't generally look at any of these (two particular) cards prices on a regular basis, I mainly look at the mid to upper range cards to see how the prices fluctuate etc....just for interest.
So...what's going on here, have these two particular cards always been in short supply, or is it something to do with the 'new' RTX 5000 series being released soon?
I've got a suspicious mind that there's something being orchestrated somewhere :sneaky:...but it's probably as I said, just me.

Jack :)
 
Last edited:
I've got a suspicious mind that there's something being orchestrated somewhere :sneaky:...but it's probably as I said, just me.

It's always orchestrated when talking of manufactured goods that get updated regularly, there's always a push to get people onto the new shiny because they need to stop manufacture of the old dull, you don't want two production lines running to produce two products that essentially do the same but one is newer and more expensive and slightly faster, you want everyone on the new and shiny so you can save money by shutting the old dull production line down, or repurpose it into making more new and shiny that has a higher profit margin!
 
does anyone know what brand of monitor pairs best with nvidia 4090's? i know some monitors support nvidia while other support radeon better

Software G-Sync is pretty compatible, even with parts that haven't been tested/verified. I don't think I have any VRR displays that don't work well with my 4090 and only one of them is actually certified as G-Sync compatible.

So...what's going on here, have these two particular cards always been in short supply, or is it something to do with the 'new' RTX 5000 series being released soon?
I've got a suspicious mind that there's something being orchestrated somewhere ...but it's probably as I said, just me.

As I mentioned earlier, client GPUs are very low priority.

Every part comes as a certain die flavor that is then salvaged or crippled as needed to create a product with the capabilities required. As AI has taken off, client and enterprise needs have diverged to the extent that it makes sense to build specialized parts that cannot be turned into GPUs at all. Since this market produces several times the revenue of the gaming market, it takes priority. So, when NVIDIA negotiates their wafer allocation with a semiconductor fab, and decides what parts to built there, they don't order many top-end consumer GPUs. It's much more profitable for them to fill those AI orders.

Using the current/upcoming generation as an example: GB100 and GB202 are very different chips, but of similar size and manufacturing complexity/cost. GB100, which has sacrificed nearly all graphics capabilities for pure compute performance, goes into enterprise DGX accelerators that sell for mid-five-figures per GPU and are sold in huge quantities. GB202, which costs pretty much the same to make, goes into the RTX 5090 and the (as of yet unannounced) RTX 6000 Blackwell edition. These parts sell for the low four-figures (RTX 5090) to the mid-to-high four figures (RTX 6000).

NVIDIA can sell all the enterprise stuff they can have made at a ~1000-2000% markup. Even the most expensive client parts only sell at 100-300% markup.

It was the same last generation, though to a lesser degree. Since these orders need to get to fabs months or years in advance, they have to try to estimate the relative demand of each market segment. As the AI boom has been going on for years and shows no signs of abating, they've settled into allocations that are very enterprise heavy, because that's where the money is. This means a shortage of high-end client GPUs at any price, because any quantity they've seen fit to have made will sell out at almost any price.

The RTX 4090 was never a high volume part. It got the binning rejects from the RTX 6000 Ada edition, which was itself relatively low-volume, and mostly existed as a combination of marketing exercise and way to salvage silicon that couldn't be used elsewhere. The low stock of last gen parts also further increases demand for the new generation as varonica mentions.

This is what annoys me about Big Business. As if their profit margins aren't big enough already.

A wise human once said "money is the root of all evil". The older I get and as the years go by, it seems this adage grows ever more accurate.

Money, by it's very nature, rewards scarcity and inequity. Mediums of exchange and stores of value have no utilitity if no one wants for anything.

As for big business, policy changes in much of the world, especially since the 1970s, have increasingly conflated it with government. The most efficent way to extract profit is with coercive monopolies that are tacitly supported by nation states. Government of corporations, by corporations, for corporations...all wrapped in a sweet candy shell of democracy.

The biggest thing separating reality from the fantasy dystopias in games like Elite and Cyberpunk is that reality is just a little less on the nose and our villians have better publicists. Oh, and that even railroady games tend to give us more opportunity and agency than life does.
 
Overclockers still have some 5080s if anyone wants one:

1738267142852.png
 
400 EUR for a CLC...MSI might get away with that on the proportionally more expensive 5090 where people will take whatever they can get, but 5080 stock is in a much better state and one would need to be nuts to spend that kind of premium on mediocre watercooling that the lower TDP part doesn't really need.
 
I had some fun playing chase the GPU this afternoon, 5090s were out of stock before I could even refresh the page on Scan UK, Nvidia went straight from available soon to out of stock on FE cards and Amazon UK didn't bother to even list them at 2pm, they eventually showed up around half an hour later as out of stock.

I found those watercooled MSI ones available at Currys for £14xx but decided not to buy it, and a £1200 Palit aircooled 5080 (also at Currys) that I added to cart but it went out of stock seconds later when I went to my cart to hit the buy button.

Also saw a "Sapphire Nitro+ AMD Radeon RX 7900 XTX Vapor-X" for about £1000. I kinda want an Nvidia card though. I'm going to wait to find out more about the new AMD cards but I'm aiming for a stock 5080 of some sort if I can get one.
 
I was considering a 5080 but the reviews have put me off , I was hoping for more of a performance increase for the extra power consumption. The 5090 looks impressive, but I am not spending that much money on a GPU.

I will see what AMDs 9070 has to offer, if that is a no go too, it looks like I will be waiting for the next generation of cards.
 
I went back to the Scan UK website & it's working for me now, everything is either out of stock or available to pre-order & are rather more reasonable prices than the Overclockers screengrab above.

So I have pre-ordered the Palit 5080 gaming pro for £1056 and a Corsair HX1200i PSU (~£200) to power it.

The card is estimated to be in stock at the end of Feb, who knows when my order will actually be fulfilled but I haven't been charged for it yet so I figure I have nothing to lose by pre-ordering and I lock in the max price I'll pay.

I currently still have a 1080ti so all the higher end GPUs are going to be a significant upgrade for me & I'm taking advantage of a possibly brief opportunity to buy a modern card at retail price.
 
I went back to the Scan UK website & it's working for me now, everything is either out of stock or available to pre-order & are rather more reasonable prices than the Overclockers screengrab above.

So I have pre-ordered the Palit 5080 gaming pro for £1056 and a Corsair HX1200i PSU (~£200) to power it.

The card is estimated to be in stock at the end of Feb, who knows when my order will actually be fulfilled but I haven't been charged for it yet so I figure I have nothing to lose by pre-ordering and I lock in the max price I'll pay.

I currently still have a 1080ti so all the higher end GPUs are going to be a significant upgrade for me & I'm taking advantage of a possibly brief opportunity to buy a modern card at retail price.
I was thinking of a new rig myself. However, I don't think the cost versus benefits are justifiable - at least for me:
  1. Ryzen 7 5800x -> Ryzen 7 9800x3D
  2. RTX 3080 -> RTX 5080 (a 5090 is just way too expensive no matter which way I cut it).
  3. ASUS x570 E gaming - ASUS TUF Gaming Z870 Plus
I would also consider
  • Going from 32 to 64GB RAM.
  • PSU from 850W to 1000W (probably a must)
Drive wise, I'd stay with 1TB SSD O/S but upgrade from a 2TB HDD to a 2TB SSD.

I'd stay with a discreet soundcard * (currently an ASUS STRIX Raid Pro which I would keep or maybe swap out for a Soundblaster one) and Win11 Pro.

I'm not sure about chassis though.

Essentially, it would be this build with tweaks:

**

* Sound is of major importance/appeal to me although I do acknowledge on-board sound is pretty good these days. I currently don't use speakers - and the audio from both my monitors sounds like it's being transmitted to empty baked bean tins via string. I therefore use headphones and would stay with that.

** Ye gods, that's a lot of cash! :D I wonder if Scan would knock 1K off the price if didn't want CoPilot :D
 
Last edited:
So I have pre-ordered the Palit 5080 gaming pro for £1056 and a Corsair HX1200i PSU (~£200) to power it.
  1. Ryzen 7 5800x -> Ryzen 7 9800x3D
  2. RTX 3080 -> RTX 5080 (a 5090 is just way too expensive no matter which way I cut it).
  3. ASUS x570 E gaming - ASUS TUF Gaming Z870 Plus
I would also consider
  • Going from 32 to 64GB RAM.
  • PSU from 850W to 1000W (probably a must)
Holy hell, you guys are really going some on the PSUs there. Is that really necessary?
A 9800X3D can't pull more than about 160 W, I believe. And the 5080 peaks at something like 400 W? So an 850 W supply ought to have a good amount of headroom, I'd have thought... (But maybe @Riverside is using one of those gas-guzzling Intel CPUs? ;))
 
I have an i7 12700k with 32Gb ram that I think is still current enough for my needs, but I tend to alternate between upgrading the CPU/ram/mono and GPU/PSU. Stuff like storage, monitors & other peripherals can be upgraded independently, usually when something breaks. My keyboard is probably 20 years old but I bought a new mouse last week ;)

My current mobo supports PCIe5 & windows 11 so it can take a modern gpu even if my CPU might not be beefy enough to take full advantage of that GPU.

Last time I bought a whole new computer it was because my old one used AGP so I couldn't use my existing graphics card.

I have a massive case (phanteks enthoo pro), I can fit whatever components I need in there. It sits under the desk & keeps my feet warm ;)

The decision to upgrade this time is a personal choice, for me I have been wanting a better GPU since Odyssey released really but bitcoin mining & other pricing and availability factors have gotten in the way. I'm hoping this 5080 will keep up for the next few years at least.
 
Back
Top Bottom