What other games are we all playing?

If you're upgrading your graphics card, I'd seriously suggest looking at one of the AMD cards as a serious alternative to the RTX 4070. I had a rather good RTX 3070ti FTW3 ultra until last week and like you, was intending on looking to the 4070 or 4070ti as a possible future upgrade.
Yeah, I'm still keeping an eye on them too, but...
Elite: Dangerous Odyssey VR weirdness aside, AMD's GPUs are better deals at most price ranges right now, unless you're heavy into hardware RT.
Well yeah, ED weirdness in AMD cards is very much a good reason why I will take some convincing to go AMD. I intermittently take very long breaks from ED but when I do play it I want it to look right.
I don't like Nvidia's business tactics, I don't like their pricing, I don't like their attitude to open-source. However, I had ATI cards a long time ago and had my fair share of pain with them (so they were very firmly off the list for a while) and in my old age I think there's maybe something to be said for taking the path of least resistance :unsure:
A better reason is that I would like to dip my toe into ray-traced gaming now and then, and the AMD cards (I think) aren't currently competitive in cost/fps for that mode. I also have a thing for not putting space heaters in my bedroom and the watts/fps ratio is pretty darn good for 40-series cards, much as the pricing stinks.
 
If you're upgrading your graphics card, I'd seriously suggest looking at one of the AMD cards as a serious alternative to the RTX 4070. I had a rather good RTX 3070ti FTW3 ultra until last week and like you, was intending on looking to the 4070 or 4070ti as a possible future upgrade.

However, as chance had it...a mate of mine had recently bought a Radeon RX 6950 XT but decided he didn't need the grunt of the card as he only plays World of Tanks. He suggested a swap for my 3070ti plus a small bit of cash so I decided to jump in even though I've stuck rigidly to Nvidia stuff for years.

The Radeon card is an absolute beast, 16Gb of Vram and matches up to the mighty RTX 3090ti (and the 4070ti) in benchmark comparisons. Price wise (current prices) it's a good bit cheaper than a 4070 (certainly cheaper than a 3090ti). Well worth a look...even at the Radeon 6800 xt range as possible cheaper upgrade alternatives. I'd normally never have considered an AMD card...no reason other than I'd had no experience of them, but I'm certainly glad I made the swap 🤷‍♂️



The amount of FPS that DLSS 3 brings to the table could have fixed all Odyssey performance issues.
Maybe someone knows why Frontier didn't added DLSS 2, 3 and FSR 2? Cause you know, they did it for Jurassic World Evolution 2. Ray Tracing, DLSS 2 and 3 - the whole package. :unsure:
 
Last edited:
Imagine all the new players, picking up ED Oddy for the first time without a clue, and trying to run it from native motherboard graphics...
 
The amount of FPS that DLSS 3 brings to the table could have fixed all Odyssey performance issues. Maybe someone knows why Frontier didn't added DLSS 2, 3 and FSR 2?
But DLSS 3 is the key selling point and also a solution to Ray Tracing heavy performance dive.
I can only speculate, but i don't think AMD card price dive is a coincidence. Maybe they realised that FSR 3 can't be done on their current cards same way Nvidia couldn't do it for 30 and 20 series cards, so they admited defeat and trying to sell as much as possible until they release 80XX series cards with AI cores similar to 40 series cards.

But something is definately off with AMD cards. They are trying too hard to undercut Nvidia. I would say its a very suspecious behavier, and something is definately brewing on the horizon that will unveil this mystery.
Nvidia on the other hand, not only hold the top place for 4k gameplay with 4090 cards for over 6 month, they still have 4090ti card later this year, and possibly 50XX cards next year. Nvidia is clearly moving forward, and if FSR 3 doesn't work, than AMD is done for.
I think you're jumping way too heavily onto the Nvidia hype train...but this isn't the place to get into the ancient and deeply uninteresting AMD v Nvidia brawl...nor the other one about DLSS v FSR ;)

I don't have a particular iron in the fire here, up until I got hold of this rx 6950 xt, I'd had absolutely no experience of AMD cards at all. I've stuck to Nvidia, mainly through force of habit, for the last 10 years from my old 970 titan, 1080ti, 2070 super, 3070ti...

All I said was that the AMD cards were a cheaper and very capable alternative to the wildly overpriced higher end Nvidia options on the table...the technical benchmark scores are all out there to show the numbers. It solely depends on how much money you want to be spending to play your games 🤷‍♂️
 
Last edited:
All I said was that the AMD cards were a cheaper and very capable alternative to the wildly overpriced higher end Nvidia options on the table...the technical benchmarks are all out there to show the numbers. It solely depends on how much money you want to be spending to play your games 🤷‍♂️
But all this benchmark are testing FSR 2 vs DLSS 2, not against DLSS 3. That why people still think AMD cards are good.
If i was a developer of an upcoming game, and saw what DLSS 3 can do, i would have gone through hell to add it.

We chase after the good performance in games, and yet when NVIDIA sneaks up with an easy solution, everyone is like - no no, lets test FSR 2 vs DLSS 2. As if DLSS 3 doesn't exists.
 
But all this benchmark are testing FSR 2 vs DLSS 2, not against DLSS 3. That why people still think AMD cards are good.
If i was a developer of an upcoming game, and saw what DLSS 3 can do, i would have gone through hell to add it.

We chase after the good performance in games, and yet when NVIDIA sneaks up with an easy solution, everyone is like - no no, lets test FSR 2 vs DLSS 2. As if DLSS 3 doesn't exists.
FSR 2.1 is pretty good too, it also does exactly the same basic upscaling rendering tricks as DLSS (2 or 3)... I know this because I'm now using FSR 2.1 with the RX-6950xt, the same as I used DLSS 2 on my RTX-3070ti...but that's just firmware enabled by software which is invariably updated and improved periodically by the vendors during the life of it's implementation regardless of hardware.

It may be news to you, but all of the trusted third party GPU benchmark providers reflect the raw power of the cards with DLSS or FSR disabled in their testing for that reason, it's software, not hardware, the same with raytracing... just saying.

My 'crappy' AMD based PC build (R9-5900x and RX-6950xt respectively) is firmly in the current top ten performance bracket right alongside the RTX-3090ti ...but I didn't get it there by throwing buckets of money at Nvidia or Intel through misguided brand loyalty. I'll always be at least a generation behind the current top of the tree PC hardware because like most of us, I simply can't afford to keep up with the Jones's or follow the latest trending fan fiction to clutter up t'interweb. The reality is that Nvidia are increasingly pricing folks like me out of their high end market, exactly the same as Intel did with their CPU's. I voted with my wallet.

So, without being disrespective... since that's not my intent, until you can say you've tried both ends of this silly argument and made a realistic side by side comparison, don't come to me and tell me you've heard how crap AMD cards supposedly are when compared to the latest Nvidia hype-train offerings. I've made exactly that real time comparison on my own PC by trying both, side by side, I certainly have no regrets in the choices I made.

I'm a gamer with more eclectic and varied tastes than some, but I still expect decent pound for pound performance out of the hardware I can realistically afford to play with, so I'm not going to be blinded by that brand loyalty just for the sake of it...the same reason I have an Xbox series X in the living room and a PS5 in the Mole Cave amongst the clutter of PC bits. I enjoy a fair spread of games across the genres from DCS to Ghost of Tsushima... It's kinda what I do since I retired and stopped gathering up motorbikes in my garage. 🤷‍♂️

BTLEphy.jpg
 
Last edited:
Imagine all the new players, picking up ED Oddy for the first time without a clue, and trying to run it from native motherboard graphics...
Thing is, it should run on native motherboard graphics. I'm able to run Horizons 3.8 on Intel graphics and an i3 processor just fine (Jenny's computer). Granted, this is at lowest graphics settings and reduced resolution, but it runs just fine and still looks better than all the predecessors from the late 90s and early 2000s. IIRC, graphics settings have little effect on performance in Odyssey, which is not how it ought to be. Is this still the case?
 
FSR 2.1 is pretty good too, it also does exactly the same basic upscaling rendering tricks as DLSS (2 or 3)... I know this because I'm now using FSR 2.1 with the RX-6950xt, the same as I used DLSS 2 on my RTX-3070ti...but that's just firmware enabled by software which is invariably updated and improved periodically by the vendors during the life of it's implementation regardless of hardware.

It may be news to you, but all of the trusted third party GPU benchmark providers reflect the raw power of the cards with DLSS or FSR disabled in their testing for that reason, it's software, not hardware, the same with raytracing... just saying.

My 'crappy' AMD based PC build (R9-5900x and RX-6950xt respectively) is firmly in the current top ten performance bracket right alongside the RTX-3090ti ...but I didn't get it there by throwing buckets of money at Nvidia or Intel through misguided brand loyalty. I'll always be at least a generation behind the current top of the tree PC hardware because like most of us, I simply can't afford to keep up with the Jones's or follow the latest trending fan fiction to clutter up t'interweb. The reality is that Nvidia are increasingly pricing folks like me out of their high end market, exactly the same as Intel did with their CPU's. I voted with my wallet.

So, without being disrespective... since that's not my intent, until you can say you've tried both ends of this silly argument and made a realistic side by side comparison, don't come to me and tell me you've heard how crap AMD cards supposedly are when compared to the latest Nvidia hype-train offerings. I've made exactly that real time comparison on my own PC by trying both, side by side, I certainly have no regrets in the choices I made.

I'm a gamer with more eclectic and varied tastes than some, but I still expect decent pound for pound performance out of the hardware I can realistically afford to play with, so I'm not going to be blinded by that brand loyalty just for the sake of it...the same reason I have an Xbox series X in the living room and a PS5 in the Mole Cave amongst the clutter of PC bits. I enjoy a fair spread of games across the genres from DCS to Ghost of Tsushima... It's kinda what I do since I retired and stopped gathering up motorbikes in my garage. 🤷‍♂️

BTLEphy.jpg
Here is an example why you should stop buying AMD ( Until they release FSR 3) and old generation Nvidia cards:

AMD vs NVIDIA.jpg

Here are same cards with Native resolution test:

AMD vs NVIDIA 2.jpg


That's how those Benchmarks tricks you. They show you 6950XT is 24% Faster in Native - and you are happy with good purchase.

But when you turn on RT, FSR 2 and DLSS 3. It show complete opposite results: 4070 is 28% faster in RT ( FSR 2 vs DLSS 2) match up. And more than 100% Faster with Frame generation.
Now you can of course ignore RT, it's not a nessasity, but AMD cards will still lose in FSR 2 vs DLSS 3 performance battle, because Frame Gen adds so much FPS, it's mind blowing that NVIDIA even released it, as it totally destroys the credibility of older generation cards.

You have to understand, Frame Gen is very new tech so people watching missleading benchmarks are still confused about what they could gain with it. But the gains are massive.
 
Thing is, it should run on native motherboard graphics. I'm able to run Horizons 3.8 on Intel graphics and an i3 processor just fine (Jenny's computer). Granted, this is at lowest graphics settings and reduced resolution, but it runs just fine and still looks better than all the predecessors from the late 90s and early 2000s.IIRC, graphics settings have little effect on performance in Odyssey, which is not how it ought to be. Is this still the case?

This isn’t the case now, and AFAIK this was never the case. My measuring stick during the post-release optimizations was how far down did I have to turn down graphics to get acceptable (YMMV) performance on foot in settlements on my old PC from 2013. I’m the lazy sort, so I use FDevs presets. Target performance was for Horizons settings. In VR, using VR high. Monitor is 1440p. VR headset is HRC Vive.

Odyssey at during the Alpha was Low 720p. No VR.
At release it was Low 1080p. Still no VR. Medium in space.
Last time I tested it, U12 IIRC, I could play in VR, and the settings were VR medium in settlements, and VR high in space. Monitor at 1440p.

Keep in mind, my old computer was almost identical to David Braben’s home PC, so it probably benefited from targeted fixes. In my opinion, many of the complaints about Odyssey’s poor performance was down to various CPU intensive tricks to brute force a fix for FDev’s bad anti-aliasing, as well as the use of hidden graphics improvements.

IIRC, during one thread discussing why players were seeing such wildly different outcomes, someone revealed that they were running a Pixma 8k VR headset, a 4K monitor, and was using both nVidia’s upscaling and FDevs upscaling at 2.25. That’s just shy of 16 times the resolution of a 4K monitor, or if you prefer, trying to running a 4K monitor at 1700 hertz. And they wondered why they were having problems. :rolleyes:

The general impression I’ve gotten over the past year is that Horizons had so much headroom, graphics and CPU wise, that some people were effectively running the game at 16k on super-ultra-max settings. Odyssey seems to require those resources for other aspects of the game, like on-foot AI behavior. As a result, trying to run Odyssey at those settings was doomed to failure.

Still… it did take FDev about a year of optimization fixes before my old PC could use the same settings as Horizons did in space, so they’re not blameless either. EDO was rushed out the door way to early, and with inadequate player testing. And I still want to know what they were drinking to think EDO could possibly run on last-gen consoles, so I can avoid ingesting it in the future.
 
Meanwhile back on topic .... ;)

Finally getting to grips with Phoenix Point, a game I don't need the dongle for. In the classic style it starts slow, but before you know what's going on, you don't have enough resources, time, upgrades or soldiers. Trying to balance the factions is a nighmare, probably need to start favouring one over the other two.

Kept me out of bed until about 2:30am this morning and that doesn't happen very often for any cause :)
 
But buying a video card is a game!


😂
I note the requirements for that piece of propaganda appear to be far below what's being discussed. Obviously simulating cutting edge 'Windows' machines doesn't require a cuttting edge 'Windows' machine :)

Nor a small sub-station to power the graphics card! ;)
System Requirements

Minimum:
OS: Windows 7 or higher
Processor: Intel Core i5-2500K or AMD Athlon X4 740 (or equivalent)
Memory: 4 GB RAM
Graphics: GeForce GTX 660 (2048 MB) or Radeon R9 285 (2048 MB) - Integrated GPUs may work but are not supported.
DirectX: Version 9.0c
Storage: 30 GB available space
Sound Card: DirectX compatible
 
Obviously simulating cutting edge 'Windows' machines doesn't require a cuttting edge 'Windows' machine :)
The irony / sad reality is that those minimum requirements are technically wrong, because Steam itself is dropping support for Windows 7 & 8. I don't know exactly what that means, however. Will Steam stop running on these older versions of Windows, or will it just stop getting updates? If I can't run a game made for Windows 7 on a Windows 7 PC because Steam requires a Windows 10 PC, then I will actually start regretting choosing Steam as my platform rather than GOG...

Thankfully I am on Windows 10, but that's where I'm drawing the line. I have zero interest in Windows 11 (or 12 or 20). I'd rather switch to Linux, even if it means losing access to Microsoft's growing acquisition of game titles. It's crap like this that caused me switch from PC to console years ago, but sadly consoles are just as bad as PC these days, worse if you count the multiplayer fee (something my PS3 did not have). Maybe I'll just buy a deck of cards and play real Solitaire, lol.
 
Here is an example why you should stop buying AMD ( Until they release FSR 3) and old generation Nvidia cards:

View attachment 354405
Here are same cards with Native resolution test:

View attachment 354406

That's how those Benchmarks tricks you. They show you 6950XT is 24% Faster in Native - and you are happy with good purchase.

But when you turn on RT, FSR 2 and DLSS 3. It show complete opposite results: 4070 is 28% faster in RT ( FSR 2 vs DLSS 2) match up. And more than 100% Faster with Frame generation.
Now you can of course ignore RT, it's not a nessasity, but AMD cards will still lose in FSR 2 vs DLSS 3 performance battle, because Frame Gen adds so much FPS, it's mind blowing that NVIDIA even released it, as it totally destroys the credibility of older generation cards.

You have to understand, Frame Gen is very new tech so people watching missleading benchmarks are still confused about what they could gain with it. But the gains are massive.
Ok, you've convinced me...Nvidia good, AMD bad 😱

I've shown you mine...now you show me the 4070 you're running CP2077 on...or do you want me to fire up another 100 user made videos... or even some of my own... that show something completely different?

For the older games like CP2077 (which I completed some time ago with my 3070ti and haven't played since)...and some of those other games I very rarely play still using outdated FSR 2.0...it's been replaced by FSR 2.1, I chose not to use FSR, exactly the same with raytracing...it's software, it's optional and not always best optimised by the individual game developer incorporating it, or belatedly adding support for it, into their games.

It has nothing whatsover to do with what brand of graphics card I'm using... It's software (didn't I mention this already?) CP2077 ran like crap on my 3070ti with DLSS and RTX on because CDPR screwed the pooch with optimising the entire game...long before they belatedly added DLSS or FSR support. I didn't use raytracing or DLSS then either.

I'm done here...you can just do you :rolleyes:
 
Last edited:
Ok, you've convinced me...Nvidia good, AMD bad 😱
Don't be done just yet! I have a serious question. It used to be that Nvidia had the superior driver support for Linux, but that was a LONG time ago when I last looked at this (my current Linux machine runs Intel graphics). Is this still the case, or has that since changed?

FWIW, my gaming laptop has Nvidia, but I don't have any special loyalty to this chip vs AMD except that it's the chip I have. I do, however, foresee the day that I'm forced to abandon Windows on this machine and switch over to Linux. Well, let me rephrase - Microsoft will abandon Windows on my machine (Windows 10 that is), shortly followed by Steam as I was lamenting earlier.
 
Don't be done just yet! I have a serious question. It used to be that Nvidia had the superior driver support for Linux, but that was a LONG time ago when I last looked at this (my current Linux machine runs Intel graphics). Is this still the case, or has that since changed?

FWIW, my gaming laptop has Nvidia, but I don't have any special loyalty to this chip vs AMD except that it's the chip I have. I do, however, foresee the day that I'm forced to abandon Windows on this machine and switch over to Linux. Well, let me rephrase - Microsoft will abandon Windows on my machine (Windows 10 that is), shortly followed by Steam as I was lamenting earlier.
'nix with steam user here.
Red team crowed for years about how they were going to really, really, truly get their open source drivers running. I'm still waiting.
So I still use nvidia, although not open source, I don't care, their drivers are a cinch to install, and they work. I'm not some lunatic that sees it as "polluting" my OS.
 
'nix with steam user here.
Red team crowed for years about how they were going to really, really, truly get their open source drivers running. I'm still waiting.
So I still use nvidia, although not open source, I don't care, their drivers are a cinch to install, and they work. I'm not some lunatic that sees it as "polluting" my OS.
Another selling point of X games - native Linux support! Speaking of, does X4 have this now? I know X3 does, as that's how I run it on my Linux laptop, but I've never tried X4 because that laptop doesn't have the horsepower for it.
 
Ok, you've convinced me...Nvidia good, AMD bad 😱

I've shown you mine...now you show me the 4070 you're running CP2077 on...or do you want me to fire up another 100 user made videos... or even some of my own... that show something completely different?

For the older games like CP2077 (which I completed some time ago with my 3070ti and haven't played since)...and some of those other games I very rarely play still using outdated FSR 2.0...it's been replaced by FSR 2.1, I chose not to use FSR, exactly the same with raytracing...it's software, it's optional and not always best optimised by the individual game developer incoporating it, or belatedly adding support for it, into their games.

It has nothing whatsover to do with what brand of graphics card I'm using... It's software (didn't I mention this already?) CP2077 ran like crap on my 3070ti with DLSS and RTX on because CDPR screwed the pooch with optimising the entire game...long before they belatedly added DLSS or FSR support. I didn't use raytracing or DLSS then either.

I'm done here...you can just do you :rolleyes:
Sorry i won't bother you with this nonesense any longer ;)
I was my self upgrading from 10 series cards to 40 series few weeks ago, and still just blown away by Frame Generation technology. It's such an awesome and easy solution to performance issues in games.

Back on topic, really really enjoying Everspace 2. Such a beautiful space and egaging story.
Space.jpg
 
The amount of FPS that DLSS 3 brings to the table could have fixed all Odyssey performance issues.

Sure, DLSS3 frame generation could significantly mitigate Odyssey's performance issues (even CPU limited ones, by virtue of only needing every other frame to be real), but it's an ultra-niche feature (only works on RTX 3000 and 4000 series parts) that has significant trade-offs (the input latency at anything short of triple digit frame rates is perceptible, even if not problematic, and occasionally obtrusive visual artifacts).

Expecting Frontier to be able to integrate it into Odyssey without issue is also exceedingly far fetched. CDPR couldn't manage a smooth DLSS3 frame generation integration with Witcher 3 or Cyberpunk 2077 (both games stutter for seconds at a time during scene transitions that feature large frame rate swings) and they had far more incentive, plus far larger budgets. Frontier isn't going to shoe horn DLSS3 into a game that doesn't even have provisions for TAA for the tiny fraction players that could use it; the cost/benefit ratio isn't there.

But all this benchmark are testing FSR 2 vs DLSS 2, not against DLSS 3. That why people still think AMD cards are good.
If i was a developer of an upcoming game, and saw what DLSS 3 can do, i would have gone through hell to add it.

We chase after the good performance in games, and yet when NVIDIA sneaks up with an easy solution, everyone is like - no no, lets test FSR 2 vs DLSS 2. As if DLSS 3 doesn't exists.

For the overwhelming majority of games, DLSS3 doesn't exist, or may as well not exist. It also adds complications to comparisons, because there are considerations that don't apply to pure upscalers.

The general impression I’ve gotten over the past year is that Horizons had so much headroom, graphics and CPU wise, that some people were effectively running the game at 16k on super-ultra-max settings. Odyssey seems to require those resources for other aspects of the game, like on-foot AI behavior. As a result, trying to run Odyssey at those settings was doomed to failure.

Odyssey still has a disturbingly low FPS floor in CPU/memory performance limited scenarios, irrespective of settings used.
 
Back
Top Bottom