Game Discussions Bethesda Softworks Starfield Space RPG

That's interesting. People on reddit pointed out Silent Hill 2 on Steam have the same 2080RTX vs 6800XT for Recommended settings, but with additional info:
Playing on recommended requirements should enable to play on Medium quality settings in 60 FPS or High quality settings in 30 FPS, in FullHD (or 4k using DLSS or similar technology).

 
Wolfpack was done by a subsim.com veteran? Had a look at it again recently and saw the compressor is now in the right place. Did some shenanigans when it released with keeping decks awash but the deck gun still be able to fire. Iffy thing but I managed not to drown the dude and we went speed boat attack against destroyers. It's a bit unsettling when you can't see what's happening but apparently the AI targets center mass which we kept submerged.
Onkel Neal owns and runs subsim.com...has done for the last 25 years or so.
 
That's interesting. People on reddit pointed out Silent Hill 2 on Steam have the same 2080RTX vs 6800XT for Recommended settings, but with additional info:
Playing on recommended requirements should enable to play on Medium quality settings in 60 FPS or High quality settings in 30 FPS, in FullHD (or 4k using DLSS or similar technology).


Interesting coincidence with the recommended specs, but it's a totally different developer and publisher using a different engine in a different game. No particular reason to think Starfield's recommended requirements imply the same performance targets.
 
Wolfpack was done by a subsim.com veteran? Had a look at it again recently and saw the compressor is now in the right place. Did some shenanigans when it released with keeping decks awash but the deck gun still be able to fire. Iffy thing but I managed not to drown the dude and we went speed boat attack against destroyers. It's a bit unsettling when you can't see what's happening but apparently the AI targets center mass which we kept submerged.

You'd hope it would be, need to give the bin bags back aft something to do apart from starting/stopping engines in the next update!
 
I'm doubtful the game will even start on a 4960X. Performance wise it's still a capable chip, but AVX2 wasn't introduced until Haswell and I strongly expect Starfield to require that instruction set.

Regardless, if the system supports an RX 6750 XT, it will support a 6800 XT, unless you've been cutting it way too close on the power supply.
Yes but my CPU will bottleneck the GPU so not getting the full performance, at least when I check with the bottleneck calculator

I went RTX3070 … and now they’re gonna optimise for AMD only?
You will do just fine, the reason I'm going for the AMD processor is the cost per FPS, RTX cards even after the price cut is just to high for what I want to pay ATM.
rely on fake frames to have games run fine
I don't want to use DLSS or FSR as they give latency, smooth but slow response is not the way to go, now this is a SP game, however if you fell like is like there is a rubber band attached to your mouse it's not good.

That's interesting. People on reddit pointed out Silent Hill 2 on Steam have the same 2080RTX vs 6800XT for Recommended settings, but with additional info:
Playing on recommended requirements should enable to play on Medium quality settings in 60 FPS or High quality settings in 30 FPS, in FullHD (or 4k using DLSS or similar technology).

The strange thing here is that they lock the frames to 30 fps, and I don't care about that, if only the experience is good its just fine, it's a SP game.
 
I don't want to use DLSS or FSR as they give latency

They dont and they shouldnt (they actually should decrease latency)
Only Frame Generation from DLSS3.0 is increasing latency (but if you dont enable frame generation, DLSS30 should also decrease latency )
 
Last edited:
Yes but my CPU will bottleneck the GPU so not getting the full performance, at least when I check with the bottleneck calculator

If the CPU is capable of delivering acceptable performance at all, then you can just jack up visual quality to make use of almost any GPU you might use.

But in the the case of Starfield, you'll likely need a platform upgrade anyway, so the point is moot.

I don't want to use DLSS or FSR as they give latency, smooth but slow response is not the way to go, now this is a SP game, however if you fell like is like there is a rubber band attached to your mouse it's not good.

Frame generation adds considerable latency, especially at lower frame rates, but the standard upscaling versions of DLSS or FSR typically have negligible latency at the exact same frame rate, and because they increase frame rate, final latency will usually be lower when using them.
 
But in the the case of Starfield, you'll likely need a platform upgrade anyway, so the point is moot.
True, I need to at least get the RX 6750TX GPU as I can get that one on the cheap right now, and it will do just fine until I do the full upgrade later next year.
Cost wise there is a big jump from RX6750TX to RX6800TX as I got a local deal to $299 and the 6800 is not even close even now regarding cost.

I need new everything as my whole build is from 2014 - 2016 so need to replace MOBO, CPU, RAM and lets see the prices in 2024 if there is a more stable market.
my 1080 card actually did pretty well for many years.
 
They dont and they shouldnt (they actually should decrease latency)
Neither decease latency, they're upscalers. Both companies are really sensitive about increasing latency so rather than try to argue they don't increase latency they argue they decrease latency by making apples to oranges comparisons: the native resolution the upscaler is working with vs native at the resolution the upscaler was targeting.
 
Neither decease latency, they're upscalers. Both companies are really sensitive about increasing latency so rather than try to argue they don't increase latency they argue they decrease latency by making apples to oranges comparisons: the native resolution the upscaler is working with vs native at the resolution the upscaler was targeting.
As i don't have FSR or DLSS (don't think it was on the 1080 if it was I didn't use it) I can't say anything from personal test, only from what I can read, and from that people say that if you have high FPS like 60+ the latency is not that bad, however if you try to fix 20 FPS with DLSS to get like 60 FPS the latency is like rubber banding or ghosting.
 
As i don't have FSR or DLSS (don't think it was on the 1080 if it was I didn't use it) I can't say anything from personal test, only from what I can read, and from that people say that if you have high FPS like 60+ the latency is not that bad, however if you try to fix 20 FPS with DLSS to get like 60 FPS the latency is like rubber banding or ghosting.
You may find this comparison of how much the amount of VRAM can seriously effect performance in the same game...


Edit: Keep clear of 8 GB cards.
 
Last edited:
As i don't have FSR or DLSS (don't think it was on the 1080 if it was I didn't use it) I can't say anything from personal test, only from what I can read, and from that people say that if you have high FPS like 60+ the latency is not that bad, however if you try to fix 20 FPS with DLSS to get like 60 FPS the latency is like rubber banding or ghosting.

That's in reference to frame generation, which needs at least two frames to already have been drawn to interpolate the one between it, resulting in a rather high latency floor. 60fps final output (30 fps internally), typically means at least 33ms more latency than could be done without frame generation. This is why pretty much every DLSS3 title that supports frame generation also requires NVIDIA Reflex; the total end-to-end latency has to be reduced as much as possible to give time for frame gen to work without making the lag perceptible through floaty input. Personally, I use frame generation in Cyberpunk, but only when I can ensure final minimum frame rates remain at 60+ and even then it's borderline perceptible. Optimal frame rate for frame gen in anything with significant action is probably closer to 90-120, but at that point most people would just run without it at ~60 fps.

Normal upscaling, even temporal stuff including DLSS, FSR, or XeSS, without frame generation, is extremely fast; sub-millisecond per-frame typically. They only need to reference past frames, so don't have to wait for anything.
 
That's in reference to frame generation, which needs at least two frames to already have been drawn to interpolate the one between it, resulting in a rather high latency floor. 60fps final output (30 fps internally), typically means at least 33ms more latency than could be done without frame generation. This is why pretty much every DLSS3 title that supports frame generation also requires NVIDIA Reflex; the total end-to-end latency has to be reduced as much as possible to give time for frame gen to work without making the lag perceptible through floaty input. Personally, I use frame generation in Cyberpunk, but only when I can ensure final minimum frame rates remain at 60+ and even then it's borderline perceptible. Optimal frame rate for frame gen in anything with significant action is probably closer to 90-120, but at that point most people would just run without it at ~60 fps.

Normal upscaling, even temporal stuff including DLSS, FSR, or XeSS, without frame generation, is extremely fast; sub-millisecond per-frame typically. They only need to reference past frames, so don't have to wait for anything.
ok understand, I haven't played CP2077 for a long time, as I was hogging through the game with a horrible frame rate at launch due to my ageing PC, but the last 2-3 years the GPU prices has been insane so I didn't want to spend that amount of money on a new one or to upgrade my pc, now at least I can upgrade the GPU to something decent :D
 
That's in reference to frame generation, which needs at least two frames to already have been drawn to interpolate the one between it, resulting in a rather high latency floor. 60fps final output (30 fps internally), typically means at least 33ms more latency than could be done without frame generation. This is why pretty much every DLSS3 title that supports frame generation also requires NVIDIA Reflex; the total end-to-end latency has to be reduced as much as possible to give time for frame gen to work without making the lag perceptible through floaty input. Personally, I use frame generation in Cyberpunk, but only when I can ensure final minimum frame rates remain at 60+ and even then it's borderline perceptible. Optimal frame rate for frame gen in anything with significant action is probably closer to 90-120, but at that point most people would just run without it at ~60 fps.

Normal upscaling, even temporal stuff including DLSS, FSR, or XeSS, without frame generation, is extremely fast; sub-millisecond per-frame typically. They only need to reference past frames, so don't have to wait for anything.
What was DLSS good for in the first place?
 
Neither decease latency, they're upscalers. Both companies are really sensitive about increasing latency so rather than try to argue they don't increase latency they argue they decrease latency by making apples to oranges comparisons: the native resolution the upscaler is working with vs native at the resolution the upscaler was targeting.

They DO decrease latency by allowing your system to render more frames
So, if you use DLSS/FSR2.0 you'll get lower latency simply because your system will render more frames than it's able to render when you dont use DLSS/FSR2.0

Frame generation also renders more frames by interlacing fake frames in-between rendered frames, but this is introducing increased latency. Which may be inconsequential in certain types of games
 
Back
Top Bottom