Nvidia hire FDev to pushing 40 series?!!...

Can anyone explain, how a solution, that is supposed to purely reduce load on the GPU, (DLSS) helps with an engine problem, that mainly revolves around CPU load?
i am not sure of how it works technically, but i think DLSS3 has some feature which reduces bottlenecks in cpu as well as gpu. i was working so only had the reveal on in the background but NV showed MSFS2020 a cpu bound game and with DLSS3 turned on it doubled the framerate.

That said i doubt it would happen (in Elite Oddy)............ but if it does then great.
 
Last edited:
i am not sure of how it works technically, but i think DLSS3 has some feature which reduces bottlenecks in cpu as well as gpu. i was working so only had the reveal on in the background but NV showed MSFS2020 a cpu bound game and with DLSS3 turned on it doubled the framerate.

That said i doubt it would happen............ but if it does then great.
DLSS 3 can "only" create synthetic frames, similar to asynchronous spacewarp and motion smooting in VR APIs. So yes, it can multiply the displayed frame rate, but the part of the rendering done by the CPU is unaffected. I.e. if your CPU only manages to prepare only 30 Fps for your GPU, DLSS can help to double that (by inserting artificial frames), but scripts, animations, player input and so on is being processed at 33,33ms (30 Fps).
 
Not that I know all that much about it, but it also upscales resolution, no?

Watched a video on it, and the quality of the inserted frames is impressive, and done in realtime no less.
 
So. almost 1.5 years of Odyssey.
With still garbage perfomance and without DLSS which could help with it.

After Nvidia preview their new 40 seris GPU with DLSS 3 magic I'm starting to understand...
What if... FDev wont add DLSS now because Nvidia hire them for add DLSS 3 later which forcing Elite players to buy new gpu if they want play smooth?!?!?!?
Would be ironic if it's true.

And a New CPU and a new 1000W+ PS

The people here can’t spend £10 on Odyssey nevermind drop £1200 on a graphics card.


😂😂😂

 
i didn't know people had overloaded CPU to be honest, mine doesn't go about 25% when in space i get 10%
GPU on the other hand is battered, my poor 2080ti is certainly working its little heart out when on foot or in stations in a ship

If you uncap frame rates and aren't seeing GPU load in the upper-90% range, there is a bottleneck somewhere else. It can be CPU proper, or it can be the CPU waiting on memory. Even if it is the CPU, it's only takes a single logical core maxing out intermittently to be capping frame rate.

DLSS 3 can "only" create synthetic frames, similar to asynchronous spacewarp and motion smooting in VR APIs. So yes, it can multiply the displayed frame rate, but the part of the rendering done by the CPU is unaffected. I.e. if your CPU only manages to prepare only 30 Fps for your GPU, DLSS can help to double that (by inserting artificial frames), but scripts, animations, player input and so on is being processed at 33,33ms (30 Fps).

Yes. You are still technically getting the frame rate multiplied without there needing to be more draw calls or engine overhead, but latency isn't improving.

Is DLSS2? (I don't know)

DLSS 3 implies the upscaling and asset replacement already part of DLSS 2; there is probably a way to use it without upscaling, but that would be a very niche use a DLSS2 upscaling probably has fewer trade offs, so most would only want DLSS 3 if DLSS 2 didn't cut it by itself. It also mandates NVIDIA Reflex to mask some latency that would otherwise be more problematic.

..the issue with DLSS3, in my opinion, is that it will require a 4000 series card, and I don't like the prices - nor the marketing strategy as it is today.

Same issue as earlier versions of DLSS. It's a proprietary solution that presumably leverages hardware only available on the GPU's supported by it. More likely this is not a hard limit (though frame interpolation is probably much costlier in performance terms than upscaling and selective asset replacement, probably prohibitively so on earlier hardware), but a marketing value-add.

Wouldn't buy a part specifically for it, but if I were on the fence, it might push me over.
 
DLSS 3 is including DLSS 2, so yes AI supported upscaling is still applied.

I am wondering if it will be supported for VR application. It could be a great benefit.
Indeed, not that I have much hope but I think it could probably be beneficial to any ED player using the new Nvidia cards. Looks too expensive in the immediate future, but as time goes by it will become cheaper.
 
Can anyone explain, how a solution, that is supposed to purely reduce load on the GPU, (DLSS) helps with an engine problem, that mainly revolves around CPU load?
Same as how NIS do, but NIS not good as DLSS and it's sideway solution.
 
With the cost of electricity going up in the UK most CMDR's in the UK won't be able to afford to run a 40 series GPU so I doubt it.
 
Well.
VR performance is a bit of a moot point, isn't it?

I mean, at the end of 2015, Horizons required a GTX980 (as advertised minimum) when the GTX980TI was the best card available

Now, in 2022, Odyssey requires at least an RTX3080TI (for proper vr experience) when we have 3090 and 3090 TI ontop of it.

So, VR in Elite always required top tier video cards
Expecting to proper play VR in Odyssey on older cards (like 1080TI) or non top tier cards like 3070 - is a bit unrealistic

So, yea, on 4000 series things will be way better and on 5000 series will be even better, although if the trend keeps going on, you may need a new power line contract to power those cards 😂

Greetings Commanders,

Many of you have been asking about the minimum system requirements for VR in Elite Dangerous, including Horizons, so here they are:

• OS: Windows 7/8/10 64 bit
• Processor: Intel Core i7-3770K Quad Core CPU or better / AMD FX 4350 Quad Core CPU or better
• Memory: 16 GB RAM
• Graphics: Nvidia GTX 980 with 4GB or better
• Network: Broadband Internet Connection
• Hard Drive: 8 GB available space

We are passionate about VR and Elite Dangerous is leading the way in cutting edge VR software development. This is what we consider to be a minimum spec to have a good experience on forthcoming consumer VR headsets.

As most of you are aware we currently support HTC Vive and the Oculus rift 0.5 SDK. We continue to work with Oculus on support for their more recent SDKs, and will let you know if and when there is more to announce.
 
So, VR in Elite always required top tier video cards
Expecting to proper play VR in Odyssey on older cards (like 1080TI) or non top tier cards like 3070 - is a bit unrealistic

Yeah I tend to agree with that but hopefully you can also understand as a 3080 owner I get a bit salty at the performance and experience we get in Odyssey.

So Babeltech Reviews did some VR tests incl. Elite Dangerous, they didn't say it was Odyssey so I assume Horizons:

...about a 75% unconstrained framerate increase over a 3090 is the long and short of it with an Index HMD, they are going to test the 4090 with the higher res G2 and VP2 for next week which will be interesting because if there is a CPU bottleneck going on its going to open up the graphics headroom with the better displays.
 
I'm just saying that in Horizons, people were starting to be happy with the VR experience when GTX1080TI appeared, less 2 years after Horizons was launched (with the mention that horizons at launch was very taxing on the hardware of the time - or so i read)

Based on these pieces of history-repeating, i would expect that Odyssey VR will run better on the high-end cards of the 4000 series (not sure if 4080/16gb will qualify - it will remain to be seen) - which coincidentally or not - would mean less than 2 years after Odyssey launched.
 
VR performance in Elite Odyssey with a Reverb G2. Oh boy. Even with a 3090 Ti I can't turn any of the pretty stuff up high and have to reduce quality/supersampling in various ways. We'll see how the 4090 is this weekend. Maybe VR will actually run smoothly in Odyssey and I can try some of those mythical ultra settings?! Probably not, but fingers crossed it's significantly better.
 
Last edited:
Back
Top Bottom