Latest Nvidia Graphios Driver - Low Latency Option

The latest Nvidia graphics driver (436.02) may be of particular interest to VR players trying to squeeze the most juice out of their cards. Here's the announcement:

140944


To use the feature go to the Nvidia control panel 3D settings and choose the low latency option. That's it. I'm still testing the driver, but so far seems to work as advertised. If you decide to try the driver and see a difference in frame rate or graphics quality in ED by all means drop a post here. Some B4 and after FPS or latency comparisons would be really good. :)
 
Last edited:
NVIDIA's own article mentions that they've had this option in their drivers for a while...though they neglect to mention that, up until a few years back, it used to be possible to set the same zero queue depth that this new 'ultra-low latency mode' re-enables.

Anyway, I've been using 436.02 since it showed up and haven't had any issues with it, other than not being able to uncheck GFE and thus having to install the driver via device manager to get around that...something which they've already hot-fixed.

The low latency modes seem to work as advertised, though one would be hard pressed to notice the difference between 1 and 0 pre-rendered frames as far as input latency goes. I do see a mild frame rate hit to some more demanding areas in ED, but nothing serious. I wouldn't recommend the ultra-low latency/zero pre-rendered frames for anyone that is CPU limited.

Haven't tried it with VR yet and I'm not sure if it can override the 'virtual reality pre-rendered frames' option, which still only goes down to "1".
 
  • Like (+1)
Reactions: A D
NVIDIA's own article mentions that they've had this option in their drivers for a while...though they neglect to mention that, up until a few years back, it used to be possible to set the same zero queue depth that this new 'ultra-low latency mode' re-enables.

Anyway, I've been using 436.02 since it showed up and haven't had any issues with it, other than not being able to uncheck GFE and thus having to install the driver via device manager to get around that...something which they've already hot-fixed.

The low latency modes seem to work as advertised, though one would be hard pressed to notice the difference between 1 and 0 pre-rendered frames as far as input latency goes. I do see a mild frame rate hit to some more demanding areas in ED, but nothing serious. I wouldn't recommend the ultra-low latency/zero pre-rendered frames for anyone that is CPU limited.

Haven't tried it with VR yet and I'm not sure if it can override the 'virtual reality pre-rendered frames' option, which still only goes down to "1".

Not sure that's right. The feature was only introduced on 20 August 2019 and the "pre-rendered frames" option is still present in the control panel. The "low latency" option has been added, which suggests there is some difference between the two options or they compliment each other. The jury is still out. :)

The "pre-rendered frames" option is still at the default 1. I just enabled the "low latency" option by setting it to "ultra" and their seems to be a small improvement, although I'd be the first to admit that this may be wishful thinking.
 
Last edited:
"Low latency mode" replaces the old "maximum pre-rendered frames" setting. According to NVIDIA Profile Inspector, it's the same variable.

"Virtual reality pre-rendered frames" is a newer (than the original max pre-rendered frames") and separate option.
 
Last edited:
"Low latency mode" replaces the old "maximum pre-rendered frames" setting.

"Virtual reality pre-rendered frames" is a newer (than the original max pre-rendered frames") and separate option.

Interesting. My control panel only has the "virtual reality pre-rendered frames" option which is fixed. Maybe (as you say) the new feature simply exposes the old "max pre-rendered frames" to players who have not been using Nvidia Inspector. As to the relationship between the options, don't know. Can only hope that Nvidia is not pulling my leg. Be nice to see some empirical data around this. :)
 
Last edited:
It doesn't expose anything new. Prior to 436.xx the setting was still there, just under a different name.


It had been there for almost fifteen years. The new setting is actually less full featured than it originally was (it used to go from 0 to 8, then from 1 to 4, now does 3, 1, or 0).
 
So downloading GFE is now obligatory as of 436.02 ? Hmmm...
In the past I found GFE to cause direct conflicts between my finely tuned elite game settings and overriding them with what it felt they should be. As a result I’ve not used it for many years. In this regard has it been improved recently?

Flimley
 
So downloading GFE is now obligatory as of 436.02 ? Hmmm...
In the past I found GFE to cause direct conflicts with my finely tuned elite game settings. As a result I’ve not used it for many years. In this regard has it been improved recently?

Flimley

The initial release of 436.02 didn't allow GFE to be unselected in the custom install menu. However, I believe this was fixed. Regardless, even with the original 436.02 package, you can still simply extract the files and delete GFE, or manually install the driver via device manager.

Anyway, I'm not aware of any particular conflicts with it and ED, but I don't use it.
 
Thanks. I will find out tomorrow while checking out their website. Hopefully the tick box will now be operational. If not, manual deletion will commence.
GFE did indeed used to override graphic settings, causing conflict and much frustration. Stopped using if four years ago as a result, I ultimately considered it unnecessary extra software clogging up my PC.

Flimley
 
It doesn't expose anything new. Prior to 436.xx the setting was still there, just under a different name.


It had been there for almost fifteen years. The new setting is actually less full featured than it originally was (it used to go from 0 to 8, then from 1 to 4, now does 3, 1, or 0).

Sorry, didn't mean empirical data from you. I believe you. Meant data around the low latency feature itself to see what if any difference it makes. :)
 
Sorry, didn't mean empirical data from you. I believe you. Meant data around the low latency feature itself to see what if any difference it makes. :)

You can find some input latency benchmarks on other incarnations of these queue depth limiters (e.g. https://www.techspot.com/article/1879-amd-radeon-anti-lag/), but the easiest way to tell if it's working or not, without tools to measure actual input-to-display latency, is to run some benchmarks. Generally, performance at very low render queue depths will be slightly worse, both in raw frame rate and in frame interval consistency, especially in lower FPS and CPU limited scenarios.

Actually perceiving the difference in latency can be tricky. Unless one's frame rate is relatively low and purely GPU limited (allowing the render queue to stay filled), consciously perceiving the difference is unlikely, at least between on/1 and ultra/0. The difference between off/auto/3 and ultra/0 is quite noticeable in some games though. However, unless it causes problems, I'd leave it enabled on all supported (most DX9/11 titles, including ED) because you don't need to perceive reduced input latency for it to matter...your aim and other time sensitive inputs will be better.

If you've got a game where there in enough latency that it's really obvious, something is probably wrong. Reducing the queue depth may help, but there could also be other underlying issues to address.

Anyway, I'm sure some third party will explicitly review the renewed NVIDIA setting before the week is out.
 
  • Like (+1)
Reactions: A D
I’m using the oculus rift and will add my PC specs below.
436.02 Now installed... have been experimenting with latency set to “off” and “on” then “ultra” in various places and situations in ED. But simply cannot determine any difference in GPU load or frame rate.
Anyone tried this yet?

System specs
Processor:INTEL Quad Core I7 4790K
Memory:16GB Corsiar Ram 2400Mhz Memory
Graphics: MSI GTX1080 gaming X
Harddrive:120GB Solid State Drive
Powersupply: XFX XTR Series 750W 80 PLUS Gold Certified Fully-Modular
12mbps broadband connection.


Flimley
 
I’m using the oculus rift and will add my PC specs below.
436.02 Now installed... have been experimenting with latency set to “off” and “on” then “ultra” in various places and situations in ED. But simply cannot determine any difference in GPU load or frame rate.
Anyone tried this yet?

System specs
Processor:INTEL Quad Core I7 4790K
Memory:16GB Corsiar Ram 2400Mhz Memory
Graphics: MSI GTX1080 gaming X
Harddrive:120GB Solid State Drive
Powersupply: XFX XTR Series 750W 80 PLUS Gold Certified Fully-Modular
12mbps broadband connection.


Flimley

I didn't see any difference and Morbad's comments explain why it is unlikely to make any noticeable difference, but I did some 3D benchmarking tests with the low latency setting "off" and on "ultra" (with all other settings identical). The overall result was a tiny bit better with the setting on, but no big improvement as the Nvidia ad would imply.

Still, any gain is better than no gain, so I have left it on.
 
Last edited:
The main effect is to reduce the input-to-display latency in GPU limited scenarios.

The only way to benchmark this directly is with something capable of measuring this delay. Typically, reviewers will point a high-speed camera at some clearly visible input source and the display, then measure the number of frames (of video) it takes between a control being actuated and the effect appearing on screen.

The best case scenario improvement in latency between off and ultra is three frames. This requires a wholly GPU limited scenario (so the queue is always full), and no other features that would hold frames longer (any form of vsync). At 60 fps this would be a ~50ms latency savings (which could well be two thirds of the total latency)...certainly significant, but even that is borderline as far as the threshold of perception for many people. The total latency improvement declines as frame rate increases as each frame takes less time to render and display.

Any latency improvement will be beneficial in competitive action gaming. Even if you shave 5% off your reaction times, it will mean statistically superior performance in the long run...more close calls will be decided in your favor, the amount of lead you require will be reduced (and thus accuracy will improve), etc. However, it takes good reflexes and a well trained eye to consciously perceive any of this at higher framerates and it's almost wholly meaningless for non-time sensitive scenarios, or slower games, unless you have such a huge latency problem that your controls/cursors feel drifty.

Any impact on frame rate performance is a tangential side-effect that is only worth mentioning because the total absence of it would, lacking other information, imply that the setting hasn't been accepted and is not functional in the title in question. Even then, this will be uncertain in titles that perform well and are never CPU limited...a render queue doesn't do much of anything here, except maybe smooth out frame intervals, which would be far too minor to detect at high fps anyway without an accurate analysis and comparison of the render and present times for a significant number of frames.

The reason features like this languished in obscurity for fifteen years is because they don't improve frame rates and aren't easy to benchmark. Leaving the render queue alone is how you win benchmarks and have the most success marketing most products. Only with the rising popularity of esports and a slowdown in raw performance wars have other performance concerns become mainstream enough for real attention to be drawn to these features. Everyone is looking for any marketing edge they can get, and when the raw framerate comparisons are spent, feature comparisons become the deciding factors. Queue depth tweaks that have always been available, one way or another, are an easy checkbox to fill.
 
Still, any gain is better than no gain, so I have left it on.
I agree. And well done for doing the benchmark tests. I only changed the settings in known GPU heavy load spots such as stations and asteroid rings. To observe by eye GPU load and frame rate etc.

Watching some babbling YouTube videos yesterday, one guy found the FPS to be even worse with certain titles with latency set to ultra. 4.20 testing begins....
Source: https://youtu.be/sBQPGr87w6Q


Overall, and from what I can determine it’s mostly hot air.

Flimley
 
Last edited:
Any latency improvement will be beneficial in competitive action gaming. Even if you shave 5% off your reaction times, it will mean statistically superior performance in the long run...more close calls will be decided in your favor, the amount of lead you require will be reduced (and thus accuracy will improve), etc. However, it takes good reflexes and a well trained eye to consciously perceive any of this at higher framerates and it's almost wholly meaningless for non-time sensitive scenarios, or slower games, unless you have such a huge latency problem that your controls/cursors feel drifty.

if i understand this correctly, input-to-display online latency will include scene update too, and possibly even network round trips? if this option has to make a difference by optimizing purely on the gpu side it surely means the gpu is way too stressed for competitive action anyway.
 
if i understand this correctly, input-to-display online latency will include scene update too, and possibly even network round trips?

The render queue is only part of input-to-display latency, correct. Still, it can be a significant part, and in competitive scenarios, any improvement is an improvement.

if this option has to make a difference by optimizing purely on the gpu side it surely means the gpu is way too stressed for competitive action anyway.

Level of GPU stress doesn't imply this. It's generally far better to be GPU limited than CPU limited and I have plenty of games where I am wholly GPU limited well into triple digit frame rates, with even a fairly modest CPU.
 
Top Bottom