The main effect is to reduce the input-to-display latency in GPU limited scenarios.
The only way to benchmark this directly is with something capable of measuring this delay. Typically, reviewers will point a high-speed camera at some clearly visible input source and the display, then measure the number of frames (of video) it takes between a control being actuated and the effect appearing on screen.
The best case scenario improvement in latency between off and ultra is three frames. This requires a wholly GPU limited scenario (so the queue is always full), and no other features that would hold frames longer (any form of vsync). At 60 fps this would be a ~50ms latency savings (which could well be two thirds of the total latency)...certainly significant, but even that is borderline as far as the threshold of perception for many people. The total latency improvement declines as frame rate increases as each frame takes less time to render and display.
Any latency improvement will be beneficial in competitive action gaming. Even if you shave 5% off your reaction times, it will mean statistically superior performance in the long run...more close calls will be decided in your favor, the amount of lead you require will be reduced (and thus accuracy will improve), etc. However, it takes good reflexes and a well trained eye to consciously perceive any of this at higher framerates and it's almost wholly meaningless for non-time sensitive scenarios, or slower games, unless you have such a huge latency problem that your controls/cursors feel drifty.
Any impact on frame rate performance is a tangential side-effect that is only worth mentioning because the total absence of it would, lacking other information, imply that the setting hasn't been accepted and is not functional in the title in question. Even then, this will be uncertain in titles that perform well and are never CPU limited...a render queue doesn't do much of anything here, except maybe smooth out frame intervals, which would be far too minor to detect at high fps anyway without an accurate analysis and comparison of the render and present times for a significant number of frames.
The reason features like this languished in obscurity for fifteen years is because they don't improve frame rates and aren't easy to benchmark. Leaving the render queue alone is how you win benchmarks and have the most success marketing most products. Only with the rising popularity of esports and a slowdown in raw performance wars have other performance concerns become mainstream enough for real attention to be drawn to these features. Everyone is looking for any marketing edge they can get, and when the raw framerate comparisons are spent, feature comparisons become the deciding factors. Queue depth tweaks that have always been available, one way or another, are an easy checkbox to fill.