Page 2 of 2 FirstFirst 12
Results 16 to 23 of 23

Thread: Does Monitor Size Have an effect on Ship Manoeuvrability?

  1. #16

  2. #17
    Originally Posted by rlsg View Post (Source)
    Actually it is not, and there are ALOT of variables wrt general software design that have impact in that regard.

    15ms lag would be around one full refresh of a proper 60Hz TV (~16.6ms refresh cycle time) whether it is V-Synced or not. 4K Monitors can vary from 1ms to 8ms, 1ms is pretty much the norm for gaming 1080p monitors IME. However, this all moot since the OP has obviously resolved their particular issue.
    Yeh, I corrected you because I don't know what I'm talking about. Uh-huh, yeh, people do that a lot. As for that second paragraph...lol. Just, wow, and you were doing so well.

  3. #18
    There is scaling/processing lag on many displays that is entirely separate from, and in addition to, any pixel response, which is in turn in addition to any time the system and the renderer take to prepare the frame in the first place.

    My current primary display (a BenQ BL3200PT monitor) has about a one refresh/frame (~17ms) processing delay from the time the signal hits the display until the time it can even start to draw the image to the panel. Pixel response then depends on the degree of change being made to a pixel (certain transitions are faster than others), the scanline that pixel is on (top of the screen is refreshed first, even in LCDs), and any response time compensation setting used (which can dramatically improve pixel response, but may cause overshoot ghosting).

    Most monitors and modern TVs with a 'game mode', or options to otherwise limit the amount of processing, will have very low input latency (anywhere from zero to one frame worth), other than the pixel response proper. However, some TVs can have as many as four or five full frames of input latency (upwards of 80ms in the worst cases) which will destroy their utility for gaming. Fortunately this is rather rare today.

    Any sort of frame rate cap, including vsync, is generally bad for input latency as it increases the time it takes to draw frames. Normally this is not a big deal, unless you allow frame rate to dip too low, or the queue depth is too large (this can be adjusted in NVIDIA's driver settings, or in the registry values for AMD's drivers, though the latter is usually well optimized by default). The real issue with vsync is that any missed refresh causes the duplication of a frame, which can present as stutter in addition to automatically making the minimum latency hit one full frame (worse, if a queue has built up). Even with triple buffering or the like, you'll be bouncing between frames that are displayed for ~17ms and ones that are displayed for ~33ms, if you can't maintain a frame rate at least that.

    Also, if you are trying to avoid tearing, frame rate caps aren't a substitute for synchronization. You'll almost never be able to set the frame rate cap to exactly the refresh rate of the display and even if you could, you would have no way to control when the refresh starts relative to the frame, or vice versa. You can use frame rate limiters in conjunction with vsync to limit peak queue depth while removing tearing, but most in-game limiters aren't accurate enough to completely eliminate stutter with this.

    Personally, I prefer an uncapped frame rate, no vsync at all, and the lowest queue depth possible, as this is the lowest possible input latency, with no chance of artificially induced stutter. Tearing is generally not very perceptible either as the frame rate tends to be much higher than the refresh rate, minimizing the differences between one frame and the next (and thus what is above or below the tear line). There are some exceptions, but they are generally ones where vsync would be worse.

    If tearing really bothers you, get a VRR display, use RTSS to cap frame rate to a hundredth of a frame below your true refresh rate (https://www.testufo.com/refreshrate) and turn vsync on, or use fast(NVIDIA)/enhanced(AMD) synch and set your "max frames render ahead" (for NVIDIA cards) to "1".

  4. #19
    Originally Posted by Aashenfox View Post (Source)
    Yeh, I corrected you because I don't know what I'm talking about. Uh-huh, yeh, people do that a lot. As for that second paragraph...lol. Just, wow, and you were doing so well.
    Your lack of knowledge and lack of knowing what you don't know just proves that you don't know what you are talking about. And yes - people do that a lot on the internet.

    I often see certain things professed from certain quadrants of the gaming community that make so many assumptions about the software design it is ridiculous. In the old days when most games were single threaded V-Sync would have had a greater impact, in the current days of multi-threading there are various software patterns that can mitigate such concerns - I have used such patterns in R/T man-in-the-loop simulations as far back as 2000 and the patterns in question are probably older than that.

    WRT HD TV refresh - a lot are actually interlaced so a 60Hz TV in such a case would have a 30Hz effective refresh (50Hz/25Hz if adhering to legacy PAL frequencies). There are however at least some TVs that are capable of 1080p60 and in those cases 15ms would be a near enough a single screen refresh of latency. A display can not update faster than it's refresh rate regardless of the frame rate a game is allegedly running at, if software tries to then you get tearing and/or other artefacts (one of the primary reasons for V-Sync in the first place).

  5. #20
    Originally Posted by Morbad View Post (Source)
    Any sort of frame rate cap, including vsync, is generally bad for input latency as it increases the time it takes to draw frames.
    Not quite accurate - V-Sync holds up the render pipeline till the buffer it is has written to is displayed. While that can seem to induce "additional" latency depending on various factors it need not. Triple buffering paired with V-Sync can mean that a render pipeline can effectively free run with only the last complete frame being presented to the display for processing (what nVidia refer to as Fast V-Sync mode effectively).

    Tearing can induce simulator sickness as well as generally looking ugly and potentially having an adverse effect on accurate targeting.

  6. #21
    Originally Posted by Morbad View Post (Source)
    Any sort of frame rate cap, including vsync, is generally bad for input latency as it increases the time it takes to draw frames.
    In games such as FPS shooters I'm always running a frame rate cap at the max monitor refresh rate and G Sync on and I never get any noticeable input lag. I'm sensitive to such lag and notice it at fairly low levels. I haven't run without G Sync and a framerate cap... Might be interesting to do so...

    For any game like ED the input lag would really have to be up there to notice it and then only in PvP combat.

  7. #22
    Originally Posted by Pville_Piper View Post (Source)
    In games such as FPS shooters I'm always running a frame rate cap at the max monitor refresh rate and G Sync on and I never get any noticeable input lag. I'm sensitive to such lag and notice it at fairly low levels. I haven't run without G Sync and a framerate cap... Might be interesting to do so...

    For any game like ED the input lag would really have to be up there to notice it and then only in PvP combat.
    Input latency doesn't need to be perceptible to matter; any and every objective reduction could potentially be the deciding factor in a time sensitive contest. The sum of all possible reductions to latency will almost certainly be significant, even if each individual improvement is quite minor.

    If had a VRR panel I'd likely be using vsync with a frame rate cap slightly below the VRR limit as this would eliminate tearing while adding almost no latency. However, I don't have a VRR display and since I find the occasional stutter introduced by standard vsync to be more noticeable than the degree of tearing I see with it off, I'm not willing to add any latency for no real improvement to my subjective experience.

  8. #23
    Originally Posted by Aashenfox View Post (Source)
    Input lag is the technical term for it. Lock max fps to 60, and disable vsync in game, you should be fine provided your pc can kick out a solid 60fps. I use a tv on my sim rig as well, it's not ideal but with that adjustment at least I can now shoot. Make sure the TV is in gaming mode as well, to give fastest response.
    ^This works for me on a 50" 4k. 1080ti Gtx though...

Page 2 of 2 FirstFirst 12