There is scaling/processing lag on many displays that is entirely separate from, and in addition to, any pixel response, which is in turn in addition to any time the system and the renderer take to prepare the frame in the first place.
My current primary display (a BenQ BL3200PT monitor) has about a one refresh/frame (~17ms) processing delay from the time the signal hits the display until the time it can even start to draw the image to the panel. Pixel response then depends on the degree of change being made to a pixel (certain transitions are faster than others), the scanline that pixel is on (top of the screen is refreshed first, even in LCDs), and any response time compensation setting used (which can dramatically improve pixel response, but may cause overshoot ghosting).
Most monitors and modern TVs with a 'game mode', or options to otherwise limit the amount of processing, will have very low input latency (anywhere from zero to one frame worth), other than the pixel response proper. However, some TVs can have as many as four or five full frames of input latency (upwards of 80ms in the worst cases) which will destroy their utility for gaming. Fortunately this is rather rare today.
Any sort of frame rate cap, including vsync, is generally bad for input latency as it increases the time it takes to draw frames. Normally this is not a big deal, unless you allow frame rate to dip too low, or the queue depth is too large (this can be adjusted in NVIDIA's driver settings, or in the registry values for AMD's drivers, though the latter is usually well optimized by default). The real issue with vsync is that any missed refresh causes the duplication of a frame, which can present as stutter in addition to automatically making the minimum latency hit one full frame (worse, if a queue has built up). Even with triple buffering or the like, you'll be bouncing between frames that are displayed for ~17ms and ones that are displayed for ~33ms, if you can't maintain a frame rate at least that.
Also, if you are trying to avoid tearing, frame rate caps aren't a substitute for synchronization. You'll almost never be able to set the frame rate cap to exactly the refresh rate of the display and even if you could, you would have no way to control when the refresh starts relative to the frame, or vice versa. You can use frame rate limiters in conjunction with vsync to limit peak queue depth while removing tearing, but most in-game limiters aren't accurate enough to completely eliminate stutter with this.
Personally, I prefer an uncapped frame rate, no vsync at all, and the lowest queue depth possible, as this is the lowest possible input latency, with no chance of artificially induced stutter. Tearing is generally not very perceptible either as the frame rate tends to be much higher than the refresh rate, minimizing the differences between one frame and the next (and thus what is above or below the tear line). There are some exceptions, but they are generally ones where vsync would be worse.
If tearing really bothers you, get a VRR display, use RTSS to cap frame rate to a hundredth of a frame below your true refresh rate (
https://www.testufo.com/refreshrate) and turn vsync on, or use fast(NVIDIA)/enhanced(AMD) synch and set your "max frames render ahead" (for NVIDIA cards) to "1".