Combating colour banding on 1440p monitors

So last year I decided to get a new monitor, and through reviews and stats I settled on the Dell S2417DG. Unfortunatly it didn't quite offer a good experience.

148934


Sadly ED seems to be the game most affected by this, since the problem only occurs on dark gradients. On other games the performance of the monitor is up to par.

I've tried googling the issue but it seems a bit too much of an edgecase to offer any help in choosing a replacement that won't nessecarily have the same issue on it.

I temporarily worked around the issue with reshade, but I kinda want to solve the underlying issue rather than blurring the problem out with a PP filter.

Fellow commanders, what are your recommendations?
 
That looks dreadful. Does it change with monitor mode?

I have absolutely no evidence of this on my (IPS) Benq monitor (GW2765) at 2560 x 2440 driven by a GTX 1700 8GB - my Elite runs at Ultra preset, nothing "tweaked". Good luck getting it sorted.
 
I see that a significant number of the reviews on amazon UK mention colour banding. This was not unusual for older cheaper TN panels but I'm surprised to see it on a modern 400+GBP Dell.

Have you tried one of the suggested workarounds Nvidia Control Panel > Change Resolution > Output dynamic range > Limited ?

I'm sure you've tried it but adjusting Gamma might help a little (though it's just masking the issue), either in-game or from Nvidia control panel (apparently the monitor has no gamma control which is weird).
 
I see that a significant number of the reviews on amazon UK mention colour banding. This was not unusual for older cheaper TN panels but I'm surprised to see it on a modern 400+GBP Dell.

Have you tried one of the suggested workarounds Nvidia Control Panel > Change Resolution > Output dynamic range > Limited ?

I'm sure you've tried it but adjusting Gamma might help a little (though it's just masking the issue), either in-game or from Nvidia control panel (apparently the monitor has no gamma control which is weird).

Hi, thanks for taking an intrest!

Setting the colour range to Limited does mitigate the issue mostly, but it does remove a lot of the clarity of the colours, making the whole screen look overly bright/gray and losing a lot of colour dynamics. Between the harshness of the banding issue and the loss of colour quality I'd rather take the banding.

The picture was taken after tweaking gamma and colour profiles in Windows, I'd say it's as good as I can get it without doing the reshade fix with the 'Deband' filter.
 
The S2417DG is TN panel with 6-bit + FCR color. Frankly, image fidelity was never going to be a strong point with it and 6-bit panels are fairly notorious for banding issues.

I'd go back to 8-bit RGB color and remove any OS/driver color/gamma profiles, then manually calibrate the display from scratch using any of a number of online guides (with a browser that doesn't screw with profiles itself...e.g. Firefox, not Chome) as reference points. Once you have the contrast, black levels, and gamma references looking as close as you can get with nothing other than the monitor's OSD settings, then you can fine tune further with software.

 
Is possible that " game mode " mode is active? where the brightness it set to maximum to offer a better view of the games? It could be explain the problem
 
The S2417DG is TN panel with 6-bit + FCR color. Frankly, image fidelity was never going to be a strong point with it and 6-bit panels are fairly notorious for banding issues.

I'd go back to 8-bit RGB color and remove any OS/driver color/gamma profiles, then manually calibrate the display from scratch using any of a number of online guides (with a browser that doesn't screw with profiles itself...e.g. Firefox, not Chome) as reference points. Once you have the contrast, black levels, and gamma references looking as close as you can get with nothing other than the monitor's OSD settings, then you can fine tune further with software.


I'm afraid I already tweaked the panel to the best of my abilities, using the tests you provide. What I'm looking more for now is not to fall into the same trap I fell in last time, getting a monitor that promises an 8 bit panel but is in reality 6 bit.

By 'going back to 8-bit RGB, do you mean the Colour range setting? I can choose between Full and Limited and limited looks like I didn't pay $400 for a monitor.

Perhaps I've painted myself in a corner by needing a 24 inch monitor with 1440p resolution and a refresh rate over 60?
 
By 'going back to 8-bit RGB, do you mean the Colour range setting?

Yes, RGB full. That's what most monitors and PC content are meant for. Though, YUV 444 limited can be a valid alternative if you need to use it.

Perhaps I've painted myself in a corner by needing a 24 inch monitor with 1440p resolution and a refresh rate over 60?

The issue is mostly that when it was released (three years ago) TN panels were the only way to get high refresh rate and NVIDIA's G-SYNC was the hardware only version that required a $100+ premium over comparable displays.

And yes, it's still hard to find 24" 1440p displays....almost all the IPS ones are 27".
 
The difference between 6 bit and 8 bit color resolution is quite large. With 8 bit RGB you get 256 different levels of R, G and B in each pixel, 16.7 million combinations, which way back was called "true color". Not that the white balance was "true", but because the human eye isn't able to distinguish banding between say level 223 and level 224. 6 bit only gives you 64 levels, which is clearly visible to most people. Edit: White balance is mostly a concern if you use the display for color sensitive work like color grading. Otherwise the display can be pretty yellow or blue, but your brain is excellent at compensating that.

I've had a 6 bit TN display, that I switched to a dirt cheap 8 bit IPS with much lower frame rate, simply because I couldn't live with the banding. If you wan't to keep your display, focus on it's refresh rate, or sell it and get an IPS display, with no banding, better color reproduction and a much wider viewing angle. On the TN displays, especially the older ones, moving your head a few inches up or down could also cause horrible banding, and even with your eyes in the center of the screen, you could see a clear difference between the top and the bottom of the display.

I'm not a big fan of TN as you might have guessed by now. Btw. Not all IPS displays are equally good, but you can get most of them pretty welly calibrated by removing all the "enhancement" the display producers normally put into the software of the display. This proces depends on the manufacturer, but remove any fancy stuff enabled in any menu. The "range" typically crunches the black levels from the GPU, which can make banding worse, and any sharpening is a no go.
 
Last edited:
Yes, RGB full. That's what most monitors and PC content are meant for. Though, YUV 444 limited can be a valid alternative if you need to use it.



The issue is mostly that when it was released (three years ago) TN panels were the only way to get high refresh rate and NVIDIA's G-SYNC was the hardware only version that required a $100+ premium over comparable displays.

And yes, it's still hard to find 24" 1440p displays....almost all the IPS ones are 27".
Just a minor detail: A YUV signal is always converted to RGB in the display, because the panel itself is RGB (the pixels). YUV is a leftover from back when we needed to add color to the monochrome TV signal.
 
Just a minor detail: A YUV signal is always converted to RGB in the display, because the panel itself is RGB (the pixels). YUV is a leftover from back when we needed to add color to the monochrome TV signal.

Some panels/scalers originally intended for broadcast/film content expect YCbCr input and some displays map colors differently when receiving a YCbCr 4:4:4 than they do with RGB limited, even if they should contain the same information.

In my anecdotal experience, when there is a visible difference between RGB limited and YCbCr/YUV 444, which is admittedly uncommon, the latter tends to look superior...so I usually recommend trying it first if full range RGB isn't viable, especially if the display is using HDMI.
 
Some panels/scalers originally intended for broadcast/film content expect YCbCr input and some displays map colors differently when receiving a YCbCr 4:4:4 than they do with RGB limited, even if they should contain the same information.

In my anecdotal experience, when there is a visible difference between RGB limited and YCbCr/YUV 444, which is admittedly uncommon, the latter tends to look superior...so I usually recommend trying it first if full range RGB isn't viable, especially if the display is using HDMI.
Speaking of anecdotes:

I was working as a color grader for more than 25 years, starting out back in the CRT days, and watching Sony's early LCD attempts going: "Is that a joke?". The thing that really taught me about cameras, monitors and color spaces were when I started doing astrophotography.

Nowadays a modern digital camera uses some sort of sensor, CCD or CMOS. Those are basically monochrome sensors counting the amount of photons hitting each pixel over the time of exposure. In most cameras, the sensor has tiny color filters in front of each pixel (the Bayer Mask) and those filters are red, green and blue. Therefore the raw signal read from the chip contains RGB information. That information is either saved as RGB or converted to YUV, but in the latter case, you cannot reproduce the color information completely when converting it back to RGB in the computer used for editing etc. YUV can make it easier to compress the information, because Y is luminans while U and V (combined with Y) contains the color information. The average human eye is not able to see the difference between 4:4:4 and 4:1:1, especially in 4K, but if you need to chroma key or color grade, then 4:1:1 is pure destruction of valuable information. You almost always need to color grade, because photographers neither understand color spaces nor do they seem to have time for white balancing the camera.

Most video file formats use YUV exactly because it's more compressible without visual loss, but uncompressed 4:4:4 is really just saying that all three channels are at full resolution at a given sample rate. Therefore, the image information is not "better" than the original RGB image from the camera. Quite the opposite because conversion from one 3D color space to another is always a mess, and literally impossible in most cases once you start looking at the math behind it. (AFAIR LAB to RGB is possible).

By far the best way to reproduce what colored light goes through the optics of the camera onto a display (cinema projectors included) is to stay in RGB, but if it becomes too simple and too easy, RED, Sony and all the rest of the camera manufacturers wouldn't be able to sell those magic SSDs etc. at the price that they do. For broadcast HDTV, the REC 709 colorspace is so similar to sRGB that you can use any decent (calibrated) IPS monitor to get a good representation of the image data from the camera. sRGB is limited in the chroma range (Gamut), but so is the TV at the consumer, and honestly you rarely use colors that are saturated outside sRGB. Also nobody seems to calibrate monitors anymore, so it all doesn't really matter that much. The brain compensates most of the issues anyways. :)
 
Last edited:
conversion from one 3D color space to another is always a mess

Which is why one often has to test what settings work best on a given display...there is often no telling what it's actually doing without seeing it for one's self.

I have a 250 dollar 55" 4k TV...you should see how it mangles color space conversions. Even with access to the hidden service menus, I can only get a usable image from my HTPC with YCbCr 444 limited and if I have it calibrated so I can actually see everything, the Gamma slope is about 1.5 when I'd like ~2.2. Still looks better than the out of box settings for TVs that are ten times the cost, but I have no idea what the OEM was thinking when they programmed the firmware for it...or if they even tried to account for the panel they ended up pairing with their scaler.

On a side note, I do try to avoid 4:2:2 or 4:2:0 on NVIDIA hardware because conversion has a performance hit that isn't seen on other modes. RGB limited and YCbCr 444 limited have no performance hit, but 420 costs about 5% of the frame rate on most NVIDIA parts until at least Pascal (haven't tried it with Turing yet).
 
Which is why one often has to test what settings work best on a given display...there is often no telling what it's actually doing without seeing it for one's self.

I have a 250 dollar 55" 4k TV...you should see how it mangles color space conversions. Even with access to the hidden service menus, I can only get a usable image from my HTPC with YCbCr 444 limited and if I have it calibrated so I can actually see everything, the Gamma slope is about 1.5 when I'd like ~2.2. Still looks better than the out of box settings for TVs that are ten times the cost, but I have no idea what the OEM was thinking when they programmed the firmware for it...or if they even tried to account for the panel they ended up pairing with their scaler.

On a side note, I do try to avoid 4:2:2 or 4:2:0 on NVIDIA hardware because conversion has a performance hit that isn't seen on other modes. RGB limited and YCbCr 444 limited have no performance hit, but 420 costs about 5% of the frame rate on most NVIDIA parts until at least Pascal (haven't tried it with Turing yet).
What I found useful was to have some sort of known signal from the signal source (GPU, TV tuner, whatever), and then calibrate the monitor using that known signal. So if I was color grading in Resolve or Final Cut, I took, say, a 50% grey (RGB), calculated the expected luminans (in Candela) from the given color space, and then adjusted the gamma curves for R, G and B accordingly. Today if I take two different IPS displays and put them side by side, out of the box using "neutral" settings, they perform damn well color wise. Especially now that they are LED backlit, meaning that they drift less. Also I've been staring at 24 and 25 fps most of my life, so even just 60 fps is a luxury, and I find it difficult to tell much of a difference from 60 to 90 fps :D

I saw Linus try to "scientifically" prove that frame rates above 200 is an advantage for gaming, and I'm not a pro gamer, but I do think that most people wouldn't be able to notice if their frame rate dropped from 144 to 90. Not in the heat of the battle, even though they could use it as an excuse when they died. It's kind of like resolution. I have personally tried to resample 2K down to PAL (720x576) and then resample it back to 2K. Doing an A/B compare of the two in a cinema, nobody could see the difference, though some said that the low res material looked less noisy. I dare say it did, since it contained 8 times less information. We're talking serious professionals including technical supervisors, telecine operators, photographers, post producers etc.
 
Last edited:
Today if I take two different IPS displays and put them side by side, out of the box using "neutral" settings, they perform damn well color wise.

This hasn't been my experience, unless they are explicitly advertised as being

I just returned a pair of fairly high-end LG displays and despite being exactly the same models and revisions, I could not even get them to look the same as each other, even if I discount the defects I returned them for.

I saw Linus try to "scientifically" prove that frame rates above 200 is an advantage for gaming, and I'm not a pro gamer, but I do think that most people wouldn't be able to notice if their frame rate dropped from 144 to 90. Not in the heat of the battle, even though they could use it as an excuse when they died.

Not being conciously aware of the extra information you're recieving and it not being beneficial are very different things.

Some of the more readily demonstrable tests are the frame rate and optical illusion tests here: https://www.testufo.com/

Anyway, we can see extremely short duration flashes of light and extract meaningful motion data from a handful of frames delivered in a small fraction of a second. Being able to identify the sillouette of an aircraft that's visible for a hundreth of a second, or tell the trajectory of a grenade thrown past one's FoV are common action game scenarios where even several hundred frames per second could be beneficial.

The ultimate goal is 1000+fps, because that about when sample and hold displays stop showing readily identifiable improvements.

I have personally tried to resample 2K down to PAL (720x576) and then resample it back to 2K. Doing an A/B compare of the two in a cinema, nobody could see the difference, though some said that the low res material looked less noisy. I dare say it did, since it contained 8 times less information. We're talking serious professionals including technical supervisors, telecine operators, photographers, post producers etc.

This is heavily content dependent.
 
This hasn't been my experience, unless they are explicitly advertised as being

I just returned a pair of fairly high-end LG displays and despite being exactly the same models and revisions, I could not even get them to look the same as each other, even if I discount the defects I returned them for.



Not being conciously aware of the extra information you're recieving and it not being beneficial are very different things.

Some of the more readily demonstrable tests are the frame rate and optical illusion tests here: https://www.testufo.com/

Anyway, we can see extremely short duration flashes of light and extract meaningful motion data from a handful of frames delivered in a small fraction of a second. Being able to identify the sillouette of an aircraft that's visible for a hundreth of a second, or tell the trajectory of a grenade thrown past one's FoV are common action game scenarios where even several hundred frames per second could be beneficial.

The ultimate goal is 1000+fps, because that about when sample and hold displays stop showing readily identifiable improvements.



This is heavily content dependent.
True. With a few comments.

There's always the "Monday" product, but remember I used to live with CRTs. They were only manufactured on Mondays, and they all left the manufacturer with different personalities.

I remember when video was analog, and it had drop outs, which were often just a single scan line of the image only visible for one 50th of a second. We used to look for those "drops" and use a paintbox to paint them away on something like TV commercials. I simply couldn't stand watching the news, because they used lousy worn down cameras with a lot of drops, and they didn't have "time" ($) to paint them away. When I asked friends and family how they could continue watching it, they we're like: "What on Earth are you talking about?" and even when I described the problem, they still couldn't see it.

There was also the time where we tried adding a sign saying "Please send cash to ...", for a single frame (one 24th of a second) into a movie. We didn't get any money (of course, sublimation is kind of a myth), but more importantly nobody noticed the annoying sign flashing regularly during the movie. I kid you not! It might be that a gamer has faster perception than the average movie goer, I'll happily believe that, but it still puzzles me, and says something about our perception being very individual. Don't know if you've noticed it but our dear Yamiks likes putting single frame images in his videos, and I've seen comments where old fans of him notice it after a long time. I know FPS is different from ED, but there must be some PvP'ers that are focused on fps?

The material I downsampled was very sharp, with fine details like thin hairs and leaves on trees. We are used to seeing blurred images, because photographers love depth of field, meaning that only a small part of many images is sharp. Often the optics very much "create the image". Also I used a few "tricks", like using a Lanczos algorithm which adds overshoot (sharpness), and adding a drop of fine noise, after resampling to 2K. It didn't change the actual amount of image information. Today people are getting more critical than they were in the days of VHS, where any focus puller could be sleeping without anyone being able to see it in their "Home Theater". The actual resolution of VHS was close to 320x200 pixels. Everyone could see the improvement when the DVD arrived.

A 4K image on a TV like yours is truly magnificent to look at. I envy you that experience :)
 
Last edited:
The Dell S2417DG is known to have big banding problems.
The simplest solution is to use Reshade and its debanding filter. Or if you have an NVIDIA card, the last drivers can use reshade filters (with geforce experience), UI is easier to use than reshade.
It won't be perfect but users report a big improvement.
 
The difference between 6 bit and 8 bit color resolution is quite large. With 8 bit RGB you get 256 different levels of R, G and B in each pixel, 16.7 million combinations, which way back was called "true color". Not that the white balance was "true", but because the human eye isn't able to distinguish banding between say level 223 and level 224. 6 bit only gives you 64 levels, which is clearly visible to most people. Edit: White balance is mostly a concern if you use the display for color sensitive work like color grading. Otherwise the display can be pretty yellow or blue, but your brain is excellent at compensating that.

I've had a 6 bit TN display, that I switched to a dirt cheap 8 bit IPS with much lower frame rate, simply because I couldn't live with the banding. If you wan't to keep your display, focus on it's refresh rate, or sell it and get an IPS display, with no banding, better color reproduction and a much wider viewing angle. On the TN displays, especially the older ones, moving your head a few inches up or down could also cause horrible banding, and even with your eyes in the center of the screen, you could see a clear difference between the top and the bottom of the display.

I'm not a big fan of TN as you might have guessed by now. Btw. Not all IPS displays are equally good, but you can get most of them pretty welly calibrated by removing all the "enhancement" the display producers normally put into the software of the display. This proces depends on the manufacturer, but remove any fancy stuff enabled in any menu. The "range" typically crunches the black levels from the GPU, which can make banding worse, and any sharpening is a no go.

Lots of helpful answers here, thank you all for taking your time!

What I'm taking from this is that I need to adjust my specifications a bit, and go up to a 27 inch IPS panel?
Part of the problem seems to have been that the 8 bit colour depth advertised was in reality 6+FRC, how can I try and dodge that particular kind of marketing advertising and find a true 8 bit panel. Or should I just try to aim higher?

Looking at my local markets in Sweden these monitors seem to fit the description, is there anything in here I should stay away from or do they all seem good? (Apart from that LG monitor which seems to have 18 bits colour depth)

149592
 
Part of the problem seems to have been that the 8 bit colour depth advertised was in reality 6+FRC, how can I try and dodge that particular kind of marketing advertising and find a true 8 bit panel.
Seek reviews of the model you're interested in. Any reviewer worth anything will have the panel covered in detail.
 
What I'm taking from this is that I need to adjust my specifications a bit, and go up to a 27 inch IPS panel?

IPS or VA.

VA tend to have slower pixel response and thus more ghosting, but for something like ED, the extra contrast is worth it, IMO.

No reason to get a TN unless you need 240Hz on a budget...and most of those will not be focused toward image quality.

Part of the problem seems to have been that the 8 bit colour depth advertised was in reality 6+FRC, how can I try and dodge that particular kind of marketing advertising and find a true 8 bit panel. Or should I just try to aim higher?

Detailed specifications/datasheets and any competent review should reveal this stuff. In general, 6-bit + FCR is only used in total trash displays, or those that are targeting the highest refresh rates in their class.

Anyway, make sure the display and it's reviews are relatively recent. An old model that got glowing reviews back in the day could be quite poor by todays standards.

Looking at my local markets in Sweden these monitors seem to fit the description, is there anything in here I should stay away from or do they all seem good? (Apart from that LG monitor which seems to have 18 bits colour depth)

I wouldn't bother with any display that has a hardware G-SYNC module, as there is almost certainly a near identical model that costs much less, and G-SYNC via VESA Adaptive Sync (FreeSync) has closed the gap in performance considerably.

I wouldn't bother with any shoddy HDR implementations...which is pretty much anything not rated for at least HDR600 or higher.

Don't take any statements about bits per channel for granted if it comes from advertising materials or an e-tailer. Good review or technical datasheet/whitepaper is where you want to get that info from.
 
Back
Top Bottom