Combating colour banding on 1440p monitors

18 bit "color depth" is an overkill :)

Each pixel on a panel can be seen as three small dots of light. Red, green and blue:

149612


A "white" pixel is a combination of all three dots sending out "equal" amounts of light. Any color is a combination of the three dots emitting light at different levels. A "blue" pixel is a pixel where only the blue dot emits light, whereas the red and the green dots are turned off. If you look at the image above, the blue dots are not completely blue. They seem kind of purple. Therefore the bluest blue such a display can show isn't pure blue. Normally when speaking about color and displays we use the expression called Gamut. There is a limit to the amount of colors the human eye is able to see. You can picture it graphically like this:

149618


The rainbow like shape is a representation of all the colors. It's not exactly true, since we're looking at it on a display that isn't able to show all the colors. Notice the triangle(s). Each corner of the triangles could represent each of the RGB dots of a pixel on a display. Look at the top corner of the triangle. It's almost completely green, but looking at the colors behind the triangle you can see that there are colors that are even more green. The corners of the triangle shows the purest red, green and blue that the panel in the display is able to create, when the other colors are turned off. The colors inside the triangle are the colors it is possible for the display to reproduce, and those colors are called the gamut of the display.

It would be obvious that we all should buy displays with as large a Gamut as possible, but that is not the case. First of all a "wide gamut" display is only able to display the colors that is sent to the display from the computer, TV tuner etc. These are normally quite limited, and well within the gamut (triangle) of a normal sRGB panel. Secondly the colors outside sRGB gamut are too saturated for most people. Finally wide gamut displays are expensive, so go for sRGB unless you have very special needs.

Each colored dot in a pixel can be turned off or on, but they can also be turned up to say 50% of what they can maximally emit. If we imagine that all three dots are set to 3x50% then the pixel will look grey, at 3x100% it will look white and at 3x0% it will be black. This is where the bit color depth comes into play. Disregarding the colors for a second: If the panel has a bit depth of one, then it can only show black or white. If it has a bit level of two, then it can show black, white and two levels of grey. 4 levels in all because 2^2=4. A 6 bit panel can show 2^6=64. If you look at a greyscale gradient at different bit depths it becomes clear why low bit depths create banding:

149619

(click for large version)

Notice that on an 8 bit panel, you cannot see any banding, so any bit depth over 8 is money wasted.

So the best bang for the buck is an 8 bit IPS panel with a framerate you can live with. If you have more money you can increase the size of the panel, but you can also just move closer to the display, because it all comes down to how much of your field of view (FOV) that is covered by the display. With a TV, where you typically watch it together with other people you can't, but a 24" display seen from 70 cm distance, covers more of your FOV than a 30" display seen from a distance of 2 meters, (exaggerated to illustrate the point I'm making). A large display looks impressive, but you need to sit further away from it, or you'll have to turn you head a lot.

149627


That was a lot of info, but if you follow these guidelines, you'll get what you want and need. Be careful with the manufacturers specs. They don't exactly lie, but not telling the whole truth is kind of the same. Use your likeminded friends on the internet, and once you find a display that you like, ask them :)
 
Last edited:
Not wishing to derail the topic but - since the OP is looking for 144Hz refresh it got me wondering - is it really that important?

Unless you are into e-sport FPS then surely it isn't at all worth it?

I don't do FPS and I happily run E D, X4 and various flight sims whilst leaving v-sync at 60Hz on my IPS 2560x1440 27" monitor (from a GTX1070 8GB) with no ghosting and no apparent issues from just running at 60Hz. So really (and I am looking to upgrade monitor at some point) am I missing a trick, is there some issue I am unaware of (and if I am unaware of it is it then actually at all important)?
 
Not wishing to derail the topic but - since the OP is looking for 144Hz refresh it got me wondering - is it really that important?

Unless you are into e-sport FPS then surely it isn't at all worth it?

I don't do FPS and I happily run E D, X4 and various flight sims whilst leaving v-sync at 60Hz on my IPS 2560x1440 27" monitor (from a GTX1070 8GB) with no ghosting and no apparent issues from just running at 60Hz. So really (and I am looking to upgrade monitor at some point) am I missing a trick, is there some issue I am unaware of (and if I am unaware of it is it then actually at all important)?
If you go back a few posts we "discussed" that issue. Back when movies were analog, the frame rate of a movie was 24 fps. That worked good enough to give the impression of fluid motion, but if the photographer panned the camera over the horizon, the image you watched on the big screen in the cinema seemed to be what it was, a serious of still images shown for 1/24th of a second. When humans invented the TV the problem was "solved" by increasing the frame rate 50 or 60 fps, depending on where you lived (Europe=50 fps, and US=60 fps, due to the frequency of 110 or 220 V).

For many years, movies continued to be 24 fps. Because perception is very subjective, people got used to the low frame rate in the cinema, and even though video had more than double the fps, we used to lower (half) the frame rate of video to "make it look like film". Why? Because people had a subconscious feeling that 24 fps looked good, since the actual image they were watching in the cinema had proper light setting, was sharper and generally of a better quality than the average TV production. Lowering the frame rate from 50 to 25 gave people a feeling of watching a blockbuster made on a big production budget, whereas 50 fps looked like a lousy x-rated "pron" movie. When Peter Jackson tried upping the frame rate while doing The Hobbit, people went crazy and some (hysterics) even claimed they became dizzy and barfed.

Typically, most people watching a movie wouldn't notice if the frame rate was anything above 60 fps, but if you're a hardcore pro gamer every little detail counts. I personally think there are more important factors influencing your performance than frame rate above 60-90 fps, but I'm not a pro gamer. Whether it is worth the money is always up for debate. If I personally was to buy a new display I would go for 90 fps, and with my personal needs (skills) I would consider that welly OP. :)
 
Last edited:
Typically, most people watching a movie wouldn't notice if the frame rate was anything above 60 fps, but if you're a hardcore pro gamer every little detail counts. I personally think there are more important factors influencing your performance than frame rate above 60-90 fps, but I'm not a pro gamer. Whether it is worth the money is always up for debate. If I personally was to buy a new display I would go for 90 fps, and with my personal needs (skills) I would consider that welly OP. :)

Pretty much this, I usually try to aim for 85 to 100 which seems to be in my comfort range. 144hz is overkill for me, certainly. But if I want a screen that will perform over 60 most will do 120 or 144.
 

Deleted member 38366

D
Hm, just since I didn't see it being mentioned (apologies if I overlooked it), please do check Video Driver Settings for the Display!

This Setting can easily be the cause of Color Banding and I remember having the issue myself after virtually each Driver Update for quite a while until the issue was fixed.
(Example Article on it : https://www.howtogeek.com/285277/how-to-avoid-washed-out-colors-when-using-hdmi-on-your-pc/ )

If anything, definitely worth a look (for any Display, if Color Banding is observed).

149650
 
18 bit "color depth" is an overkill :)

It's also highly misleading in reference to the LG 27G850, which is a 10-bit panel with HDR.

Notice that on an 8 bit panel, you cannot see any banding, so any bit depth over 8 is money wasted.

If those 8-bits were stretched across the entire spectrum of light intensities that could be delivered without damaging your eyes, it would not be sufficient. That's where these extra bits are supposed to be applied to...accounting for as much of the range of contrast the human visual system is capable of perceiving.

Of course, to make good use of this, you need suitable content, which is largely lacking at this time.

Not wishing to derail the topic but - since the OP is looking for 144Hz refresh it got me wondering - is it really that important?

Unless you are into e-sport FPS then surely it isn't at all worth it?

I don't do FPS and I happily run E D, X4 and various flight sims whilst leaving v-sync at 60Hz on my IPS 2560x1440 27" monitor (from a GTX1070 8GB) with no ghosting and no apparent issues from just running at 60Hz. So really (and I am looking to upgrade monitor at some point) am I missing a trick, is there some issue I am unaware of (and if I am unaware of it is it then actually at all important)?

Refresh rate isn't a measure of ghosting, though to convincingly support a given refresh rate, a certain pixel response time is required.

Anyway, all other things being equal higher refresh and frame rates will still be perceptibly smoother (though with diminishing returns), deliver more information, and result in lower latency than lower ones. I'd always recommend getting the highest refresh rate one can without having to make significant sacrifices elsewhere.

Outside of 1440p ultrawides, or 4k 16:9 displays, 144Hz is pretty standard at this point. Targeting lower refresh rate for an IPS or VA panel will neither save much money, nor allow for many improvements elsewhere, unless you are going for certain professional displays.
 
If those 8-bits were stretched across the entire spectrum of light intensities that could be delivered without damaging your eyes, it would not be sufficient. That's where these extra bits are supposed to be applied to...accounting for as much of the range of contrast the human visual system is capable of perceiving.
I haven't tried it on my PC, but I was a Mac user for 10 years and on a Mac using a probe to calibrate the 30" Cinema Display I had, you ended with a color profile, that compensated the signal in software in the OS. Basically three gamma curves. I know this can be done on the PC as well. Then you send a calibrated signal to the display.

If the material was originally 8 bit, then no matter how you do the gamma correction, you will lose some of the 256 levels. In reality, since each RGB value of each pixel is multiplied by a factor given by the gamma curve, and then rounded, you end up with a perfectly smooth grey scale gradient. That is the best way to test banding, and if you can't see any, then you're good to go :)

A weird result you get by calibrating the display is that the colors visually separate more. I heard several people say that after we calibrated their display, but that is probably because we are mostly used to the white balance of daylight, which is very close to the 6500 K you normally use for calibration. The 6500 K white balance simulates the color of the grey sky on a cloudy day.

Many TVs used to be defaulting close to 9300 K, because when you stand at a TV shop, all TVs look more or less the same. They are all showing the same signal, typically something with snow, and snow shown at 6500 K looks slightly yellowish, while turning the white balance up to 9300 K makes snow look blueish. People like blueish snow more than yellowish snow, so they tend to prefer a too high white balance, but whatever sells the TV :)
 
View attachment 149619
(click for large version)

Notice that on an 8 bit panel, you cannot see any banding, so any bit depth over 8 is money wasted.

I took another look at this and I can distinguish every single individual band on that 8-bit spectrum, except for the last three or four, on my current display.

Which is, of course, what I was aiming for when I calibrated it...each step of brightness being distinct to maximize the detail that can be seen and the information that can be obtained (I sacrificed a tiny bit at the peak brightness to compensate for a small amount of black crush that tends to be intrinsic to VA displays). Couldn't quite fit the whole range in on this decidedly budget oriented Pixio PX329, but even a 2000:1 contrast ratio gets me to the limits of 8-bits of luminance.
 
I took another look at this and I can distinguish every single individual band on that 8-bit spectrum, except for the last three or four, on my current display.

Which is, of course, what I was aiming for when I calibrated it...each step of brightness being distinct to maximize the detail that can be seen and the information that can be obtained (I sacrificed a tiny bit at the peak brightness to compensate for a small amount of black crush that tends to be intrinsic to VA displays). Couldn't quite fit the whole range in on this decidedly budget oriented Pixio PX329, but even a 2000:1 contrast ratio gets me to the limits of 8-bits of luminance.
Do you see a clear difference between the banding in the 8 bit and the 7 bit rows?

Can you see how many different boxes there are in this image?

149728


You might have excellent eyesight, but the average human is only able to distinguish ~10 million different RGB combinations, and some of those are even outside the sRGB gamut (like true monochrome colors), meaning that we can distinguish less than 10 million inside the sRGB gamut. Therefore the 16.7 (actually 16.8) million combinations available on an 8 bit panel should cover everything we're able to distinguish.

Some VA panels have been said to show banding, but in those cases I think the problem is in the signal fed to the panel. Color management of the whole system is "more complicated than quantum mechanics", especially because some manufacturers of equipment don't reveal all the details, and there are many steps in the chain.

Edit: I tried to measure the cheap HP 24es I use as a secondary display, and "out of the box" it measures 6391 K when it should be 6500 K. I used to be able to calibrate a CRT in a dark surrounding to within +/- 200 K without a probe, just using my eyes, but those days are long gone, and I've never been able to detect an error of ~100 K.

My other monitor (main) is a 27" BenQ GW2765HT, which is as old as Methuselah, and that was at 6103 K. Not perfect but still close enough that I hadn't bothered calibrating it :)
 
Last edited:
Do you see a clear difference between the banding in the 8 bit and the 7 bit rows?

Yes.

Can you see how many different boxes there are in this image?

View attachment 149728

I can only clearly distinguish two squares there with the slightly darker one on the left half.

You might have excellent eyesight, but the average human is only able to distinguish ~10 million different RGB combinations, and some of those are even outside the sRGB gamut (like true monochrome colors), meaning that we can distinguish less than 10 million inside the sRGB gamut. Therefore the 16.7 (actually 16.8) million combinations available on an 8 bit panel should cover everything we're able to distinguish.

Can't most human trichromats distinguish about ten million different colors, not including luminance levels? 8-bit RGB gives us 16.7 million possible shades, including all possible brightness levels.

Even a monochromat could still see more than 256 levels of brightness on a display with decent contrast.
 
There are two boxes. You're spot on! :)

The 16.8 million colors derive from the fact that a 256 level gradient seems to have no banding for most people. 256^3=16.8e+06. However, I think you're on to something, which might prove that wrong. See, when this was stated back in 1975, a standard TV was having a maximum luminans of ~100 cd/m2, and the black levels weren't as low as today, where you switch off the LED behind the panel to make it look darker and some HDR TVs or computer displays are able to deliver much higher luminans ~500 cd/m2. With the wider dynamic range you get "wider" steps in the gradient you see on the display.

Any monochrome color at a high enough luminans i percieved as being "white" to the human eye. Likewise, when the luminans drops to a certain level it is percieved as being black:

149843


Looking at the greyscale gradient it is clear that the difference of the white and black levels are more difficult to distinguish than in the grey part of the gradient. If the dynamic range increases and the steps get larger in the "midtones", I might have to revise my old understanding of perception, and say that the human eye is able to distinguish more than 256 levels. Maybe a bit more, or maybe even two, but not 18 bit. :unsure:

In practical terms, you very rarely see images on a display containing gradients, but more something resembling dithering. That helps a lot with perceiving less banding.
 
A bit of an update. I got myself a new monitor, this time an IPS panel, and discovered the drawback of these types of displays.

Backlight bleed!


150812

150813


The added contrast is a nice change, but if I can get that big blob of brightness in the lower right corner I'd be a lot happier. The quest continues!
 
Among remotely affordable display types, I prefer VA panels. IPS glow annoys me, especially in darker games.

Anyway, it's hard to tell from a photo, but that level of blacklight bleed may well be out of spec and could be grounds for an exchange. I'd check out some reviews and see how common it is...if it seems like you have a good shot at getting a better sample if you exchange it, it may be worth it to do so.
 
Among remotely affordable display types, I prefer VA panels. IPS glow annoys me, especially in darker games.

Anyway, it's hard to tell from a photo, but that level of blacklight bleed may well be out of spec and could be grounds for an exchange. I'd check out some reviews and see how common it is...if it seems like you have a good shot at getting a better sample if you exchange it, it may be worth it to do so.

I certainly hope it's out of specs, I paid a bit over £450 for it!
 
I certainly hope it's out of specs, I paid a bit over £450 for it!

You might be surprised how sloppy some specification tolerances are, which is why it's always good to know what to expect before sending something in for an exchange, less you get an even worse sample back.
 
Back
Top Bottom