@da_wae Well, you're correct that it's rarely wise to claim 100% certainty about these things

, but I'm currently developing a piece of video/image processing software based on GLSL shaders. I'm only ever working in 2D, but the shaders that I create include color correction filters. I'm around 99% certain that what we're seeing are post-rasterization fragment/pixel shaders applied after the 3D data has been converted to a 2D image, so right at the end of the pipeline. These can be dynamic and you can feed it variables from the game in real-time - for example, reducing the overall magnitude of the filter as the player moves away from the star, which appears to be what's happening. You can do all manner of the sophisticated things, such as isolate the shadows, mids and highlights and apply different values to each, but you're just acting on pixels. The benefit is that you can alter the image in pretty dramatic ways and computationally it's fairly cheap. I can't see any way of completely and cleanly isolating elements from earlier stages in the pipeline, especially since we're dealing with semi-transparent elements.
Assuming that this is the case, I can understand FD's rationale for introducing it, but they must also be aware of it's shortcomings. Clearly they feel that the unnatural looking changes to the HUD and skybox are worth it. I can't see any "fix" other than reducing the effect significantly (which defeats the purpose) or disabling the filters.