Hi Old Duck - that's probably more to do with the object hierarchy further down the chain, in 3D space. The main benefit of post-processing shaders is that they're fairly cheap in computational terms since you're just working with and on pixel data. There's no way to cleanly discern different objects in the scene as you could with 3D data. It's comparable to committing a 3D scene to film and then applying filter effects in editing software. You can pass variables from the game to modulate the filters in real-time so it's quite powerful (distance from star etc), but the filter itself is only ever applied to pixel data. If they could do it with whatever method they're using, I'm sure that they would have addressed some of the dimming issues on the HUD and so on. I've not used GLSL in a game development context so their implementation might be more sophisticated than I'm suggesting, but that would kind of defeat the object of using them, which is fairly dynamic and dramatic effects at modest computational cost. It certainly looks like a plain old set of 2D filters applied at the end of the pipeline to me.I know just enough to be dangerous, but what if they are rendering multiple layers, thus allowing them to separate the skybox from the other 3D assets? Someones when I quit the game while in a station, the station disappears, but the skybox (which was totally blocked since I was in a hangar) becomes visible, and it's rotating (as in, I'm rotating), as if were rendered completely separate from the station itself. Isn't this a "thing" in advanced 3D graphics?
However they fix it, I welcome the fix.
P.S If you examine the images above you can see that every element on-screen shows the signs of being shifted towards a given part of the colour spectrum. I would much prefer a more natural looking Milky Way, but for me the unnatural effect that it has on the HUD is the more disconcerting.