Update 6 your thoughts?

I'm getting much better than that with a 2080ti. Where is a good place to test?
The biggest performance killing areas are concourses in orbital stations and planetary ports. Combat Zones and planetary settlements are also big offenders. Basically anything new foot-based area that was added with Odyssey is where the performance is tanking. So try those kinds of places.
 
The generation of the LODs itself confers a stage of aliasing, as you sample from your various heightmap sources, bashing them together, and try to match the resolution of the produced LOD to how many pixels it will be drawn across on-screen...
I think this is the key point: LOD should always take into account the intended output resolution, expecially when the detail is pushed to a limit beyond which everything becomes a pixel mess.

If LOD biasing is an insurmountable problem...
Update 5 was the 'living proof' that we can get away with a very good compromise.
But while it worked well with 1080p, it produced somehow inadequate results at higher resolutions.
I think this is the reason FDev changed LOD parameters with update 6...
...and the game became gorgeous on 4K (if what I read in the forums is to be believed), but quite crappy at 1080p.

I think the best solution would be to cook different LOD profiles, each one optimized for a different output resolution.
Then, when the player changes resolution, the game should switch to the matching profile automatically.

Of course everything should be handled 'behind the curtain', that is without exposing any new setting to the end user.
 
Last edited:
I think the best solution would be to cook different LOD profiles, each one optimized for a different output resolution.
One would imagine there'd have to be some artihmetic component to it, which would dynamically adjust to fit optimally, based on output resolution, field of view, and distance.
 
I sure hope that Update 7 does better at performance because the last 6 updates haven't done anything for me. On my laptop, I see about 30 fps in real open space where I see 50-60 in Horizons. The worst is when I enter a station. Then I drop to 10-12 fps while in Horizons, I was getting 20-25. And don't even mention disembarking to go in a station! It's even lower than 10. I'm just ashamed to mention it. Of course, all my settings are on low, I just can't go any lower. I understand they changed the planet tech but, honestly, I haven't seen much difference on planet than what I was seeing in Horizons. I tried to keep both versions on but when switching to Horizons from Odyssey, it would just screwed up my bindings, so I ditched Horizons.

Anyway, still hoping that Frontier will be able to do much better than what they did so far...
 
One would imagine there'd have to be some artihmetic component to it, which would dynamically adjust to fit optimally, based on output resolution, field of view, and distance.
Out of curiosity, have you tried to force LOD bias through driver-level tools like NVidia Profile Inspector?
Do you think it would do any good?
 
Out of curiosity, have you tried to force LOD bias through driver-level tools like NVidia Profile Inspector?
Nope - didn't even know it could be manipulated on that level. (I guess it is possible that I, despite efforts not to, write things with a tone of certainty that projects a level of know-how I am nowhere near to possessing, in which case I'll have to work on improving my disclaimer-to-guff ratio. :p).

So I suppose delaying a highly detailed terrain LOD's beginning to draw, by some relative distance, could indeed possibly reduce its initial aliasing, at the obvious cost of delayed graphical fidelity gratification, which could be an interesting experiment -- see if one could find a comfortable balance, and whether it differs between different resolutions/supersampling levels, and fields of view. :7

( All imperfections become doubly apparent in VR, due to its so far much lower angular resolution. Some headsets may boast 4k screens, but where X pixels over 1 degree of field of view really is X pixels over 1° of actual viewed FOV in VR, they become effectively compressed on a monitor, that displays a 100° rendered FOV over something like 40° of the player's real-world FOV, depending on monitor size, and how far from it the player chooses to sit. :7 )

The flashy graphics hound in me kind of wish we could rather have such quality to the procedurally generated LODs and textures (I am guessing maybe no just-in-time cycles are spent on making mipmaps for procgenned stuff...), and to the filtering algorithms that sample them, that less balancing was needed in the first place, but that is of course prohibitively computationally expensive. (I've got a whole unwritten rant pressure-cooking, about how we need an order of magnitude more computing power in order to get last "generation" games to render with satisfactory graphics and physics (EDIT: ...at satisfactory frame rates), and at least one more for the next (man, how I'd love to see Elite raytraced, at high resolution and recursion count, and with physics/IK that actually gives an impression of objects interacting) - a far cry from the at best 50% increase between GPU generations. :p )

I have no idea how long various terrain LODs are buffered before they are flushed - I guess it depends on available VRAM, and your distance and travelling direction/velocity trend... (I am assuming (... in the spirit of the need to make it clear I have no idea what I am talking about, this is probably a good place to add some stress on: assuming) consistent mesh subdivisions by two, as one approach, and texture patches retaining the same bitmap size, for each such half-by-half smaller and smaller patch of geometry.)
 
Nope - didn't even know it could be manipulated on that level. (I guess it is possible that I, despite efforts not to, write things with a tone of certainty that projects a level of know-how I am nowhere near to possessing, in which case I'll have to work on improving my disclaimer-to-guff ratio. :p).

So I suppose delaying a highly detailed terrain LOD's beginning to draw, by some relative distance, could indeed possibly reduce its initial aliasing, at the obvious cost of delayed graphical fidelity gratification, which could be an interesting experiment -- see if one could find a comfortable balance, and whether it differs between different resolutions/supersampling levels, and fields of view. :7

( All imperfections become doubly apparent in VR, due to its so far much lower angular resolution. Some headsets may boast 4k screens, but where X pixels over 1 degree of field of view really is X pixels over 1° of actual viewed FOV in VR, they become effectively compressed on a monitor, that displays a 100° rendered FOV over something like 40° of the player's real-world FOV, depending on monitor size, and how far from it the player chooses to sit. :7 )

The flashy graphics hound in me kind of wish we could rather have such quality to the procedurally generated LODs and textures (I am guessing maybe no just-in-time cycles are spent on making mipmaps for procgenned stuff...), and to the filtering algorithms that sample them, that less balancing was needed in the first place, but that is of course prohibitively computationally expensive. (I've got a whole unwritten rant pressure-cooking, about how we need an order of magnitude more computing power in order to get last "generation" games to render with satisfactory graphics and physics (EDIT: ...at satisfactory frame rates), and at least one more for the next (man, how I'd love to see Elite raytraced, at high resolution and recursion count, and with physics/IK that actually gives an impression of objects interacting) - a far cry from the at best 50% increase between GPU generations. :p )

I have no idea how long various terrain LODs are buffered before they are flushed - I guess it depends on available VRAM, and your distance and travelling direction/velocity trend... (I am assuming (... in the spirit of the need to make it clear I have no idea what I am talking about, this is probably a good place to add some stress on: assuming) consistent mesh subdivisions by two, as one approach, and texture patches retaining the same bitmap size, for each such half-by-half smaller and smaller patch of geometry.)
I see. If update 7 won't fix all the shimmering brought by the previous one I'll take the time to install Profile Inspector to carry out some experiments by myself.
And I'll report my findings, of course.

Anyway, I find your musings extremely interesting and relevant, mate.
Thank you for sharing!
 
The strangest thing I've found with EDO, Update 6 or not, is how, when quitting the game, it will display a single frame from anywhere between half an hour to 2 hours prior to logging out. As if it has been holding onto that frame in the frame buffer the entire time.

e.g.; I'll launch from a station, and then two hours later will log off, and it will briefly display a frame of when I was launching from that station during the process of closing the game.
 
And I'll report my findings, of course.
Thank you in advance, for sharing! :)

I hope somebody will have given some attention to the badly aligning seams between patches of geometry. Even if that doesn't remove all "boiling" ground, I am rather tired of the pixellated grid they brought to the ground radar... :7
 
Top Bottom