Hardware & Technical Elite: Dangerous client performance at high refresh rates

I've been tweaking my overclock, and have been running Afterburner while playing my games to monitor CPU and GPU usage, temperatures and similar.

I noticed just now that even though the game doesn't make my rig work hard at all, the FPS sometimes dips. The thing is that neither the CPU nor the GPU are even approaching 100% usage - the GPU tends to peak at around 75% while the CPU occasionally hits 60%.

Admittedly, the drop in FPS doesn't seem to affect much; it's less than 10%, from 120 to 110 usually.

Specs, if they're useful:

i7-6700k @4.51GHz
16GB DDR4-3000
Zotac GTX 1070 Amp! @stock
Asus something-or-other 144Hz 1080p display running at 120Hz (set in Windows display options)
Win10 build 1809, fresh install

The game client is set to 1080p/ultra/120fps with the fps capped and vsync disabled

It occurs to me that speedstep is enabled, so I might try switching that off to see if it makes a difference. On the other hand, I thought that there might be client-side settings I can change. What tweaks to graphics settings and the like can I do to fix this first world problem?
 
Is that the total aggregate utilization? Are any individual logical cores of your CPU at 100%?

Most games, including Elite: Dangerous, are limited by a few demanding threads and a 4.1GHz Skylake might fall short of what's needed to maintain 120 fps all the time.
 
I looked into this in the summer Monkey, your specs are more than enough for 1080p. ED does not seem to prioritise frame rates/graphics in normal settings. It seems to max out my RAM when other players are nearby and drops my frame rate to compensate.


If you have a 2nd screen try displaying task manager as you play and watching memory usage when FPS drops. Then go into windows display settings at set your monitor to 2k or 4k (a bit technical but and you will not notice it graphically but it will tax your GPU) to see if your GPU maxing out will alter FPS.
 
I have never seen Elite use more than 6-7 GB (not accounting windows and other running software) of RAM regardless of how many players are in an instance.

It does seem to be rather wonky and reliant on networking though. As for cores while I play I see load spread out over all of my 12 cores so it's quite SMP capable.
 
As for cores while I play I see load spread out over all of my 12 cores so it's quite SMP capable.

It can spawn as many worker threads as it's allowed to, within reason, but still only has a handful of render threads, and these are usually the limiting factor, CPU wise anyway.
 
Is that the total aggregate utilization? Are any individual logical cores of your CPU at 100%?

Most games, including Elite: Dangerous, are limited by a few demanding threads and a 4.1GHz Skylake might fall short of what's needed to maintain 120 fps all the time.
Good point - I was more interested in temperatures than CPU utilisation having just come off a game of BFV (which pushes CPUs very hard indeed), and only had aggregate utilisation displaying.

I looked into this in the summer Monkey, your specs are more than enough for 1080p. ED does not seem to prioritise frame rates/graphics in normal settings. It seems to max out my RAM when other players are nearby and drops my frame rate to compensate.
RAM isn't an issue - I have 16GB, and E:D doesn't come close to filling it.

If you have a 2nd screen try displaying task manager as you play and watching memory usage when FPS drops. Then go into windows display settings at set your monitor to 2k or 4k (a bit technical but and you will not notice it graphically but it will tax your GPU) to see if your GPU maxing out will alter FPS.
I'm running Afterburner, so RAM utilisation is displayed on-screen while I'm in the game; it's not an issue.

It does seem to be rather wonky and reliant on networking though.
Good point - I'll check in solo to see if the same thing happens. I didn't see any other players last night, however, even though I was in open.
 
Good point - I was more interested in temperatures than CPU utilisation having just come off a game of BFV (which pushes CPUs very hard indeed), and only had aggregate utilisation displaying.

For monitoring temperatures, I use Tthrottle. https://efmer.com/
It is designed to run BOINC programs, so it is a bit of an overkill.

It has a good graphical display and stores all its CPU and GPU readings for 24 hours. You can set a temperature levels that will throttle down the CPU or GPU when exceeded.
 
For monitoring temperatures, I use Tthrottle. https://efmer.com/
It is designed to run BOINC programs, so it is a bit of an overkill.

It has a good graphical display and stores all its CPU and GPU readings for 24 hours. You can set a temperature levels that will throttle down the CPU or GPU when exceeded.

Thanks for the tip - my temps are fine though!

As an update, I played for about two hours earlier today, and didn't see any cores going above 70% at any time, while I was still getting framerate dips in station instances. I was in solo at the time, so I don't think there are any network optimisations I can do. Planetary bases pushed GPU usage to maximum, and caused framerates to dip, but I fully expected that to happen.

I noticed that the framerate dips tend to be front-loaded and are evident for about 10-20 seconds after dropping into an instance before stabilising. Perhaps it's due to assets loading?
 
Last edited:
If you're referring to the frame drops/micro-stutters when you drop out at a station and drop out of hyperspace I think a lot of people get that don't they? In other words: it's the game. I've had that since 2.2/2.3 or thereabouts and it got better or worse depending on the update. From what I've read it seems to affect higher end systems worse which is odd.

It might be a bit smoother in the beta, need to pay more attention next time, then again I assume the beta servers will be less loaded and there will be less network traffic. With previous updates I've found the performance worsen when it goes live.
 
I noticed that the framerate dips tend to be front-loaded and are evident for about 10-20 seconds after dropping into an instance before stabilising. Perhaps it's due to assets loading?

Probably. Could be a memory or I/O bottleneck if both CPU and GPU utilization fall at the same time.

What happens if the frame rate cap is disabled?
 
If you're referring to the frame drops/micro-stutters when you drop out at a station and drop out of hyperspace I think a lot of people get that don't they? In other words: it's the game. I've had that since 2.2/2.3 or thereabouts and it got better or worse depending on the update. From what I've read it seems to affect higher end systems worse which is odd.

It might be a bit smoother in the beta, need to pay more attention next time, then again I assume the beta servers will be less loaded and there will be less network traffic. With previous updates I've found the performance worsen when it goes live.
It's not stutter; it's a general drop in fps by (about 10%) for ten or so seconds when I drop into an instance.

Probably. Could be a memory or I/O bottleneck if both CPU and GPU utilization fall at the same time.

What happens if the frame rate cap is disabled?
Actually, if the cap is disabled or set to higher rates than the monitor, the FPS never dips below that rate.
 
Actually, if the cap is disabled or set to higher rates than the monitor, the FPS never dips below that rate.

The cap itself may well be to blame then. There will be some lag in the render queue, and possibly even in the GPU clocks as it adjusts power/performance states.

What do GPU clocks look like with the cap enabled?
 
Back
Top Bottom