Questions For PC Gamers

I'm curious about how ED utilizes different hardware. Assuming "decent" hardware to begin with, does ED seem more CPU dependent or GPU dependent? On the CPU side, how well does ED take advantage of multiple cores? Do you see a respectable improvement by going from 4 cores to 8, for example?

Does ED do better on a certain brand of GPU / video card when all else (specs) is relatively equal? Or does ED maximize whatever you give it?

What do you believe are the bottlenecks, hardware-wise, and software-wise, for that matter? What graphic features seem to cause the greatest hit to FPS - detailed shadows, volumetric fog, something else? This last question assumes no changes in resolution.

Any other insight you have is welcome. Thanks!
 
Greetings!

First off, ED is more GPU intensive, I believe.

A slow CPU will matter if there's lots of physics going on too though.

I use an AMD FX 8320 8 core monstrosity. But. The game doesn't feel that different to when I used my old Intel Core 2 Quad Q6600. Lol
Faster loading times, but gameplay is very similar to the point I didn't notice much change.

However, ARMA 3 was running absolutely pants on my Q6600, but now runs swimmingly.

Planets and docks seem to tank my FPS. But combat does not.

I wouldn't know if ED prefers any brand. Full AMD build here.
 
My old Phenom 955 rarely gets above 45% on all 4 cores so the CPU isn't a limiting factor (although I believe the recommended spec is 4 cores or greater) my GTX 960 2Gb runs near 60 FPS until docking at large starports where it can drop into the 40s for a few seconds (loading textures perhaps?), the graphics settings are mostly medium but AFAIK the game can run at 4K and VR if you've got a powerful enough rig.
 
As with any PC game, it's mostly about the video card. You won't likely bottleneck the CPU unless your CPU is underpowered for your GPU, check the GPU recommendation and up it a notch or two and that should be good. I went from 4 to 8 cores, but I installed a better GPU at the same time so I don't know how much the extra cores helped, if any. I haven't bothered to check how many cores are actually in use when playing ED.

Don't forget about memory, though, memory makes a difference. I recommend at least 16GB of the fastest memory you can afford. Using an SSD helps too, for loading.

As far as graphics settings go, shadows and supersampling are hogs. I turn down shadows myself for better VR.
 
I found ED palyable on total potato.
Namely 1st generation Quad Core 2.4 with 4gigs of ddr2 memory and GeForce 560.
Was getting 60fps on lows (exept for stations. Mid 40s there).
GTX 960 can run it at ultra at 1080p at 60fps.
GTX 1060 (6gig one) can do 60 at ultra at 1080 1.5 supersampling.
(Both of above with i5-6600 CPU its usage hovers around 40% so no bottleneck there).

Mind you way game renders stuf in incompatible with AA so if you want it smooth you need to supersample it.

Hope that helps
 
I found ED palyable on total potato.
Namely 1st generation Quad Core 2.4 with 4gigs of ddr2 memory and GeForce 560.

Pretty much my set up but with a GTX 760 .. runs pretty well on high settings (for me) .. the only issue I have is over-heating.

*Edit* - And i resent my PC being called a Potato! ..
 
I found ED palyable on total potato.
Namely 1st generation Quad Core 2.4 with 4gigs of ddr2 memory and GeForce 560.
Was getting 60fps on lows (exept for stations. Mid 40s there).
GTX 960 can run it at ultra at 1080p at 60fps.
GTX 1060 (6gig one) can do 60 at ultra at 1080 1.5 supersampling.
(Both of above with i5-6600 CPU its usage hovers around 40% so no bottleneck there).

Mind you way game renders stuf in incompatible with AA so if you want it smooth you need to supersample it.

Hope that helps


I concur with the potato comment.

One of my gaming machines is a Core2 Quad with 16GB RAM and a GTX760. It can run Horizons well. FD have tweaked the graphics performance lately, so it's even better now.

The GPU is the bottleneck.
 
Definitely more GPU dependent than a lot of games, such as GTA 5 for example which seems to demand a lot of both GPU/CPU equally. My i7 6700k's average operating temperature on water when playing ED is around 38C, while in GTA 5 its closer to 50C.

GPU on the other hand is about the same between the two games with my overclocked 1080ti averaging around 45 to 50C in both games running at 1080p / 60hz with all in-game graphics settings maxed to the highest available settings.
 
Last edited:
I run on an i7 4770k Devil's Canyon 4.0 Ghz (Quad Core) with 32 GB of RAM, a GeForce 1080, 256 GB SSD for my OS, 1 and 2 TB SATA III HDD's for storage.
I run with everything set as high as it goes and I never miss a beat at 1920x1080.

Game does seem more GPU-intensive though, as my CPU usage never seems to move much. RAM gets around a good bit though.
 
GPU handles the game graphics and presentation.

CPU handles I/O and file operations. Needed graphics textures are retrieved from the HD and delivered to the GPU, as one example. Also handles the P2P networking and other game functions.

Basically, you need both to do the job right. If either one bottlenecks, your gaming experience will suffer.
 
Given that compute shaders is a GPU feature, Horizons is GPU-dependent.

The client, however, really doesn't like dual-core ULV CPUs (with the -U suffix on Intel's laptop processors.) At 4K+, a high-end CPU becomes important - overclocking a 6700K has been shown to improve framerates in VR.
 
I'm running a relatively old i5 and a (somewhat more up-to-date) GTX 770 with 8GB of system RAM and 2GB of graphics memory.
It runs okay and I'd agree it seems to be more fussed about the graphics card than anything else. Occasional stuttering around stations and maybe a bit on near planets, though I tend to see graphical downgrades versus stuttering.
I imagine a better gfx card would sort out much or all of this. An SSD probably wouldn't hurt for general load times. I'm not convinced more RAM or a faster CPU would do much.
 
I'm curious about how ED utilizes different hardware. Assuming "decent" hardware to begin with, does ED seem more CPU dependent or GPU dependent? On the CPU side, how well does ED take advantage of multiple cores? Do you see a respectable improvement by going from 4 cores to 8, for example?

Does ED do better on a certain brand of GPU / video card when all else (specs) is relatively equal? Or does ED maximize whatever you give it?

What do you believe are the bottlenecks, hardware-wise, and software-wise, for that matter? What graphic features seem to cause the greatest hit to FPS - detailed shadows, volumetric fog, something else? This last question assumes no changes in resolution.

Any other insight you have is welcome. Thanks!

In my case, my I7 is being used at about 25-30% across the cores, and my 1070 GPU about 30-70% usage... So I'd say ED is more GPU intensive than CPU.

ps: Note the GPU fan hasn't even bothered coming on, the card is so good at passive cooling :)

EaJZzvC.png
 
Last edited:
Well, I didn't know either so I checked my MSI Afterburner output.

EdNf56L.jpg

As you can see, it is GPU heavy at 1440p and ultra settings.
It less CPU dependent and seems to use mostly one core (CPU8).

GPU: Nvidia GTX 980 TI
CPU: Core i7-5820K
 
Last edited:
Played elite on a 4gb gtx 770 and 4770k @ 4.6ghz when it came out.
When my 770 died I played it on the intel integrated graphics at 720p and 60fps, wasn't bad at all.

Then I bought a gtx 1080 and play it with everything on ultra and 2x supersampling 1080p 60fps.

It's hilarious how fast the galaxy/system maps come up now, can open the system map after every jump while the 7a fuel scoop is at work, that's how fast the gtx 1080 is.
Definitely worth the cash.

Got it for 499€ 2 weeks before the prices went up.
Lucky me.
 
IIRC, stuttering FPS around stations seems normal no matter what hardware you have. Has anyone looked at the gauges (CPU, GPU, I/O) when this happens to see if anything is being pegged at that moment?
 
IIRC, stuttering FPS around stations seems normal no matter what hardware you have. Has anyone looked at the gauges (CPU, GPU, I/O) when this happens to see if anything is being pegged at that moment?

As you know...I have reported on this several times.

You are right that hardware seems to have little to do with this particular problem. It is vastly improved in terms of duration and degree when using a high performance GPU/CPU combo, but the issue is still there. Just less severe and for less time.

When I drop into a Starport from SC, my FPS drops from the usual locked 60fps to about 58fps for less than 2 seconds, but the stuttering lasts for well over 10 seconds, even though my FPS and Frame times are back up at their usual 60fps/16ms. (GTX 1080ti OCed to 2070Mhz Boost / i7 6700k OCed to 4.6 Ghz @ 1080p / 60hz)

GPU load is no where near max either. My 1080ti is usually at either the base clock of 1599Mhz or around 1700ish Mhz during these events which translates into a GPU load percentage of less than 35%.

I also run ED off my boot drive which is a Samsung 950 PRO M.2 Nvme. One of the fastest SSDs on the planet.

This is why I have always suggested that this is some kind of instancing hiccup combined with optimization issues with the Cobra Engine.

No one has been able to prove otherwise at this point.

P.S. Since the 2.3.11 update, I have had a lot less of this issue when dropping out of Glide Mode at planetary bases. These areas were usually just as choppy as Starports, but not so much after this last update. It would appear that they are working on fixing this a bit.

My guess is that the Starport issue is more difficult since that has been in the game forever.
 
Last edited:
  • Like (+1)
Reactions: NW3
Back
Top Bottom