How Many Threads?

How many threads does it take to change a lightbulb? Wait, that doesn't make any sense...

The actual question is related to multi-threaded computer code. ED must be multi-threaded, but how well does it actually scale with CPU core count?

I ask as a PS4 player, because we have an 8 core processor that doesn't run terribly fast, so if ED only takes advantage of two or four cores, we're missing out on untapped performance. I have evidence that ED is under-utilizing the hardware, because my PS4 consumes less electricity when running ED compared to my other games, but there's definitely room for improvement on the graphics end. This may be due to code not optimized for 8-1 cores, and / or it may be due to GPU saturation (though a brilliant programmer can offload some GPU work to spare CPU cores).

On the PC side, have you folks found a sweet spot for core count? I've been out of the PC gaming arena for some years - is the average core count for the current desktop CPUs still four? Asked a different way, if you could get a faster dual-core CPU (assuming they still exist) or a slower quad-core, which would you go for? Would you even bother with an 8-core CPU? These questions are solely for playing Elite: Dangerous.
 
i was under the impression that higher clock per core was more important than a lower clock with more cores/threads.

would like to be proven otherwise tho..
 
I can say for a fact that the game client does not like low-voltage dual core processors with a relatively high clock (for a laptop CPU - 3.1GHz) but concomitantly low instructions per clock, even with multi-threading.

I know it likes 4 physical cores with a lower clockspeed but more cache and a higher IPC.

This is all on PC though, with a GPU that has the chops for high settings at 1080p - a GTX1050 2GB on the latter, and a GTX 1050Ti on the former.
 
One thread, over 1000 pages, more than 100 participants and various FD posts stating that the lightbulb will be changed but they either
  1. don't have more lightbulbs
  2. have the lightbulbs but the change is planned later down the road
  3. the lightbulb will be added in a mysterious™ manner

Left out someone will create a poll about it with non-nonsensical options and Brett C will want to test a new thread format during the bulb change.
 
I first played on a dual core pentium, it was really laggy working with system / galaxy map.

Upgraded to a quad core and saw a real improvement.

Have since upgraded to a faster quad core i5 and again saw a nice improvement, so clock speed also helps. Haven't tried an i7 yet...
 
I can say for a fact that the game client does not like low-voltage dual core processors with a relatively high clock (for a laptop CPU - 3.1GHz)
That's why the requirements state a 4-core CPU as minimum, and there have been ugly issues on CPUs with fewer. In open play with a handful of other players in supercruise, it will happily saturate an FX-6300 (6 cores).

(edit) I have the OP on ignore for what I infer was its original misunderstanding about how pestering about a given topic would speed things up, so any offtopicness can be attributed to that, and I had not even realised until now :D
 
Last edited:
I don't think CPU is a big factor.

I'm running an old Core i7 - 950, which is ancient in "CPU years" and get excellent performance on a 34" monitor with all settings maxed out.

The GPU is a 980Ti and that's what's important.

I'm running the 34" at 3440X1440 and I also have a 24" at 1920X1200 running off the same GPU alongside for surfing and watching videos or live TV at the same time I'm playing ED.

HTH

One note - watching the forums. I have noticed that other users seemed to have more problems obtaining similar performance with newer CPU's. Hard to tell is that's an architecture problem or just bad user setup.
 
That's why the requirements state a 4-core CPU as minimum, and there have been ugly issues on CPUs with fewer. In open play with a handful of other players in supercruise, it will happily saturate an FX-6300 (6 cores).

PS4 originally only had six cores available to developers, though I believe Sony made the 7th core available later on. If ED can saturate six cores, that's good news.

It does leave me wondering where the under-utilization is happening. The PS4 running ED consumes significantly less power, percentage-wise, than Uncharted 4, which is one of the most optimized games made for our console.
 
The game is capable of running across multiple cores, including logical ones created by Hyperthreading (Intel) or SMT (AMD - used in XBone and PS4). The following is my utilisation just now when I loaded up in a Station. The dip at the end is when I quit the game to show the baseline:

ZfRRFde.jpg


The issue for consoles is multiple. First you have the power/heat considerations that has them limited to ~2GHz per core (vs 4GHz in my case for example). The second is that the APU on consoles must also perform all GPU calculations, while on PC you essentially have a whole other computer doing that part. Thirdly, and probably most importantly, is the fact that the game was not originally designed with console play in mind which makes it very difficult to use the console's most powerful tool, the hardware homogeneity that allows for Direct-To-Metal computing.
 
Elite seems to scale well across cores, utilises my 12 threads nicely and load is very, very low really.

I get bigger CPU spikes from all the nonsense "me-too!" crapware running in the background.
 
The issue for consoles is multiple. First you have the power/heat considerations that has them limited to ~2GHz per core (vs 4GHz in my case for example). The second is that the APU on consoles must also perform all GPU calculations, while on PC you essentially have a whole other computer doing that part. Thirdly, and probably most importantly, is the fact that the game was not originally designed with console play in mind which makes it very difficult to use the console's most powerful tool, the hardware homogeneity that allows for Direct-To-Metal computing.

I understand what you're saying. I'm not expecting my PS4 to compete with a high-end PC gaming machine. I'd just like to see ED use the hardware I've got, and according to my measurements, it's only using a fraction of it. I was curious if this might be due to poor multi-threading / parallelism, but it appears this is not the issue.
 
That's why the requirements state a 4-core CPU as minimum, and there have been ugly issues on CPUs with fewer. In open play with a handful of other players in supercruise, it will happily saturate an FX-6300 (6 cores).
The thing is, it didn't use to - my laptop could quite happily drive a full-fat GTX970 over multiple screens before Horizons (with a few settings turned down and outside of conflict zones). These days, I get clear artefacts of CPU saturation at 1080p even in stations. The game has gotten much more demanding, not always justifiably so, IMO.
 
Last edited:
Bear in mind that CPU utilization is also offloaded to graphics drivers and other support processes. Even if an application is not specifically coded for multithreaded operation, various O/S support services (audio/video/networking/etc) can often run on alternate cores at the bequest of the application.

How an application uses system resources is often as much a choice of which development API a programmer chooses as it is how the core logic is written.
 
I understand what you're saying. I'm not expecting my PS4 to compete with a high-end PC gaming machine. I'd just like to see ED use the hardware I've got, and according to my measurements, it's only using a fraction of it. I was curious if this might be due to poor multi-threading / parallelism, but it appears this is not the issue.

I'm afraid not. Your previous example of Uncharted 4 is actually excellent in this case. As I recall, originally Naughty Dog had wanted to target 60FPS but they were finding that it was taking too long to craft their environments with that level of bespoke optimisation. They *could* have done it but it would have been a case of developing things like geometry/terrain optimisations at the cost of actual content. This is extremely similar to the core difficulties Frontier would face and they have nowhere near the level of experience at developing for PS4 that a company like Naughty Dog has.

Could they make Elite run better on consoles? Absolutely they could, that's not at issue. It's much more the case that can/will they devote the time you need to handcraft assets with great efficiency for just 1 system? It's for this exact same reason that we will sadly never see Uncharted 4 (and many other titles) on PC - by optimising specifically for PS4 you create a product that would not be able to run consistently on other hardware. When you develop for Windows you quite literally have to do it in a very un-optimised way to make sure that you are complying with all the different hardware types and standards. Naturally this is also how you do most of your development on console because the development tools are very similar nowadays, except that when you run into a difficult problem there are hardware-based optimisations you know you can make because everyone has (almost) identical gear. Winding these optimisations out of a game like Uncharted 4 is simply not economically viable. For example, it took Rockstar almost 2 years to release GTA V on PC and it wouldn't have happened if it wasn't already one of the most successful games of all time.
 
Last edited:
From my Observations, on AMD Hardware. ED spreads it's Load *cough* across all available Core's / Thread's. Load per Core goes down the more Cores there are - at least it's this Behavior under Windows but i could imagine under BSD its more or less the same since the PS4 hasn't particulairy powerful Cores ( Jaguar beeing low-power and all ). If there is enough GPU Power and there aren't enough CPU Cores ( like on my Laptop for example which uses a Low TDP Dualcore-APU - i notice teh CPU is loaded to 100% on both cores while Graphics sits "idle" most of the time. ).

For some Tasks there seems to be higher per-core load ( route-plotting, zoning, network polling ( Message Board etc. ). ) there seem to be some spike's but that's quiet normal. ED benefits in thoose cases ( also the Galmap ) from faster per-core speeds for that instances. The Reason for that your PS is using less power may be that the COBRA Engine in ED utilizes all cores but not fully - basically balances itself out. Also it could be that other titles using less cores instead but still use more power. Why ? Simple the PS4 cores when all are used run at 1.8 ghz and feed the GPU with that pace also. While other Titles may only use 4 Cores but then have more Headroom to run at up to 2.4 Ghz per core ( depending on the PS Model ). More Mhz also usually means more Core Voltage which can lead to higher Power draw and heat - depending also how they feed the GCN Cores inside the GPU.

At the Moment its pretty balanced - is this ideal ? could this be optimized more ? Maybe, only dev's can tell.
With today's slim Formfactors and Power vs Heat it isnt as simple as "not utilizing all cores" anymore.

Cheers!
 
Back
Top Bottom