Hardware & Technical Computer Build to run Elite Dangerous

I certainly hope it runs it, as we want everyone involved and the rig is no slouch, but I'd be disturbed if it ran in it highest detail, as that would imply highest detail is not in fact very high.

I reckon the latest PCs ought to be capable of double (SLI'd GTX780Ti even more so) the gaming throughput, and if FD aren't offering options to consume that, then the game will have a rather short life-span.

Graphics aren't the be all and end all, but people do expect modern games to test their kit (for all the right reasons).

I think we need to be realistic at least in the sort of stuff we're seeing at the moment... We're see a handfull of space craft 1-2km away, with some gun/explosion effects, all on a space background. It shouldn't prove taxing - Compare this to say Bioshock released earlier this year, where there's huge amounts of detail feet away, and all around.

The game generally isn't going to push visuals to any great degree, until you get in some more unusual settings of close to big structures/craft, and even then it's nothing that's unusual in in titles from years ago.


I honestly can't see this game needing to stretch processing unless in unusually complex scenes, or until we get atmospheres and ground details in the future.


That said, if Dr Wookie's comments are correct, that a simple distant space battle scene can make an i7 with a pretty reasonable graphics card struggle, I'm clearly wrong!


EDIT: A PM from Cosmos (who is on a Q6600 like me) seems to back Wookie up! - "the last scenario slows right down, with all those ships, and it would appear its bottle necked by the CPU.. but there is a ridiculous amount of stuff going on. GPU was at 40% load and was getting <20fps.

This surpirses me, as I can't understand where all that CPU is really required!
 
Last edited:
We seem to be gettign mixed messages then :S

CPU requirements seem low, but beware that this is an alpha, and the full game should be more demanding than this current alpha version.

GPU requirements do vary according to the scenario and options. Good thing is that if you dial down the settings, the game seem to be playable and still pretty with low class HW.
 
Maybe it's the AI engine that's hammering the CPU, the actual online play might be lighter as the CPU won't be "thinking" so hard if you catch my drift :S
 

Squicker

S
EDIT: A PM from Cosmos (who is on a Q6600 like me) seems to back Wookie up! - "the last scenario slows right down, with all those ships, and it would appear its bottle necked by the CPU.. but there is a ridiculous amount of stuff going on. GPU was at 40% load and was getting <20fps.

This surpirses me, as I can't understand where all that CPU is really required!

No doubt code will be heavily improved for release, and remember a top-spec Haswell is way ahead of a 6600, maybe up to 3 times as potent for CPU intensive stuff.
 

Squicker

S
naah, not even twice (depending on clockspeed ofc..-)

I took a an old test machine we have here and ran a Cinebench comparison between the same box with a 6600 and then upgraded to a 4770K (new mobo obviously) and it was over 3 times as fast. Even running PC Mark, a more aggregated measure, was twice as quick.

I found in any measure that matters between the two, the new chips destroy a 6600.

Granted, that won't translate into 3 times as fast app performance, which is why I said 'as X as potent at CPU intensive stuff', but there is no doubt in my mind the 6600 is annihlated by the newer chips.
 
Last edited:

Squicker

S
Here we are, this guy did the same thing as I and you can see his 4770 es on his 6600 in CPU bound work. The 3D stuff that is traditionally bottlenecked by a CPU - low-res stuff where the CPU has to generate geometric data for the GPU as fast as it can - is double the speed.

http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247

It's in no way surprising, whilst the 6600 has aged remarkably well, Intel have put together a world leading CPU architecture and there's nothing that can realistically compete.
 
Here we are, this guy did the same thing as I and you can see his 4770 es on his 6600 in CPU bound work. The 3D stuff that is traditionally bottlenecked by a CPU - low-res stuff where the CPU has to generate geometric data for the GPU as fast as it can - is double the speed.

http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247

It's in no way surprising, whilst the 6600 has aged remarkably well, Intel have put together a world leading CPU architecture and there's nothing that can realistically compete.

Let's take that example of yours.

Look at - http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247/7
Ignore the highest frames per second because who cares if someing is 60 or 600 fps... Look at the lowest FPS. Note how similar they are?

If we look at the Battlefield 3 results - http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247/8
Even on ultra at high resolution, we still get a perfectly good 45fps out of the Q6600.

Surely we're not saying ED is more 3D intensive than these FPS shooters? Consider the amount being displayed/calculated? Most of any ED screen is background. Consider these shooters where most of the screen is calculated.


Anyway... I guess time will tell... My plan is to stick with the Q6600 well into next year, at least... I've not found a single thing I cannot play on highest settings yet :)
 
Last edited:

Squicker

S
Let's take that example of yours.

Look at - http://www.legitreviews.com/upgrading-from-intel-core-2-quad-q6600-to-core-i7-4770k_2247/7
Ignore the highest frames per second because who cares if someing is 60 or 600 fps... Look at the lowest FPS. Note how similar they are?

If we look at the Battlefield 3 results -

But BF is totally GPU bound except in MP, that is well understood and the lowest frame rates are settings that are GPU bound - high res - so the more potent CPUs do not get a chance to unleash, they are probably sat there doing nothing. Remember this guy did not chop out his GPU, so of course at some point measurements will be the same.

I am merely making a comparison of the two CPUs, related to another comment earlier. I have already said just because the Haswell chips are 4 or more times quicker than the 6600 doesn't mean that the game will run 4 or more times faster.

I am merely saying remember that the Haswell chips are immensely more competent than the 6600 at CPU intensive activities, in relation to a specific comment earlier. People are misinterpreting what I have said here.

That's one of the reasons why I am wondering what Wookie's CPU activity is like.
 
Last edited:

Squicker

S
How do I find that out?

Right-click the Taskbar and run Task Manager. Then make sure you can see all cores:

More details arrow, Performance Tab, right click main CPU graph and select All Logical Cores.
 
Just tried that on my system Q6600 unclocked with a GTS 8800 card (below specs) and with the last scenario run windowed at 800x600 it peaked at 89% with all cores. It appears that that scenario gives CPUs a pretty good work out
 

Squicker

S
Just tried that on my system Q6600 unclocked with a GTS 8800 card (below specs) and with the last scenario run windowed at 800x600 it peaked at 89% with all cores. It appears that that scenario gives CPUs a pretty good work out

Nice that it uses all the cores. So you were nearly the CPU's limits by the sounds of it, I think probably the 8800 is a good match for the 6600 CPU but what happens when you try and run higher resolutions?
 
Right-click the Taskbar and run Task Manager. Then make sure you can see all cores:

More details arrow, Performance Tab, right click main CPU graph and select All Logical Cores.

Thanks :)! That scenario had my processors running at 15-35% (i7 4770, Haswell I think, 3.4 GHz). The load seems to be pretty well distributed amongst the 8 threads.

I love buzzing the anacondas, but my gpu doesn't! The frame rate gets down to ~6 FPS at 1920x1080 on high settings :p!
 
Last edited:

Squicker

S
Thanks :)! That scenario had my processors running at 15-35% (i7 4770, Haswell I think, 3.4 GHz). The load seems to be pretty well distributed amongst the 8 threads.

I love buzzing the anacondas, but my gpu doesn't! The frame rate gets down to ~6 FPS at 1920x1080 on high settings :p!

Ah thanks for checking that.

So you are very much GPU bound then, the powerful CPU is sat waiting to deliver data to your card.

The other poster said, "A PM from Cosmos (who is on a Q6600 like me) seems to back Wookie up! - "the last scenario slows right down, with all those ships, and it would appear its bottle necked by the CPU.. but there is a ridiculous amount of stuff going on. GPU was at 40% load and was getting <20fps."

So, as expected, we go from one poster being CPU bound in the same scenario with a Q6600 to your case where the CPU is twiddling its thumbs because the GPU cannot cope: the only reason a GPU would run at only 40% is a CPU bottleneck, either the hardware or the app itself written poorly (not multithreaded for example, but you have proven this is not the case).

I observed that, "remember a top-spec Haswell is way ahead of a 6600, maybe up to 3 times as potent for CPU intensive stuff."

This is the evidence of that, your CPU is far more capable than your GPU - a GTX660? - and more capable (as expected) than the 6600 CPUs.

Throwing multiplayer into the equation seems to increase load on CPUs judging by experience with BF3, so come March 2014 the 6600 people are going to be CPU shopping and you are going to be GPU shopping!

I am a hi-fi and performance car man and the way I view it is that I'd always rather the core of a system is the most competent aspect. I'd rather my speakers were slightly less capable than my amp, then I know I am getting the best I can from them, but the best speakers in the world cannot make a 200 quid Sony amp sound good. The same with a car, I'd rather my car was over-powered for the suspension, I can always reign in my driving if I lose traction, but the best suspension\tyres in the world are not going to greatly improve the experience of driving a 1.6 Ford Focus!

Same with a PC, I think a video card sat doing nothing is a far worse scenario than a the CPU waiting on the video card. You can always turn off graphical bells and whistles if the GPU is overloaded, but if the CPU cannot generate enough geometric data to send to the GPU in the first place, then even going down to 800x600 isn't going to help you. A lot of people just fall for marketing speak and buy the best GPU they can, forgetting something has to actually do the grunt work of sending data to it. Plus, a powerful CPU benefits everything, a powerful GPU only benefits games.

Anyway, the code is bound to go through a ton of optimisating before release, so what we see now is in no way representative of production and the multiplayer angle will change the behaviour of a system again.
 
Last edited:

Squicker

S
Just for a laugh, I tried 1280x720 at the LOW preset... it still looks pretty damn good to me :D! Here is a video

I am fairly heartened now that I know the lower spec systems are sweating, not in a mean way! But I read someone say they thought that a Q6600 with an old GPU would run it well in high-res and I just thought, "that sounds ominous", because that really ought not to be the case, and clearly it is not.

For a game to have a good life-cycle it needs to stretch current kit and be able to be further optimised for the next gen of hardware IMO.

Tonight I played Skyrim and whilst it is starting to show its age now, I've been playing it for what - three years? - and I would not be doing that if it looked crap now. When I got Skyrim, I could not run it with everything on, but I now can run it with everything on, all the HD texture packs and mods and still get 120FPS from it, that's how far hardware has come in the lifecycle of that game, and that's what I am hoping for with ED, something I can keep getting more and more out of it, until in maybe 2 or 3 years it's time to send it to the great game graveyard in the sky!

BTW when you dropped the res how did your FPS fare?
 
Last edited:
I am fairly heartened now that I know the lower spec systems are sweating, not in a mean way! But I read someone say they thought that a Q6600 with an old GPU would run it well in high-res and I just thought, "that sounds ominous", because that really ought not to be the case, and clearly it is not.

While I am not too bothered with getting the latest and greatest graphics card, it definitely does seem like those that do splash out will get their money's worth! Given that this is only an alpha, I think that FD have done a marvellous job of making it scalable so that it will run on a wide range of machines, while making good use of high end cards as well. Someone with a GTX 780TI said they were getting >70 fps on max settings, and good luck to them :D
 
Back
Top Bottom