Recommended specs

Does anyone know what they will finally end up being after optimization? What are they aiming for? If they are too high it will hurt sales because anyone willing to update their computer will probably be a backer anyway. Or are sales not that important for this project and the only issue is getting the vision made? I suggest having a 1GB card and a duel core CPU and 4GB ram as a recommended specs. That will make sure most people are okay with the game on their system.
 
Well, with my old rig (E8400 coupled with 560ti 1go) game is running pretty fine with all maxed up. So I think the final release will have specs close to what you suggest.
 
That's good news. I don't have to fork out another 300-400 for a high end card. I just have to turn some stuff down if it's too slow. It would be interesting to see if people can pick the difference between 40 fps and 80 fps. I'm guessing no unless the fps is shown.
 
Well, my dual core i5 2.5 GHz laptop with Intel HD4000 graphics is not up to the job with the beta.

That may change, of course, but I'm not expecting it to. Frontier have always recommended a quad core.

Good thing I also have a robust desktop machine that's eminently suitable. :)
 
Well, with my old rig (E8400 coupled with 560ti 1go) game is running pretty fine with all maxed up. So I think the final release will have specs close to what you suggest.

That's good to hear! So my prediction my 5-6yr old system (with a new gfx card) will run the game well, is hopefully correct :)

I don't plan on upgrading this year :)
 
Well, my dual core i5 2.5 GHz laptop with Intel HD4000 graphics is not up to the job with the beta.

That may change, of course, but I'm not expecting it to. Frontier have always recommended a quad core.
Isn't that most likely down solely to the gfx card which would be a huge bottleneck?
 
Isn't that most likely down solely to the gfx card which would be a huge bottleneck?

That may be the case but I wouldn't be too sure. Integrated graphics aren't the pile of poo they used to be.

I was able to play "Tomb Raider 2014" quite acceptably on medium settings with low shadows.
 
Hmmm

Well I just upgraded my graphics card from a Radion 5770HD with 1GB to a Geforce 760 with 4Gb GDDR5 ( cpu is AMD PHENOM 3ghz , 6 core , 32Gb DDR3) and the difference was mind-bending in the frame rate. Awesome. The game was playable with the 5770 but I certainly wouldn't want to go back to it.
No Way !!!
Gary.
 
That may be the case but I wouldn't be too sure. Integrated graphics aren't the pile of poo they used to be.

I was able to play "Tomb Raider 2014" quite acceptably on medium settings with low shadows.

Well, we have folks running the game (in their words fine at max settings) on E8400s. I'd suspect that couldn't be any faster (probably slower) than your I5? So the difference must come down the gfx card?

Well I just upgraded my graphics card from a Radion 5770HD with 1GB to a Geforce 760 with 4Gb GDDR5 ( cpu is AMD PHENOM 3ghz , 6 core , 32Gb DDR3) and the difference was mind-bending in the frame rate. Awesome. The game was playable with the 5770 but I certainly wouldn't want to go back to it.
No Way !!!
Gary.

Yeh, that card must have been a huge bottleneck surely?
 
Well, my dual core i5 2.5 GHz laptop with Intel HD4000 graphics is not up to the job with the beta.

Don't know much about laptop GFX abilities, so I can only refer you to this: http://www.tomshardware.co.uk/forum/85285-35-graphic-memory-intel-4000-review-performance
Doesn't look like it's ever going to make the grade, unfortunately. :(

As for minimum specs, soon after alpha 1 a bunch of us tested it on various 'obsolete' machines & found that it ran acceptably well provided that the CPU & GPU were not wildly out of step with each other, or to put it another way, so long as they were fairly well balanced in their capabilities & that one didn't bottleneck the other.
My 'obsolete' test rig spec is: E8400 @ 3.3gHz, Nvidia GTX 460, 4Gb DDR2 memory, & all scenarios were very playable at max settings @ 1920x1080. As expected, in 'Factions' the frame rate was lowest but at no point did the game become a slide show, & I expect it could be improved by lowering the settings a little.
 
That's good news. I don't have to fork out another 300-400 for a high end card. I just have to turn some stuff down if it's too slow. It would be interesting to see if people can pick the difference between 40 fps and 80 fps. I'm guessing no unless the fps is shown.

More frames per second is a huge advantage tbh.
 
More frames per second is a huge advantage tbh.

Love statements like this!


So 60 fps is a huge advantage over 59?

200fps is a huge advantage over 100? Or 60?

And define huge? As much of an advantage as someone wearing an Oculus? Or someone playing on 3 monitor vs 1? Or someone with a £150 controller instead of a £15 simple joystick?

Exactly... It's a blanket statement without real meaning. :)


So let's be realistic. This isn't a twitch shooter, so anything over 30fps is a nicety, not a necessity. Personally if I can run at 60fps most of the time (vsync locked), fine!
 
Last edited:
Don't know much about laptop GFX abilities, so I can only refer you to this: http://www.tomshardware.co.uk/forum/85285-35-graphic-memory-intel-4000-review-performance
Doesn't look like it's ever going to make the grade, unfortunately. :(

As for minimum specs, soon after alpha 1 a bunch of us tested it on various 'obsolete' machines & found that it ran acceptably well provided that the CPU & GPU were not wildly out of step with each other, or to put it another way, so long as they were fairly well balanced in their capabilities & that one didn't bottleneck the other.
My 'obsolete' test rig spec is: E8400 @ 3.3gHz, Nvidia GTX 460, 4Gb DDR2 memory, & all scenarios were very playable at max settings @ 1920x1080. As expected, in 'Factions' the frame rate was lowest but at no point did the game become a slide show, & I expect it could be improved by lowering the settings a little.
That's very useful to know! So that bodes well for my quad core processor similar processor (E6600 @ 3.4) with a 7870XT.
 
I have head tracking and 1080p display with vsync on and mostly maxed graphics except AA which I don't bother with at native res. my GTX 650 does ok but runs much better at 1600x1200.

Luckily the game scales better than most so it doesn't look too bad.

I ended up capping my FPS to 30 using riva statistics server and it still looks very smooth and head tracking has a lot less stutter which is my main priority since most of the time the game experience is so much better with it.

If your system isn't quite powerful enough capping with vsync on means your never going to get tearing and your gpu doesn't get stressed too hard and has a consistently higher average framerate than when you let it run as fast as it can.

So well worth it if your gpu performance is struggling a little.
 
Well, my dual core i5 2.5 GHz laptop with Intel HD4000 graphics is not up to the job with the beta.

That may change, of course, but I'm not expecting it to. Frontier have always recommended a quad core.

Good thing I also have a robust desktop machine that's eminently suitable. :)

Another thumbs down for hd4000 graphics here. Fact that it is married to decent core i7 is immaterial, I get 5-8 fps if I get near anything fancy like a Coriolis!

Haven't even bothered on my ancient windows desktop yet, and not keen to bootcamp my mac, so a wee purchase could be in order....
 
Well, my dual core i5 2.5 GHz laptop with Intel HD4000 graphics is not up to the job with the beta.

That may change, of course, but I'm not expecting it to. Frontier have always recommended a quad core.

Good thing I also have a robust desktop machine that's eminently suitable. :)

Your current dual core cpu might be the problem since as you mention they do recommend a quad core. Michael has said that once the game is optimised they are hoping the suggested specs come down a bit so once this takes place it might be ok so no need to do any damage to your bank account just yet I think ;)
 
Love statements like this!


So 60 fps is a huge advantage over 59?

200fps is a huge advantage over 100? Or 60?

And define huge? As much of an advantage as someone wearing an Oculus? Or someone playing on 3 monitor vs 1? Or someone with a £150 controller instead of a £15 simple joystick?

Exactly... It's a blanket statement without real meaning. :)


So let's be realistic. This isn't a twitch shooter, so anything over 30fps is a nicety, not a necessity. Personally if I can run at 60fps most of the time (vsync locked), fine!

You know nothing, John Snow.

Just like many ignorant people out there think that 30 fps is silky smooth like TV and so on, I thought 60 was the best on TFT, and 85 which marks the line where I can no longer tell higher refresh rates on CRT; and nobody would ever need or want more.

Then I played DIRT 3 on a 120 Hz screen at 120 fps. It's definitely not a twitch game either. It made a huge, huge difference. And in other games as well. Even if my rig cannot give me 120 fps straight, the difference is always noticeable.

I never want to go back.

Google FPS comparison.
 
You know nothing, John Snow.

Just like many ignorant people out there think that 30 fps is silky smooth like TV and so on, I thought 60 was the best on TFT, and 85 which marks the line where I can no longer tell higher refresh rates on CRT; and nobody would ever need or want more.

Then I played DIRT 3 on a 120 Hz screen at 120 fps. It's definitely not a twitch game either. It made a huge, huge difference. And in other games as well. Even if my rig cannot give me 120 fps straight, the difference is always noticeable.

I never want to go back.

Google FPS comparison.

Is it worth paying big bucks to get the Titan cards in sli though to get this game up to 120 fps. I honestly can't tell the difference between 80 and 50. Actually, once in project cars I had it up to 100fps but the display gave me a headache with micro studders. I suppose you have to turn things down to get 80 on a normal games machine, but does it really look any uglier? I don't think I will bother upgrading unless the game turns into a slide show whenever theses lotza stuff on the screen.
 
Back
Top Bottom