Performance issues.

I have yet to find a manager that thinks that refactoring and optimization are A Good Thing ™.

For the right money, I can be hired ;-)

@CYB34: I don't argue that this card should run everything on high. I'm arguing that it shouldn't boil my GPU while my viewport is the inside of a box. Using freelook is seizure inducing, really. Your testimony also underlines my core point - on a beefy hardware, you suddenly have twice the GPU usage with 1/3 of visible content. Something is not right.
 
Last edited:
If a GTX660 is comparable to an ATI R9 270 like someone said here, than yeah it can run ED at ultra. I have the ATI R9 and it runs fine on ultra settings. Also in stations.
The fans are constantly blowing at 100%, but the game itselfs runs perfectly smooth. (Aside the stutter when jumping to hyperspace, but I really think that's lag)

I highly doubt that a midrange GPU released 3 years ago can run this game on Ultra. Maybe in deep space.
I upgraded my "recommended" GTX770 because it was unable to maintain constant 60fps inside big stations and RES sites.

People have different standards, some dont care about a 5fps drop, some are ok playing at 30fps.

I'm one of these picky mof*s:

59fps_01e1f9_5512995.jpg
 
I highly doubt that a midrange GPU released 3 years ago can run this game on Ultra. Maybe in deep space.
I upgraded my "recommended" GTX770 because it was unable to maintain constant 60fps inside big stations and RES sites.

People have different standards, some dont care about a 5fps drop, some are ok playing at 30fps.

I'm one of these picky mof*s:

View attachment 31260

actually even a Xfire of R9 280X and a titan SLI are unable to maintain 60fps constant thanks to stuttering (the Xfire doesn't even work, and it's an eleven yo tech)
 
Is that on a 60hz screen? Might be a rounding issue due to vsync ;P

Oh I know about that.
The major lie by monitor manufacturers and Microsoft is that 59Hz works the same as 60Hz.

Those that have this issue should use the video driver vsync option and disable the ingame one.
 
Try dropping the resolution a bit. The 6xx series of nVidia card in my opinion was brought out at the time of PS3 and Xbox 360 when gaming was perfectly acceptable at a resolution of 720p and to me the cards seem to be optomised to run at this resolution. My GTX680 runs 100+ fps in space and 50-60fps in station / res. Thats a 720p, max quality settings with supersampling set at 2. The 660 was a more budget card but was still good. I would imagine you'll be able to run it similar, maybe with supersampling set to 1.

EDIT: to expand on my resolution point if the game is run at 1080p the frame rate drop off is significant reducing to 60-70 in space and 30-40 in station / res and thats with supersampling set to 1. I play through a 50" TV and can honestly say 720p with AA and quality on full looks better than 1080p with AA and quality settings turned down.
 
Last edited:
http://imgur.com/a/GfC2r

46 fps in station
51 fps leaving station
57 fps heavy asteroid RES (stutters in metal rings)
79 fps looking at station from outside
90 fps looking at planet

Max settings 1080p with 7850 OC (1GHz) 1GB and i5 3570.

The game should run at 60fps+ nearly everywhere except in stations with cards like an R9 280X or GTX 770/780.
 
Last edited:
I have a GTX 750 and I have also noticed the fans going nuts, especially in stations as OP said. I get 60fps most times but it does drop in stations, that is with in game vsync switched on. Display is at 1080p. Play wise it seems fine but yeah pretty noisy.
 
http://imgur.com/a/GfC2r

46 fps in station
51 fps leaving station
57 fps heavy asteroid RES (stutters in metal rings)
79 fps looking at station from outside
90 fps looking at planet

Max settings 1080p with 7850 OC (1GHz) 1GB and i5 3570.

The game should run at 60fps+ nearly everywhere except in stations with cards like an R9 280X or GTX 770/780.

Thanks for this.

My point is that there is absolutely no reason for the game to consume twice the GPU in stations. There's nothing to draw there, especially in the under-deck bay. Conversly, there's a lot to draw at a RES, at least in terms of polycounts.

The take out from this is that the FPS is low in the station when docked, but when you undock, it magically goes up to decent numbers. Maybe the station is rendered multiple times per frame?

I know that this is a minor use case, but when I dock, I want to chill out and browse stuff in peace. Not have the PC hum like an ore extractor ;-)
 
From what people have posted so far GPU cooler turns faster when in a station. This appears consistent between two desktops I have with differing specs. The main difference being the graphics card in one is an AMD R9 270 and the other a NVidia GTX 980.
 
I don't mind my constant 250+ fps... smooth in any dock or dogfight.

System:Intel i7 4790k @ 4.4ghz, Kingston Genesis 16GB 2400Mhz, Asus GTX970 4GB OC SFF Edition, OCZ 256GB SSD, Twin Viewsonic Monitors @ 1680x1050, Windows 7 Pro 64 Bit etc etc.Settings:Borderless, Vert Off, Rate Limit Off, Everything else on High
-
I think it's important to have a well balanced system.. no point in having a chunky new CPU and still using a GPU that's a few years old. The most critical component has to be the graphics card these days. I don't think the game will make much difference in overall performance on a slower CPU or less RAM. Even with all the stuff going on at the moment on my machine with ED, Explorer, uTorrent, Outlook and several other background tasks running I'm only using 4.5GB at about 30% CPU utilization.
-
I work/own/run a computer shop here in Australia and whenever I build a gaming machine for a customer, I always stress the importance of a decent graphics card. The 970's are under $500 here at the moment and I think represents pretty good bang for buck. Sure the 980's are abit more powerful, but at nearly double the price I don't think they are that great in value.. you don't get double the performance.
 
Last edited:
Just for a comparison, I have a similar card (I think)

I have an Nvidia 650ti GTX and run the game at 1680x1050 (I know not the highest but thats the best on my current monitor) with everything on ultra.

I generally have a FPS of 80-85 in normal space flight SC and around most planets.
This drops to about 60ish in RES sites and 45-55 in and around stations.

CMDR Flaxton
 
From what people have posted so far GPU cooler turns faster when in a station. This appears consistent between two desktops I have with differing specs. The main difference being the graphics card in one is an AMD R9 270 and the other a NVidia GTX 980.

i also noticed some differences in station, temp doesn't move much due to complete watercooling and 1 gpu deactivate since no Xfire support, but turns out the VRM temp increase slightly in station on the working card, as well as the %load of the gpu, so definitely more work in station, might have to do with the fact there is many shadows + many light sources, whereas there is only on light source within RES (or 2 but not as many)
 
You see, that's the whole problem I have with today's computing, enthusiasts and industry in general.
Back in the day, devs would go down to the lowest possible level and do optimizations to get stuff working. Nowadays, we just slap tons and tons of hardware on top of what seems to be poorly written code, or just a performance edge case.

Sure 660 is not a high end card, it's even mid-low card. But, it runs games that look much better at a higher frame rate.
I'm sitting in a box, looking at my cockpit and the wall ahead of me. 100% gpu usage. There are no effects visible, not a lot of geometry.
I'm in space, flying between a field of asteroids, buzzing around ships, doing combat. 80% GPU usage, 60fps.

In my mind, arguments 'get a better card lawl' make no sense, when we're looking at a particular scenario that simply misbehaves.

I did a degree in realtime graphics at one point, so it's kind of my pet pevee ;-)

Except I didn't say get a better card. Turn down the settings or put up with a reasonable 30 fps in stations (the single most complex object in normal space).
 
I highly doubt that a midrange GPU released 3 years ago can run this game on Ultra. Maybe in deep space.
I upgraded my "recommended" GTX770 because it was unable to maintain constant 60fps inside big stations and RES sites.

People have different standards, some dont care about a 5fps drop, some are ok playing at 30fps.

I'm one of these picky mof*s:

View attachment 31260


I just downloaded FRAPS because you really got me curious actually. With a R9 290 I get 40fps inside a station, a steady 75 in normal space outside a station.
I can hardly tell it's running that much slower inside a station, so ok, I guess I'm not that picky then ;)
 
Last edited:
Except I didn't say get a better card. Turn down the settings or put up with a reasonable 30 fps in stations (the single most complex object in normal space).

I have 45fps in stations, and it still doesn't look good. Plus my PC is boiling.
Having 'good fps' was not my point at all. My point was that when the viewport is enclosed in a box (the hangar) it seems to render much more than is visible. The station is nowhere complex to modern games' complex.

Following this: http://en.wikipedia.org/wiki/Hidden_surface_determination reveals that something is not done. Stuff that you do not see should not be rendered. Ergo, while in a hangar, you see only a few walls with minor lighting and effects. This should not induce 100% cpu usage (or 200% compared to RES usage), because there should be virtually nothing to render. *This* is my point.

Instead, this thread party derailed to a e-peen measurement competition. Not sure what the point is. There are countries where buying a 970 takes 100% of average monthly salary. Something to think about.
 
Back
Top Bottom