What fps is acceptable?

I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.
The human eye can discern 60 vs. 144.
144 in 4k should be attainable with the right machine. Currently it seems like nobody with current tech can achieve max graphics with buttery smooth framerates.
 
The human eye can discern 60 vs. 144.
144 in 4k should be attainable with the right machine. Currently it seems like nobody with current tech can achieve max graphics with buttery smooth framerates.
Yep Frontier is one step ahead, thinking about the future, and designing for tomorrows technology
(and this quarters yearly financial report).
 
I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.

It’s not about being able to perceive each individual frame, it’s about how smooth it feels, and you can definitively feel a difference above 60 FPS vs below, especially if it drops in the 40s.
 
The source I found said the average is in the region 24-30. Fighter pilots have been apparently been found to discern frames at up to 250fps.

30 for games is fine. I grew up with 10fps, if I was lucky.

I am extremely sceptical of anyone who claims that anything less than 60fps ruins it for them.
There are a couple of issues with this.

First of all, in a passive experience, 24fps works - it's just good enough. In a game, however, you're experiencing much more control and you're looking for extra information like direction and speed (as noted by Morbad in his post soon after yours), particularly when you're trying to hit a running target with the plasma marksman rifle, which has a very slow projectile speed. I can assure you that once you've experienced the difference between 60 and 120fps in a game, it's no longer an academic matter that one can be sceptical of; it's just better, and it is easily perceptible. For me, 60Hz is the minimum I enjoy playing at. I can immediately tell if, for example, Elite defaults to 48fps as its target framerate, as happened to me yesterday when I launched Odssey on my laptop for the first time. In fast-paced twitch shooters, I will happily sacrifice eye candy for framerate.

Secondly, it's not just that "less than 60" is bad. If a game is performing at less than the target framerate, it often means that the time between frames is inconsistent as the GPU is struggling to keep up. Vsync contributes to this too, as the GPU will discard a frame if it can't hit the target time. Frametime variance contributes heavily to feelings of "choppiness" in games, something Odyssey is really bad for right now. G-sync/Freesync will help, but not when the framerate is into the 30s. VR systems use reprojection to infer the content of a skipped frame to prevent variance in the ramerate inducing nausea. They usually target 90fps.

I grew up with low framerates too. We also didn't have "first-person" views until, ironically enough, Elite was released in 1984.
 
There are always going to be trade offs between frame rate and other things, and serious ones for the foreseeable future, especially in Odyssey.

Personally, I'm not a framerate junky, not in these kinds of games. For the last several years, I've tend toward relatively slow VA panels because bad contrast bothers me a lot more in space games than a bit of blur. The fastest display I currently own barely does justice to 120 fps/Hz (it's advertised as 165Hz, and I run it at 144Hz to take advantage of VRR, but the true pixel response means a lot of transistions can't keep up). Still, I target 60 fps minimum for smoothness of motion and acceptable latency and have for quite some time. I used to tolerate 20-25fps in the Quake and Mechwarrior 2 days, but that was mostly because I didn't have a choice. Now I do, and the improvements are obvious out to quite a high frame/refresh rate, and more subtle, but still apparent well beyond that. It's just a matter of finding the right displays and deciding if any trade-offs are worth it or not...which are very subjective things.

Not gonna knock anyone for being content with 30 fps, as long as they aren't trying to tell me that's the limit of perception or something.

After I have realized what is breaking performance and how to avoid it I managed to get stable 60 fps EVEN inside station ports.
Details in video description.

Good methodology, and there are certainly setting that can help, but it's up to Frontier to address the major underlying issues. These still seem to be persistent on your setup (the GPU utilization is a clue) even at your optimized settings.

Wow i never would have imagined there was such a clear difference. I guess thats what motion blur was invented for.

Motion blur was originally an artifact of film exposure/shutter times; unavoidable and largely undesirable.

People have gotten used to blur...it's 'cinematic' and can be used to hide questionable details. However, the ideal would be zero blur, especially if something is to eventually look realistic enough to fool the senses in VR, or just as detailed as one would expect when actually able to track a moving object (with one's viewport or eyeballs). Try following your mouse cursor around the screen with your eyes...even if you have a 300 or 360Hz, 3mz GtG, display (the fastest I've personally used), it's going to get fuzzy and have a blur trail as long as it is.

Problem with this is that it requires silly high frame and refresh rates with displays that don't strobe (sample and hold). Displays that do strobe can achieve it at pretty low frame rates, relatively speaking (about 120), but strobing has serious downsides itself (major brightness reductions and that strobing can be noticed at even higher frequencies than sample and hold blur).

Some potentially interesting reading and demonstrations:


For the UFO in the eye tracking test to have no blur or for the image in the persistence test to have perceptibly full resolution, you'd need a ~1000Hz display. There are certainly deminishing returns before this, but that's the target to realistically get close to maxing out perception in a way that it becomes indistinguishable from real motion.

My theory is that if it occasionally dips to lower framerates, whatever is responsible could also result in your controller actions being out of sync with what you're seeing. It's possible that this leaves the impression of "sluggishness", but not necessarily the framerate itself. Or, to put it another way, in many such cases we probably wouldn't even notice it if we were watching the scenery passively, as in a movie, for example.

Average input to display delay increases by half of the current frame time interval, and this can become readily apparent in and of itself. Even more noticeable is the lack of precision and feedback if the effects of one's input is less frequently, or less consistently, updated.
 
I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.

for me. With my TV. Anything below 50, yes i can visually tell and it feels sluggish. 60 is what i lock it too since thats as good as the TV can do.
 
I game at 1080p and my monitor is 60hz, so a solid 60 at 1080p with all settings maxed would be my acceptable bottom line. Other, more demanding games, can manage it so it should be easily attainable in Elite.
 
60 FPS for average hardware at 1080p
144+ FPS for enthusiast hardware at 1080p
60 FPS for enthusiast hardware at 4k

Anything less is unacceptable. 30 FPS is DOS era performance, 30 FPS is lazy coding. Computers and GPU are capable of way more nowadays.
 
Last edited:
that does sound odd! i have an i7 860 (yes its older than old), 16Gb, gtx970 and i have 30-60 in most situations with High settings, except when on foot at sites pretty much, which is about the same as in Horizon really (non-vr ofc)

wanna trade ur setup for my antique one? ;) dont mean to make fun of you, but do u have to restart the game to see a change in FPS when changing gfx settings as i do?
Thanks for the reply - I'll keep my rig thanks :). Yes I do find I have to restart, but this has until now been a question of restart with no visible effect. Your reply had told me to keep trying as there is clearly something 'broken' locally for me. This setup has all the cruft since the first Beta on it and I've been using EDProfiler to launch with and without VR, so no doubt the answer lies in a corrupted file somewhere.
 
Motion blur was originally an artifact of film exposure/shutter times; unavoidable and largely undesirable.

People have gotten used to blur...it's 'cinematic' and can be used to hide questionable details. However, the ideal would be zero blur, especially if something is to eventually look realistic enough to fool the senses in VR, or just as detailed as one would expect when actually able to track a moving object (with one's viewport or eyeballs).
It's particularly egregious in last-gen console versions of certain games which didn't offer a toggle for it, keeping it always on - something which is not as intrusive when you are running that game at 60 on PC, where the added motion blur did not compound the motion blur effects of the lower framerate. But on those console ports it was a trip to eye strain city. Thankfully on the new XBox consoles at least, that problem can now be mitigated by FPS Boost removing the 30fps cap.
 
Did you clean up your custom Gfx settings as fdev suggested?

  • Resetting the game's graphics configuration files helps boost performance for some CMDRs. Backup then delete all files in this folder, which will reset all settings files to defaults: C:\Users\your_user_name\AppData\Local\Frontier Developments\Elite Dangerous\Options\Graphics (thanks /u/Pyran)
This seems to have worked for me but via a very roundabout route. I deleted the folder but even though it was re-created with only 3 files in, the game then crashed after 5 seconds at the main screen every time. Surprisingly, even when I reverted to Horizons it did exactly the same. I got it to 'survive' by quitting the launcher and restarting from EDProfiler (with the no-VR flag). The reported FPS is now 30+ in the menus where it had been 4 - I'll be trying this out in the actual game tonight. Thanks for the tip.
 
It's a shooter. 60fps is the absolute minimum. Anything below that on reasonable hardware like, I dunno, the recommended specs, is completely unacceptable.
 
Funny thing is that even if you think 30 fps is acceptable, you just find yourself missing while shooting much more often. When I started playing Odyssey I was wondering why I was sucking so much at aiming, when I was quite good at it in Call of Duty years ago. Then I turned on the FPS counter and noticed I'm playing at around 30fps ish. I lowered the resolution and suddenly I could shoot much more precisely. So yea, 30 or 60 fps is a huuuge difference.
 
horizon 60 FPS maximum quality.. Odyssey 20 FPS minimal quality.. I expected something similar but less brutal, it's time to change my PC. I have the same concerns with all the recent games the world of computing is like.
 
I wonder what really is wrong tbh? I haven't upgraded my machine in around 4-5 years apart from some new memory that was hardly an upgrade, I get a solid 70 frames on the starports on foot then everywhere else is well over 100, usually always above the refresh rate at 1440p and everything up full, even tried with the ULTRAFORCAPTURE terrain yesterday and it did hit the fps a bit more but tbh that was even still playable for me over 60fps, very very rarely below.

I did have to create a new nVidia profile for Odyssey to set the power setting on the GPU cause it was throttling with the old Elite Dangerous profile.
 
Back
Top Bottom