What fps is acceptable?

Funny thing is that even if you think 30 fps is acceptable, you just find yourself missing while shooting much more often. When I started playing Odyssey I was wondering why I was sucking so much at aiming, when I was quite good at it in Call of Duty years ago. Then I turned on the FPS counter and noticed I'm playing at around 30fps ish. I lowered the resolution and suddenly I could shoot much more precisely. So yea, 30 or 60 fps is a huuuge difference.

Lowering the resolution gave you better performance? I've been limited with the amount of time that I've been able to try things out for myself (bloody kids) but from what I've read on here I got the impression that lowering the settings usually does nothing for most people?
 
Lowering the resolution gave you better performance? I've been limited with the amount of time that I've been able to try things out for myself (bloody kids) but from what I've read on here I got the impression that lowering the settings usually does nothing for most people?
The rendering resolution puts a significant impact on the performance and the impact is proportional. The lower the resolution, the less pixels to rasterize and shader, the more images you can put out in the same amount of time using the same amount of shader units.

You need 4 times the work to put out a 4K image vs a 1080p image. Lowering from 1080p to 720p lessens the rasterization and shading workload by a factor of 2.25 (down from 2,073,600 pixels to 921,600 pixels)
 
Last edited:
60 is what I always aim for! This game is very very badly optimised! In space it's nice and smooth I can hit and stay at 60fps happily, on the ground I can get dips as low as 30 in odyssey also it's pushing the graphics card and the cpu to 100 percent usage which quiet frankly is unacceptable as I am above the recommended requirements on pc 60 fps has been practically the norm for years especially at 1080p
 
The rendering resolution puts a significant impact on the performance and the impact is proportional. The lower the resolution, the less pixel to rasterize and shader, the more images you can put out in the same amount of time using the same amount of shader units.

You need 4 times the work to put out a 4K image vs a 1080p image. Lowering from 1080p to 720p lessens the rasterization and shading workload by a factor of 2.25 (down from 2,073,600 pixels to 921,600 pixels)

Very valid points and good info for folks who are less tech savvy. I Would also check for a cpu bottleneck. As that takes away lots of FPS even with the biggest gpu's
 
The rendering resolution puts a significant impact on the performance and the impact is proportional. The lower the resolution, the less pixel to rasterize and shader, the more images you can put out in the same amount of time using the same amount of shader units.

You need 4 times the work to put out a 4K image vs a 1080p image. Lowering from 1080p to 720p lessens the rasterization and shading workload by a factor of 2.25 (down from 2,073,600 pixels to 921,600 pixels)
Yeah, if Odyssey isn't following this basic formula which is consistent across literally every other videogame, then there's something else seriously wrong somewhere that is hogging resources and causing most of the FPS drop and rendering the resolution-based differences barely noticeable in comparison.
 
Very valid points and good info for folks who are less tech savvy. I Would also check for a cpu bottleneck. As that takes away lots of FPS even with the biggest gpu's
Yep. That is why many 2010 to 2014 Direct3D 9 based games suffered performance issues because almost all but the Intel CPUs were multithread-focused in their architecture. Direct3D 9 didn't support multithreaded rendering which meant only one core could calculate and feed the vertices to the GPU.
 
Yeah, if Odyssey isn't following this basic formula which is consistent across literally every other videogame, then there's something else seriously wrong somewhere that is hogging resources and causing most of the FPS drop and rendering the resolution-based differences barely noticeable in comparison.
Calculating and drawing geometry that is obscured by geometry in front of it for example.
 
Yep. That is why many 2010 to 2014 Direct3D 9 based games suffered performance issues because almost all but the Intel CPUs were multithread-focused in their architecture. Direct3D 9 didn't support multithreaded rendering which meant only one core could calculate and feed the vertices to the GPU.

There's something fishy going on btw and not only in Odyssey, but in Horizons as well.
I checked it 2 days ago, and in planetary rings my FPS went all the way down to 85 to 90 (from the usual 130+), while neither the GPU nor the CPU is under heavy load (GPU: 50 to 60%, CPU: 38%).
 
Yep. That is why many 2010 to 2014 Direct3D 9 based games suffered performance issues because almost all but the Intel CPUs were multithread-focused in their architecture. Direct3D 9 didn't support multithreaded rendering which meant only one core could calculate and feed the vertices to the GPU.
The original Crysis is a good example of a game that didn't support multithreading prior to the updated engine used in the sequel and its console releases, the key part of how they managed to eventually port it to those systems.
 
60FPS at 1080p. Because the monitors I have won't go higher. I get that with 10 Gen i7 and a RTX 3060Ti. I have not been around surface bases though.
 
60FPS at 1080p. Because the monitors I have won't go higher. I get that with 10 Gen i7 and a RTX 3060Ti. I have not been around surface bases though.
With my Series X the difference the supersampling from rendering at 4k makes on a 1080p monitor is very much noticeable.

I'd still like the option to toggle to 1080/60 with raytracing instead of only having a choice between 4k/30 with it and 4k/60 without it in several of those games.
 
60FPS at 1080p. Because the monitors I have won't go higher. I get that with 10 Gen i7 and a RTX 3060Ti. I have not been around surface bases though.

For a while I had my current GTX1080ti but had not yet got a 4k monitor & was still using my old 1080p one. The GPU had plenty of spare cycles so I ran the game at 4xDSR (effectively 4k resolution downsampled to 1080p) for a kind of 'expensive' anti-aliasing. It gave a noticeable improvement in image quality.
 
I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.

In Horizon in open space i see 144 fps, only because i set the frame limiter to that for sure. In Horizon i go to 90 in station and you can be sure as hell than the difference can be noted.

Having said that, the day we'll have, with my config, a situation in Odissey where the counter never goes under 60, i'll be an happy man.
 
5aq5d7.jpg


(ps. 60 fps limited)
 
IMO 30 is good for 2D. There’s a reason why movies have a frame rate of 24FPS... most people can’t tell the difference visually between that and higher frame rates.

Wrong.

Film has natural inherent motion blur which smooths things out. When played back, each image is actually flashed two or three times which also helps, videogames do not. Movies have no user imput, videogames do.
 

Deleted member 275623

D
Human eye doesnt work in fps and even if its 300-1000 a study was made once. A trained eye sees immediately.
 
I see people on here complaining that they only get 60 fps....... if you turned your counter off would you even know? What can the human eye discern?

Now, I am only getting 20 fps and I can sure as hell discern that - it runs like a dog.
Yes. I'd notice. I can tell the difference and I pay through the nose for the hardware to do it. That said it isn't so much that the performance is awful, just that the performance is awful in comparison to the previous build.
 
Last edited:
Thermals?

I can have issues with thermal throttling on my editing laptop, dropping the fps drastically I’m the process. Being a laptop I’m limited when it comes to cooling and it gets very toasty when playing Odyssey, more so than rendering in 4K.
Never above 85. It otherwise goes 1900mhz boost but sometimes crashes.
 
Back
Top Bottom