Anyone that runs Odyssey in a position to try this to see if there is a performance change?

I am dying 0 times in a match while running ideal paths and shooting down bots as efficient as possible regardless of whether the game runs at 10fps or 10000fps.
It simply doesn't make a difference for success in EDO. It is that slow-paced. The only difference is when it feels laggy / framed / non-fluid. This for me starts below 20fps. Therefore, everything above 20 fps is fine for me.
And back when I was a broke student I finished farcry 2 at 20-25fps on very low. Either it's proof that farcry are best enjoyed at minimal setting and FPS, or just proof of nothing at all beside the fact I was broke.
While you think that 20fps is good, it's not for the majority of the consumer base. Which is why games on console are capped at 30fps minimum and on PC they are usually at 60minimum. Also why Odyssey had such a poor reception and DB himself had to apologize about the poor performances.

Now that's it's settled, perhaps we can move on ?
 
I found no performance increase after a clean win10 install a couple of months ago.
Had nothing but EDO installed and the latest gfx drivers. Was I surprised, no.
 
There is no contradiction there since the brain's perception of two different things can of course be different, but you can very easily google that.
Many people have been asking why 24 fps is the standard for movies and TV.
The point is that a lot of people here are confused about different things. They think the fps meter at the bottom left and the frame rate on the monitor are the same thing.
I even think that if someone plays this counter at 250 (for example) and then shoot it on video (with maximum bitrate) and show it at 60fps, they will argue that it is not ;)
 
They think the fps meter at the bottom left and the frame rate on the monitor are the same thing.

And who thinks that, excepting cases where they actually are the same thing (e.g. adaptive sync on a VRR display with transition times fast enough to keep up)?

Not that one should believe there is zero benefit to be had from frame rates in excess of refresh rates, which is a very common fallacy, or vice versa.
 
May I ask your CPU and GPU?
You see the point, loading the GPU is not a reflection of using the entire graphics card.
There is a bus, there is video memory, etc.
I9 7940x 14 cores 2080ti, 64gb ram quad channel
I guess something happens at settlement which is chocking some sort of bandwidth
 
Last edited:
Thank you.
I also want to clarify, is it always a single-player game ?
I'm playing in PG, nobody else there
If u see others in group makes no difference.

Weird thing is my gpu is the bottleneck, cpu would probably consider a 3090 as a bottleneck as well.

Does Elite use multiple cpu cores or target 1 or 2... I will have a look at hardwinfo
 
Last edited:
This is one of the only places I know of that still argues that 24-30fps is "standard" and "good" for gaming. And one of the only places that continues to argue that "the human eye can't perceive more than 30-60fps."

Stay classy.
 
This is one of the only places I know of that still argues that 24-30fps is "standard" and "good" for gaming. And one of the only places that continues to argue that "the human eye can't perceive more than 30-60fps."

Stay classy.
You have the wrong translation. I think a lot of people just write that the main thing that the minimum frame rate in all conditions was 25-30. Not the average, namely the minimum.
And it will be somewhere around 250 is not so important.
 
Back
Top Bottom