Are FDev still optimizing Odyssey? Or is this it?

Just to clarify from my side:

I wouldn‘t mind if I ordered a pork chop and was given a chicken breast - as long as the chicken breast is tasty and good.
I do mind when I am given spoiled meat that gives me stomach problems :)
(I am referring to bugs and technical issues.)

At this point I wouldn’t mind any kind of content as long as it’s not just grind(R) in disguise.
That's just irrelevant in a different way.
 
Hmmm, this sounds intriguing.
I do have VSYNC ON in ED. I will give it a go tomorrow and disable it to see how the game behaves then.

On a side note: 60 fps is not really an option for me anymore. I got used to 120 fps and more in other games and sadly 60 is too “rough” for me now.
I really wish Odyssey was optimized better so I could enjoy it without drops.
you should find your minimum FPS will increase..... it's not all gravy however as you will likely get some tearing as well where the screen update is out of sync with the screen drawing

which is why adaptive /fast vsync (or a gsync or the AMD equivalent screen tech) is a nice compromise
 
Last edited:
Hmmm, this sounds intriguing.
I do have VSYNC ON in ED. I will give it a go tomorrow and disable it to see how the game behaves then.

On a side note: 60 fps is not really an option for me anymore. I got used to 120 fps and more in other games and sadly 60 is too “rough” for me now.
I really wish Odyssey was optimized better so I could enjoy it without drops.
I use "fast sync" set in Nvidia Control Panel instead of using the in-game V sync setting in taxing games like DCS World.
 
This opinion piece I read the other day springs to mind ;)
I agree with the following from that article, though my agreement is subjective:

"That in essence is the frame rate debate, or rather it should be. 30fps is acceptable but 60fps is desirable, especially in fast acting games like racers, shooters, and fighting games."

It should also be pointed out that not all 30 fps is created equal. For example, 30 fps in RDR2 looks amazing thanks to very good motion smoothing, and MSFS feels surprisingly smooth at 30 fps as well, again thanks to motion smoothing technology. Elite, on the other hand, feels so much smoother to me at 60 fps on PC than it does 30 fps (no smoothing at all) on PS4. The difference is palatable. Though I confess, some of my best memories of Elite are the early days when playing on PS4, so 30 fps was "playable" then, at least before I was spoiled by 60 fps.

This argument becomes mute when talking about VR, where framerate is absolutely critical to the experience.
 

What rate of speed can the human eye see?​

Our eyes aren’t moving at a specific speed, but the way visual stimuli are measured is in frames per second (fps). The visual cues in the world around us move at a particular rate, and our eyes can take in this information at a specific pace of perception. Most experts have a tough time agreeing on an exact number, but the conclusion is that most humans can see at a rate of 30 to 60 frames per second.

There are two schools of thought on visual perception. One is absolute that the human eye cannot process visual data any faster than 60 frames per second. The other school of thought is that it may be possible for some individuals to have some additional perception beyond the rate of 60 frames per second.
 
I'm not going to debate the 30 vs 60 (or any other number) fps here.
All I will say is that there is a noticeable difference between 60 and 144 fps. Especially when you played at 144 fps for a longer time. And if I had the option, I would always choose 144 fps.
And now even the mainstream consoles often offer 120 fps modes for players who have a corresponding TV.

I don't want to sound like an arrogant PCMR dummy, I just would like to play Odyssey at more than 60 fps because that is not good enough for me personally.
I know I never should have attempted to play at 144 fps because every game that runs 30/60 is now ruined for me. ;) (jk).
 
you should find your minimum FPS will increase..... it's not all gravy however as you will likely get some tearing as well where the screen update is out of sync with the screen drawing

which is why adaptive vsync (or a gsync or the AMD equivalent) is a nice compromise

The best compromise is to have a high-enough refresh rate that tearing is imperceptible.

I used fast/adaptive sync when I was on a 60Hz display before VRR was a thing. I used VRR on my capable displays up to around 144-165Hz. I don't use any kind of syncing at all on my 240Hz display.

Most experts have a tough time agreeing on an exact number, but the conclusion is that most humans can see at a rate of 30 to 60 frames per second.

There are two schools of thought on visual perception. One is absolute that the human eye cannot process visual data any faster than 60 frames per second. The other school of thought is that it may be possible for some individuals to have some additional perception beyond the rate of 60 frames per second.

This is mostly nonsense and has been discussed ad nauseam elsewhere.

Some of my prior posts on the topic, with some sources:

Anyone who ever stated the 'human eye' cannot process visual data faster than 60 fps was either a complete moron, or was talking about some uselessly narrow aspect of vision that has no meaningful application to most real scenarios, including playing video games. As I've pointed out before, depending on the specific parameter tested, you'll need four figures of updates per second to fool most people...not exceptional people, not some people, not trained people...most humans with eyes and undamaged visual cortices. Even if the constrain is something like 'meaningful information', 60 fps isn't remotely the limit.

Some people will be happy with 30 fps, but they are actually seeing less than someone playing at 60 fps. And someone playing at 60 fps is objectively seeing less than I am if I am playing at the frame rates I prefer. There are some simple demonstrations of this in the aforementioned links.
 
Last edited:
tbh for me it's not about I can see (tho 30fps I do subjectively see as Jerky and 60 as smooth) but for me it is about what I can feel.

60 on a monitor / screen for me is ok - other opinions exist ;) - but any less and I can start to get motion sickness and find controlls "feel" sluggish.

VR is another level again. I am ok with my quest , rift and reverb G2 at 90fps but without ASW or equivalent if FPS drop below that it makes me feel proper rough .

I would like a screen which did huge refresh rates... but my 65 inch 4k gaming TV is fine and I can't justify replacing it.
 
Last edited:
The game is working, planets look fine, etc, in those trailers, so Im sure they can always "revert" it back to that state.
Yep and they didn't because ?
Imo, even if the Cobra can actualy handle the quality seen in trailer, is not real time. The sceneries have been captured at "bullet" time rate then accelerated at real time rate for the trailers.

Can you imagine EDO, or ED at all, in a "working properly" state? Never been done before! Mind blowing!
Like you said, mind blowing.

your cardboard box turns into your detailed ASP, you know?
[...]

where even if you're a few hundred meters from the slot of a space station, it "pops" in tonnes of detail and lights.
It sounds like it's not just a LOD issue, i'm guessing you're also saturating your VRAM and your GPU is having trouble loading the assets in time. Try to decrease texture quality and terrain work.
 
It sounds like it's not just a LOD issue, i'm guessing you're also saturating your VRAM and your GPU is having trouble loading the assets in time. Try to decrease texture quality and terrain work.
whilst this is possibly an issue, i see the pop in as well... it does not bother me that much (I have other things to whine about, so its not a hill am gonna die on).... but i do see it and i have 24gb of gpu memory so one would think gpu ram is not my issue.

as for the original trailer......... IF that is the case it was hugely misleading then, didnt FD state it was in game screen shots? WIP subject to change etc etc and i am not suggesting any legal issues or anything like that........ but FD strongly intimated those trailers were a true reflection of the game at that time.

if they were piecing together some sort of photomode to make a video that makes FD disingenous imo.......... perhaps not as bad as Alien Colonial marines or the PS3 bullshots, but still dishonest imo.
 
whilst this is possibly an issue, i see the pop in as well... it does not bother me that much (I have other things to whine about, so its not a hill am gonna die on).... but i do see it and i have 24gb of gpu memory so one would think gpu ram is not my issue.
Even though the LOD is not really good in EDO, i don't think you see your ship like a card-box or the slot-mail pop in full details only under 1 km :)

as for the original trailer......... IF that is the case it was hugely misleading then, didnt FD state it was in game screen shots? WIP subject to change etc etc and i am not suggesting any legal issues or anything like that........ but FD strongly intimated those trailers were a true reflection of the game at that time.
Well... technicaly, it's in-game screenshots and its give an idea of what the game will look like. It's not like its would be about gameplay experience. Saying that, the gameplay video FDev produced already did much less honour to the trailers and the players at the time had made this clear (you will notice that there is a nice/evil fade to black between the approach of the planet and the landing of the ship).

if they were piecing together some sort of photomode to make a video that makes FD disingenous imo.......... perhaps not as bad as Alien Colonial marines or the PS3 bullshots, but still dishonest imo.
Well... it's common to overboost graphic engine parameters to make overshining capture expecting to optimise the game enough to release something close. It's just that FDev dramaticaly overestimated its ability to optimise the game.
 
as for the original trailer......... IF that is the case it was hugely misleading then, didnt FD state it was in game screen shots?
It was 'In Engine" rather than 'Actual Gameplay'... Pretty much the same that had always been screened by other games studios, generated by the engine to look good, even if it was only a frame or 2 each second. Don't you remember the amazing TV adverts of upcoming games, that always looked fantastic, with the legend "in-engine footage" in tiny letters at the bottom of the screen (nowadays they have a slightly more honest "Not in-game footage" tacked on.)
 
(nowadays they have a slightly more honest "Not in-game footage" tacked on.)
i absolutely remember those adds... they were dodgy Bullshot adds and i had no respect for those when they made them...... however the bit of the post i quoted is the important part for me.... now most companies who still insist on this (and i still dont like this) put not in game footage on them.

Even FD did this when they launched their outrageously fake elite launch promo video ...... and yet despite it saying not ingame footage a lot still complained - and it generated meme videos of "elite trailer vs elite gameplay"
At the time i am fairly confident (but dont have a link i am afraid so discount if you want) that FD said from then on that their trailers would actually use genuine in engine unmodified video........... and AFAIK they have generally done just that. It is why i believed their launch EDO trailer.... and tbh it is still possible it WAS genuine

(but maybe running on SLI 3090s or summat)

i would be interested to know actually.
 
Last edited:
There are two schools of thought on visual perception. One is absolute that the human eye cannot process visual data any faster than 60 frames per second. The other school of thought is that it may be possible for some individuals to have some additional perception beyond the rate of 60 frames per second.
I don't know if this is true or not (there seems to be some debate), but even if it is, it's not an argument for limiting framerate / refresh rate of a digital computer monitor. Our eyes are analog, so even if they are limited to 60 "frames" per second, it's not at all equivalent to the shutter speed of an LCD display. For example, look at a bright scene and then close your eyes, and you'll see a fading imprint of that scene in the blackness as your rods and cones "desaturate" over time. This gives our eyes a built-in motion blur, smoothing out how we perceive high speed motion. There's also an overlap of the "pixels" in our eye, unlike the sharp grid of a computer monitor.

If a real-life object crosses our vision faster than our eye can process, that object is still smoothly transitioning across our view, both literally and perceptionally. Contrast this with a digital LCD screen. An object crossing the screen faster than the screen can refresh will literally "jump" (the Picard Maneuver) from one location to another, skipping multiple pixels. You can see this clearly and painfully when a TV show pans across a scene at 30 fps - the stuttering is palatable. Even at 60 fps it can be noticeable. This is why many modern TVs run at 120 or 240 Hz and use interpolation to draw the scenes between those frames, to eliminate that pixel-jumping. It's also why older CRT displays didn't suffer from this stuttering as much at lower framerates, because the phosphorous itself was analog in nature, slowly fading like our rods and cones. The lower resolution also helps (the more pixels you have, the more pixels are skipped from one frame to the next at a lower fps). It's also why analog film could get away with 24 fps, because of the built-in motion blur of the analog film from one frame to the next.
 
If my memory serves me... (but I may be in error) the last couple of EDO pre-launch trailers were 'in-game footage' rather than in engine.
Screenies were produced using the now removed 'Ultra for capture' setting, which I do recall making great scenery, at a cost to FPS, of course.
 
To be honest, even if odyssey ran at 120 fps, i wouldn't enjoy it anymore either.
Most of my friends are out and if nothing new comes along, then nothing will motivate me in the long term.
 
I don't know if this is true or not (there seems to be some debate), but even if it is, it's not an argument for limiting framerate / refresh rate of a digital computer monitor. Our eyes are analog, so even if they are limited to 60 "frames" per second, it's not at all equivalent to the shutter speed of an LCD display. For example, look at a bright scene and then close your eyes, and you'll see a fading imprint of that scene in the blackness as your rods and cones "desaturate" over time. This gives our eyes a built-in motion blur, smoothing out how we perceive high speed motion. There's also an overlap of the "pixels" in our eye, unlike the sharp grid of a computer monitor.

If a real-life object crosses our vision faster than our eye can process, that object is still smoothly transitioning across our view, both literally and perceptionally. Contrast this with a digital LCD screen. An object crossing the screen faster than the screen can refresh will literally "jump" (the Picard Maneuver) from one location to another, skipping multiple pixels. You can see this clearly and painfully when a TV show pans across a scene at 30 fps - the stuttering is palatable. Even at 60 fps it can be noticeable. This is why many modern TVs run at 120 or 240 Hz and use interpolation to draw the scenes between those frames, to eliminate that pixel-jumping. It's also why older CRT displays didn't suffer from this stuttering as much at lower framerates, because the phosphorous itself was analog in nature, slowly fading like our rods and cones. The lower resolution also helps (the more pixels you have, the more pixels are skipped from one frame to the next at a lower fps). It's also why analog film could get away with 24 fps, because of the built-in motion blur of the analog film from one frame to the next.
Remember those old millisecond "flash" pictures?
 
Back
Top Bottom