"no obvious reason" worksReason being not quite optimized ofc which is obvious, idk anything other than that.
However, there may be something less obvious, like ships coming in, NPC's rerouting, re-deciding, etc.
"no obvious reason" worksReason being not quite optimized ofc which is obvious, idk anything other than that.
Possibly but i've been to few different settlements layouts with the same result with almost identical fps, so yea, no reason why it supposed to happen on almost all settlements and stations i visited when it was supposed to be a performance fix."no obvious reason" works
However, there may be something less obvious, like ships coming in, NPC's rerouting, re-deciding, etc.
I'd realistically temper those expectations and target something like Update 13 or 14 for that kind of graphics setting. Given that U8 was initially supposed to be the "big optimization update" and barely changed anything for a good half the vocal playerbase, I really doubt U9 is going to change much either.I'd really like to put it back to Ultra but will wait for Update 9 before attempting that.
I'd really like to put it back to Ultra but will wait for Update 9 before attempting that.
Well, I understand where you're coming from there. I'll still give it a try once U9 comes in and see what it does. Being out in the black I can only see what happens when approaching planets and when on the ground. I expect I'll finally be at Colonia by the time U9 drops so I'll see what happens there. I can always put it back to medium if needed. Call me a "cautious optimist."I'd realistically temper those expectations and target something like Update 13 or 14 for that kind of graphics setting. Given that U8 was initially supposed to be the "big optimization update" and barely changed anything for a good half the vocal playerbase, I really doubt U9 is going to change much either.
I'm on my way from Beagle Point to Colonia so I can't comment on "in station" situations but I can confirm that there has been an improvement in FPS on approach while in "exploration" mode. It used to be full of stuttering and FPS would drop on the surface to between 25-50. I'm currently experiencing no more stuttering on approach or in glide mode and FPS is around 50-90 on the surface now. This is with the default setting on "medium." I suspect that in a few weeks I'll see what it's like at Jaques Station. I'd really like to put it back to Ultra but will wait for Update 9 before attempting that.
Just visited a ground CZ (can't remember the name, but was in the CG system, planet A1 - site was in daylight still) in my ship and started to take some pot shots at some NPC ships after choosing a side. After a couple of minutes or so another CMDR joined in (I think they were on-foot but not 100% sure) and as soon as I'd face the settlement, incredible lag would set in with framerates in the single digits. As soon as I'd face in any other direction it'd be ok again.
Is this normal? (I know it's not but have others experienced the same in this "theatre of war" scenario or is it 'just' one of those poor performing settlements?)
I went back there again last night and being on my own didn't get the same drops again (still very uneven and janky but it was playable enough). It seems to me there's already more going on at these sites than the engine can handle and adding another player gives it the kiss of death. At least from a ship perspective. Did an on foot CZ and it was the usual (like above, janky but playable) though it was also without other players joining. I was at a medium site because the game's world beating UI doesn't tell you in the system map which is which (I'm aware FS tells you but I visited in my own ship). Will try High tonight as I imagine that's where most players congregate.I haven't had any big drop-outs like that since the last fix. Played in the last 2 combat CGs with, barely, acceptable frame rates. I get a stutter if another CMDR joins in but I get the same thing in a ship. It's like a CMDR alarm.
Just wanted to say that for an i7-10875 and a 3080, those are some pretty...low framerates at 1080p (especially given that the 3080 was designed as a 4K card). Both CPU and GPU here are so far above the listed recommended spec that it may as well be on a higher dimension, and yet 45-60 down to 30...that's not great.After last update no more significant issues at settlements, there were few that I had like 10 fps from time to time.
Now, I am averaging about 45-60 fps (locked at max 60fps). Here and there I dip to 30-35 fps in some weird situations, can't say why, nothing in particular that I see causes it. Like being at certain spots in settlements dips to 30-35 fps, but can't see any smoke / increase in LOD, no shooting, etc... Very weird.
However, I do have few issues:
a) Exiting supercruise at stations, I need to wait for like 3-5 seconds for station to actually appear, more often than not, this causes one noticeable stutter when it pops in
b) Passing mailslot causes one noticeable stutter, I drop to like 15fps for half a second - second each time my ship passes mailslot to the inside of the station
Processor:
- Intel Core i7-10875H
- locked to 35W sustained, averaging power consumption 25-28W.
- peaks at 43W in PL state (could do 45W).
- averaging clock speed 3.7 GHz on 8 cores.
GPU:
- NVIDIA GeForce RTX 3080 Laptop GPU - 8192 MB, Core: 1245 MHz, Memory: 1500 MHz, 95 W TDP
- averaging around 50ish W,
- max recorded power consumption was 85W
Memory:
Ingame settings are set to mostly Ultra, 1080p
- 16GB, dual channel (2x8GB), 3200Mhz
Considering I get that on an i7-6850K and 1080 (at 1920x1080), I suspect that Odyssey may be auto-tuning its graphics (and we don't have any knobs exposed to adjust that tuning ourselves). Another reason I suspect such is some of the video and screenshots I've seen look far better than anything I've seen on my PC, even when I run it in Ultra (Ultra for out in the black, High for in the bubble (Ultra tanks to 10 fps too often in the bubble)).Both CPU and GPU here are so far above the listed recommended spec that it may as well be on a higher dimension, and yet 45-60 down to 30...that's not great.
Sorry, 3070, that was a typo... But still correct, the FPS in certain situations is far too low from expected on this hardwareJust wanted to say that for an i7-10875 and a 3080, those are some pretty...low framerates at 1080p (especially given that the 3080 was designed as a 4K card). Both CPU and GPU here are so far above the listed recommended spec that it may as well be on a higher dimension, and yet 45-60 down to 30...that's not great.
In windows I notice anything less than 100 immediately. Over 120ish it looks the same (144 vs 360, etc).Considering I get that on an i7-6850K and 1080 (at 1920x1080), I suspect that Odyssey may be auto-tuning its graphics (and we don't have any knobs exposed to adjust that tuning ourselves). Another reason I suspect such is some of the video and screenshots I've seen look far better than anything I've seen on my PC, even when I run it in Ultra (Ultra for out in the black, High for in the bubble (Ultra tanks to 10 fps too often in the bubble)).
Of course, assuming my suspicion is on-track, that doesn't mean the auto-tuning is working well.
Bit of a ramble (from the perspective of someone who's been poking at a game engine for over 20 years, and this sort of thing in general for almost 40 years)...
Before anybody jumps in with "but I need my 120fps"... you don't really. I'm not going to claim you can't tell the difference between 60fps and 120fps (I do notice something in the difference between 30fps and 60fps, visual clarity? can't put my finger on it), but I did find something in Quake that would be a huge contributing factor to gamers "requiring" 120fps: button logic (eg, movement key handling) has a built-in lag where the first frame the button is pressed applies only half the movement, and the second frame on applies full movement (meaning that player acceleration is governed by frame rate). If this logic made its way into other game engines, it would explain a lot. Also, several years ago, I fixed QuakeForge's handling of gravity: it was highly frame rate dependent (most noticeable in jump height) because it was calculating the new position incorrectly (not taking the acceleration into account ((at^2)/2)*). If this bug spread far and wide (which I have no reason to doubt), then frame rate would matter very very much in a competition.
Now, my point (as to not needing 120fps): Odyssey at 20-30fps does not feel like it's running at 20-30 fps. I won't put a number on it, but there have been many times where I've felt like Odyssey was running very smoothly, only to find that it was getting around 25 fps when I turned on the display. Below 20 is not at all nice. Above 30... "aaah, too fast, unplayable!" (actually, half true: jumping to high frame rates after playing a lot at slower frame rates can make things difficult).
* Of course, this applies only in flat gravity (most game worlds). Properly curved gravity (1/r^2: KSP, maybe Odyssey (would hope so, really)) would need more terms (sadly, infinite, but for most purposes, just one more (((da/dt)t^3)/6)** is close enough).
** da/dt = jerk (or jolt), followed by snap, crackle and pop.
Dropping FPS "significantly" (more than a few %) is never reallt acceptable (unless it's always over some threshold). It is far better to have a steady frame rate (even 30fps) than one that bounces between 40 and 100.However, you can get somewhat "smooth" experience at 30fps in same games, but that happens if you are mostly running at that FPS all the time. When you have 50ish FPS and then it dips to 30 for few seconds, then back up to 50 for few seconds, then you turn around and it dips to 30 again, you definitely notice and it looks like stuttering. I would argue that for better on foot experience you are better off to lock FPS to 30 all together, but in any case, on new hardware getting around 30 fps is just so star citizeny
This. I can understand 30fps being "smooth" if it actually is able to stay there (after all, most console games up until PS5 generally ran at 30fps) and can get generally consistent frametimes (since a "locked" framerate but with huge variables in frametime can still be a bad experience).Dropping FPS "significantly" (more than a few %) is never reallt acceptable (unless it's always over some threshold). It is far better to have a steady frame rate (even 30fps) than one that bounces between 40 and 100.
A high framerate is visually useful for games where the field of view is far away without motion blur. In corridors or with motion blur, a high framerate is less annoying. It's more a question of the responsiveness of the controls.Considering I get that on an i7-6850K and 1080 (at 1920x1080), I suspect that Odyssey may be auto-tuning its graphics (and we don't have any knobs exposed to adjust that tuning ourselves). Another reason I suspect such is some of the video and screenshots I've seen look far better than anything I've seen on my PC, even when I run it in Ultra (Ultra for out in the black, High for in the bubble (Ultra tanks to 10 fps too often in the bubble)).
Of course, assuming my suspicion is on-track, that doesn't mean the auto-tuning is working well.
Bit of a ramble (from the perspective of someone who's been poking at a game engine for over 20 years, and this sort of thing in general for almost 40 years)...
Before anybody jumps in with "but I need my 120fps"... you don't really. I'm not going to claim you can't tell the difference between 60fps and 120fps (I do notice something in the difference between 30fps and 60fps, visual clarity? can't put my finger on it), but I did find something in Quake that would be a huge contributing factor to gamers "requiring" 120fps: button logic (eg, movement key handling) has a built-in lag where the first frame the button is pressed applies only half the movement, and the second frame on applies full movement (meaning that player acceleration is governed by frame rate). If this logic made its way into other game engines, it would explain a lot. Also, several years ago, I fixed QuakeForge's handling of gravity: it was highly frame rate dependent (most noticeable in jump height) because it was calculating the new position incorrectly (not taking the acceleration into account ((at^2)/2)*). If this bug spread far and wide (which I have no reason to doubt), then frame rate would matter very very much in a competition.
Now, my point (as to not needing 120fps): Odyssey at 20-30fps does not feel like it's running at 20-30 fps. I won't put a number on it, but there have been many times where I've felt like Odyssey was running very smoothly, only to find that it was getting around 25 fps when I turned on the display. Below 20 is not at all nice. Above 30... "aaah, too fast, unplayable!" (actually, half true: jumping to high frame rates after playing a lot at slower frame rates can make things difficult).
* Of course, this applies only in flat gravity (most game worlds). Properly curved gravity (1/r^2: KSP, maybe Odyssey (would hope so, really)) would need more terms (sadly, infinite, but for most purposes, just one more (((da/dt)t^3)/6)** is close enough).
** da/dt = jerk (or jolt), followed by snap, crackle and pop.