Testing frame rates on 4 different machines. Will update.

Please, people, put your in-game graphics settings too. Ideally, everybody should test in 1920x1080@Ultra, just to have a common starting point. From my personal experience, AMD cards (except latest generation) seems more affected that Pascal or newer NVIDIA cards.

Originally, FD used mostly NVIDIA cards, so the original game / Horizons were better optimized for NVIDIA. And since it's a DX11 engine, there is not much multithreading, so single threaded performance cpus are favored. Also the cards using brute force (real bandwidth/clockspeed/render units) or able to cull better hidden surfaces are favored (2080 Ti for example).
 
Please, people, put your in-game graphics settings too. Ideally, everybody should test in 1920x1080@Ultra, just to have a common starting point. From my personal experience, AMD cards (except latest generation) seems more affected that Pascal or newer NVIDIA cards.

Well that basically was my point: changing graphics settings doesn't really have an effect for me. At least not when the settings are reasonable for my machine (720p / 1080p), quality LOW, MID or HIGH
 
The suit operative tutorial is a good benchmark for ground based fps
The temptation is clear, but that tutorial simply isn't running the things that kill performance; it mostly sticks to merely bad, somewhere between 45-60fps in my case (60fps from... sitting in the detention cell in security, staring directly at a wall). It doesn't lose much performance over an hour of testing. There's no relevant travel (though there's one major hitch on approach to base), and no random NPCs ing about. Even so, standing perfectly still until the energy went to 50% showed variation between 42 and 62fps as absolutely nothing happened.

Next test: Run at least an hour in main game, staring at some wall somewhere. For starters, faffing about on some plain ground (that just SCREAMS "gravel and sand texture on featureless mesh) ran at 120fps; it's not being on foot, it's having anything on foot related around.

Just looking at Piazza's Laboratory from closer than 3km away drops the frame rate straight to 40. Disembarking to shove my nose into a nondescript hull plate on my Diamondback Scout makes it touch on 55. GPU utilization very uneven, usually around 40% (15% to 59% observed), CPU at slightly over 50%. VRAM about 9G, RAM about 8G.

80% extended maverick battery left, still framerates touching as high as 56 staring into the plate. Drops to 40 if I look at the settlement. Drastic performance loss not yet in evidence. Might require multiple jumps? Carrying on staring at plate.

75% battery left. Plate isn't any more fun at 51fps. Touching the controls seems to have bumped it up as high as 57. Power saving issues?

66% battery left. Watching my breath fogging up the helmet. Again frame rate bumped up as I alt-tabbed. Watching the settlement, there's one skimmer about and Radeon software reports 42fps while the game says 38. Different filter lengths, I suppose. A Cobra landed with no obvious hit to performance.

62%. Bored, I fired three plasma shots into the air. RAM 7.9GB, VRAM 9.5GB. Switching to desktop frees up over 100MB VRAM. Perhaps time to walk over to the settlement soon?

55%. I would report the time, but clearly I forgot my clock aboard the ship. Found some timestamps in the comms panel, but they're not that recent. Still not seeing anything like the memory usage and slowdowns normal gameplay brings.

I have decided to walk to the settlement.

Walked over and tried to toss a grenade over the building. I have no idea how it came back into my field of view as I turned 90 degrees. Base is alarmed. Some sharpshooter shot me. Camping out on top of the power station. Can reach 50fps by staring at sky. 40 is the rule looking at the building. No movement in sight. Incidentally, they shot me, but I don't have as much as a fine.

Beginning to ponder if the real memory hog might be station interiors. Sitting just barely below 10G VRAM, still 8G RAM. Spotted GPU utilization 8%, lowest I've noticed yet. Some ship left, VRAM crept over 10G, still just over 50% CPU. Likely CPU or system RAM constrained in this particular workload, but it's not the same as when it "plays" at 20fps.

Bored again. Shot the skimmer. Alarms on. Now I do have a 100cr fine.

Went closer to destroy the skimmer. Also took out a roof strolling scientist.

Reinforcement dropship came by while I was playing "don't touch the floor" on the roof. Found a handful of guards stuck on the same corner. Reinforcements were kind of tough. Seems the framerate went up over 60 after there were no more NPCs. Dipped under 60 once outside, but only a little. I'm even seeing moments CPU usage dips below 50%.

Perhaps the NPC control routines are poorly threaded?

Running out and facing away from building, I'm even seeing the frame counter hit 89.

I have shut the facility down, but still have to run off some 500m to recall my ship.

This gravel texture could really use some normal and roughness maps. I do notice some colour variations from the underlying biome blender. It doesn't feel like plausible ground when there's nothing else to look at; it feels like camo.

Back in my ship, still at the surface. Frame rate in the low 70s. Time to test that station hypothesis. Wondering why my flight assist likes to turn off nowadays.

Had frame rates around 90-100 while flying over to Andrey Popov Works, notably lower than the normal 120. Dropped below 40 once inside, stabilized at about 54. Disembarking.

On foot in hangar, frame rate mostly a bit over 50, sometimes a bit over 60. Admired the bad LOD popping on the lifts once again. The green [] box between the triangles disappears just before the top light goes jaggy. Oh well, on to the concourse. VRAM now as 12.3GB.

6 power up missions available. Accept 1. Now 6/6 are invisible, because there's only one location. Frame rate mosly low 40s.

Picked up a few missions. Time for Apex ride to fetch some item. 60fps out the mail slot, 120 in supercruise. VRAM back to 9.6G.

As taxi drops out of orbital cruise, frame rate drops under 40fps.

Apex ride to the next fetch quest. I'd like a moment to acknowledge two good things about the Odyssey launch: They haven't demanded a manual reinstallation of the launcher for every little patch (yes, I remember, and no, there was no excuse), and I've seen shadows from multiple light sources. Lighting itself is still messed up.

Supercruise at 120fps with 9.6GB VRAM again.

Looking for locker 2 in the habitat. Walk up to a row of 4 lockers: 5, 3, 4, 1. Who numbers these? Found 2, 6, 7 in the opposite corner.

Took out some scavengers. 14GB VRAM allocated. Didn't expect the hostile reinforcements after I'd finished the mission. Taking taxi back to station. Regular supercruise deallocation brought it down to 10.5GB.

In station on foot, framerates now in 30s to low 40s. Sometimes hit as high as 50. Took an identical mission (same task, same reward, same target). VRAM 14.6GB as I walked up to taxi. 10.2GB during jump. Noticed taxi veered way off course just before jumping, no clue why.

After I shut down last night, the Odyssey client executable sat chewing on a little more than 1 CPU core, showing a black screen, all night. Just killed it. That's not great either.
 
Last edited:
45 FPS is not acceptable for hardware that is "recommended". If the tutorial alone already shows weaknesses, it's sufficient as a benchmark to compare systems.

You can increase the stress and expand the test basis once the tutorial runs smoothly.
 
Last edited:
Regarding the network, you can also use the Game mode on Windows 10:

1622472996508.png

Use "Fix it" to check your NAT and even fix some possible local issues.
 
g5 laptop dell
i7 9750h @2.60 ghz
16 gb ram 2666
rtx 2060 6 gb vram
516 gb nvme

connected on tv 55 inches 60hz

@ 1080p 60fps on high in space and 30/45 fps walk to outposts.
@ 1440p on high in space 60fps and 25/35 fps walk to outposts

in CZ peaks of 24 fps in 1080p.

C.
 
I'm not even sure if fdev have ever officially supported duel cards anyway

They haven't (though it has worked), but it's the only example of anything resembling sharing VRAM between two cards in a way that would be relevant for gaming that I could think of.

Please, people, put your in-game graphics settings too. Ideally, everybody should test in 1920x1080@Ultra, just to have a common starting point.

I'm CPU limited (not to say that performance scales well with increased CPU performance) to about 80-120 fps on surface settlements (and the suit tutorial) at 1080p ultra with my primary system, which has a 3900X (custom PBO) and a 2.5GHz 6800 XT.

I just swapped my 5800X back to this system to analyze exactly how much I can overcome that CPU limitation. Not expecting a lot, due to how bursty the loads are.

The temptation is clear, but that tutorial simply isn't running the things that kill performance; it mostly sticks to merely bad, somewhere between 45-60fps in my case (60fps from... sitting in the detention cell in security, staring directly at a wall). It doesn't lose much performance over an hour of testing. There's no relevant travel (though there's one major hitch on approach to base), and no random NPCs ing about. Even so, standing perfectly still until the energy went to 50% showed variation between 42 and 62fps as absolutely nothing happened.

Any experimental test needs to be repeatable.

I haven't noticed any scenarios where my systems' performance degrades over time, or fails to revert to normal (EDO normal) after entering a partcularly slow area/using particularly demanding settings.

If you can find any area that seems to reliably cause issues I can test that, but waiting to stumble into something isn't going to be very informative, or fast.
 
To me (and my inner circle) it seems AMD card have more problem. Better lets Fdev decide on it..

I resolved my cpu bottleneck - from below recommended specs to a 5800X - but fps on the ground improve only from 15 to 20 fps.
To me (and my inner circle) it seems AMD card have more problem. Better lets Fdev decide on it..

I resolved my cpu bottleneck - from below recommended specs to a 5800X - but fps on the ground improve only from 15 to 20 fps.
Just go through this forum and you will see its at least 10-1 Nvidia cards with problems while the small number of AMD cards arent having any problems.
 
45 FPS is not acceptable for hardware that is "recommended". If the tutorial alone already shows weaknesses, it's sufficient as a benchmark to compare systems.

You can increase the stress and expand the test basis once the tutorial runs smoothly.
Agreed. In the year 2021, "recommended spec" generally means that the game will perform at least at 60fps, and with whatever graphics settings the developers consider as most accurate to their artistic vision of the game (this can sometimes be High settings, sometimes Medium). I've never encountered a game that only expected 30fps from its recommended spec.

Minimum spec obviously is a whole different ballgame, and expecting 30fps on medium or low settings is more reasonable. But not for Recommended Spec, and it's frankly quite insulting that Braben implied that 60fps generally is an expectation they reserved only for higher end recently built PCs.

As a frame of reference: Death Stranding on PC has an i5 3770 and a 1060 6GB as their Recommended Spec. I have a 1070 Ti and an 8600K, so I'm at least a bit above the recommended spec. In that game I can set everything to Ultra settings at 1080p, and i don't think it has ever dropped below 80fps. My friend who has a 1060 6GB also plays it at the same settings I do, and I don't think they've said it ever dropped below 60 fps.

Comparing to EDO: the "recommended" spec is also a 1060 6GB and an i5 8600. I meet the CPU requirement and exceed the GPU requirement. You know what performance I get on planet surfaces (the place that Odyssey is mostly intended to expand) at 1080p Ultra? 25-35 fps, with periodic jumps to 55 that only ever seem to last a minute or two. I dont consider performance out in Space as relevant to calculating average performance as that is not where Odyssey was intended to be played. And performance doesn't seem to change all that much between Low settings and Ultra.

So to wrap up:

Death Stranding being slightly above recommended spec: 80-110fps average.

EDO being slightly above recommended spec: 30-50fps average.

They aren't even in the same ballpark.
 
Watch the performance issue be caused by something insanely weird, like which brand of Memory your Graphics card is using. I know my card has GDDR6 from Micron. I honestly can't think of anything that would cause such consistently inconsistent performance across such a wide range of hardware. What about Antivirus, is anyone who is NOT having problems running Bitdefender?
 
Conclusion: seems odd to me that PCs 1 and 2 are vastly more powerful than PC 3, and yet the frame rates don't differ all that much.

Given I have access to four machines, and plenty of time, if anyone from Frontier wants me to try different settings or drivers or whatever, do let me know ...
Yes, there are some very odd things happening. I have a 2070, which gives me around 60fps in a base,.with some drops and rises on ultra. Then I turned my settings down, my fps didn't get better, if anything it got worse.

Hopefully they'll get to the bottom of it soon, but it's certainly a strange one.

With my system though I should be up into the 100s when it comes to fps though.
 
Just go through this forum and you will see its at least 10-1 Nvidia cards with problems while the small number of AMD cards arent having any problems.
Are you going to show your calculations to back up that assertion?

For the record, my AMD 5800 with 6800 machine runs E: D poorly. There isn't anywhere near enough feedback on these forums to make the kind of assertion you are making.
 
Are you going to show your calculations to back up that assertion?

For the record, my AMD 5800 with 6800 machine runs E: D poorly. There isn't anywhere near enough feedback on these forums to make the kind of assertion you are making.
The problem is also that the ratio Nvidia cards / AMD cards with issues does not say much when you don't take into account how much Nvidia cards there are compared to AMD cards amongst the players in the first place! You know, Bayesian statistics and stuff.
 
I hope you are aware that you need to do a little more than just set port forwarding from client side, do you? Just saying, since the ingame network settings seem to suggest that's all you need to do to make it work. It's also giving you zero feedback whether port forwarding is actually active or not. Apologise if you know what I'm talking about, but this comment is meant in general for everyone reading this and I'm pretty sure not everyone does.

Frontier itself offers a tutorial how to do it right, so for everyone not being sure, that's the way to go:

If you mean did I set up the router, yep I did. I should have mentioned that part.
 
Running Odyssey on a i7 4790k, 32gb and RTX 3070 running on high

Single screen in space 120-144 single screen @ 2560x1440
Triple screen in space 110-130 triple screen @ 7680x1440

Single screen sat in Carrier 90-120 single screen@ 2560x1440
Triple screen sat in Carrier 60-70 triple screen @ 7680x1440

Single screen at settlement 28-40 single screen @ 2560x1440
Triple screen at settlement 19-35 triple screen @ 7680x1440
 
All four tests are running the same save game, same Cmdr, same location but on 4 different PCs.
All were run in Solo mode.

First PC:

i7-6700k/GTX 1660Ti/32gb ram/SSD. On foot at one particular settlement.
Nvidia drivers: 457.30 (downgraded them before this test, after suggestions the latest drivers might not be as good)
Screen size 1920x1200 (16:10) with 2 other monitors connected displaying Windows desktop.
Windows 10 latest updates.



Whether I set my graphics to low, med, high or ultra, restarting the game completely in between, I get 35 fps looking across the settlement. Rises to 45 FPS when looking at the empty sky.

I tried 0.5 supersampling and 1.25x supersampling and it was still 35.

I tried switching off 2 of my 3 screens in Windows, and setting the middle screen to borderless/fullscreen/vsync on/off and frame rate limit 60 and none, and it made zero difference.

Next test will be on my i7-7700 with GTX1660.

I have almost the same specs and that's roughly my experience.

30s-50s from planetary to stations. Fire and foot CZs lowest. Planetary surface 40s-60 on foot no settlements or stations. Space 60 fps all the time except when leaving stations sometimes 40s till jump.

Ultra 1920-1080 Full terrain slider (Right)
Single monitor.

1660 Ti
i7-6700k
32 Gigs DDR4
870 EVO SSD


I was playing with shadows low and aa fxaa for awhile and getting 60fps in space stations but the rest was the same. I barely lost much by maxing them back up.
 
Last edited:
i5 3570k
980 ti
16gb RAM

It seems to me that looking through glasses tanks the framerate. Can anyone else confirm that?

If I look away I get on average 10-12 FPS more-- even when the scene is clearly more complex.

Looking through windows of shops I get about 30, but if I look elsewhere, including the massive window that looks out to the docking area (they don't have a reflective surface it seems so they might be why), I get 40-45. I'm pretty sure in some docks I get lower FPS.

This is at 1080p with mostly high settings, and ground details like terrain and materials on ultra. The only thing that's set to medium are shadows.
 
Last edited:
Watch the performance issue be caused by something insanely weird, like which brand of Memory your Graphics card is using. I know my card has GDDR6 from Micron. I honestly can't think of anything that would cause such consistently inconsistent performance across such a wide range of hardware. What about Antivirus, is anyone who is NOT having problems running Bitdefender?

I've been brainstorming all kinds of stuff to explain top-end hardware getting low frames vs people with laptops and mobile chips outperforming them.

At first I wondered if it was a game setting which was slowing things down, like 'report crimes against me' being repeatedly checked and spammed by every on-foot NPC when you go to a settlement. I think that's ship-only, though.

It could just be that the NPC AI is being updated too frequently, and the threads are multiplying until everything lags. (someone earlier posted a screenshot showing a lot more active threads in busier settlements.) Or it could be something like the NPC torches and light sources in settlements - I know in iRacing if I add more light sources the frame rate can slow down a lot. That wouldn't explain the slowdown in stations though.

By the way, if anyone is running supersampling above 1.0x they're asking for lag right now. I saw someone with a 3080 trying to run Odyssey in 4k resolution at 2.0x supersampling, just for kicks, and they got 5 frames per second. With 1.0x it was back to 60 ish.

Unfortunately what this needs is a profiling tool on a debug build, and that's something only FDev can do.

(I've been an indie application developer for 30+ years now, and my focus has always been getting the most out of the hardware. Starting out, I designed and coded a multi-user application, solo, to manage a large manufacturing and retail business. That involved writing code to hot-switch between different programs I'd written - from booking to invoicing to order entry, customer lookups and installations, etc ... all running on 8mhz XTs with 640k of memory. So yeah, I think about optimisation daily.)
 
Back
Top Bottom