Console player looking at biting the bullet

DDR5 on AM5 has a sweet spot at about 6000MT/s. The fabric clock (fabric = the interconnect between the CCDs and the memory) on that platform can run in a 1:1 ratio with the RAM.

1:1 ratio on AM5 doesn't refer to Fabric, but only to UCLK:MCLK (memory controller and memory I/O clock). Fabric cannot scale high enough to match memory clock, except at very low speeds (a great Raphael or very good Granite RIdge CPU tops out at 2200 FCLK, which would only be 1:1 with the memory up to 4400MT/s).

1:1 UCLK:MCLK can generally scale to 3200MHz or so (less on some samples, very rarely more), but fabric will unavoidably be much lower. 1:2 UCLK:MCLK is needed for higher memory speeds, which generally means that gap between 6400 and 8000 is often slower than 6000-6400.

There is also a tiny edge (half a nanosecond of memory latency) to using a 2:3 FCLK to UCLK ratio at the 1:1 UCLK to MCLK ratio and a somewhat larger edge (1-2ns) to holding a 1:1 FCLK to UCLK ratio at the 1:2 UCLK:MCLK ratio. This is where that 2000 FCLK DDR5-6000 'sweet spot' comes from...it's not the fastest or best, but it's something that performs well that 99% of AM5 parts can do with voltage safe enough to include in XMP/EXPO presets. For tweakers/overclockers, getting UCLK as high as possible then using the fastest unconditionally stable FCLK is what generally provides the best performance (even small increases in FCLK typically result in larger gains than trying to stick to a particular ratio).

AM4 CPUs generally run 1:1:1 (FCLK:UCLK:MCLK), which is optimal, because DDR4 didn't really scale high enough to justify breaking that ratio. DDR5 increased memory clocks dramatically, but AM5 only modestly increased Fabric clocks.

So I ended up with a Ryzen 7 5700x with a RX 7600 running on AM4 .
I tried to get the AM5 but I couldn't really stretch .
600 watt 80+ PSU
B450m motherboard

The 5700X is an entirely competent CPU, but Odyssey is very sensitive to memory performance on this platform, especially if one doesn't have an X3D part.

I generally think 8GiB cards are rather anemic for Odyssey at this point, but if no 10GiB+ parts fit your budget, then the RX 7600 should do fine.

What PSU exactly? Wattage and 80+ rating don't say much about quality, in and of themselves.

What bit am I missing??

The game tracks device IDs/instances and anything that causes these values to be different from when everything was setup will break bindings. Using different ports, different input/controller software, or any kind of translation layer (e.g. Steam Input) can do this if one doesn't ensure everything is configured exactly the same as it was last time ED was setup the next time it's launched.

Personally, I'd start from the beginning. Figure out exactly how you want your controls connected to the system and connect them, remove any non-present devices from device manager, delete all custom binding from the game's settings folders, disable Steam Input or any macro software, start the game, reconfigure bindings, make sure they work, then manually save that binding file somewhere as a reference/backup.
 
Apologies in advance that I can't be less technical than this! :)

I understand your frustration, as I found out when trying to do bindings.
Unlike consoles, every PC is different and there are many, many types of input devices. Many games have to compromise, and that compromise is to support it, but it nearly always ends up being complex.

What bit am I missing??

To get the game to create a custom bindings file, you just have change one of the defaults.
Go into Ship controls, select one of the drop-down presets, either "empty" or start with one that is similar to your hotas. Then go into the actual settings and update something - it can be anything - then the preset at the top changes to "custom," - which it now is. Finalise your settings, click "Apply," then go back to the controls section.

For the next set of controls, go into the SRV section. Do not select any default preset, but instead set it to "custom" before you do anything else. Go into settings and manually set the buttons and axes you need to drive around. Click apply when done.
Repeat this process for "On Foot," then for "General," - set to custom, then set stuff up.

Elite will now have saved all relevant settings into that "custom" file, with each device (stick/throttle/joypad) having a unique identifier against all your desired settings.
After this, whenever you launch Elite, control presets should all read "custom" and it will load the settings from that file.

Last note:
Backup your custom.binds somewhere safe - updates to joystick drivers, windows, or Elite can change things, but it is easy to tweak the new custom file to get it back to what you expect.
 
GPU VRAM can NOT be upgraded unlike the motherboard RAM.
So when you buy a GPU with 8GB VRAM you are stuck with it.
That being said, 8 GB of VRAM is more than enough for ED.

But it might start falling a bit short for some of the newest PC games out there. (I don't think any game so far will outright refuse to run with only 8 GB, but too little VRAM can cause increased loading times and even some stuttering in some situations. However, this is probably still not as bad as that might sound. But in the near future, who knows.)

The RTX 4060 is probably more than enough to run ED, and most other games, comfortably, although with some of the most demanding game it might start to require lower resolutions.

I recently upgraded to a PNY 4060 Ti 16GB, which I got on sale, and it does quite well in ED (only game I play). I upgraded from 3060 16GB.

My other specs: AM4 socket Ryzen 5 5600X (3.6Ghz/4.8 GHz, 6 cores), 32 GB DDR4-3200 (4 x 8GB).

In Odyssey, I have my frame rate locked at 120 fps and I rarely dip below 50 fps, and only on the ground, and its still very playable then.
I have an RTX 3080, and the game runs at 4k resolution 120 FPS in space without breaking a sweat (since I'm using g-sync, it's capping to my display's maximum refresh rate, and I have never bothered to test how high it goes if not capped), on foot inside stations at about 90 FPS.
 
My 10GiB RTX 3080 and 11GiB GTX 1080 Ti (as well as all my higher VRAM cards, from any brand) run Odyssey fine, but my 8GiB RX 5700XT and Arc A750 have serious issues related to VRAM contention.
 
I have an RTX 3080, and the game runs at 4k resolution 120 FPS in space without breaking a sweat (since I'm using g-sync, it's capping to my display's maximum refresh rate, and I have never bothered to test how high it goes if not capped), on foot inside stations at about 90 FPS.
The 3080 is almost a third faster than the 4060ti

1746852891518.png
 
GPU VRAM is not a direct one-to-one mapping of the screen resolution.
GPU VRAM is used to store the textures, geometry, and other data needed for rendering a scene.
The amount of VRAM needed depends on the complexity of the scene and the desired resolution.
 
I don't know if 2 GB really makes all the difference in the world, but at 4k resolution I haven't experienced such things with ED.

It can be. Local VRAM is more than an order of magnitude faster than reaching out of the card over the PCI-E bus, closer to two orders of magnitude faster on high-end cards. Overflow VRAM even slightly and, depending one how gracefully the application can adjust, performance often nosedives. ED is not particularly graceful about VRAM utilization. It doesn't crash, usually, but it doesn't dynamically scale back on assets, or even fail to load textures unless contention is extreme. Going over capacity on my 8GiB cards by less than 1GiB cuts frame rate by about 60% and turns the game into a stuttery mess.

With any recent build of Odyssey, 10GiB seems fine at ultra settings. 8GiB needs texture quality concessions starting at relatively low resolutions.

GPU VRAM is not a direct one-to-one mapping of the screen resolution.
GPU VRAM is used to store the textures, geometry, and other data needed for rendering a scene.
The amount of VRAM needed depends on the complexity of the scene and the desired resolution.
Generally, textures are overwhelmingly the largest fraction of VRAM utilization. Render resolution matters, especially in borderline cases, but the various texture quality settings matter a lot more.

ED also doesn't seem to aggressively evict assets until it's pretty much out of VRAM (same settings that use nearly all the VRAM on my 10GiB cards can use nearly all the VRAM on my 16GiB cards, but won't overfill either), though I'm sure GPU driver contributes to this as well. Probably good for loading times if you have enough VRAM, but bad for performance if you don't.
 
If you're still struggling with your bindings - there's a comprehensive thread stickied at the top of this subforum.
Yes, keybinds in ED are a bit tricky, but that's the price you pay for the ability to connect everything, including the kitchen sink, as input device.
well they do have a plug if adapted to usb it would probably work also:giggle:
still havent gotten around to adapting my grans umbrella/walking stick handel
GRANNYTHETHARGOIDKILLER.png

the missus when angry could probably serve as a pretty decent kitchen sink launcher........
she is pretty much a pro with everything else:ROFLMAO:
 
Last edited:
Back
Top Bottom