4K Ultra - What GPU?

I have a latest-gen i7 and an RTX 3070. Been a while since I’ve fired up Odyssey but I could do 4K at 60FPS with most stuff on Ultra or High … think I have Bloom and Ambient Occlusion turned down or off?

Anyway, in space I could probably have it all turned up but on foot - even with those settings - I get the occcasional FPS drop. Usually to like 57FPS for a split second but sometimes into the 40s if there was a lot going on.

Running at 1440p instead did make things a lot smoother but I still found Bloom and Ambient Occlusion “problematic” at high settings.

This is all pre-Update 10, IIRX. D if anything’s hugely improved since? I think FSR (maybe) and checkerboard (definitely) were there but the settings didn’t seem to make a big difference to me.
 
My goal is to play Odyssey at pure 4K with Ultra+ graphics.
What GPU will handle this game at not less than 30 FPS?

I have an i7 7700 (not that different from your 3770) and can now get 40-60 (capped) fps on planet surfaces using native 4k ultra settings, albeit salad hunting rather than base assaults with a GTX 1080ti. If the CPU (number of cores) is a limiting factor in ground bases there's probably not going to be much benefit to a beefier GPU unless you plan to upgrade the mobo/CPU/RAM later & move the GPU to the new system.

My son has a 3770 with a GTX1080, it's still an excellent combo for almost any game. Almost... ;)
 
I'm playing in 4K in Super Ultra+ Max Settings at exactly 60 FPS (since I'm using VSYNC).
GPU: nvidia RTX 3080
 
So I have capped mine at 60FPS and with a 3080ti 12GB, and an i5 9600 with 64GB I run at 4K resolution ultra on setttings. I have pretty well seen it hold that FPS in all but certain parts of some settlements now. If my bottom were 30FPS I think that would be solid across the board.

For reference, with the previous RTX 2080 I had to drop to 2K resolution and let NVIDIA up-scale to hold the same 60FPS on foot or planet-side. In any settlement it generally fell to 45-50FPS which was maddening because it caused the screen to stutter every few frames. My TV really hates 50-55FPS. I wish there were a way to set 45FPS as fall-back limit when 60 can't be held. That would look much better on my TV.
 
My goal is to play Odyssey at pure 4K with Ultra+ graphics.
What GPU will handle this game at not less than 30 FPS?
I would wait a few more patches for optimizations and fixes to really be sorted then we can really benchmark the best cards for Odyssey as currently its a miss match you might get the recommend required hardware and be disappointed. My GTX1080 runs the game really well but I've got friends on newer cards suffering performance problems.
 
Howdy folks from way over here.

I've been following the thread and have concluded that every single computer is going to act/play differently (even if ever so slightly). So I would like to offer up my 2 cents for what it's worth and maybe give someone an idea what kind of system to shoot for (or not shoot for). The following laptop was a holiday gift last Christmas. It replaced an 8 year old ASUS ROG 17" gamer laptop that was top-of-the-line in 2013 ...BUT...couldn't handle ODESSEY the way I wanted, so here are my present specs and some performance screenshots...

IMG_20220604_213736248.jpg


IMG_20220604_213840896.jpg


IMG_20220604_214029837.jpg

The above is the "diagnostic" snap of my laptop system. The app usually accompanies Windows. This is what I was gifted last Christmas...I keep mentioning "gift" cuzz there ain't no way I could've afforded the price on this machine.

With the settings I've put it to I get between 70 and 80+ FPS in a station (disembarked).

I will post some screenshots as soon as I figure out how to turn a .bmp into a .jpg...I actually know how but neglected to do it before posting.


Here are a few screenshots as promise and the settings screen showing what all is set to. Note the FPS in the SS's'

Screenshot_0054.jpeg

Screenshot_0055.jpeg

Screenshot_0056.jpeg

Screenshot_0057.jpeg

Screenshot_0058.jpeg

Screenshot_0059.jpeg


This machine has a 165Hz screen...15.5".
The main (though tolerable) drawback is heat...91c-94c average in ODYSSEY. My fingers will never be cold playing this game. ;)

o7
BIZZ
 
Last edited:
The 3770 might be borderline in some settlement scenarios
My 4790k struggles quite badly in settlements. I still get random drops to 5-10fps that last a good 10s, and frequently require rebooting the game. Ironically my recent test in Linux seemed to go better.
 
My 4790k struggles quite badly in settlements. I still get random drops to 5-10fps that last a good 10s, and frequently require rebooting the game. Ironically my recent test in Linux seemed to go better.

Playing on my i7-3770 and RX570 4GB with 3K res set to FSR QALITY never drops me less than 15 fps. Those 15 FPS is only situation in training mission where you enable power station. After that does not matter where you are and what you do it is constant 15 FPS (reported and posted this issue everywhere many times and I don't know DEVS seen it or no). In all other base settlement worst case scenarious I have 20 - 25 FPS and normally I play settlements and social hubs at 30 fps. Custom settings with everything ULTRA except shadows HIGH and LOW. I dont bother does it go higher or no I caped to 30.

So mostly I play at 30 FPS but quality is real garbage with that resolution.
I do really hope that RX 6700 XT will change everything!
 
Last edited:
Ok, I didn’t even realize that Vulcan had any say in the threading ..

Passing through the D3D11 commands to Vulkan with dxvk allows them to be distributed better because dxvk evidently leverages threaded command buffers: https://developer.nvidia.com/sites/...log/munich/mschott_vulkan_multi_threading.pdf

In your case, the performance benefits you see could be from this (though NVIDIA's D3D11 drivers can force multi-threaded command buffers, but this may be of limited use on a 4790K), or it could just be from the Linux scheduler handling things better than Windows, perhaps a combination of both.

Anyway, with a bit more tuning, I'm able to get slightly (about 2%) more than native D3D11 performance via Vulkan, and since this is all in Windows, I think it's likely due to the advantages of the API itself:
Source: https://www.youtube.com/watch?v=j-wKsCfCJEk

Took a 5800X3D (a great CPU for badly optimized games) to do it, but I'm now solidly GPU limited (at least at 1440p ultra or higher...4k here, plus FSR ultra) in high-intensity CZs at the most demanding of large settlements.
 
4K ULTRA+ and Without FSR - pure 4K resolution.

60+ FPS in space (caped to 60 so i dont know max).
50+ FPS planet surfaces
45+ FPS settlements
33 FPS were total minimum on super active combat in big settlements.

So I am happy I Got my RX 6700 XT 12GB and my old i7-3770 with 16 gigs of ram handles it really well.
Even did not expected that kind of results!
 
Last edited:
The RX 6700 XT is a very good card. Lord Kyron / Elite Secret Service got one too yesterday. But his CPU needs an upgrade at some point and his mainboard is a bit dated. :)
 
Last edited:
Back
Top Bottom