Elite dangerous works better for me at 4k than 1080p

As the title says, it works better for me at 4k than at 1080p with the same settings, I don't understand anything, more stable, more fps on the carrier, in combat, etc, before in high intensity AX it lowered me less than 50 now it stays in 50 and up again, I have not tried terrestrial, when I did it in its day, it was the same at 1080p as at 4k, with jerks included.

Spects:
CPU: i7-4790 with all cores on 4Ghz
Ram: 16GB DDR3 1600Ghz max for i7-4790
Gpu: 2080 oc of fabric "gigabyte"
HDD: 7200 rpm toshiba cache 64 mb
Shaders cache: SSD i545 256 GB
75 fps locked: down to 60/25 on ax battle before on 1080p, after with 4k down to 50 max,
FSR: Ultra
20220714122941_1.jpg
20220714123200_1.jpg
20220713215031_1.jpg
20220714124400_1.jpg
 
Last edited:
How does 1080p run without FSR? At that resolution it could be that the amount of work done by FSR’s up scaling is greater than the time saved by its resolution decrease.
 
How does 1080p run without FSR? At that resolution it could be that the amount of work done by FSR’s up scaling is greater than the time saved by its resolution decrease.
I have played 1080p since odyssey came out, before 2k, I have never used FSR in 1080p, it sucks, everything blurs
 
I'm guessing FSR is allowing more of the work to be done GPU-side rather that CPU. While your CPU is still a decent one, I can imagine it perhaps presents a bit of a bottleneck in certain situations, considering your GPU. I know a friend had a similar issue with the same CPU and a 1080ti. A modern CPU saw the 1080Ti pushed a lot harder to great results. Changing that 1080Ti to a 3090 was transformative needless to say.
 
I'm guessing FSR is allowing more of the work to be done GPU-side rather that CPU. While your CPU is still a decent one, I can imagine it perhaps presents a bit of a bottleneck in certain situations, considering your GPU. I know a friend had a similar issue with the same CPU and a 1080ti. A modern CPU saw the 1080Ti pushed a lot harder to great results. Changing that 1080Ti to a 3090 was transformative needless to say.
my CPU does not have a bottleneck with the GPU, now, the game can cause it because it makes huge calls wrongly made like X4 or similar, the stellaris when it went to cycle 100 went to 10 fps due to a memory leak that was a bottleneck bottle on launch, taking into account those things, that is what causes the bottleneck, but in games like war thunder that is more demanding on CPU and GPU due to explosions, penetration, projectile ricochets ect, it goes to 75 almost perfect and 110 at 120 unlocked from 75, that's at 1080p, that's why it doesn't make sense that it goes better at 4k even using FSR than at native 1080p with my CPU which is the recommended one, to this day it's still one of the best, it's not decent and ok, I understand why you say decent, but it is the most powerful in 4 cores, obviously it lacks the new technology that would make it even better, that's why many people with the K version have a 3090 and also nvidia recommended the 3060 which is better that my 2080, that's why I say there is no bottleneck, only it exists in games that are poorly optimized or that have denuvo that generates thousands of random meaningless flames, there is help more cores not more Hz
 
Maybe it goes like this:
New card can reasonably draw new sized viewport. But it is like 10 times faster determining what NOT to draw.
he can draw with ease if he doesn't exceed 120 fps, but he doesn't even get there in odyssey, in each patch it was worse, before the last one I was going back to 75 stable in the war at 1080p with everything in ultra, but again I had to lower it to medium or low in some cases and I decided to try 4k and FSR because the resolution is indicated so that it does not look like battered butter, as I said, games like stellaris with mods are better for me, that loads the CPU a lot, it its the game not the system, but what I have discovered may help some people, even people with i9 and a 3090 experience what happens to me, I suppose that if they did the same thing as me, it would be better for them, even though it is normal Contrary even with DLSS or FSR
 
my CPU does not have a bottleneck with the GPU

Your CPU will bottle neck a 2080 at 1080p (Your CPU will be at 98-100 usage now during modern games!). You'll still get good frames on most games, but your CPU will reduce the available instructions back to the 2080 (The IPS is just not fast enough to keep up with modern cards), thus reducing the amount of FPS that card could possibly display.
As 1080p uses the CPU more for rendering than 4k, that's why you see less performance at the reduced resolution. Its sounds illogical, but you do need a faster CPU for 1080p, than you would with 4k, as your card would do more of the heavy lifting!

It's the reason I retired my 4790k last year for a Ryzen 3600. It was bottlenecking my 3070 (which is about the same as your 2080). It was not the silicon lottery winner, and would only be comfortable at 46Ghz on all cores. But still lasted me so, so many years!

At 1080p you'll be getting about 20-30% bottleneck and you'll be getting about 0 to 5% at 4k with a RTX 2080. Check out this link (Link to Bottleneck Calculator).

The Haswells/Devils Canyons were an amazing chip, but unfortunately they are starting to really show their age when paired with new mid to high end cards. They simply can not keep up! It was when the i3 10100f was beating my 4790k in FPS, that made me decide for something a little newer. (Link to i3 10100f vs i7 4790k).

Its not saying that your CPU is incapable or now worthless. Its just means that it limits what your card could do, if your CPU had a higher IPS. One thing I have always suggested about upgrades is, if your getting the frames and smoothness your happy with? Then don't spend £1000 to gain an extra 30-50FPS when your already getting what your monitor can already display. As you see my old i7 4790k and a 1070 could run comfortably at 7680x1440p with Horizons. But the 3070 was held back until I got the Ryzen 3600.

Elite Horizons CPU I7 4790k GPU GTX 1070 7680x1440
Source: https://www.youtube.com/watch?v=zbinImBkjNA


Elite Odyssey CPU Ryzen 3600 GPU RTX 3070 7680x1440
Source: https://www.youtube.com/watch?v=8ivT7kj4J18
 
Last edited:
This isn't suprising.

At 1080p you will be hitting a CPU bottleneck whereas at 4k the bottleneck will be the GPU.

If anything you want the bottleneck to lie on the GPU rather than the CPU.
 
Back
Top Bottom