Push graphic quality to the limits

Status
Thread Closed: Not open for further replies.
There's a little extra sharpness.. That's about it. I wonder what the FPS hit is for that - or VRAM increase at least.

There's plenty of settings in that file, but I doubt just doubling all of the High/Ultra values would be advantageous to your GPU's health! ;)
 
There's a pretty clear difference between those two pics and I'm not exactly 20/20 vision. Though granted I'm short-sighted so it wouldn't be an issue anyway. :p

Steam is indicative yes, of people who game on PCs. Like I said, almost half of Steam gamers have 4GB or less RAM. That's just too much of the market to simply abandon for the sake of lazy memory management.
 
Another thing which can increase image quality is DSR (if you have a NVidia card).
I am playing with 2K resolution and have most of the time no FPS drops. Only when I arrive at a planet which is not already loaded... stutter stutter stutter stutter even though I have a SSD.
I will run a diagnostic tool next time to see if there is a bottleneck which causes this issue.

Edit:
I think it is not bad to release 32bit applications today because there are many people with older PCs.
The best would be to release 32bit AND 64bit executables for a game.
 
Last edited:
Actually, you are all wrong or misinformed to some degree, a 64-bit client has many advantages over a 32-bit client, the least of which is the increase in memory.

This guy gets it.

However, the next guy saying that E:D is poorly written PoS is also right.

GPU running at near 100% with graphical fidelty of a 2009 game doesn't win them any awards, really.

Let's leave 64bit for now. This game needs to have a profiler run on it from every possible angle. There's virtually nothing to render, yet it can chuckle on even moderately strong rigs.
 
I want a drawdistance preventing pop-ins on everything. The slider in options makes no diffrence whatsoever.
FD has told us this game will scale into the future, but right now the fans on my 970s don't even rev up at 4K....
 
GPU running at near 100% with graphical fidelty of a 2009 game doesn't win them any awards, really.

Let's leave 64bit for now. This game needs to have a profiler run on it from every possible angle. There's virtually nothing to render, yet it can chuckle on even moderately strong rigs.

If you're not CPU (or software) bound then your GPU will run at 100% regardless of the scene complexity.
 
32bit Apps can only allocate up to 2GB (or if you push the limits 3GB) of dynamic memory... but your right, even 2gb should be enough for this game (at this point).

Err... 32-bit applications are restricted to a 32-bit address bus internally, so they can only use a 4GB address space. They can, however, have an entire 4GB to themselves, if you run them on a >32-bit operating system with >4GB RAM total, as the OS can assign an unused 4GB block to the app and keep itself to any spare RAM in the machine. You are probably confusing this with 32-bit Windows never assigning a single app more than 2GB, unless you specifically tell it to.

It is also a lot faster to cache art assets (meshes, textures, all that lovely gubbins that makes the pretty) in RAM than leave them on the disk to load on demand. Taking my gaming rig as a base spec, ED can scoff data directly off the SSD at 550MB/s at best, with the physical bottleneck being the 6Gbit SATA3 bus. If it was pre-cached in otherwise unused RAM, that's a 64-bit parallel bus at 1.6GHz (I am only running DDR-3 1600, not any of the speedy overclocked stuff), or an effective 12.8GB/s to get that data to the GPU, assuming the memory bus to be the next bottleneck.

Using those numbers as an indicative base, caching art assets in RAM would give ED a significant (an order of magnitude at least) speed boost in getting data to the GPU, and a 64-bit ED executable would have access to most of the 16GB I currently have installed to cache frequently needed art assets (Windows is greedy enough to never give an app all of it, although it does squash down to >1GB reserved for the OS when pushed).

Of course, as mentioned by Hexcaliber earlier in the thread, converting the exe is a non-trivial undertaking and would potentially end up with 2 codebases (32-bit build and 64-bit build) to maintain for whatever period of parallel running is decided upon. Only FDEV could provide a meaningful appraisal of whether the cost of building and maintaining a 64-bit client is outweighed by the additional utility (and potential performance gains) of providing one to users who have a 64-bit OS and >4GB RAM.

Personally, I really hope they do, because knowing 12GB of my RAM is sitting unused and unloved, when ED could have it simply for the asking is slightly saddening ^^;
 
If you're not CPU (or software) bound then your GPU will run at 100% regardless of the scene complexity.

We're assuming that v-sync is on. So, 60hz in my case.

Even FC3 in full details in the middle of the jungle doesn't run at 100% GPU.
Considering the scene complexity inside the stations, constant 60% would be generous.
 
Try downsampling, I'm running DSR on Nvidia and the game looks a bit better. AA is still borked, lots of flickering on station toasters, jaggies etc. And I dont think AF is working either.
 
GFX card performance and associated memory would have a greater effect on image quality than moving to 64bit although I do feel that ED should be caching much more into memory than it is. There are sooooooooo many other factors that affect performance even before you consider moving to 64 bit.

Remember, no matter how fast your kit is, it will only go as fast as the slowest component.
 
Remember, no matter how fast your kit is, it will only go as fast as the slowest component.
Not necessarily true. While it remains correct for benchmarking purposes, its no longer the case (CPU's have come a LONG way) with games. In (modern) gaming, the biggest impact will be on your gfx card.
 
We're assuming that v-sync is on. So, 60hz in my case.

Even FC3 in full details in the middle of the jungle doesn't run at 100% GPU.
Considering the scene complexity inside the stations, constant 60% would be generous.

Agree with this. I am just not seeing how ED is that graphic intensive, even inside a station. It is nothing when compared to something like FC3 or its kin.

Blasphemy maybe but are we sure the Cobra game engine is up to the task??? Not that a switch is possible but it is the hidden component behind everything.
 
I get jaggies and have AA problems as well (in the DK2... on screen is no problem) with settings at ultra. I am running 4th gen I7 @4.0ghz in a Gaming MB, 16 G ram, 970 gtx, SSD. On my screen the game is crisp and butter smooth.. In the DK2 I get stutter and judder around stations and really big/active systems... Around resource extraction sites has gotten better in the last 2 patches.
 
We're assuming that v-sync is on. So, 60hz in my case.

Even FC3 in full details in the middle of the jungle doesn't run at 100% GPU.
Considering the scene complexity inside the stations, constant 60% would be generous.

What graphics card? My 7850 OC basically maxes out at around 65fps in stations. I'd be lucky to get half of that in FarCry 3 max settings.
 
I get jaggies and have AA problems as well (in the DK2... on screen is no problem) with settings at ultra. I am running 4th gen I7 @4.0ghz in a Gaming MB, 16 G ram, 970 gtx, SSD. On my screen the game is crisp and butter smooth.. In the DK2 I get stutter and judder around stations and really big/active systems... Around resource extraction sites has gotten better in the last 2 patches.

You are always going to get jaggies with the DK2 as the resolution just isn't up to it (more accurately the pixel density isn't up to it). There are some issues with Oculus support in ED remember. FD blame Oculus, Oculus blame ED. I suspect a mixture of both but I lean a little towards the Oculus drivers being suboptimal. Also a lot has to do with your system set up with your Oculus. Direct mode isn't fully supported so you'll see better performance with enhanced mode and if you set your Oculus as your main monitor. Use something like VRChanger to handle that more easily.

With all the tweaking I've been doing over time it is now very playable and only really notice the judder when in stations and even that isn't too bad.

Edit: If you haven't done so already, it is recommended that you change the UI colour scheme to something heavy on the green side. Not ideal as the other colours are different too but with the green base you'll get an improvement in clarity, especially in the UI panel text. Ideally (and I think someone has already suggested this) FD should implement an official UI colour switcher with Oculus in mind where the theme can be made green but all the other colours (scanner, HUD info and warnings, FSD countdown, etc) all remain the same as stock.
 
Last edited:
This guy gets it.

However, the next guy saying that E:D is poorly written PoS is also right.

GPU running at near 100% with graphical fidelty of a 2009 game doesn't win them any awards, really.

Let's leave 64bit for now. This game needs to have a profiler run on it from every possible angle. There's virtually nothing to render, yet it can chuckle on even moderately strong rigs.

Well, using Ctrl-F shows me that I'm getting up to 200fps from my 100% GPU use, although that's a maximum and not an average (average is around 120fps, dropping to maybe 90fps in busy stations or belts). That's on a 4GB GTX760, so a 2 year old card can triple an "acceptable" value of 60fps and still beat it even in worst case scanarios.

Of course, I'm not running maximum graphics fidelity, because I'm also getting that 2yo card to run a Rift, and it seems that the Rift overhead is not just a few percent; those 90fps moments on a monitor translate to less than 75fps on a Rift, so I had to crank the details down to keep it at 75fps or as close as possible in all situations.

I'd also be interested to know what measurements you are using to determine what the graphical fidelity of "a 2009 game" is, and how you are objectively measuring the output of ED as a comparison. It's going to sound trite, but to me the XBox 360 is still managing to render contemporary titles (Destiny, CoD:AW) with very similar results to the XBox One, so the majority of the shader and post-processing capabilities have either been around for a decade already, or they literally aren't actually that noticeable an improvement.

As someone whose first gaming experience was on an old Pong clone, then the Atari 2600, I can truthfully say I'm not noticing any real "generational" increase in graphics quality over the last few years, mainly in computational grunt (framerates and/or resolutions supported). In other words, every new game looks like it's still a 2009 title to me, so I don't see the problem due to its all-pervasive nature.
 
Originally Posted by Nestor We're assuming that v-sync is on. So, 60hz in my case.

Even FC3 in full details in the middle of the jungle doesn't run at 100% GPU.
Considering the scene complexity inside the stations, constant 60% would be generous.

A side question giving Graphics ,, I had a 144hz screen set to the recommend resolution 1920 x1080 & in game, should v-sync be on
 
Status
Thread Closed: Not open for further replies.
Back
Top Bottom