Don't read this unless you have some serious spare time, you have been warned!
Beside the Video card, I have similar components to yours. I have no issues playing on the monitor at all, the problem is just with the rift, and if other games work fine, but not ED, I can't really blame neither the SDK that is still hold together with glue and duct tape, nor my machine. That said; I see that many people has issues with "low" tier video cards, like the 7xx series; and it is sad...I won't spend 500 dollars to get a video card for just one game, to play just ED...either Frontier find a way to make it better, as other games do, or I guess I will have to play other games with the rift, beside ED, which is sad.
- When you say you have no issues playing on the monitor at all, what FPS do you get in stations, at asteroid fields and in deep space with vsync/gsync off? If you're not getting 120+ FPS in stations with a 2D monitor, then you have little chance of using the DK2 without stutter, how tolerable that stutter is, is a personal experience for each user.
- "Glue and duct tape"; I assume you are talking about the Occulus Rift SDK, (correct me if I am wrong)? Have you actually built anything against the Rift SDK? I am developing with it daily and it has it's good and bad points, but for a beta release API it is more than acceptable.
- Lets say by release FD manage to optimise their rendering engine by another 20%, (improbable at this late stage), will that make a difference to your performance issues? I am guessing not, therefore it's up to you to determine why your machine isn't performing like others with similar setups.
You have a valid point about the double amount of render calls, since Oculus approach is to use a single monitor divided in 2 and render twice the same scene. Instead than use 2 different monitors (it is still 2 draw calls, but you gain in performances since you can use the wait cycles while the card is switching GL context. If you ever wrote anything in OGL, you may remember that the GPU has to switch context between when the scene is rendered and when is "composed" in memory). And still, you tell me that Star Citizen is able to do it, but not ED? A10 and any of the DCS world games are able to give decent performances, are these less intensive application, compared to ED? I thought that a flight sim is the most complex application that you may have. It is not like you keep 400B systems in memory and do culling
- ED uses a DirectX based renderer, this has nothing to do with OpenGL.
- You as the progammer don't normally switch contexts in DirectX or OpenGL, one thread per context. The system switches between contexts for different windows on a desktop or when switching processes. If you use an API like Equalizer for OpenGL, then yes you can use multiple contexts but that is irrelevant for this discussion. Also DX11 supports multi-threading and deferred contexts, but once again it is one thread per context, there is no unbinding/rebinding of contexts at the singular thread level.
- Remember, switching contexts used to be an extremely costly operation for the GPU and to some extent it still is, all it's onboard pipelines and caches get flushed, the driver flushes it's shader caches and re-intialises it's memory manager, etc, etc.
- There is no wait states between GPU context switching, the GPU is effectively stalled whilst the old context is being flushed and the new one is being created. The process of sending draw calls to the GPU is already a deeply pipelined operation at the API level, at the driver level and even internally on the GPU. You NEVER want to stall the GPU, in an ideal game the GPU runs at 100%, (no VSync), all the time and the CPU cores run less than 100% happily feeding it batched draw calls, handling input, streaming resources, performing AI, procedurally generating content, (in ED's case), etc.
- What exactly does Star Citizen do that ED doesn't with respect to the DK2?
- Of course 400Billion systems aren't stored client side, the galaxy has been pre-generated server side and it feeds local slices, (infinitesimally small), to each island host, (multiplayer), or individual client, (solo); the client then generates a new scene based on that data.
- With the DK2, you have two separate cameras, your scene graph/render lists, need to be traversed, culled and converted into draw primitives twice, there is no short cut, no wait states, nothing, it's just twice the work. You can completely cheat by rendering a normal single image into a render buffer and then reconstruct a left and right view from the depth buffer by generating position information and generating a new left and right image in post processing shaders, but it is a hack and usually gives poor results.
And you don't get headaches trying to read the text, nor sore eyes? I suppose that you have very good eyes then, you don't wear corrective lenses and you are not in your 40s

My nephew is 19 and he play night and day, has no problem with vision and he can't play either, for more than an hour or so, and not daily. Not every person is the same, and that's why I am looking to see if the problem is mine and mine only, or if there is something else.
- I am 47 years old; when I first received my DK2 kit, the A lenses that I tried were horrible, (I am slightly near sighted and have an IPD of 68mm), after a couple of weeks of experimenting, I modified the B lenses, (to be 68mm apart), and rebuilt the DK2 examples with a modified HDMInfo.LensSeparationInMeters = 0.068 and guess what, voila everything was suddenly in focus and I could finally see the actual Pentile pattern as a screen door effect that a lot of people had been whining about on the Occulus forums. The CV1 without a shadow of a doubt needs physically adjustable IPD, (just like a set of binoculars), that the API can monitor or be adjusted to; the current optic system with it's sweetspot at 63.5mm +/- 2mm just doesn't cut it in my opinion.
- Now I use VRGear Interceptor which is a small Windows app that intercepts the DK2 API's messages and reports a user defined screen separation that matches my modified B lenses and once again, voila, I now have perfect in focus DK2 imagery for all applications, including ED.
http://www.vr-gear.com/
- Note I am not affiliated with them in anyway and even though I did end up ordering the attachments, I ended up using my own previously modified B lenses with their software.
This sentence sounds similar to " Poo is good, million of flies can't be wrong"; there is a reason why majority does not always hold the absolute truth
- Weird analogy; here's how I see it, there is no absolute truth, there is a majority truth, there are multiple minority truths and then there is your truth. Your truth is an isolated experience, what you have observed and experienced may have been observed and experienced by others in the minority groups, but not by the majority group.
- How do I know this, well it's extremely simple logic, if the majority experienced what you are experiencing, then the DK2 just would not work and it plainly does, for a lot of people, including me.
I understand that for many of us, playing this game is something that would minimize any kind of issue.
I am not saying that rift does not work in game; but I am giving my experience, which is shared with others that like me, cannot play this game decently, while can play other games without problems.
I may waste time searching for a solution, but my time is much more valuable when invested in more pressing matters, so to me there is no real issue, beside the disappointment (one of the many, after all), to not be able to enjoy the game fully like I do with other games, using the rift.
- I understand your experience and I am not trying to minimise it, as I expressed in my earlier post; but you are projecting your experiences onto others, go through the huge VR thread and collate how many users shares experiences similar to yours and how many share experiences similar to mine.
No apocalyptic forecasts were made here: I said "play at your own risk".
But it seems that for many, anything that could shake the solid ground that they built in their comfort zone, about this game, is something to avoid at all cost.
- I won't argue this point with you, it just has to much flammable potential, so I will leave it at that.
No big deal, it works for so many people, the problem does not exist...it is my computer or my eyes or who knows what else...Why optimize, when people buy bigger and faster systems? The cost of a card is 5 hours of work of an engineer, so looking at the money saved, it is a better choice to save on optimization
- I think you are really making light of the work and optimisation that has already gone into ED, I have been creating 3D engines for a very long time, right from pre-hardware days where you scan line filled the individual triangles themselves on the CPU. I don't share your views, I don't think it is perfect, but from what I have seen it is mostly there in terms of optimisation, especially given the server side size of the galaxy and the real-time generation of procedural content client side.
Thanks for your feedback; I will keep it in mind for the future.
This is an enjoyable discussion, I hope it continues!