Image convergence for distance, Oculus Rift CV1

Whenever I play ED with the CV1, it always looks as though the skybox, the horizon (when landed), etc, are projected onto a screen about 20 or so feet (7m) away, rather than stretching out to infinite distance. In the graphics settings, there's an option for 3D separation that appears to be disabled when using HMDs. I was wondering if there is some other way to adjust the stereoscopic image separation for distant objects? Everything in the near range, such as the cockpit interior, looks fine. But beyond the cockpit, it looks like the world compresses into a narrow depth. Has anyone else noticed this?
 
Last edited:
Absolutely agree! Skybox sucks compared to interior of cockpit. I notice this effect in other games too. Close objects are great in VR but distant objects look very 2D flat. Not sure if they can do anything about that with present VR tech though..
 
This might be more prevalent on planets. When I am flying out in the black things feel pretty damn far away from me.
 
Is this because you have your gamma too high and everything is too bright? I've heard this complaint before, but I never really noticed it until I turned my gamma up during testing. I kept my gamma at 0 while I played on a monitor, and now have about 2 clicks with the Rift, and the stars look like they are "infinity" away to me. But at about 1/2 gamma, it does look like there is a dark gray wall with stars on it.

Meh, I'm going to have to look at it again to see if things are still the same after so many updates.
 
In real life, if you wrap your hands around your eyes like you're wearing blinders and then look at a clear night sky, how far away do the stars and other objects look?

I haven't tried this, but I suspect the answer is not too far because things at a distance lose their dimensionality naturally.
 
If you have the hardware for it: Supersample a bit (favour the Oculus supersampling option over the ones offered by the game), and increase the size of the skybox bitmaps (GraphicsConfiguration.xml: <GalaxyBackground>/{your quality level}/<TextureSize>).
(EDIT: Bright stars will still be rendered as large smudges/blotches, rather than points of light, but the dimmer ones should look less painted on, at least)
 
Last edited:
Whenever I play ED with the CV1, it always looks as though the skybox, the horizon (when landed), etc, are projected onto a screen about 20 or so feet (7m) away, rather than stretching out to infinite distance.
I assume you must be talking about outside the cockpit, because some cockpits are so large their glass looks 20 feet away. I paid a lot of attention tonight, and I think gamma does have an effect, but I've always kept things pretty dark. If I'm out in space with just stars, I can sometimes trick myself into thinking they are just 20-30 feet away. But if you add a planet into the mix, that all falls apart for me. Even if I image a small moon as a ball or a dirt clod, as I get closer, eventually it grows huge. Even if I thought it was right outside the cockpit, it would grow to the size of a house or larger, so how can it only be a short distance away. And if you let your imagination take you away, you feel miles up in the "air", and as you go lower you see huge craters or mountain ranges that must be hundreds of miles away, so how can the stars be just 20 feet? Yeah, if I want to, I can force myself to think it is all small and close, but it's easier just to go with the flow.

Btw, how far away do things look on a 2D monitor - 2 feet? :rolleyes:

Maybe try some transcendental meditation before your next flight to clear your mind of bad thoughts and to prepare for space.

Btw, I ventured out to look at the stars, and they actually didn't look too far away. Certainly a lot farther than 20 feet, but still within walking distance - it was hard to estimate.
 
It's not a gamma issue, or brightness or anything like that. I'm talking about the depth perception. The skybox literally looks like it's only about 20 feet away, due to the spacing between the eyes and centering of the images. This tells me that anything "infinitely" far away is slightly closer to the center (toward the nose in the display) for both eyes than it should be. Anything at a significant distance should completely lose any sense of depth, since it's so far away, but instead it looks only 20 feet out, like a wall on the other side of a large room. It may be that I'm just more sensitive to depth perception than many people, but I can tell that it's not quite right, at least with my setup.

Does anyone know if Elite and/or the Oculus configuration software dynamically take into account the lens separation, via the slider on the underside of the headset, or is it configured only once during the initial setup of the Oculus configuration? I'm wondering if I need to re-run the setup, with the lenses set further apart, to force the 3D eye separation to be a little larger? I had configured it as per the instructions, based on what gave me the best clarity in the center of the image.
 
If you consider that you are still looking at a 2d image projection and that VR (like 3d shutter glasses) in only creating the depth effect by slightly delaying the image between eyes have to expect it will be less than perfect. You can adjust gamma, shadows, etc to help trick the brain to your particular perceptions, but when trying to convince the brain that small dots in the sky are at great distance likely isn't easy, given the tech we have. We also have lighting issues in Ed that likely influence this as well. When looking at the night sky in RL it can be hard to appreciate the great distances involved, as well. Especially under poor lighting conditions. The affect can also diminish as one gets use to it, depending on how much time is spent in ED looking for it. I try to mix it up in my VR play to try and keep it fresh and take a break to do other things, like hit the golf course. Seems to help as far as my perception in VR goes. YMMV
 
Actually depth is quite easy, since that's the whole point of having two separate images for your eyes. :) That's why near things look near, and far things look far. Most of it is pretty accurate, but it breaks down at a distance when the separation is off by just a little bit, and I think it's probably down to just a couple of pixels. What's hard is that everyone's eye separation is different. The slider on the headset detects where you've set it, so that it can try to get the depth perception correct, but my impression is that ED ignores what the Oculus is trying to do, and sets the image separation in the headset to what it wants, which is a conservative spacing that works for most people, but results in a compressed depth for most. I can't prove it though, and it's quite possible that it comes down to the initial setup in the oculus software, but I'm just not sure. I was hoping someone had experimented with this already and could give some tips. If I could just tweak the image centering/separation a tiny bit, it would be great.
 
Last edited:
I don't know if it's due to my glasses, but it sounds like the IPD (Inter-Pupil Distance) that gives me the best clarity, is not correct for proper depth perception of distant objects. Apparently it can be adjusted on the fly. I'll have to try this.

EDIT: Apparently the Oculus software has a default IPD number that it sends to games, regardless of where you set the lens separation slider on the headset. This leads to some people having incorrect depth perception more than others, depending on their actual eye separation (IPD).

https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/

https://www.reddit.com/r/oculus/comments/4h0hfl/can_i_run_just_the_ipd_calibration_by_itself/

https://www.reddit.com/r/oculus/comments/29f1t8/at_what_distance_does_stereo_vision_end_30ft/

And it looks like there's an old thread about this, which I may need to read now:

https://forums.frontier.co.uk/showthread.php/285680-Lack-of-3D-depth-just-me-or

EDIT: And a hack for the DK2 that discusses the issue:

https://www.reddit.com/r/oculus/comments/4epjud/dk2_ipd_adjustment_on_13_real_ugly_fix/
 
Last edited:
The sending-fixed-IPD-to-the-game-instead-of-what-the-dial-says bug was fixed ages ago, wasn't it?

(EDIT: There have been one or two people reporting (and sending back for repair) their HMDs having faulty potentiometers, though, so that their IPD settings did not get reported to the driver.)
 
Last edited:
I may have to consider opening a ticket with them then. The setup acknowledges the IPD, displaying a changing number on the screen as I move the slider, however the depth is unchanged.
 
I am pretty sure that means it's working. The in-game cameras are moving with the real-world screens, and you see the result, just not necessarily through the centre of the lenses. If the change was only on one of the sides of the real/virtual divide, then you'd see the world rescaling.
 
Last edited:
Possibly. But distant objects are still being drawn with too central of a convergence. I'm just having a hard time figuring out if it's the Rift's fault, or ED's, and I'd love to be able to tweak it.
 
For what it's worth, I don't believe I see anything of the sort here; How ever far the skybox appears to be to me, it's certainly not in front of the mountains it crowns, here where I am landed. I know from occasional experience with injection drivers that it would most certainly cause me severe eyestrain if it did.

Things outside the cockpit always look small and close, to me, mind, with stars and planets seeming like beachballs, and the station mailslot looking like I couldn't possibly fit through it, even with a sidewinder, unless there is another ship passing through it at the moment (EDIT: ...offering a size reference) -- I could swear my feet dig through the landing pad, every time I dive down to land. :p

Maybe this is due to ED rendering stuff wrong, but there is inevitably the matter of stereopsis only working on a rather close range; Already seated as little as 6m up, and hoisting something down (IRL), I can't reliably tell the distance between the ground and the object on the hook, and keep having to reel out rope long after I think I should be done. Here other depth cues become important, such as accomodation (eye focus), motion parallax, perspective, known size objects for reference and the density and sharpness of detail on things, motion speed, atmospheric occlusion and dispersion, etc, etc, and many of these do not apply in a helpful manner in airless space, with superluminal travel. :7

(EDIT: You may want to try an experience called "Titans of Space", if you haven't already; It gives you a sightseeing tour of our local planets and moons, deliberatly rendered at a 1:1000000 scale, because sizing them down to within where we have steropsis going, gives a sense of scale that you just can't get with their enormous real sizes.)

I suggested doing some supersamling, if possible, because at the very least it does help some with the detail density matter, even despite the low resolution current HMDs have. With enough SS, the screen door effect immediately comes to feel more like actually looking through a screen door, instead of being something that is affixed directly to the image. The stars still look like big painted dots, instead of light points, because that's what they are, but at least they become more stable and don't "shimmer" when looking around. (...and if you, like myself, think the large dots for stars is an ugly art direction decision, know that at one time, before 1.0, the star field was made up of them old four-point flares, that looked equally crayon-drawn. :p)

Hopefully generation two or three HMDs will let us focus freely, instead of having the entire image on our retinas in focus simultaneously at all times (maybe using lightfield-, or holographic- or multifocal displays, or maybe something entirely different). For now; Make sure to turn off any depth of field effect; They only serve to produce an interesting, but almost always counterproductive "tilt-shift photography"-like "scale model" effect. Other "cinematic" post effects, such as bloom, film grain, and lens flare, also tend to plaster themselves to your face, ruining immersion. :7
 
Last edited:
Normal

The problem you have with depth is normal, and everyone experiences that problem with current generation hmds. Rendering resolution is not enough to properly show depth of distant objects.

Objects at just a few dozens of meters will already present the flat problem.

There is a trick that I'm using on my own space game, which is a switch between two interpupillary distances: "normal" at about 6.5cm -- so you see the cockpit and everything else normally, but distant objects look flat, and "giant", where I simply multiply the IPD by 100, or 6.5m of distance between the virtual eyes (on universe rendering, not on your hmd, of course). The resulting effect is that it makes the bigger and farther objects show a great depth, but then because of the exaggerated virtual interpupillary distance (it is similar to how an enormous giant would see the world), you can't see your own cockpit anymore (it is like a small ant between the giant's eye, let's say).

I am not sure if this is allowed to post a reference to a sort of "competing" game over here, but considering that my game is a much smaller, single programmer production with no official budget, maybe no one gets offended, so here goes the first youtube video that I released just today:

[video=youtube;ysvn0g2pQ50]https://www.youtube.com/watch?v=ysvn0g2pQ50[/video]

Of course it's a much smaller game, with much smaller budget (I work alone on everything, from the engine built from scratch to the procedural models). But it is tiny in download size -- less than 16mb -- and runs directly from the web on all major Operating Systems including mobile. It just opens and becomes ready to play.

Many features where not shown on video yet, for example Google Carboard and Oculus Rift stereoscopic rendering (but it has and works pretty well already), realtime in-game editor accessible to all players, combat, meeting on space stations and planets, leap motion and kinect support.

Although the game is not too similar to Elite in gameplay, it also sports a fully procedural galaxy, but with its own set of gameplay feeling and characteristics.

Regards.
 
I did open a ticket with them, and they're asking me to take pictures with a camera, through each lens, as well as take a screenshot from the mirror on the monitor. Plus check my driver versions, etc, etc. I'll follow up on that.

But yes, my fear is that it's a limitation of what the HMDs can do, which I wouldn't expect, since technically infinite distance just means that the center of each screen field needs to be centered with respect to the eye/IPD, so that the eyes can have a parallel view angle.

The thing is (perhaps unfortunately in this case), I have pretty decent stereopsis sensitivity. The skybox doesn't look closer to me than the mountains, no, but rather everything outside of the cockpit gradually compresses into a narrower distance, the further it gets. The farthest anything looks to me, is about 20-30 feet, whether it's a space station interior, the skybox, other ships, etc. I lose a sense of scale and distance a lot sooner (distance wise) than I should. The landing pads in the bases look like they're on a projection screen, rather than actually being there. If it weren't for the fact that I see this effect gradually come in over a range of distances, such as when driving around in the SRV, I would have thought the entire outside world was rendered onto a sphere around the cockpit at close range.

Some other programs I have can make this less noticeable due to other effects, such as haze/fog, parallax, and other such cues that have been brought up here. But it's entirely a stereopsis issue that's bugging me. I didn't notice it until I played with "Discovering Space", and then Elite, where you can have unobstructed view to things that are very far away.

Truth is, it's a pretty minor issue, and if I've been playing for over an hour, I can forget that it's there. My brain will start accepting it as true distance again, just not with "infinite" distance that you should perceive for anything 100m-200m and further. When I first put the HMD on, it smacks me in the face.

I can get used to it. I'll just moan about it. ;)

EDIT: It occurs to me that they (Oculus) probably did this intentionally. They probably erred on the side of caution, and opted to lean toward a smaller matching of your IDP, rather than risk going too large. If your eyes have to spread wider than parallel at any time, nothing will feel right, and lots of people will get headaches, etc. So they probably went with a safer configuration, that most people won't notice. I would really like a slider in the settings, to add a manual IPD offset, or otherwise a checkbox to turn off safe IPD estimation, or something along those lines.
 
Last edited:
Back
Top Bottom