Lack of 3D depth, just me or...?

Hello, just been reading up on this thread (been away for a few days). i have gotten a lot of good explanations that makes sense, but can't shake the feeling that something feels off. Let me try another set of questions then (thank you for anyone still being willing to be bothered). To all of you using a cv1, in the ledge/metropolis do you experience vertigo and or a sense of height? And finally have any of you experienced a loss of immersion in situations where you experienced it earlier?

DK2 user here. Atm it's a while since I've played ED btw.

But yes, I agree somethings feels off! And I think (not sure) it's after Oculus updated their runtime to 1.3 and/or later versions.

Prior to 1.3, space stations felt HUGE and intimidating. Maybe I just got used to it? I don't really know for sure. But now they are, well, still huge, but not as intimidating, the feeling of "being there" is not so much.

Now, when approaching a landable planet (ED Horizons), the planet doesn't seem big at all. Before you get to the "glide" thing, while approaching it, it should be like 100s of km away, but it feels like I could bounce a ball off of it. It looks small and no more than 20 meters away (it's been a while since I played it, but something like that). Then as you approach, I guess they just inflate the planet. I can't quite remember to have noticed that earlier (pre-Oculus 1.3?).

Same thing with other games. Project Cars and Dirt Rally, it's hard to judge distances, or indeed see depth, outside of the cockpit / car. (I even refunded DR due to this, though truth be told I'm not that much of a VR racer anyway, but that was the reason as it annoyed me).

It's is however easy to see depth in the area near you, like the cockpit and it's instruments, avatar, etc (though as someone stated, the avatar size is a little bit off to me as well due to IPD, but no biggie, the rest does look very good).

And yes, according to the all-knowing Wikipedia, and as others have pointed out, human depth perception due to convergence doesn't apply after about 10 meters. I'm no expert, and I think 10 meters sounds too close, but OK, maybe/probably that's how it is. But I still thinks something is off! My guess is also that the skybox is rendered too close, or the far point / "infinitiy" is at the wrong convergence (the 50 feet mentioned by several others). I guess the IPD is also a factor here, and how the skybox is displayed (or horizontally translated) on the left and right side in the VR HMD (the same with other "distant" objects).

It is very subtle thing though, as the head tracking still works (ofc), as do other depth cues (scale). So it's still pretty immersive. But I can't shake the feeling that something is off.
 
The wiki information is correct, beyond 10m it's the brain that works out scale and distance. That fact is that in some environments the brain fins this hard. An example would be judging the altitudes of airliners as they fly over. There aircraft are basically the same shape by vary in size, thus giving the brain difficulty. Same with craters and celestial objects. Go wandering in the desert and you'll have trouble judging distance.
 
DK2 user here. Atm it's a while since I've played ED btw.

But yes, I agree somethings feels off! And I think (not sure) it's after Oculus updated their runtime to 1.3 and/or later versions.

Prior to 1.3, space stations felt HUGE and intimidating. Maybe I just got used to it? I don't really know for sure. But now they are, well, still huge, but not as intimidating, the feeling of "being there" is not so much.

Now, when approaching a landable planet (ED Horizons), the planet doesn't seem big at all. Before you get to the "glide" thing, while approaching it, it should be like 100s of km away, but it feels like I could bounce a ball off of it. It looks small and no more than 20 meters away (it's been a while since I played it, but something like that). Then as you approach, I guess they just inflate the planet. I can't quite remember to have noticed that earlier (pre-Oculus 1.3?).

Same thing with other games. Project Cars and Dirt Rally, it's hard to judge distances, or indeed see depth, outside of the cockpit / car. (I even refunded DR due to this, though truth be told I'm not that much of a VR racer anyway, but that was the reason as it annoyed me).

It's is however easy to see depth in the area near you, like the cockpit and it's instruments, avatar, etc (though as someone stated, the avatar size is a little bit off to me as well due to IPD, but no biggie, the rest does look very good).

And yes, according to the all-knowing Wikipedia, and as others have pointed out, human depth perception due to convergence doesn't apply after about 10 meters. I'm no expert, and I think 10 meters sounds too close, but OK, maybe/probably that's how it is. But I still thinks something is off! My guess is also that the skybox is rendered too close, or the far point / "infinitiy" is at the wrong convergence (the 50 feet mentioned by several others). I guess the IPD is also a factor here, and how the skybox is displayed (or horizontally translated) on the left and right side in the VR HMD (the same with other "distant" objects).

It is very subtle thing though, as the head tracking still works (ofc), as do other depth cues (scale). So it's still pretty immersive. But I can't shake the feeling that something is off.

Have you applied the IPD hack to your DK2 and set it to your IPD? If not, that would explain the scale issues you are seeing. More details can be found here:

https://www.reddit.com/r/oculus/comments/4epjud/dk2_ipd_adjustment_on_13_real_ugly_fix/
 
Last edited:
No, I haven't. Kind of an intimidating procedure but that does looks promising! (Though it seems the hex pattern to search for is no longer valid for runtime 1.6 as per the comments below)
But thanks for the tip!

The hex patterns are still in the latest runtime. It's 6 patterns in total, 3 for the 64-bit dll and 3 for the 32-bit dll.
 
I found this interesting thread on the oculus forum.

https://forums.oculus.com/vip/discussion/23053/game-developers-make-stereographic-3d-an-option

It also follows a link to another thread in the first post of the thread that's interesting.

What I gather from this is that it is possible for a game or app not to render in true stereoscopic and that this might have a positive effect on vr sickness for example, or its done for other reasons. This could explain my perceived lack of 3D or depth, right?
 
Sorry for not reading all the responses and probably just regurgitating what's already been said 20 times, but outside of the cockpit is supposed to be flat as everything you see is beyond 4-5 meters so no parallax would be noticeable, just like in real life. It's not a design thing, don't get me wrong. Those things are rendered properly in stereoscopic 3D and if you got close enough to them they would exhibit parallax perfectly fine and there are plenty of situations where you can make that happen. This has been confirmed by a designer, although I can't remember which thread I saw it. It was a big deal at the time and he confirmed it over and over.
 
Yea I understand this and and I understand when simulating piloting space ships we loose a alot of depth cues. Oh well, weirdly enough I tried the impossible travel agency and that app gave me near-presence and a good feeling of depth and 3d. Going to try some other experiences and see what works best.
 
Yea I understand this and and I understand when simulating piloting space ships we loose a alot of depth cues. Oh well, weirdly enough I tried the impossible travel agency and that app gave me near-presence and a good feeling of depth and 3d. Going to try some other experiences and see what works best.

Have you got horizons?
 
Planet surfaces and stations have depth because they are 3d rendered. Space lacks depth because it is a pre-rendered sky-dome, all the stars are on the same plane - would be great to have parallax changes with distant stars rendered on different planes.

Anyway in real life you cant see any 3d after a 80-100 meters away, because the angle of the focal point between your eyes becomes almost zero!, and this angle is decreasing with distance.

https://www.quora.com/How-are-we-able-to-see-3D-objects-when-our-retina-is-just-a-2D-screen

nei5xk.jpg
 
Last edited:
Anyway in real life you cant see any 3d after a 80-100 meters away, because the angle of the focal point between your eyes becomes almost zero!, and this angle is decreasing with distance.

https://www.quora.com/How-are-we-able-to-see-3D-objects-when-our-retina-is-just-a-2D-screen

http://i65.tinypic.com/nei5xk.jpg
This exactly, when it comes to wanting a feeling a depth, in a space game, beyond the cockpit, yeah, depth is going to be near impossible, when projected correctly as it is in Elite, because the distances in space are great.
 
... [ED is] rendered properly in stereoscopic 3D and if you got close enough to them they would exhibit parallax perfectly fine and there are plenty of situations where you can make that happen. This has been confirmed by a designer, although I can't remember which thread I saw it. It was a big deal at the time and he confirmed it over and over.

There's no doubt in my minds that ED renders objects in correct 3D, both within the cockpit and without. The only trickery I'm aware of is the skybox/dome not being set at vast distance, but at 15-20 metres or so (50 feet-ish).

Everyone has different sensitivities to different aspects of their visual system. It is the quality of the immersion - the fact it is MUCH closer to 'reality' rendered than we have ever seen before that means we are noticing these tiny variations from reality - the bits current VR has yet to perfect. There are aspects that are simply not being addressed in current generation VR, and this leads to some slight but noticeable visual confusion for some people.

Its these tiny 'confusions' that some players are finding intensely annoying. :)

Current VR is a kind of 'brute force VR' - everything is being rendered, there's some wastage of rendered pixels (the corners we can't see), there's a fair bit of wasted 3D card and cpu time, and everything is rendered at the detail settings you set.
The whole scene is also rendered in perfect focus.

This last point is important because focus also plays a part in judging distance - its small and secondary to convergence, but measurable and provides valuable feedback on judging distances.
Feedback from the ciliary muscles around your eyes' lenses inform your vision about how hard you're having to focus on an object - which is related to the distance.
In VR, there is no focus - everything is rendered sharp, from the Elite insignia on your shoulder pad to the cockpit, to a . This robs your visual system of some important visual cues to size.
Thus with a VR hardware focal length set to about 1.3m, lots of objects that should feel further away even though convergence is fairly low or zero - you still feel as if they appear 'too close'. This seems to be a recurring current theme in VR.

The other is perception of size - even a Sidewinder is pretty big; an Anaconda is massive! We have never stood near a ship of this scale, walked around it, crawled over it, so we have little real idea of its actual size. We have to make that up in our heads as we play. We can see the differences, but defining an absolute size is difficult.
This lack of easy reference to familiar objects makes correct perception of depth even harder.
This is very subjective, and people familiar with large objects like aircraft might have an easier time of it.

The decreased resolution of current VR also detracts from your ability to define size - detail is reduced at distance, and we use drop-off in perceived detail as distance to an object increases to judge that distance, along with colour variation (general loss of warm red/yellow colours with increasing distance in air). The relatively low VR resolution hampers the ability to define much detail even at medium ranges, and of course in space there is no colour variation. You're left with relative scale and this is largely experiential.
 
There's no doubt in my minds that ED renders objects in correct 3D, both within the cockpit and without. The only trickery I'm aware of is the skybox/dome not being set at vast distance, but at 15-20 metres or so (50 feet-ish).

Everyone has different sensitivities to different aspects of their visual system. It is the quality of the immersion - the fact it is MUCH closer to 'reality' rendered than we have ever seen before that means we are noticing these tiny variations from reality - the bits current VR has yet to perfect. There are aspects that are simply not being addressed in current generation VR, and this leads to some slight but noticeable visual confusion for some people.

Its these tiny 'confusions' that some players are finding intensely annoying. :)

Current VR is a kind of 'brute force VR' - everything is being rendered, there's some wastage of rendered pixels (the corners we can't see), there's a fair bit of wasted 3D card and cpu time, and everything is rendered at the detail settings you set.
The whole scene is also rendered in perfect focus.

This last point is important because focus also plays a part in judging distance - its small and secondary to convergence, but measurable and provides valuable feedback on judging distances.
Feedback from the ciliary muscles around your eyes' lenses inform your vision about how hard you're having to focus on an object - which is related to the distance.
In VR, there is no focus - everything is rendered sharp, from the Elite insignia on your shoulder pad to the cockpit, to a . This robs your visual system of some important visual cues to size.
Thus with a VR hardware focal length set to about 1.3m, lots of objects that should feel further away even though convergence is fairly low or zero - you still feel as if they appear 'too close'. This seems to be a recurring current theme in VR.

The other is perception of size - even a Sidewinder is pretty big; an Anaconda is massive! We have never stood near a ship of this scale, walked around it, crawled over it, so we have little real idea of its actual size. We have to make that up in our heads as we play. We can see the differences, but defining an absolute size is difficult.
This lack of easy reference to familiar objects makes correct perception of depth even harder.
This is very subjective, and people familiar with large objects like aircraft might have an easier time of it.

The decreased resolution of current VR also detracts from your ability to define size - detail is reduced at distance, and we use drop-off in perceived detail as distance to an object increases to judge that distance, along with colour variation (general loss of warm red/yellow colours with increasing distance in air). The relatively low VR resolution hampers the ability to define much detail even at medium ranges, and of course in space there is no colour variation. You're left with relative scale and this is largely experiential.


Ah, then no, this one isn't ED's fault but Oculus'. I suppose your IPD is fairly large, like 65mm and above? The separation on the screens seems to be fixed at around 62ish so for me, the scale of everything is 100% perfect, and on the DK2 it felt really small. There isn't any way to change it, all you change with that toggler on the CV1 is the center focus of the lenses. Sorry man, I know how you feel. It all felt so sucky on the DK2 and then the CV1 was a revelation.
 
Ah, then no, this one isn't ED's fault but Oculus'. I suppose your IPD is fairly large, like 65mm and above? The separation on the screens seems to be fixed at around 62ish so for me, the scale of everything is 100% perfect, and on the DK2 it felt really small. There isn't any way to change it, all you change with that toggler on the CV1 is the center focus of the lenses. Sorry man, I know how you feel. It all felt so sucky on the DK2 and then the CV1 was a revelation.

My IPD, according to the green cross is 63mm. Average. So that'd make sense to place the screens that far apart in the Rift and just handle the IPD by moving the lenses (and the displayed image).

I don't have any real issue with any of the VR oddities - I see the (probable) technical reason for most of them and simply enjoy the experience for 99% of the time. Wouldn't put my Rift down for ED if you paid me.
But it is interesting seeing the variations in the perceptions of others using VR.
 
My IPD, according to the green cross is 63mm. Average. So that'd make sense to place the screens that far apart in the Rift and just handle the IPD by moving the lenses (and the displayed image).

I don't have any real issue with any of the VR oddities - I see the (probable) technical reason for most of them and simply enjoy the experience for 99% of the time. Wouldn't put my Rift down for ED if you paid me.
But it is interesting seeing the variations in the perceptions of others using VR.

According to the green cross, my IPD is slightly less than 59, maybe 58 or 57 (but it won't go that low) so you are at least 4mm wider than me. I know my IPD is really 61, measured by an optician, so yours is probably over 65.

Either way, you are very likely to perceive the virtual world as being smaller and closer than I am
 
Ah, then no, this one isn't ED's fault but Oculus'. I suppose your IPD is fairly large, like 65mm and above? The separation on the screens seems to be fixed at around 62ish so for me, the scale of everything is 100% perfect, and on the DK2 it felt really small. There isn't any way to change it, all you change with that toggler on the CV1 is the center focus of the lenses. Sorry man, I know how you feel. It all felt so sucky on the DK2 and then the CV1 was a revelation.

CV1 user. I have completely different experience... My IPD is 68, in DK2 it was hard with focus but scale (object size, avatar size) was absolutely correct for me, now in CV1 everything looks much smaller (child body, wrong hotas position). Tried different workarounds but without any visible result...
 
According to the green cross, my IPD is slightly less than 59, maybe 58 or 57 (but it won't go that low) so you are at least 4mm wider than me. I know my IPD is really 61, measured by an optician, so yours is probably over 65.

Either way, you are very likely to perceive the virtual world as being smaller and closer than I am

Interesting - I should check my actual IPD from the optician.

99% of the time though I'm lost in ED the same way as everyone else. Its only when I really try to be aware of it, that the illusion falls.
 
Firstly I don't have an Oculus Rift nor have I played ED with it, so I have no experience with that hardware and how much 'depth' is visible using that hardware.

I have a VIVE however, and apart from everything else that is rendered in 3D (Interior and exterior of stations and ships, planets in space, planets surfaces etc...) which looks spectacular and really gives you the impression that you are there, I feel as if I am 'stuck in a sphere' which is indeed roughly less than 200 meters across.

I'm not sure ED can do something about this with the current skybox and it results in minor nausea when playing the game for a couple of hours.

What accentuates this feeling is twofold:

1) The lenses used by the VIVE (fresnel I believe) causing everything to look as if faceted - like looking through an insects eye. For distant images like the Milky Way nebula image projected on ED's skybox, the facet effect gives you the feeling that it was painted on some sort of canvas with oil paint which of course does not help.

2) The skydome and stars themself are not projections, but indeed a flat image. Since the resolution is pretty low in VR, the sharpness is gone and it feels like the stars were smeared across the skybox instead of really standing out.


There are three solutions to this as far as I can tell which could help to minimize this effect.

1) Give users the option to switch the nebula picture off completely, and allow us to have a pitch black skybox. (Can be made possible by the ED Dev team)

2) Make the Skybox image as HI-res as possible to minimize the facet effect produced by the frensell lenses.

3) Extend the lightbox to only show stars that are close to the player (20 Ly or so or closer) which you can never reach in SC, but who are rendered on the skybox in high detail, so they don't look faded or fuzzy.

The second solution is maybe technically not feasible, but the first one should definitely be.
 
Last edited:
I agree with point 3 above, although to me in the rift I feel like I'm in big space. Having seen the nebula in BigScreen, space could be a lot better in ED. As it is now, they just show too much.
 
Last edited:
Back
Top Bottom