Hi all,
I've been thinking about a project since getting my rift I thought I'd share, in case anyone talented wanted to pick this up. I have probably way too many projects of my own on the go at the moment, but might look into this more seriously in the new year.
On eo fthe first things I thought of once I was actually "in" my cockpit in Elite Dangerous was "wouldn't it be cool and save on a bunch of HOTAS switching, if I could just reach out my hand/finger in VR and tap the holographic panels to select functions". A bit like you see either in Minority Report or District 9 (a more relevant example).
So I started thinking how I would approach this and firstly some assumptions:
I assume that the holographic panels in your ship are a polygon with alpha blended (or similar technique) menus displayed on them - iirc this functionality was introduced in DX9.
If that's the case the holograph polygons are "just" 3D geometry and they should be able to have collision detection applied to them.
Not having looked at the Rift or Vive SDKs, but I assume there is some functionality for real world object tracking if there is an appropriate sensor.
Combine this with a KISS principle that you have one sensor per index finger wired back to USB you have a way of tracking your finger's position in the virtual 3D environment.
Then its a case of nagging Frontier into adding an articulation/animation to your virtual arms and hands so that when you move your finger towards the respective panel it follows in 3D and when the finger tip (assuming the sensor is mounted on a kind of "thimble" on your finger tip) intersects with the position of something on the holographic panel, collision is triggered and the menu item in question is selected.
It would make tab and item selection really easy. Although the panels might have to moved closer to the pilot, which I believe some have asked for anyway (I suspect the reason this hasn't been done is that you would need to do the movement in every ship/cockpit and it would need to be a selectable option so as not to spoil the ergonomics for "normal" pilots using 2D screens.
Sounds easy.....lol....but I thought I'd share for discussion.
I've been thinking about a project since getting my rift I thought I'd share, in case anyone talented wanted to pick this up. I have probably way too many projects of my own on the go at the moment, but might look into this more seriously in the new year.
On eo fthe first things I thought of once I was actually "in" my cockpit in Elite Dangerous was "wouldn't it be cool and save on a bunch of HOTAS switching, if I could just reach out my hand/finger in VR and tap the holographic panels to select functions". A bit like you see either in Minority Report or District 9 (a more relevant example).
So I started thinking how I would approach this and firstly some assumptions:
I assume that the holographic panels in your ship are a polygon with alpha blended (or similar technique) menus displayed on them - iirc this functionality was introduced in DX9.
If that's the case the holograph polygons are "just" 3D geometry and they should be able to have collision detection applied to them.
Not having looked at the Rift or Vive SDKs, but I assume there is some functionality for real world object tracking if there is an appropriate sensor.
Combine this with a KISS principle that you have one sensor per index finger wired back to USB you have a way of tracking your finger's position in the virtual 3D environment.
Then its a case of nagging Frontier into adding an articulation/animation to your virtual arms and hands so that when you move your finger towards the respective panel it follows in 3D and when the finger tip (assuming the sensor is mounted on a kind of "thimble" on your finger tip) intersects with the position of something on the holographic panel, collision is triggered and the menu item in question is selected.
It would make tab and item selection really easy. Although the panels might have to moved closer to the pilot, which I believe some have asked for anyway (I suspect the reason this hasn't been done is that you would need to do the movement in every ship/cockpit and it would need to be a selectable option so as not to spoil the ergonomics for "normal" pilots using 2D screens.
Sounds easy.....lol....but I thought I'd share for discussion.