Touching the UI panels - a project I've been thinking about

Hi all,

I've been thinking about a project since getting my rift I thought I'd share, in case anyone talented wanted to pick this up. I have probably way too many projects of my own on the go at the moment, but might look into this more seriously in the new year.

On eo fthe first things I thought of once I was actually "in" my cockpit in Elite Dangerous was "wouldn't it be cool and save on a bunch of HOTAS switching, if I could just reach out my hand/finger in VR and tap the holographic panels to select functions". A bit like you see either in Minority Report or District 9 (a more relevant example).

So I started thinking how I would approach this and firstly some assumptions:

I assume that the holographic panels in your ship are a polygon with alpha blended (or similar technique) menus displayed on them - iirc this functionality was introduced in DX9.

If that's the case the holograph polygons are "just" 3D geometry and they should be able to have collision detection applied to them.

Not having looked at the Rift or Vive SDKs, but I assume there is some functionality for real world object tracking if there is an appropriate sensor.

Combine this with a KISS principle that you have one sensor per index finger wired back to USB you have a way of tracking your finger's position in the virtual 3D environment.

Then its a case of nagging Frontier into adding an articulation/animation to your virtual arms and hands so that when you move your finger towards the respective panel it follows in 3D and when the finger tip (assuming the sensor is mounted on a kind of "thimble" on your finger tip) intersects with the position of something on the holographic panel, collision is triggered and the menu item in question is selected.

It would make tab and item selection really easy. Although the panels might have to moved closer to the pilot, which I believe some have asked for anyway (I suspect the reason this hasn't been done is that you would need to do the movement in every ship/cockpit and it would need to be a selectable option so as not to spoil the ergonomics for "normal" pilots using 2D screens.

Sounds easy.....lol....but I thought I'd share for discussion.
 
Occulus Touch maybe? I haven't ordered them yet because ED is really the only thing I use the Rift for. Not sure it would be particularly effective while flying, but in-station it could be great.
 
I thought about Oculus Touch but a. they are £189 (GBP) on Amazon iirc and b. too complex for what is really needed i.e. the ability to just reach out from the HOTAS and "touch" the panels.
 
Then its a case of nagging Frontier into adding an articulation/animation to your virtual arms and hands so that when you move your finger towards the respective panel it follows in 3D and when the finger tip (assuming the sensor is mounted on a kind of "thimble" on your finger tip) intersects with the position of something on the holographic panel, collision is triggered and the menu item in question is selected.
While I applaud your creativity (repped), I'd actually like to see less motion in the cockpit. Having the hand come off your stick and flex on occasion can be distracting if you are in VR and not actually physically moving. When in VR, I'd like a setting to keep the hands on the HOTAS. When using monitor, it may be a nice addition, but it would take a lot of dev time for a limited visual effect.
 
This is how Fly Inside does it with several Flight Sims.

That's the functionality I was envisaging, but with the current 3D arms and hands rather than "ghosts" (as per Junky Juke's post)- Leap Motion maybe the way then, although I'd like to see an index finger pointing once you lift your hands off of the HOTAS rather than the whole hand pointing (unless it resolves all fingers individually?).
 
Why would you assume that the panels have colliders on them? Elite does UI navigation with keystrokes, so I would guess they haven't bothered creating colliders for their UI controls.

Also, if you understand their rendering pipeline it should be possible to inject the model of your hand in their near-most render queue.

It sounds like a not-so-easy endeavor :)
 
Not sure what the solution would be for the Rift other than support for something like Leapmotion. I don't think you would want to use touch controllers since your probably flying with a HOTAS. That would be a PITA to take your hands of the stick to pick up a touch controller to interact with the UI.

As for the Vive, since they have opened up their sensor kit to 3rd party devs, I could easily see a pair of haptic gloves covered by the same sensors that the Vive controllers use. That way they wouldn't interfere with normal HOTAS control. The tracking would be very precise too. Not saying it's a trivial thing to R&D, but totally doable. Obviously that would require a huge amount of effort on FD's part to customize the current UI for that use. That translates into a lot of $$ (or pounds if I could find the symbol for it :p) for a pretty niche feature.

It's certainly fun to daydream about, but I think it would be years away at best.
 
Last edited:
Back
Top Bottom