Various tracking based VR control schemes

ED supports a variety of control schemes KBD+Mouse, Gamepad, and HOTAS, and it supports VR. I'd like to see other VR specific control schemes that make full or partial use of the various types of tracked controls VR systems offer be supported.

There are a variety of VR controls that various groups of people want to use.

  • Menu controls: The ability to point at game menus with tracked controllers to change options and start playing the game.
  • Galaxy map controls: The ability to grab (trigger or grip button) the map and pan it by pulling/pushing it. And a 2-handed pinch-to-zoom like gesture to scale the map. (other suggestions welcome)
  • HOTAS + hand tracking hybrid: Using either VR gloves or leap motion to make it so when you let go of your HOTAS your in-game hands move as your real hands do.
  • Virtual throttle and stick: See VTOL VR for a demonstration of how this would work and a demonstration at how reliable and precise it can be when you anchor your controller on your leg.
  • Gesture controls like in X Rebirth VR.
  • In cockpit controls: Making it possible to use the various buttons you have in the cockpit to directly control the ship.

There are various reasons to support VR controls.

  • It's nice to see your hands actually move in VR when you take a break and let go of the controls.
  • As VTOL VR and X Rebirth VR have demonstrated, the level of control a virtual throttle and stick offer is beyond just acceptable. PvP and some advanced combat players may want to stick to their HOTAS (which will of course still work). But Fronter have stated they want a variety of player roles to be viable forms of gameplay, and the combat based player roles that might warrant a physical HOTAS make up less than half of these player roles.
  • Some roles like explorer and long distance trader can have less use for combat precision and more use for the ability to relax while they are waiting for their ship to travel point to point. There are already VR players using overlays to watch things on Netflix in their cockpit while they are flying around in ED. I believe these players may find it significantly more useful to be able to see their hands move when they aren't controlling the ship, see where their hands/controllers are when interacting with their overlays, and see where their controls are when they need to quickly start flying again.
  • Elite Dangerous explicitly supports multiple controls shemes, you do not require a HOTAS to play. The ability to play with the Kbd+mouse that come with your PC is a supported way to play. Given that I don't think it's unreasonable to support playing the VR mode of ED using hardware that comes with the VR system.
  • Even if you can use voice commands for some ship functions, there are still some things you may want in-cockpit buttons to activate. Voice commands and cockput buttons are not mutually exclusive, rather they complement each other.

I can understand players wanting Fronter to focus development time on fixing other bugs. I've got programming experience myself and would love to try tackling control additions myself if Fronter had a way to support community based development. Sadly most of these control schemes cannot easily be made as external programs; at least not in the optimal way.
 
Here's a long post with notes on implementation details.

Basic hand presence:
The first thing to do when implementing these would probably be to decouple the in-game hands from the throttle/stick animations. i.e. Make it so that in the code the hands can either be attached to the current input based animation system or attached to a tracking point.

This will of course require IK for the arms (assuming ED doesn't already have this, which given the hand movements it's likely they may already have an IK system).

HOTAS + VR glove/leapmotion hybrid:
To handle HOTAS users using VR gloves or leapmotion we will probably need a "Calibrate Throttle/Joystick location" button somewhere in the menu. When used you would grab onto the throttle and joystick of your HOTAS and move them around. When the throttle or joystick input is being moved we know that your hands are on the controls, so when the inputs are changing we can consult the location of the VR gloves or the Leapmotion's Bones API to get the general location of the HOTAS stick and throttle and save that info. A rough sphere is good enough for the joystick; we might want a rough capsule for the throttle to take into account the large variance in position between the min and max positions.

After calibration, if a player's hand is within the rough area we calibrated as the location of the throttle or joystick and the player's fingers on that hand are curled to a degree that looks like they may be grabbing something (we may be able to calibrate this the curve degree at the same time as calibrating the location), then we will presume that the player is holding onto their HOTAS throttle/stick.

When we think one of the player's hand is on their HOTAS, we will set the in-game hands to use the current input based animation system – although perhaps we should tweak it to respond directly to HOTAS input pitch/yaw/role instead of ship pitch/yaw/role so the reactions still match the real world HOTAS if the player changes bindings.
When we don't think their hand is on their HOTAS, we'll set the hands to track their real-world hand locations from the VR glove SDK/skeletal input/bones API.

Hand presence and in-cockpit controls:
We should have some sort of "Use Skeletal Input" option. It will affect how the hands behave when you are not grabbing your HOTAS or a virtual throttle/joystick.
  • When it's off (best for Vive wands and WMR wands) we will have pre-set hand poses like in VTOL VR. e.g. When you are close to a button the hand looks like it is touching the button and when you pull the trigger it presses it; When you are close to a switch/slider it looks like the hand is pinching it and when you pull the trigger it grabs it and lets you slide/flip it by moving your hand.
  • When it is on (best for Touch controllers, Knuckles, VR gloves, and Leapmotion) we will make use of OpenVR's Skeletal Input API, the hands of the Oculus Avatar API, and Leapmotion's Bones API (these APIs give you finger pose information). The bones in the in-game character's hands will be linked to the finger pose information we get from the SDKs. In other words, when "Use Skeletal Input" is on the in-game character's hands will be in the same physical position as your real-world hands and the in-game character's fingers will roughly make the same movements as your real world fingers. To control buttons, when the index finger is stretched out (ish) buttons are pressed when you push the tip of your index finger against them. For sliders you grab on to them when your index and thumb are in a pinching pose. For Knuckles/Touch it may be desirable to use the trigger for these interactions for reliability.

Virtual throttle/stick:
For the virtual throttle/stick implementation. Like in VTOL VR the grip input will grab onto the virtual stick or throttle when your hand is close to it.

In VTOL VR this has two modes: If you press and hold then you'll release the stick when you let go; If you quickly press and release, then you'll hold onto the throttle/stick and let go when you press and release again. We'll probably want an option to disable the press and release functionality and exclusively use press and hold, for Touch and Knuckles who will likely just use the press and hold mode and an accidental lock onto the throttle/stick would be confusing.

While grabbing on to the virtual throttle, your physical movement will control the throttle position. While grabbing the virtual joystick the rotation of your controller will be treated as a virtual pitch/roll/yaw input axis, this virtual input axis can be bound in the options. Translational movement of the controller is ignored (this lets you anchor it on a physical location like your leg/knee, which gives you a large degree of stability/control over it).

Control bindings while grabbing a virtual throttle/stick:
Also while a hand is grabbing the virtual throttle or joystick the buttons on the controller have throttle/joystick specific bindings that can be bound in the control options. i.e. You can go into controls and bind the controller's trigger when grabbing joystick to primary fire, so you can grab the joystick and pull the joystick for primary fire. Same goes for the Vive/WMR wand's application menu button and Touch/Knuckles' A/B buttons. The Touch/WMR wand/Knuckles joysticks are similar in ability to bind, except it should probably be possible to define them as either an axis or 4 directional buttons (plus a click action).

For the trackpads on Vive wand/WMR wand/Knuckles we will need to choose a mode for it when the throttle is grabbed and another for the joystick. Though actually, we're going to also need to override this based on other conditions e.g. headlook at a holo panel, like how when you look at a holo panel you can make RB on a controller switch categories instead of throttling up.

To account for different possible control setups we'll need a variety of button modes and the ability to have a swipe/drag mode that works simultaneously with the button mode.
  • In a 5 button press button mode each of the 4 directions on the outside region of the pad would be a directional button and the center of the pad a separate button [like in VTOL VR].
  • 2/3 button modes would make the left/right or top/bottom regions buttons and optionally allow the center to be a third
  • 1 button mode would just make the whole trackpad a button (probably useful on WMR)
  • In a horizontal swipe mode a quick swipe left or right would each be a button input (not sure there is use for a vertical version)
  • In a drag mode, dragging your finger in a direction will emit directional button presses. 4 directional buttons of course. And there will probably be a scale level that lets you control how far you have to travel before another press in that direction is emitted. e.g. You can bind the down direction to panel navigation down and then the holo panel will navigate downward through items as you drag your finger downward, depending on how fast you move your finger.

For more context. On the Vive I can imagine at least two ideas on how someone might want to setup bindings to control the holo panels.
  • 5 button press + horizontal swipe: Pressing the trackpad in one of the directions navigates up/down/left/right in the panel. Pressing in the center is selection. And a quick swipe to the left or right switches between the panel's tabs.
  • 3 button press + drag: Sliding your finger along the trackpad navigates up/down/left/right. Pressing in the center is selection. Pressing on the left/right switches between panel tabs.

And I can imagine people having trackpad bindings like left/right drag or left/right button press for next/prev target.

Galaxy/system maps and menus:
For the galaxy map. I imagine we'd want a single pan/grab binding which would likely be bound to trigger, grip button, or Knuckles grab.
Using that button on a controller would "grab" the galaxy map and allow you to move it in any direction and rotate the view.
Using it on both controllers would allow a 2-handed pinch to zoom style gesture to scale the galaxy map.

I imagine the system menu would behave the same. But of course in 2d.

Trackpad/joystick may make sense to navigate the galaxy map's menu. But I think the more intuitive controls might be pointing at systems and at menus with the controllers and selecting with the trigger.
Some people might also want joystick movement of the galaxy map instead of just grab.

Game menus in general should also be usable with tracked controllers (i.e. starting a game, options menu, pause menu, etc).
Given the distance of most game menus IIRC, I'm guessing the best thing would be to allow pointing at menus with the controller and pulling the trigger to select options.

Other notes on the holo panels:
Some players have suggested they may also want to be able to use that laser style control to control the holo panels in the cockpit.

While others have suggested they want the option to have all holo panels be close by (like the close up ones in an Eagle), and would like to be able to interact with them directly.

Gesture controls:
I haven't played X Rebirth VR yet, so I don't have implementation ideas on how to handle that style of ship control in ED yet.
 
Touchscreen controls. To be able to touch the screen to highlight ships or text.

Others have requested the ability to directly interact with the left and right holo panels. Cockpits will need a secondary holo panel placement spot for it. Small ships like the Eagle have them within reach. But on larger ones they are further away, so large ships would need a close by alt placement.

What kind of touch highlight of ships/text are you thinking of? The closest think I can think of is laser pointer targeting with a tracked controller.
 

Lestat

Banned
Others have requested the ability to directly interact with the left and right holo panels. Cockpits will need a secondary holo panel placement spot for it. Small ships like the Eagle have them within reach. But on larger ones they are further away, so large ships would need a close by alt placement.

What kind of touch highlight of ships/text are you thinking of? The closest think I can think of is laser pointer targeting with a tracked controller.

I have a lenovo touch screen laptop. I don't think distance would be a problem. You hit a button for each screen and touch the screen it would not be a issue. It acted as a primary mouse button. The only problem would be dirty screen due to oils on your fingers.
 
I have a lenovo touch screen laptop. I don't think distance would be a problem. You hit a button for each screen and touch the screen it would not be a issue. It acted as a primary mouse button. The only problem would be dirty screen due to oils on your fingers.

Oh, you're referring to non-VR controls for users with touch capable monitors. The controls I'm requesting are VR controls, to support the various tracked VR controllers and VR gloves / the VR setup of Leapmotion.

Most of what I'm suggesting don't change gameplay much, just gives better ways to control things already in the game from within VR.

Though if FDev were to consider it acceptable to change the ED gameplay to allow laser pointer targeting, it would be reasonable for them to also give touchscreen targeting to non-VR ussers.
Actually, perhaps you could probably justify it now because VR users already have headlook based targeting.

You might want to open another feature discussion for adding that and use headlook/gaze targeting as a justification. Otherwise your request for a non-VR feature will probably be lost in this request for VR features. Especially if FDev considers the VR subgroup too niche to bother having anyone work on it or allow the community to develop things.
 
Good ideas Daniel, I’d love to see a VTOL VR style control implementation in ED (ideally with some Lone Echo space legs to go with it :) )
 
I would be happy with a fully featured way to control galaxy map and system map within vr, it's the only thing I still haven't managed to control with ASTRA or the hotas controls
 
Back
Top Bottom