I see that there have already been a few discussions about VR controllers (Vive wands and Oculus Touch); people asking if they’ll be supported or wishing that they can control menus with them. But there’s been an underlying sentiment that virtual controls would be imprecise and just using a separate real world HOTAS/gamepad is better.
I seriously believe that virtual controls can work. And, I have an idea of exactly how.
Firstly, think of the controllers as hands or tools that can partially replicate input from your hands. So you wouldn’t just map inputs/axis from the controllers to gamepad/hotas axis; your controllers would map to the physical hands you see inside the cockpit and you’d use them to to interact with virtual controls in the cockpit.
Secondary controls, ones you don’t need instant access to at all times while piloting (like frame shift drive, silent running, lights, and landing gear) would be mapped to physical buttons inside the cockpit. You’d press these to activate those controls. For the Vive wands your hands would switch to an index finger pointing pose when near buttons; Oculus Touch and Knuckles you’d physically move your fingers so they are in a pointing pose and use your index finger to press the buttons.
Primary controls would be on the throttle and joystick physically inside the cockpit. To use these you would physically “grab” them with your controllers and the hands would lock to them. For the Vive wands while your hands are on the joystick/throttle pressing the grip button would toggle grab/lock of the controls on/off; Oculus Touch you’d press the hand triggers (the one your middle finger rests on) in to grab/lock onto the controls. And for Knuckles you’d curl your fingers around the controller to grab on to the controls, lock would probably be based on the average of your 3 grip fingers and there’d be an option to control the sensitivity.
While locked on to the throttle moving your hand backwards or forwards would move it changing your thrust. While locked on to the joystick your in-game hand would be locked in place on the joystick and rotation of your controller would translate to joystick movement (pitch, roll, yaw); so after you grab the joystick you can move your hand to a comfortable place (rest your arm on an arm rest and hang the controller off the edge; jam the bottom end of a Vive wand right above your knee so you can move it around like a joystick). This method of controlling the throttle and joystick isn’t imprecise or new, there’s already a VR game in development that uses this method of cockpit control: VTOL VR.
Controls you need immediate access to while piloting can be accessed through inputs on the controllers when you are locked on to the throttle and joystick.
The in-game panels could be controlled by some selection of direct physical input, laser pointing, mapping joystick axis and trigger, and the thumb axis on the joystick).
It might be reasonable for some of these controls to not be fixed but instead be remapable, like as if the throttle and joystick in the in-game cockpit was a virtual in-world HOTAS that pilots can remap.
This is a start and there are more controls than these. But I think someone more versed with the Elite lore and what all the buttons in the in-game cockpit do (i.e.: how in-universe pilots actually pilot ships) is better suited to deciding what the final controls are.
I seriously believe that virtual controls can work. And, I have an idea of exactly how.
Firstly, think of the controllers as hands or tools that can partially replicate input from your hands. So you wouldn’t just map inputs/axis from the controllers to gamepad/hotas axis; your controllers would map to the physical hands you see inside the cockpit and you’d use them to to interact with virtual controls in the cockpit.
Secondary controls, ones you don’t need instant access to at all times while piloting (like frame shift drive, silent running, lights, and landing gear) would be mapped to physical buttons inside the cockpit. You’d press these to activate those controls. For the Vive wands your hands would switch to an index finger pointing pose when near buttons; Oculus Touch and Knuckles you’d physically move your fingers so they are in a pointing pose and use your index finger to press the buttons.
Primary controls would be on the throttle and joystick physically inside the cockpit. To use these you would physically “grab” them with your controllers and the hands would lock to them. For the Vive wands while your hands are on the joystick/throttle pressing the grip button would toggle grab/lock of the controls on/off; Oculus Touch you’d press the hand triggers (the one your middle finger rests on) in to grab/lock onto the controls. And for Knuckles you’d curl your fingers around the controller to grab on to the controls, lock would probably be based on the average of your 3 grip fingers and there’d be an option to control the sensitivity.
While locked on to the throttle moving your hand backwards or forwards would move it changing your thrust. While locked on to the joystick your in-game hand would be locked in place on the joystick and rotation of your controller would translate to joystick movement (pitch, roll, yaw); so after you grab the joystick you can move your hand to a comfortable place (rest your arm on an arm rest and hang the controller off the edge; jam the bottom end of a Vive wand right above your knee so you can move it around like a joystick). This method of controlling the throttle and joystick isn’t imprecise or new, there’s already a VR game in development that uses this method of cockpit control: VTOL VR.
Controls you need immediate access to while piloting can be accessed through inputs on the controllers when you are locked on to the throttle and joystick.
- The right trigger can control the joystick’s trigger and be primary fire.
- Secondary fire can be part of the trackpad or the menu button on the Vive, the A button on the Touch, and either part of the trackpad or inner grip button on the Knuckles.
- The right hand trackpad or joystick can primarily control the thumb axis on the joystick. This could be mapped to the power distribution but I think those would be better as secondary controls. The thumb axis would probably be better as targeting control like I see on some HOTAS setups (horiz. next/prev target, vert. next/prev subsystem). The B button on Touch could also be mapped to either target ahead or target hostile. Touchpads could be mapped in multiple ways, press the touchpad while in one of the regions, do a touchpad swipe gesture, or use it like a 2-axis scroll wheel (potentially finer control over switching targets).
- From what I can see from pictures, the throttle in the Elite cockpit has a primary thumb button and 2 red ones. On the Touch the joystick click, X, and Y buttons can be mapped to these. On the Vive wands, since these virtual buttons are different positions of your thumb, different positions on the trackpad can map to them, trackpad press pressed the button, and capacitance can be used along with haptics so you can see your in-game thumb on the relevant button. The trackpad on the Knuckles is different so need some testing to pick the best mapping. These buttons could presumably be mapped to some portion of this set of buttons (boost, thrust up, thrust down, chaff, fire group).
The in-game panels could be controlled by some selection of direct physical input, laser pointing, mapping joystick axis and trigger, and the thumb axis on the joystick).
It might be reasonable for some of these controls to not be fixed but instead be remapable, like as if the throttle and joystick in the in-game cockpit was a virtual in-world HOTAS that pilots can remap.
This is a start and there are more controls than these. But I think someone more versed with the Elite lore and what all the buttons in the in-game cockpit do (i.e.: how in-universe pilots actually pilot ships) is better suited to deciding what the final controls are.