Reviving the discussion about manipulating in-cockpit controls with VR controllers in Elite

Another thought that just came to mind. Wouldn't the galaxy map be so much easier to use if you could use controller gestures, like grab to pan and pinch zoom like scaling with two controllers to zoom in/out.
 
Ohh, it works with Touch controllers? Have you had a chance to compare it to Vive controllers and compare whether the level of control is the same or better with the wands? One of the things I worried about was that controllers like the Touch and Knuckles that don't have that joystick like shape may not function as well.


Button navigation would probably be the most in-universe way of controlling the holoscreens anyways. I'd imagine you'd just use the Vive wand's trackpad or Touch's joystick to control the screens, while your hand is on the joystick. Pretending that you are interacting with the buttons on the top of the in-cockpit joystick (which is how I assume an in-universe ED commander would actually control those screens). Have you used the joystick buttons in VTOL VR to control the targeting camera yet?

But strictly speaking if you don't care about that and just want comfortable controls, even if the screens are too far away with varying distances it would still be possible to give them laser pointer controls. Those are still very common in VR and everyone is used to them.
I’ve not had the chance to try out a Vive, so I can’t comment on how they compare. What I will say is that using the Grab button on the Touch to take hold of the in-game joystick puts my hand in almost exactly the same position as it would be on my real joystick, with my finger on the trigger and thumbstick acting as a HAT (which works really well with VTOL VR camera, btw). Feels totally natural to me and pretty much like holding a normal joystick, just one that’s got no resistance or stops :)

As for the laser pointer, I’d like the possibility of that and being able to jab things with my finger. Oculus Home lets you do both so I suppose in ED the ideal solution for me would be the ability to reposition the holoscreens like with virtual desktops. I’d rather be prodding at A4 sized screens of a SLF next to my head than pointing at the dinner table sized things a couple of metres away on a ‘conda.

It’ll be interesting to see what games do with the Knuckles once they arrive, I’m hoping they might spur More Touch or something equivalent.

Edit:
Another thought that just came to mind. Wouldn't the galaxy map be so much easier to use if you could use controller gestures, like grab to pan and pinch zoom like scaling with two controllers to zoom in/out.
Oooh, I’d get all Tony Stark / Minority Report / most Sci-Fi with that!
 
Last edited:
Strictly speaking, I want all the options. If you've got a HOTAS and absolutely need that 100% precision, it's still there. But I don't, a virtual stick gives me enough precision to do all the stuff I do in ED and I love being able to just grab onto the controls I see right in front of me in the in-game cockpit. More importantly, I like being able to let go of them see my hands move around, and feel like I'm actually in the cockpit when my ship doesn't need input from me and I'm just waiting for it to travel where I've pointed it.

I was imprecise. I'm all about the options, there is even a video where a developer from Egosoft explains that you can mix control schemes (I mentioned that) so that for example you fly with a hotas one hand and use laser pointer in the other. I just think that "emulating" a joystick using VR controllers is sub-par to dedicated control scheme like in XR VR.

IIRC you can put away your controllers anytime by switching to mouse mode. Has been a while since I played it so I might have that wrong, but I am pretty sure your hands aren't tied to the ship all the time. XR VR also has this very cool feature where you can remove your hmd and the game switches to 2D. Would love that in E: D, btw.

Do you want to control the in-cockpit displays and station display with the wands like they are controlled in the video?
Actually I think the left and right holo screens should be touch-able, or "laserpointer-able", provided they introduced proper motion controls. This would pave the way for specialised panels which could allow you to control all the ships systems using your VR controllers. I mentioned DCS world - take a look how it works in practice (skip to around 4:24):
[video=youtube_share;SaMBmQxzzPQ]https://youtu.be/SaMBmQxzzPQ[/video]
Of course starting a fighter jet is a complicated process and DCS is a hardcore sim. Here in E: D we would point on launch and landing gear off, period. But having an interactive cockpit does tons for immersion. It doesn't get old for me, it becomes second nature. I even use the gear shifter on my driving wheel (serving as a button board for the Hornet) as a landing gear lever.

the ED ships all have HOTAS. The reasons listed for not owning a HOTAS seem to be rather silly. A shame we now live in an age where people need everything placed for them within arms length, so to speak, so they don’t have to move. The redlines people draw around themselves!
Every flight sim I’ve tried with motion controllers has felt disconnected and imprecise.
Also so odd that voice control get completely ignored.

Try to think out of the box sometimes. Earlier all engines used steam and wooden or steel wheels! And as a strawman you belittle other people valid issues - because a cramped living space is an issue. Is it a fault of the game? No. Could it be totally eliminated using aforementioned motion control? Yes.

There is a difference with motion controllers pretending to be other control method - if you try to pretend vive wands or oculus touch are yoke, and you pretend that you're holding the yoke (or in Hornet's case - HOTAS) it won't work. If you're trying to pretend holding two controllers in the air is like holding a wheel - it won't work. But why apply a XX century method to control a starship, which can be precisely controlled in VR by using other methods? Why dismiss them citing some "ultimate truth"?

Did you at least watch the XR video I posted? The guy there makes comparison to HOTAS quite a few times. Try it in XR VR if you have the opportunity. It will be weird at first, but like I mentioned, it suddenly "clicks" after a few minutes and becomes second nature - which is not surprising as you're basically gesturing with hands and not trying to push the spring-loaded joystick. Just don't fly for 30s and triumphantly declare "oh I tried it and it sucks so I won't try again". Rewiring your brain takes time, though I was really surprised how fast I had accustomed myself to the new control scheme. <opinion>For me, it's simply better.</opinion>. Though be warned, after having a comparison you probably will want Elite to have those too...

As for the laser pointer, I’d like the possibility of that and being able to jab things with my finger. Oculus Home lets you do both so I suppose in ED the ideal solution for me would be the ability to reposition the holoscreens like with virtual desktops. I’d rather be prodding at A4 sized screens of a SLF next to my head than pointing at the dinner table sized things a couple of metres away on a ‘conda.

It’ll be interesting to see what games do with the Knuckles once they arrive, I’m hoping they might spur More Touch or something equivalent.

Edit:
Oooh, I’d get all Tony Stark / Minority Report / most Sci-Fi with that!

I'd love real "holobuttons" too... This Tony Stark vibe is also my intention. Alas we Vivers need knuckles to arrive first, as viwands are sub-par in this regard, so unless that's happening, the userbase is even more small... Knuckles v2 are confirmed to have a thumbstick and a touchpad, so it's the best of both worlds. I also think when both major systems have finger sensing like Touch, game interfaces will adapt. Still is a long journey, but an exciting one nonetheless.

Excellent idea on repositioning the holo-screens. After all, they are holo screens and can surround a pilot with ease. Imagine something like the Engineer's ship bridge in Prometheus, where a galaxy map surrounds you and you can interact with planets the way David did... One can dream.
 
Last edited:
I've tried all these other methods and they are all inferior to a HOTAS and voice. It's there same with sim racing where a wheel and pedals is still best.

One precise question before we can agree to disagree: have you tried X-Rebirth VR edition with motion for more than a minute? Because if not, we're not on equal footing here. Until the controls suddenly feel natural, they feel clunky and imprecise. But that's because, as Jobs said, "you're holding it wrong" ;-).

I already said imitating real life controllers is much worse than tailored motion control system. And it usually doesn't work well. Wheel is a wheel, we're used to it, and how it works. Due to it being connected to car's wheels it also acts as a feedback device reacting to the road. That's why we have force feedback wheels for our gaming sims.

And voice, while it's ultra immersive and a new level of cool, has one big limitation - lag. You need to say a command, then say another one, and then another one. By the time you get to the second command, I would push the three buttons twice. I'm not in any way belittling voice control - it's really great, even if it's a pain in the to set up, and usually only works good with English language. It's just there are situations where I would prefer to use buttons instead of talking: "full power to shields, target last system, engage fsd". But for "landing gear" or even "gear", "engage FSD" under normal conditions - it's great. But as mentioned, you won't use your voice for steering the ship because that would be impractical.
 
One precise question before we can agree to disagree: have you tried X-Rebirth VR edition with motion for more than a minute? Because if not, we're not on equal footing here. Until the controls suddenly feel natural, they feel clunky and imprecise. But that's because, as Jobs said, "you're holding it wrong" ;-).

I already said imitating real life controllers is much worse than tailored motion control system. And it usually doesn't work well. Wheel is a wheel, we're used to it, and how it works. Due to it being connected to car's wheels it also acts as a feedback device reacting to the road. That's why we have force feedback wheels for our gaming sims.

And voice, while it's ultra immersive and a new level of cool, has one big limitation - lag. You need to say a command, then say another one, and then another one. By the time you get to the second command, I would push the three buttons twice. I'm not in any way belittling voice control - it's really great, even if it's a pain in the to set up, and usually only works good with English language. It's just there are situations where I would prefer to use buttons instead of talking: "full power to shields, target last system, engage fsd". But for "landing gear" or even "gear", "engage FSD" under normal conditions - it's great. But as mentioned, you won't use your voice for steering the ship because that would be impractical.

Yes, I tried for long enough to judge it.

Re voice: my approach isn't to depend of voice as a primary control but as a useful additional way of doing things when there's a lot to do and not much time to do it.
 
I like when an answer is thought out, like yours, CylonSurfer. What follows is my opinion, not a fact and I respectfully disagree with yours on both the precision and immersion fronts :). My view on it is slightly different than OP though, I want to get rid of HOTAS completely, not emulate it!

I used to be against and also thought HOTAS was superior. Until this one day, when I tried X-Rebirth VR. At first I tried it with HOTAS, of course, because motion controls for space sim?! Give me a break! But... the number of bindings was large and I wanted to play, so... I skipped configuring HOTAS and tried with default settings and VR controllers because why the hell not. I will do the tutorial, I will know what to bind, etc...

So I started playing. At first it was ridiculous. No, that is not precise, come on, where are you flying... Stop, not in the waaaaaall... Then the moment came: Oh wait. Wait a frikkin second. It all starts to click in... Oh my god how could I be so blind! After about five minutes, your brain adapts to new control scheme... And it's pure bliss. With Viwands + trackpads, steering your ship is like the future. HOTAS doesn't even come CLOSE.

Tried it and really couldn't abide it. Worse still I could not figure out how to re-enable my HOTAS without quitting the game and removing the batteries from my Touch controller. I've not been back since I'm sorry to say. I found it a rather frustrating experience.

In regard to the control system itself, I found it rather lack luster, twisting your wrists and laser pointing etc just wasn't for me.

BTW, I'm sure you could emulate the X-Rebirth control scheme in Elite with VJoy + free /glove pie and a couple of WiiMote motion + controllers. SteamVR may even offer similar opportunities for the more coding savvy player.

Last night I flew an hour flight in Areofly 2 in a Cessna. I had the choice to fly using motion controls in an almost fully interactive cockpit but instead used a yoke, throttle and pedal combo. If the game had better start up and shut down sequences I would have likely used motion controls to start the plane from cold and tune my instruments. However for the take off, flight and landing I'd only ever use physical controls.

PS, I also like your gear shifter as landing gear lever idea, I may have to steal that little tip :)
 
Last edited:
Another thought that just came to mind. Wouldn't the galaxy map be so much easier to use if you could use controller gestures, like grab to pan and pinch zoom like scaling with two controllers to zoom in/out.

Oooh, I’d get all Tony Stark / Minority Report / most Sci-Fi with that!

🤣

IIRC you can put away your controllers anytime by switching to mouse mode. Has been a while since I played it so I might have that wrong, but I am pretty sure your hands aren't tied to the ship all the time. XR VR also has this very cool feature where you can remove your hmd and the game switches to 2D. Would love that in E: D, btw.
That would be nice, in general it would be nice if ED had a reasonable options system. But from my understanding ED has nothing like a profiles system and many of the options affect both the 2D and VR modes of ED, some of them in ways that feel kind of mutually exclusive. There's a 3rd party application to help with control profiles for this reason. So I doubt ED could simply swap between 2D and VR.

I'd love real "holobuttons" too... This Tony Stark vibe is also my intention. Alas we Vivers need knuckles to arrive first, as viwands are sub-par in this regard, so unless that's happening, the userbase is even more small... Knuckles v2 are confirmed to have a thumbstick and a touchpad, so it's the best of both worlds. I also think when both major systems have finger sensing like Touch, game interfaces will adapt. Still is a long journey, but an exciting one nonetheless.
VTOL VR has a lot of buttons, knobs, and switches. When you get close the hand sort of locks on then you pull the trigger on the wand to press a button or pull the trigger to grab a /switch and twist or move your hand to turn/toggle it. It actually works quite nicely.

I'm thinking that in the options we'll have a "Use skeletal input" toggle, and the interface in ED would work in two ways:

  • If you have skeletal input off (best for wands), going near buttons will have the same sort of automatic poses as VTOL VR and the trigger will be used to control things.
  • If you have skeletal input on (best for Knuckles and possibly touch), it will use the SteamVR Skeletal Input or the Oculus SDK's hand poses to control the pilot's hands. In this mode you would physically push buttons with your index finger or pinch sliders.

It's slightly tricky, but it also occurs to me that we could have a mode where we use either the SteamVR Skeletal Input with gloves or Leapmotion's bones API, and disable that input when we think the player has grabbed onto their physical HOTAS. So you can seamlessly play with a HOTAS, let go and toggle a switch, and then grab onto your HOTAS and continue flying.

Excellent idea on repositioning the holo-screens. After all, they are holo screens and can surround a pilot with ease. Imagine something like the Engineer's ship bridge in Prometheus, where a galaxy map surrounds you and you can interact with planets the way David did... One can dream.
Repositioning might be a bit of work, but yeah we could go that route too. With absolute control it might be a bit too messy with human positioning, but I think we could get enough control by allowing for "distance" and "scale" sliders to configure the panels.
 
Yes, I tried for long enough to judge it.

Tried it and really couldn't abide it. Worse still I could not figure out how to re-enable my HOTAS without quitting the game and removing the batteries from my Touch controller. I've not been back since I'm sorry to say. I found it a rather frustrating experience. In regard to the control system itself, I found it rather lack luster, twisting your wrists and laser pointing etc just wasn't for me.

Well I guess if you have both flown it long enough and it didn't "click" then it's just not to everybody's liking. For me it was an epiphany of sorts :D Regarding re-enabling HOTAS i think it was some bug and it later got fixed but my memory is hazy. I think you could simply change controller profile in options to do it, but I might be mistaken. Plus I cannot take the battery out of my viwands [big grin]

BTW, I'm sure you could emulate the X-Rebirth control scheme in Elite with VJoy + free /glove pie and a couple of WiiMote motion + controllers. SteamVR may even offer similar opportunities for the more coding savvy player.

Last night I flew an hour flight in Areofly 2 in a Cessna. I had the choice to fly using motion controls in an almost fully interactive cockpit but instead used a yoke, throttle and pedal combo. If the game had better start up and shut down sequences I would have likely used motion controls to start the plane from cold and tune my instruments. However for the take off, flight and landing I'd only ever use physical controls.

PS, I also like your gear shifter as landing gear lever idea, I may have to steal that little tip :)

Glad you like it. Another nice tip is to use rotary dial on the wheel (if you have one, I have on my Driving Force GT) to turn knobs. It actually registers as two buttons, so I could bind it in DCS. I think I bound it to emulate mouse wheel somehow. I also have +/- buttons on the steering wheel located top/bottom of each other which I mapped to left/right click. So when I need to throw a switch up/down I look at the switch in game and press one of those. It's fun, but it's not currently 100% reliable (gaze control).

Thx for the GlovePie tip too ;-) I have a pair of wiimotes laying around, but I am afraid they're not "+". Still I would probably be too lazy to program this, but it's nice knowing that the option is there :D

As for yoke type controls / emulating IRL devices, a physical representation of the device will always be better... until we get a real tracked haptic gloves that is. But these tend to be quite big:
[video=youtube;OK2y4Z5IkZ0]https://www.youtube.com/watch?v=OK2y4Z5IkZ0[/video]
[video=youtube;s-HAsxt9pV4]https://www.youtube.com/watch?v=s-HAsxt9pV4[/video]

so when someone says to use a vr controller to grab something which is solid IRL - like a yoke or a steering wheel, and also tells me to move my hands in unison like I was holding a real physical object... well that's not gonna work, or gonna work not so well. There's a reason people (myself included) build a "rifle stack" out of PCV plumbing pieces to be able to aim better in Onward. Like I said a few times here, I'm all up for physical stuff if the control method isn't custom tailored like in XRVR.

That would be nice, in general it would be nice if ED had a reasonable options system. But from my understanding ED has nothing like a profiles system and many of the options affect both the 2D and VR modes of ED, some of them in ways that feel kind of mutually exclusive. There's a 3rd party application to help with control profiles for this reason. So I doubt ED could simply swap between 2D and VR.

You're probably talking about DrKaii's profiler, but it's only for convenience, gathering all these options in one place? In fact ED has quite nice profile system, but there is this one small caveat to it. All devices need to be plugged in when starting a game. If any of them is missing, the control scheme will not work properly at all. Regarding display options you could switch from VR to 2d using menus, but I'm not sure about the other way around. Probably works too, as it is a dropdown in options (3d: HMD)

VTOL VR has a lot of buttons, knobs, and switches. When you get close the hand sort of locks on then you pull the trigger on the wand to press a button or pull the trigger to grab a /switch and twist or move your hand to turn/toggle it. It actually works quite nicely.

I'm thinking that in the options we'll have a "Use skeletal input" toggle, and the interface in ED would work in two ways:

  • If you have skeletal input off (best for wands), going near buttons will have the same sort of automatic poses as VTOL VR and the trigger will be used to control things.
  • If you have skeletal input on (best for Knuckles and possibly touch), it will use the SteamVR Skeletal Input or the Oculus SDK's hand poses to control the pilot's hands. In this mode you would physically push buttons with your index finger or pinch sliders.

It's slightly tricky, but it also occurs to me that we could have a mode where we use either the SteamVR Skeletal Input with gloves or Leapmotion's bones API, and disable that input when we think the player has grabbed onto their physical HOTAS. So you can seamlessly play with a HOTAS, let go and toggle a switch, and then grab onto your HOTAS and continue flying.

Repositioning might be a bit of work, but yeah we could go that route too. With absolute control it might be a bit too messy with human positioning, but I think we could get enough control by allowing for "distance" and "scale" sliders to configure the panels.

Regarding VTOL - I don't have it so I cannot relate. In DCS I use mouse along with gaze targeting. It works, but it's a bit unreliable as mentioned. So I mostly use mouse to throw switches... The problem is this causes me to take hand off the stick which could be sub-optimal in combat situation.

Aside from a fact that this is an awful lot of UI coding just for a niche withing niche of niches ;-) playerbase, I find the idea nice. But on second thought, there shouldn't be human positioning. The display should be like JARVIS and follow the user, regardless of ship. An universal spaceship control protocol or USCP if you will. We already have it standardized (all ships' cockpit instruments look exactly the same), so why not make it so that it's projected in front of your suit and be done with it :) You'd just press the "HMD reset" button and all would be positioned correctly.

The problem is, switching from VR controllers to hotas to VR controllers to hotas is clunky, all while wearing a headset, especially if you have cramped space like you do. I also have a small, self-made little desk I use for gaming in my living room, so I feel the pain :p That's why I'm a big fan of XRVR motion control.
 
Last edited:
ED can be played with keyboard and mouse, gamepads, HOTAS/HOSAS, etc.

I don't see any valid reason whatsoever- given that the game isn't restricted to using only HOTAS/HOSAS for flight controls, why the option for VR controller support should not be implemented.

It might actually help Galaxy Map navigation, IMO
 
Aside from a fact that this is an awful lot of UI coding just for a niche withing niche of niches ;-) playerbase, I find the idea nice. But on second thought, there shouldn't be human positioning. The display should be like JARVIS and follow the user, regardless of ship. An universal spaceship control protocol or USCP if you will. We already have it standardized (all ships' cockpit instruments look exactly the same), so why not make it so that it's projected in front of your suit and be done with it :) You'd just press the "HMD reset" button and all would be positioned correctly.
I'm not quite sure what you're suggesting. The holo panels I'm thinking about are the target panel on your left and the system panel on your right, possibly the two upper panels depending on your opinion on how one should switch between the tabs on those panels. I'm not sure how a JARVIS style panel would work when there are 2-4 of them.

The problem is, switching from VR controllers to hotas to VR controllers to hotas is clunky, all while wearing a headset, especially if you have cramped space like you do. I also have a small, self-made little desk I use for gaming in my living room, so I feel the pain :p That's why I'm a big fan of XRVR motion control.
That's why I expect most of the users will be using a virtual stick/throttle, XRVR style control, VR gloves, leapmotion, or a hybrid control (joystick hand on HOTAS, throttle hand with VR controller and a virtual throttle).
 
Last edited:
That's why I expect most of the users will be using a virtual stick/throttle, XRVR style control, VR gloves, leapmotion, or a hybrid control (joystick hand on HOTAS, throttle hand with VR controller and a virtual throttle).

I doubt that, most simmers buy games to work with the equipment they already own. Most simmers who have turned to VR want it as an extension to their current rig, not as a replacement for it. Granted Elite isn't exactly a sim but it has attracted many people from that player base who are looking to scratch their space flight itch.

Ask the same question in a Race sim forum, you'll get pretty much the same reply just with a lot more attitude.

I'm not sure of what the percentage of the Elite player base that use VR actually is, but I feel it is safe to assume less than 10% of all players. Of that 10%, most (not all) will have come from playing the game in 2D and will most likely already have a HOTAS or dual stick setup, many with pedals too. Not many players will dump that for an inferior control setup. Your target audience, 9 times out of 10, would be those players who have just purchased a VR device that came with motion controls, have newely purchased Elite (or have never played it before), have no HOTAS or sticks and are still in the "Room scale or get lost" honeymoon phase of VR adoption. That would account for a very small number of Elite VR players and an even smaller number of overall Elite players. When you factor in that the Oculus Rift has always come with an Elite compatible input device (Xbox pad & Touch), there would be even fewer VR players in the position where they had to buy additional controllers to enjoy the game out of the box.

It could be said that proof of this is in the number of threads created on this topic, which is actually very few overall.

I hate to put a downer on your UI conversation, but the dev team have already stated that they feel the UI panel placement is optimal and have no intention of moving the panels due to eye strain concerns and geometry clipping issues caused by moving them. Personally, I think that is a little sad but hey, we still don't have a non game breaking way to even change the UI colour!

Any way I'm not here to argue with you, you want motion control support and that is fine. I seriously doubt you will see that support any time soon - if ever. It's likely best to temper any expections you have for it and work around them by using other supported input devices. Be that HOTAS, sticks or a simple game pad. This game has many issues and under developed features, the control scheme is not one of them, the number of supported devices and combinations of said devices is simply huge. Could it be improved? Sure, I'd kill to be able to have seperate gear up and gear down binds, just like the cargo scoop has, but these wants are way down the list of features and fixes that need the devs time and focus so I work around those problems.

You have at least hinted at having some programming expertise, have you considered delving into the SteamVR API and looking for a way to output your controllers rotational values to a virtual game pad which you could then bind to various axis in Elite? This would give you a similar solution to what StarlightPL has mentioned.
 
Last edited:
Just found out that X: Rebirth Home of Light complete edition is on sale on GoG. Now that is not the VR version, but X: Rebirth with both expansions. Worth trying.
I like ED fine as is - long as any updates don't break the game as it stands them I'm pretty happy - and I'm both old and certainly don't play casually - at least 4 hours most evenings and longer at weekends.
But if folk are looking for something else - or even if not then GoG have X rebirth: home of light complete edition on sale at the minute. Seems to have been patched up quite a bit as well.
https://www.gog.com/game/x_rebirth_home_of_light_complete_edition

PSSST: Amigacooke - it's on Mac and Linux too! ;-)
 
I popped back in and made some observations of the UI and cockpit a few days ago while playing with my gamepad.

Notably I made a few observations of how the pilot's hands are animated to respond to control inputs.

I noticed the pilots hands moving in response to the following control inputs:
  • The throttle moves in response to up/down changes to the throttle
  • The joystick moves in response to ship roll/pitch/yaw input
  • The index finger on the joystick moves in response to the primary weapon fire input

I did not try triggering actions like chaff so I don't know if the left hand makes any responses to button presses that could correspond to buttons on the throttle. Interestingly I did notice that the pilots hands do not respond to any of the following inputs:
  • There are no hand movements when you make vertical and horizontal translation movements (which begs the question, how are ED pilots even controlling the ship in this way). (Though I forgot to try observing the controls when forward/backward translational movements are used instead of throttle control, which I switch to when my landing gear is down)
  • Amusingly, the hand does not press any of the various buttons on the joystick when you use your secondary fire button.
  • Also I do not believe the pilot takes any action to control the holo panels. Not even using the hat switch to change the selection.

Given that the Touch's joystick has a capacitive sensor and trackpads are inherently capacitive. We could probably improve the joystick reactions significantly by doing things like moving the thumb to the hat switch when you touch the stick/pad. And decide on a button that could vaguely be considered secondary fire or choose one of the two Touch buttons / the Vive menu button and tie them to one button.

I also noticed that UI panel placement is also ship dependent. In my large 2-seater Cobra MkIII the panels are large and just out of reach. However when I switched to my old Eagle, I noticed that the panels are actually smaller and directly next to me in the small cockpit. That said, the the part of the cockpit to your left in the Cobra that looks like a keyboard (also out of reach) below the holo panel, is entirely absent in the Eagle's cockpit.

Perhaps it would be a valid request to allow toggling on an option to always have the close up Eagle-like small cockpit holo panels in all ships. If we wanted to support direct interaction with the holo panels.
 
Given that there are VR flight simulators out there utilising hand controllers, their various buttons/pointing/gestures as an interface i can't see why a clever person somewhere might adapt ED to enable flight control plus in ED. Maybe it'll end being third party add-on - I'd buy it

I'd love to be able to 'bin' my in-configurable HOTAS and use hand gestures, pull up screens and pilot like some of the latest flight simulators. In the true VR expectation experience.. who needs a joystick?

Then - just a fan on the desktop, a tube to drip feed, a sub-base attached to ones back and a hydraulic inertia chair would suffice... (-:
 
Given that there are VR flight simulators out there utilising hand controllers, their various buttons/pointing/gestures as an interface i can't see why a clever person somewhere might adapt ED to enable flight control plus in ED. Maybe it'll end being third party add-on - I'd buy it
There is a SteamVR overlay that allows you to place fairly basic virtual buttons near you. And someone has actually used this with ED. Unfortunately I could not get it to work when I tried it.
[video=youtube_share;FJbsES-mfEc]https://youtu.be/FJbsES-mfEc[/video]

Though I really want this directly in ED. Because I want hand presence with the actual in-game character I'm playing. And it might be difficult to try doing things like lining up joystick/throttle locations and otherwise handling grabbing/letting go of the virtual stick; and implementing other bits of functionality that require lining things up with the cockpit and requires detecting the current in-game mode (galaxy map, looking at a holo screen, etc).
 
I popped back in and made some observations of the UI and cockpit a few days ago while playing with my gamepad.

Notably I made a few observations of how the pilot's hands are animated to respond to control inputs.

I noticed the pilots hands moving in response to the following control inputs:
  • The throttle moves in response to up/down changes to the throttle
  • The joystick moves in response to ship roll/pitch/yaw input
  • The index finger on the joystick moves in response to the primary weapon fire input

I did not try triggering actions like chaff so I don't know if the left hand makes any responses to button presses that could correspond to buttons on the throttle. Interestingly I did notice that the pilots hands do not respond to any of the following inputs:

Correct, the in game animations are only for, YAW, Pitch, Roll & Thrust.

Given that the Touch's joystick has a capacitive sensor and trackpads are inherently capacitive. We could probably improve the joystick reactions significantly by doing things like moving the thumb to the hat switch when you touch the stick/pad. And decide on a button that could vaguely be considered secondary fire or choose one of the two Touch buttons / the Vive menu button and tie them to one button.

We can't do anything, that ball is firmly in FDEVs court.

I also noticed that UI panel placement is also ship dependent. In my large 2-seater Cobra MkIII the panels are large and just out of reach. However when I switched to my old Eagle, I noticed that the panels are actually smaller and directly next to me in the small cockpit. That said, the the part of the cockpit to your left in the Cobra that looks like a keyboard (also out of reach) below the holo panel, is entirely absent in the Eagle's cockpit.

Perhaps it would be a valid request to allow toggling on an option to always have the close up Eagle-like small cockpit holo panels in all ships. If we wanted to support direct interaction with the holo panels.

As stated in another post of mine in this thread, FDEV have already spoken out about this and stated that they believe that the UI placement is optimal in everyship and will not be moving them due to concerns over eye strain and geometery clipping.

Many of the cockpit elements such as the keyboards in the ships that have them are out of reach - the game was created long before motion controls were even released so that should not come as a shock to you. Not one element of the cockpits or the games UI layout were designed with motion control support in mind.

Because I want hand presence with the actual in-game character I'm playing.

Hand presence is the least important part of Sim gaming. For most, the equipment you use to play with is. When you have your HOTAS lined up with the in game throttle and stick, you more or less have hand presence...The only thing the game doesn't know is when you take your hands off the controls. You are denying yourself the additional presence / immersion by not buying the cheap $30 equipment that would get you it and coming up with flimsy excuses as to why you won't. You state you can't afford $30 to buy one, yet you over spent on your HMD by at least $100, you say you cannot store a HOTAS in your apartment, what you actually mean is that you don't want to.

I understand that you are still in the honeymoon phase of VR adoption but motion controls doth oft maketh VR.

Some forms of motions control interactivity are indeed possible, I posted this a few days ago - it demonstrates a kind of intergration of motion control for UI manipluation.

https://streamable.com/5brbf

Elite Dangerous offers a myriad of supported control options - motion control is not one at this time, your only option is to build something yourself. Did you take a look at the SteamVR API to see what you could do??
 
Last edited:
Hand presence is the least important part of Sim gaming. For most, the equipment you use to play with is. When you have your HOTAS lined up with the in game throttle and stick, you more or less have hand presence...The only thing the game doesn't know is when you take your hands off the controls. You are denying yourself the additional presence / immersion by not buying the cheap $30 equipment that would get you it and coming up with flimsy excuses as to why you won't. You state you can't afford $30 to buy one, yet you over spent on your HMD by at least $100, you say you cannot store a HOTAS in your apartment, what you actually mean is that you don't want to.

Yep, silly.
 
We can't do anything, that ball is firmly in FDEVs court.
Sorry, I'm used to writing from the perspective of someone implementing things. Yes, these are notes how things could be implemented by FDev, or anyone FDev gives the ability to implement this stuff.

As stated in another post of mine in this thread, FDEV have already spoken out about this and stated that they believe that the UI placement is optimal in everyship and will not be moving them due to concerns over eye strain and geometery clipping.
Given that the close up UI panels do already exist in Elite Dangerous in some ships, I don't think the eye strain argument is valid in this case. If that placement is bad for the player, why is it acceptable to do that to the player if they're in an Eagle?

Geometry clipping could be a thing. Sadly I only compared 2 ship cockpits, and at the moment all I have otherwise to compare it to is a Sidewinder.
However the cockpit chair is standardized between all ships, in large ship I don't see any reason for there to be any ship geometry directly next to the chair that could be clipped, and when I skimmed cockpit interior videos I didn't see any objects in between where the left/right holo panels are placed and where they would be placed close up. Given that I doubt there is much problem in making it possible for holo panels to be close in large ships. Also, worst case scenario, each large ship just needs a 2nd valid holo panel placement in their metadata. Which should completely eliminate any possible issue with geo clipping, obstruction, or alignment with the cockpit interior.

That said. I only paid attention to the left/right navigation and system/ship panels, I didn't look closely at the comms and info panel in the Eagle yet. I need to check out how usable those would be via direct interaction. That said, given that you don't actually interact directly with either of those too much it may not actually be an issue. Info is basically never interacted with. And for comms all you need to do is switch tabs and interact with mission updates and voice comm controls. It's not like the navigation and system panels that you regularly make lots of input to in order to navigate, check contacts, dock, check ship status and management, and possibly use to control ship functions like landing gear (if you're like me and enjoy using the landing gear control in the system panel).

Did you take a look at the SteamVR API to see what you could do??
Generally the SteamVR API has ways to get controller info and make overlays. Overlays generally seem to be limited to 2d floating in 3d space, rather than full models. I think some people have managed to trick SteamVR into rendering 2D overlays that look 3D, but it's not supposed to be easy or a normal thing to do. There is also an api to expose tracked objects and some people have used it to create fake tracked objects to render 3d things in-game by providing render models. But that won't work in ED of course, because ED doesn't even acknowledge the tracked controllers presence much less render any of the render models. And trying to make fake input axis for ED to pick up is another matter, though it's probably possible (if likely to involve strange APIs and limitations that force you to use lower level languages to implement the overlay instead of say, tossing something together in Unity).

Someone has already created a SteamVR Overlay that lets you have 2d buttons as floating overlays. And another that creates a virtual joystick. I couldn't get the input buttons overlay to work personally, but the fact it exists makes me really not feel like bothering with what would essentially just be just making another of the same thing.

Of course, too many of the things that would make building a GOOD 3rd party interface are dependent on Elite Dangerous and are not available to things outside ED itself:
  • You can't just add hand presence in. Unless you are happy with a floating 2d glyph of a hand, while the 3d hand attached to your shoulder sits statically attached to the flight controls. Like someone with alien hand syndrome.
  • To my knowledge you don't know the context/status ED is in. i.e. Normal ship operation, focus on a holo panel, or galaxy/system map open. Which are necessary in order to implement proper controls. e.g. Unrelated overlay buttons should disappear in maps/menus where their interface elements don't exist. Need to be able to change type of faux input output by the trackpad depending on the context so it can be used for things like target control in normal operation but navigation when focused on a holo panel. And need to know if the galaxy map is open in order to switch to the grab to pan / scale gestures (which themselves may need some input scale tuning just to get user input to match up). Need to know the current throttle level to put the virtual one in the same spot – otherwise it won't work properly when the user is also using things like "full stop" and "75% throttle" voice commands. This of course is all the easy stuff, anything that requires knowing the current selection of a holo panel is a no-go (e.g. knowing the current selection, vs the selection you're pointing at so you know what inputs to send to move the selection to the correct item).
  • No information on the 3d position of menu options. So you can't make game menus controllable with a laser. (Which would kind of hurt pretty bad. You can have a virtual joystick with an overlay, but you can't actually start the game with the controller you use to fly the ship.)
 
Of course, too many of the things that would make building a GOOD 3rd party interface are dependent on Elite Dangerous and are not available to things outside ED itself:
  • You can't just add hand presence in. Unless you are happy with a floating 2d glyph of a hand, while the 3d hand attached to your shoulder sits statically attached to the flight controls. Like someone with alien hand syndrome.
  • To my knowledge you don't know the context/status ED is in. i.e. Normal ship operation, focus on a holo panel, or galaxy/system map open. Which are necessary in order to implement proper controls. e.g. Unrelated overlay buttons should disappear in maps/menus where their interface elements don't exist. Need to be able to change type of faux input output by the trackpad depending on the context so it can be used for things like target control in normal operation but navigation when focused on a holo panel. And need to know if the galaxy map is open in order to switch to the grab to pan / scale gestures (which themselves may need some input scale tuning just to get user input to match up). Need to know the current throttle level to put the virtual one in the same spot – otherwise it won't work properly when the user is also using things like "full stop" and "75% throttle" voice commands. This of course is all the easy stuff, anything that requires knowing the current selection of a holo panel is a no-go (e.g. knowing the current selection, vs the selection you're pointing at so you know what inputs to send to move the selection to the correct item).
  • No information on the 3d position of menu options. So you can't make game menus controllable with a laser. (Which would kind of hurt pretty bad. You can have a virtual joystick with an overlay, but you can't actually start the game with the controller you use to fly the ship.)

I was more refering to a X-Rebirth style approach as spoken about in this thread - you are NOT going to get fully supported motion controls with virtual hands in Elite any time soon if ever. You need to accept that and look at either using one of the supported control setups which offers the immersion you crave or what you can accomplish yourself if you are so intent on using motion control.
 
Last edited:
Back
Top Bottom