Another thought that just came to mind. Wouldn't the galaxy map be so much easier to use if you could use controller gestures, like grab to pan and pinch zoom like scaling with two controllers to zoom in/out.
I’ve not had the chance to try out a Vive, so I can’t comment on how they compare. What I will say is that using the Grab button on the Touch to take hold of the in-game joystick puts my hand in almost exactly the same position as it would be on my real joystick, with my finger on the trigger and thumbstick acting as a HAT (which works really well with VTOL VR camera, btw). Feels totally natural to me and pretty much like holding a normal joystick, just one that’s got no resistance or stopsOhh, it works with Touch controllers? Have you had a chance to compare it to Vive controllers and compare whether the level of control is the same or better with the wands? One of the things I worried about was that controllers like the Touch and Knuckles that don't have that joystick like shape may not function as well.
Button navigation would probably be the most in-universe way of controlling the holoscreens anyways. I'd imagine you'd just use the Vive wand's trackpad or Touch's joystick to control the screens, while your hand is on the joystick. Pretending that you are interacting with the buttons on the top of the in-cockpit joystick (which is how I assume an in-universe ED commander would actually control those screens). Have you used the joystick buttons in VTOL VR to control the targeting camera yet?
But strictly speaking if you don't care about that and just want comfortable controls, even if the screens are too far away with varying distances it would still be possible to give them laser pointer controls. Those are still very common in VR and everyone is used to them.
Another thought that just came to mind. Wouldn't the galaxy map be so much easier to use if you could use controller gestures, like grab to pan and pinch zoom like scaling with two controllers to zoom in/out.
Strictly speaking, I want all the options. If you've got a HOTAS and absolutely need that 100% precision, it's still there. But I don't, a virtual stick gives me enough precision to do all the stuff I do in ED and I love being able to just grab onto the controls I see right in front of me in the in-game cockpit. More importantly, I like being able to let go of them see my hands move around, and feel like I'm actually in the cockpit when my ship doesn't need input from me and I'm just waiting for it to travel where I've pointed it.
Actually I think the left and right holo screens should be touch-able, or "laserpointer-able", provided they introduced proper motion controls. This would pave the way for specialised panels which could allow you to control all the ships systems using your VR controllers. I mentioned DCS world - take a look how it works in practice (skip to around 4:24):Do you want to control the in-cockpit displays and station display with the wands like they are controlled in the video?
the ED ships all have HOTAS. The reasons listed for not owning a HOTAS seem to be rather silly. A shame we now live in an age where people need everything placed for them within arms length, so to speak, so they don’t have to move. The redlines people draw around themselves!
Every flight sim I’ve tried with motion controllers has felt disconnected and imprecise.
Also so odd that voice control get completely ignored.
As for the laser pointer, I’d like the possibility of that and being able to jab things with my finger. Oculus Home lets you do both so I suppose in ED the ideal solution for me would be the ability to reposition the holoscreens like with virtual desktops. I’d rather be prodding at A4 sized screens of a SLF next to my head than pointing at the dinner table sized things a couple of metres away on a ‘conda.
It’ll be interesting to see what games do with the Knuckles once they arrive, I’m hoping they might spur More Touch or something equivalent.
Edit:
Oooh, I’d get all Tony Stark / Minority Report / most Sci-Fi with that!
Try to think out of the box sometimes.
I've tried all these other methods and they are all inferior to a HOTAS and voice. It's there same with sim racing where a wheel and pedals is still best.
One precise question before we can agree to disagree: have you tried X-Rebirth VR edition with motion for more than a minute? Because if not, we're not on equal footing here. Until the controls suddenly feel natural, they feel clunky and imprecise. But that's because, as Jobs said, "you're holding it wrong" ;-).
I already said imitating real life controllers is much worse than tailored motion control system. And it usually doesn't work well. Wheel is a wheel, we're used to it, and how it works. Due to it being connected to car's wheels it also acts as a feedback device reacting to the road. That's why we have force feedback wheels for our gaming sims.
And voice, while it's ultra immersive and a new level of cool, has one big limitation - lag. You need to say a command, then say another one, and then another one. By the time you get to the second command, I would push the three buttons twice. I'm not in any way belittling voice control - it's really great, even if it's a pain in the to set up, and usually only works good with English language. It's just there are situations where I would prefer to use buttons instead of talking: "full power to shields, target last system, engage fsd". But for "landing gear" or even "gear", "engage FSD" under normal conditions - it's great. But as mentioned, you won't use your voice for steering the ship because that would be impractical.
I like when an answer is thought out, like yours, CylonSurfer. What follows is my opinion, not a fact and I respectfully disagree with yours on both the precision and immersion fronts. My view on it is slightly different than OP though, I want to get rid of HOTAS completely, not emulate it!
I used to be against and also thought HOTAS was superior. Until this one day, when I tried X-Rebirth VR. At first I tried it with HOTAS, of course, because motion controls for space sim?! Give me a break! But... the number of bindings was large and I wanted to play, so... I skipped configuring HOTAS and tried with default settings and VR controllers because why the hell not. I will do the tutorial, I will know what to bind, etc...
So I started playing. At first it was ridiculous. No, that is not precise, come on, where are you flying... Stop, not in the waaaaaall... Then the moment came: Oh wait. Wait a frikkin second. It all starts to click in... Oh my god how could I be so blind! After about five minutes, your brain adapts to new control scheme... And it's pure bliss. With Viwands + trackpads, steering your ship is like the future. HOTAS doesn't even come CLOSE.
Another thought that just came to mind. Wouldn't the galaxy map be so much easier to use if you could use controller gestures, like grab to pan and pinch zoom like scaling with two controllers to zoom in/out.
Oooh, I’d get all Tony Stark / Minority Report / most Sci-Fi with that!
That would be nice, in general it would be nice if ED had a reasonable options system. But from my understanding ED has nothing like a profiles system and many of the options affect both the 2D and VR modes of ED, some of them in ways that feel kind of mutually exclusive. There's a 3rd party application to help with control profiles for this reason. So I doubt ED could simply swap between 2D and VR.IIRC you can put away your controllers anytime by switching to mouse mode. Has been a while since I played it so I might have that wrong, but I am pretty sure your hands aren't tied to the ship all the time. XR VR also has this very cool feature where you can remove your hmd and the game switches to 2D. Would love that in E: D, btw.
VTOL VR has a lot of buttons, knobs, and switches. When you get close the hand sort of locks on then you pull the trigger on the wand to press a button or pull the trigger to grab a /switch and twist or move your hand to turn/toggle it. It actually works quite nicely.I'd love real "holobuttons" too... This Tony Stark vibe is also my intention. Alas we Vivers need knuckles to arrive first, as viwands are sub-par in this regard, so unless that's happening, the userbase is even more small... Knuckles v2 are confirmed to have a thumbstick and a touchpad, so it's the best of both worlds. I also think when both major systems have finger sensing like Touch, game interfaces will adapt. Still is a long journey, but an exciting one nonetheless.
Repositioning might be a bit of work, but yeah we could go that route too. With absolute control it might be a bit too messy with human positioning, but I think we could get enough control by allowing for "distance" and "scale" sliders to configure the panels.Excellent idea on repositioning the holo-screens. After all, they are holo screens and can surround a pilot with ease. Imagine something like the Engineer's ship bridge in Prometheus, where a galaxy map surrounds you and you can interact with planets the way David did... One can dream.
Yes, I tried for long enough to judge it.
Tried it and really couldn't abide it. Worse still I could not figure out how to re-enable my HOTAS without quitting the game and removing the batteries from my Touch controller. I've not been back since I'm sorry to say. I found it a rather frustrating experience. In regard to the control system itself, I found it rather lack luster, twisting your wrists and laser pointing etc just wasn't for me.
BTW, I'm sure you could emulate the X-Rebirth control scheme in Elite with VJoy + free /glove pie and a couple of WiiMote motion + controllers. SteamVR may even offer similar opportunities for the more coding savvy player.
Last night I flew an hour flight in Areofly 2 in a Cessna. I had the choice to fly using motion controls in an almost fully interactive cockpit but instead used a yoke, throttle and pedal combo. If the game had better start up and shut down sequences I would have likely used motion controls to start the plane from cold and tune my instruments. However for the take off, flight and landing I'd only ever use physical controls.
PS, I also like your gear shifter as landing gear lever idea, I may have to steal that little tip![]()
That would be nice, in general it would be nice if ED had a reasonable options system. But from my understanding ED has nothing like a profiles system and many of the options affect both the 2D and VR modes of ED, some of them in ways that feel kind of mutually exclusive. There's a 3rd party application to help with control profiles for this reason. So I doubt ED could simply swap between 2D and VR.
VTOL VR has a lot of buttons, knobs, and switches. When you get close the hand sort of locks on then you pull the trigger on the wand to press a button or pull the trigger to grab a /switch and twist or move your hand to turn/toggle it. It actually works quite nicely.
I'm thinking that in the options we'll have a "Use skeletal input" toggle, and the interface in ED would work in two ways:
- If you have skeletal input off (best for wands), going near buttons will have the same sort of automatic poses as VTOL VR and the trigger will be used to control things.
- If you have skeletal input on (best for Knuckles and possibly touch), it will use the SteamVR Skeletal Input or the Oculus SDK's hand poses to control the pilot's hands. In this mode you would physically push buttons with your index finger or pinch sliders.
It's slightly tricky, but it also occurs to me that we could have a mode where we use either the SteamVR Skeletal Input with gloves or Leapmotion's bones API, and disable that input when we think the player has grabbed onto their physical HOTAS. So you can seamlessly play with a HOTAS, let go and toggle a switch, and then grab onto your HOTAS and continue flying.
Repositioning might be a bit of work, but yeah we could go that route too. With absolute control it might be a bit too messy with human positioning, but I think we could get enough control by allowing for "distance" and "scale" sliders to configure the panels.
I'm not quite sure what you're suggesting. The holo panels I'm thinking about are the target panel on your left and the system panel on your right, possibly the two upper panels depending on your opinion on how one should switch between the tabs on those panels. I'm not sure how a JARVIS style panel would work when there are 2-4 of them.Aside from a fact that this is an awful lot of UI coding just for a niche withing niche of niches ;-) playerbase, I find the idea nice. But on second thought, there shouldn't be human positioning. The display should be like JARVIS and follow the user, regardless of ship. An universal spaceship control protocol or USCP if you will. We already have it standardized (all ships' cockpit instruments look exactly the same), so why not make it so that it's projected in front of your suit and be done with itYou'd just press the "HMD reset" button and all would be positioned correctly.
That's why I expect most of the users will be using a virtual stick/throttle, XRVR style control, VR gloves, leapmotion, or a hybrid control (joystick hand on HOTAS, throttle hand with VR controller and a virtual throttle).The problem is, switching from VR controllers to hotas to VR controllers to hotas is clunky, all while wearing a headset, especially if you have cramped space like you do. I also have a small, self-made little desk I use for gaming in my living room, so I feel the painThat's why I'm a big fan of XRVR motion control.
That's why I expect most of the users will be using a virtual stick/throttle, XRVR style control, VR gloves, leapmotion, or a hybrid control (joystick hand on HOTAS, throttle hand with VR controller and a virtual throttle).
I like ED fine as is - long as any updates don't break the game as it stands them I'm pretty happy - and I'm both old and certainly don't play casually - at least 4 hours most evenings and longer at weekends.
But if folk are looking for something else - or even if not then GoG have X rebirth: home of light complete edition on sale at the minute. Seems to have been patched up quite a bit as well.
https://www.gog.com/game/x_rebirth_home_of_light_complete_edition
There is a SteamVR overlay that allows you to place fairly basic virtual buttons near you. And someone has actually used this with ED. Unfortunately I could not get it to work when I tried it.Given that there are VR flight simulators out there utilising hand controllers, their various buttons/pointing/gestures as an interface i can't see why a clever person somewhere might adapt ED to enable flight control plus in ED. Maybe it'll end being third party add-on - I'd buy it
I popped back in and made some observations of the UI and cockpit a few days ago while playing with my gamepad.
Notably I made a few observations of how the pilot's hands are animated to respond to control inputs.
I noticed the pilots hands moving in response to the following control inputs:
- The throttle moves in response to up/down changes to the throttle
- The joystick moves in response to ship roll/pitch/yaw input
- The index finger on the joystick moves in response to the primary weapon fire input
I did not try triggering actions like chaff so I don't know if the left hand makes any responses to button presses that could correspond to buttons on the throttle. Interestingly I did notice that the pilots hands do not respond to any of the following inputs:
Given that the Touch's joystick has a capacitive sensor and trackpads are inherently capacitive. We could probably improve the joystick reactions significantly by doing things like moving the thumb to the hat switch when you touch the stick/pad. And decide on a button that could vaguely be considered secondary fire or choose one of the two Touch buttons / the Vive menu button and tie them to one button.
I also noticed that UI panel placement is also ship dependent. In my large 2-seater Cobra MkIII the panels are large and just out of reach. However when I switched to my old Eagle, I noticed that the panels are actually smaller and directly next to me in the small cockpit. That said, the the part of the cockpit to your left in the Cobra that looks like a keyboard (also out of reach) below the holo panel, is entirely absent in the Eagle's cockpit.
Perhaps it would be a valid request to allow toggling on an option to always have the close up Eagle-like small cockpit holo panels in all ships. If we wanted to support direct interaction with the holo panels.
Because I want hand presence with the actual in-game character I'm playing.
Hand presence is the least important part of Sim gaming. For most, the equipment you use to play with is. When you have your HOTAS lined up with the in game throttle and stick, you more or less have hand presence...The only thing the game doesn't know is when you take your hands off the controls. You are denying yourself the additional presence / immersion by not buying the cheap $30 equipment that would get you it and coming up with flimsy excuses as to why you won't. You state you can't afford $30 to buy one, yet you over spent on your HMD by at least $100, you say you cannot store a HOTAS in your apartment, what you actually mean is that you don't want to.
Sorry, I'm used to writing from the perspective of someone implementing things. Yes, these are notes how things could be implemented by FDev, or anyone FDev gives the ability to implement this stuff.We can't do anything, that ball is firmly in FDEVs court.
Given that the close up UI panels do already exist in Elite Dangerous in some ships, I don't think the eye strain argument is valid in this case. If that placement is bad for the player, why is it acceptable to do that to the player if they're in an Eagle?As stated in another post of mine in this thread, FDEV have already spoken out about this and stated that they believe that the UI placement is optimal in everyship and will not be moving them due to concerns over eye strain and geometery clipping.
Generally the SteamVR API has ways to get controller info and make overlays. Overlays generally seem to be limited to 2d floating in 3d space, rather than full models. I think some people have managed to trick SteamVR into rendering 2D overlays that look 3D, but it's not supposed to be easy or a normal thing to do. There is also an api to expose tracked objects and some people have used it to create fake tracked objects to render 3d things in-game by providing render models. But that won't work in ED of course, because ED doesn't even acknowledge the tracked controllers presence much less render any of the render models. And trying to make fake input axis for ED to pick up is another matter, though it's probably possible (if likely to involve strange APIs and limitations that force you to use lower level languages to implement the overlay instead of say, tossing something together in Unity).Did you take a look at the SteamVR API to see what you could do??
Of course, too many of the things that would make building a GOOD 3rd party interface are dependent on Elite Dangerous and are not available to things outside ED itself:
- You can't just add hand presence in. Unless you are happy with a floating 2d glyph of a hand, while the 3d hand attached to your shoulder sits statically attached to the flight controls. Like someone with alien hand syndrome.
- To my knowledge you don't know the context/status ED is in. i.e. Normal ship operation, focus on a holo panel, or galaxy/system map open. Which are necessary in order to implement proper controls. e.g. Unrelated overlay buttons should disappear in maps/menus where their interface elements don't exist. Need to be able to change type of faux input output by the trackpad depending on the context so it can be used for things like target control in normal operation but navigation when focused on a holo panel. And need to know if the galaxy map is open in order to switch to the grab to pan / scale gestures (which themselves may need some input scale tuning just to get user input to match up). Need to know the current throttle level to put the virtual one in the same spot – otherwise it won't work properly when the user is also using things like "full stop" and "75% throttle" voice commands. This of course is all the easy stuff, anything that requires knowing the current selection of a holo panel is a no-go (e.g. knowing the current selection, vs the selection you're pointing at so you know what inputs to send to move the selection to the correct item).
- No information on the 3d position of menu options. So you can't make game menus controllable with a laser. (Which would kind of hurt pretty bad. You can have a virtual joystick with an overlay, but you can't actually start the game with the controller you use to fly the ship.)