Wouldn't this open up so many things you could do?
Example - you could click on a star you are facing and bring up information as to what glass it is, the name and plot a route to it etc. Scientific information. This goes for planets, objects, targets.
Maybe a scanner you can use manually.
It would open up all sorts of UI and informational tidbits that keep us "heads up" instead of going into an exterior screen (which is totally unrealistic IMHO). Realistically if you are going to look at a galaxy map VR goggles should drop down or your chair should swivel to a new station or it should be floating in front of you like in station on a transparent screen.
We lack so much information.
The design decision to make everything controllable by a gamepad is a CONSOLE decision not a PC decision. Its time they separated out the complexity the game could possible have by adding mouse and other abilities to the game to provide more information.
Example - you could click on a star you are facing and bring up information as to what glass it is, the name and plot a route to it etc. Scientific information. This goes for planets, objects, targets.
Maybe a scanner you can use manually.
It would open up all sorts of UI and informational tidbits that keep us "heads up" instead of going into an exterior screen (which is totally unrealistic IMHO). Realistically if you are going to look at a galaxy map VR goggles should drop down or your chair should swivel to a new station or it should be floating in front of you like in station on a transparent screen.
We lack so much information.
The design decision to make everything controllable by a gamepad is a CONSOLE decision not a PC decision. Its time they separated out the complexity the game could possible have by adding mouse and other abilities to the game to provide more information.