Use external screens for direct controls

Thanks for reading this suggestion.

I was thinking that maybe and probably it would not be possible, but it would be interesting to be able to use external screens such as tablets or smartphones in the game, in short, touch screens to be able to visualize the menus of the interface, both on the right side and on the left , and thus be able to interact better with the ship without using keyboard and mouse, joystick or any other type of control, but the fingers directly, marking which modules you want to be active, reading the different statistics menus or even pointing with your finger the system you want to move to

I suppose if in any case Frontier reconsidered it, it would be necessary to add some downloadable and installable application in Android and iOS systems so that any tablet or smartphone is connected to a PC (since the PC is a platform more open to changes), or add to the code of the game compatibility for multiscreen touch.

That would give the player more immersion for his ship.

It's just a suggestion.
 
I have been thinking on how this could be implemented.

How should the integration work etc.


So my own thoughts on where on how this could be implemented is one method I think should work for all platforms.

Firstly, I do not want to require the use of a specific app etc. instead I propose to have this as a "web-service" run by the game.
You could connect with any web-browser, and how to connect could be with a simple QR code or similar.
We only allow connection from the local network. why should your computer/console and tablet be on different networks? In most cases, these will be on the same network. But I want to avoid opening up for external manipulation etc,


So in game, you choose to attach device, a QR code is shown on screen, that is a simple HTTP(S) link to the webservice with a session ID, so each device gets their own session ID. For every new device you add, you get choose on their connection to trust these or just allow them for this session, we use something like a cookie for persistent devices, so that we only have to hit the link when the game is running and it simply connects.
If device is persistent, you can bookmark any of the "URLs", so that when you recall that URL, it takes you directly to that panel/tab. If device is not persistent, the main game asks you accept the connection, if the web-service is active and the SessionID have been issued before.





If you computer have multiple IP-addresses, you get one QR code for each IP... think laptop with both network cable and wireless activated at the same time.
If QR code cannot be used, device with broken/does not have camera, then we also displays the URL to manyally type in.



Now any modern web-browser should be capable of showing the different panels. So if you go all out, you can have one device for chat, one for navigation, one for system and one for the panel beneath the radar. I am unsure how Galaxy map would in this situation, so for the moment, you have to use that on the main device. But partial Galaxy map, like plotting route to bookmarks should be doable, without the actual Galaxy map, or if having the Galaxy map opening, you can use the "tablet/laptop" to navigate the galaxy map, so this would give console players access better keyboard, and having a document with system names to copy/paste etc from.


Having touch enable displays here is probably best, as you can just tap to navigate between tabs etc.




So you can basically use any left over computers, phones, tablets etc as auxiliary displays to show panels in Elite. I think for most parts touch enable displays should make the most sense, and touch device with keyboard for the chat window should be optimal.





So I wanted to stay away from the "app-requirements", as that limits supported devices, now you should be able to use any "spare" devices with a supported web-browser regardless what OS it is running.
I wanted to keep it simple, with no passwords etc.




And one bonus for console players, this web-service could allow for a small program to be developed that could receive your CMDR logs directly from the game and write them down to a log file, just like the PC version does, allow you to run the same tools that PC players currently enjoy, and have the option to upload your data to sites like EDDB/Inara etc.
I prefer the "small program" option over the other alternative, to manually download the log.
 
Well, I'm not a programmer, I do not know the level what type of program can be used for it, or if a web application would work.
But I leave that for the experts.
Nowadays that most homes have a smartphone or a tablet or both, it would be interesting to be able to add them to the computer to use manual controls.
Nor do I think that it must be mandatory to have it, that is, for those who do not have 2 tablets or a tablet and a smartphone. But even in combat, it would be nice to have access to the controls of your ship (for the activation or deactivation of modules, lights, night vision, etc.) now that in the beta 3.3 menus with bigger buttons have been seen, power Touch the screen directly without having to assign more buttons to the keyboard or the joysticks or pads.
 
Hi,

you are right, controlling the menus from external devices adds a lot to immersion. The thing you described already works, but it needs some workaround. You'll need VoiceAttack and LEA Extended Input (or Roccat Power Grid, I prefer LEA). There is a very good VA profile written by Ishmair for reaching the different panel items with only one command.

If you are interested to make your own version, I'd be glad to help you with the details.

Here is an example, how it looks:

https://forums.frontier.co.uk/showt...and-Unicorns?p=7087060&viewfull=1#post7087060

Best regards!
 
Back
Top Bottom