Just a mad idea I've had, but how feasible would it be to use a tablet, connected by USB, as a control surface with the Cockpit UI elements (contacts, navigation, modules, cargo, status etc), The tablet would display the UI elements as they appear on the main monitor and use the touchscreen for selection and activation of options.
This would require:
- An app that allows a PC to use an android tablet as a control surface that produces standard keyboard inputs.
- A way of either:
-- transmitting the UI graphics from the PC to the tablet for display on it's screen.
-- setting up the app so that the display matches the Cockpit UI but is stored on the device.
Swiping could switch tabs, and tapping would select with the performed actions bound to the relevant keys for accessing the UI in game.
How easy/difficult would this be to do?
This would require:
- An app that allows a PC to use an android tablet as a control surface that produces standard keyboard inputs.
- A way of either:
-- transmitting the UI graphics from the PC to the tablet for display on it's screen.
-- setting up the app so that the display matches the Cockpit UI but is stored on the device.
Swiping could switch tabs, and tapping would select with the performed actions bound to the relevant keys for accessing the UI in game.
How easy/difficult would this be to do?