Heh, I wasn't actually referring to letting 3rd parties fix the whole game, just the controls. I was just referring to them doing one of:
A) Allowing part of the community with programming experience to submit code changes to add controls / fix non-gameplay bugs. Remember when I replied to one of your posts with, "Likewise I'd happily agree to this being a feature that FDEV does 0 work on and making it work is up to the community that does want it. But I've never heard of FDEV allowing the community to work on functionality and integrating it back into the game."
B) Not bothering to actually implement any of this and instead telling people to implement alt controls as external applications, but co-operating and adding a new API that exposes enough information to subscribe to and game overrides to make this possible i.e. Allow an application to get 3d positions and dimensions of panels, current category/selection info, info on what is displayed where on panels, current throttle state and 3d position, current joystick 3d position, current VR coordinates to Elite cockpit coordinates transform, an event when that position is reset, info when the galaxy/system map and menus are open, 3d positions of menu items/etc, input scale information for maps, the ability to enable/disable arm animations and override the hand position and finger bones pose, and the ability to render very simple 3d (or just 2d) models in-game (It could just use SteamVR Overlays, but then the controls wouldn't work for rift users). This probably involves only a small fraction of the work actually implementing controls would be.
I'd actually be perfectly happy if they just chose to do B. I just tried setting up VoiceAttack with the HCS voice packs yesterday, and there are a number of bugs in it that come from lack of game info that B would solve. i.e. Doing B would make it possible to make great VR controls and also make non-VR specific voice controls and tablet panels much better.
Actually, when I talked about various holo panel metadata. I was only thinking of enough information on positions, rectangles, and selections to make a laser pointer or touch selection work. But if they went all the way and actually gave full information on the panel's current display. I could imagine some non-VR simmers with special rigs would write mobile apps that would let them place a tablet on each side of them and would replicate the entire left and right panel display contents.