True - the Leap Motion sensor sticked to the HMD has a rather narrow field of view.
But i think for the cockpit in ED that would be sufficient to get a starting point. Also there are some concernes about the fingers untrackable when using a controller. But there comes the whole thing together. You can combine the controler inputs and what the leap motion detects to have a good representation of where the hands are. If hands are off view when you look at side panels, you render them as beeing on cotrollers. In case they come into view you animate a quick path to follow up within 1/4sec. That would do the trick.
The thing is with VR helmet on or even closed eyes, if you are not a blind person you usually have no precise idea where your limbs are anyways. As soon as you look on one of the side panels and have your fingers searching for a button, a good enough animation for such a UI should be possible.
Now just as with DK2, it does the trick for earyl adopters and would just be good enough to give a good idea whats possible. But as soon as this system picks up interest, it would not cost a whole lot more to make a sensor with a much wider field of view optimized for HMDs. Maybe it just needs two of those sticked together, and it still would be a lot cheaper and better than gloves or use of any VR-controllers. We use HOTAs anyways in the end, or steering wheels in racing games. All that is needed is a better representation of hands moving away from our main controllers when playing sim-games.