I am *hoping* that it is the drivers which sort out most of the technical stuff and the game just has to translate the 3D information into the headset. That way it wont matter HOW the AR/VR device works (I am no coder so please excuse my ignorance / not getting the lingo correct)
but if the hardware drivers essentially do most of the leg work then would it not be trivial for the game to support any number of VR / AR devices?
Whilst I agree that in the future we should move towards some unified VR API i'm unsure the same can be true for AR and certainly not between the two it just doesn't make sense. They are two very different technologies with very different and distinct uses. VR is completely immersive, placing you within a 3D world, whilst AR is much more about inserting 3D objects into the real world. There really isn't any meaningful cross-over and nothing that driver level stuff can help deal with.
This isn't to say that you can't have an E

experience in AR, but it is most definitely not going to be the same experience that you currently see in game. If anything it will be like the difference you see between the game and the current E

mobile app. So whilst you could in theory overlay E

game over the real world it would not be very effective and misses the point of AR completely. It wouldn't be effective as it would literally be an overlay with you able to see the real world beyond, it misses the point because its not using the real world to define the experience.
I wouldn't be surprised to see some E

content in Hololens, but it would be much more along the lines of a viewer to examine the ships, placed on top of a real world table in front of you. Alternatively something even cooler would be to be able to watch a replay of a battle happen above you table, though again its not really using the benefits of AR which is to have the 3D interact with the real world.
I do agree that at some point in the future VR and AR displays will converge, there simply isn't any reason not to as the hardware is similar in many ways and since both systems offer very different but equally as exciting and useful experiences. The only question is how it will be achieved. As Carmack discussed in his GDC talk, you can add cameras to provide the 'see-through' experience to VR, but it has problems that the camera's are not originating from your eyes but a distance in front of you. The best method is likely some form of polarized displays that can switch between translucent and opaque, that way AR can use embeded cameras, but they don't need to worry about where they originate from, since in AR you are projecting content onto the world, maths can solve the depth issue.