I'll cross post this from another thread. It's a bit of a long explanation of a likely way in the future you would pull of the type of telepresence that elite has shown off. I think it's a good way to see how nothing about the gunner cam is immersion breaking when the whole of multicrew is virtual in nature:
I don't know how much you've dealt with VR stuff in our real world, so I'll try to explain the best I can why you don't need a drone camera in any way.
lets start with something simple, your character is not in your friends ship, you are there virtually. Now how do you look around your friends ship as if you were really there? In VR you'll be able to lean all over and walk around the cabin as you can now in your own ship as well. How would you accomplish this? It's not something as simple as "there is a camera there, you're looking through the camera". There are a few ways this would work in real life and the game.
One would be if the camera was some kind of drone with stereoscopic cameras that moved exactly 1:1 with your movements with 0 latency. This certainly might be possible in the future, but would be the most inefficient way to go about it. Another way would be some futuristic 360 stereoscopic lightfield camera so that you could move in physical space and have the image look correct. If you want an idea of what I mean by this, look at a 360 video on youtube, you can swivel all around and look wherever you want, but you can't lean to the side and see a different angle on the object in front of you in the video. Now, lightfield cameras for VR visualization is something being worked on in real life, and in fact intel had a recent demo at CES where they had a video of a scene, not a 3d model, that you could physically move around in VR. Really cool stuff. Problem with way #2 is unless the camera moves, theres no way you could look at stuff behind things blocking the camera, like behind or under your chair, which you can do in elite in VR.
So really, physical direct camera ways, are pretty much out as ways this would be done in elite. The real way you would accomplish this, and another thing being played with in real life for real VR is to have sensors (cameras, laser scanners, whatever they do in the future), around the area you want to virtually visit and then do a real time 3d reconstruction based on all that data. The same way something like the microsoft hololens scans your environment in real time to create a 3d map, or photogrammetry is done. If you haven't used VR before, heres an example of someones 3d reconstruction, created using only cameras and visualized in VR:
https://www.youtube.com/watch?v=ybdKP45IVDQ
That is something that we can do today, in 2017. In the next decade, possibly more, we will be able to capture scenes in greater fidelity than that, in no time, even real time possibly. At the very least its not hard to imagine in 1,300 years, they can do what we do in 2017 in real time.
Ok, so thats explaining how you're seeing your friends cockpit, how you're visiting it and visualizing it. At least the most likely explanation on how it would be done based on our current knowledge. How does this apply to the 3rd person camera? Well, simply put, you have all that sensor data from the ship that knows exactly where all objects around it are, and exactly what position they are in, the orientation and look of your own ship. It would be fantastically trivial to churn together all that data and create the same kind of 3d reconstruction we can do in 2017, and once you have that 3d reconstruction, you can place the virtual camera that you view the scene from, in literally any position, at any angle you want.
People aren't blowing smoke when they make the argument we can practically do all of that today, in 2017.
Now, whether you, or anyone else likes the gunner cam, is a completely different argument, but I hope I've been able to give you an idea why there is no such thing as a drone needed for the kind of view you get in that gunner cam.