Interesting. Let's explore this. Firstly though, I note the part bolded above is the part I concentrated on when explaining to sollisb. I showed him an image similar to the following, which is a good image to use for the analysis.
http://i.imgur.com/nThgQYD.png.
So if we can explore your solution...
Let's stop there. I'm not sure how reliably a bot could "turn towards the position 10 km in front of the slot". As a human, easy, but how does the bot know the concept of "front of the slot"? So to answer that we can look at the station hologram orientation. The issue I see here is that pattern recognition from an unknown 360 degrees orientation is flaky at best from off-the-shelf script bot software. I certainly don't know them all, but the basic (free, ahem) software we use for use case testing doesn't handle rotation or scaling of the image to be pattern matched. Perhaps lower level use of OpenCV might be able to do this? An alternative might be to rotate your ship through the entire spherical solid angle, and hope, say, that the pattern recognition can match within about 5 degrees, when the ship hits the right angle - the so-called "reference angle" from which calculations can be made to "10km in front of the slot". Is that how you see it, or is readily available software out there that pattern matches any angle? More importantly, how reliable do you think this would be?