Flickering lines are due to aliasing, which is the inability of the renderer to sample the underlying scene with a high enough accuracy to correctly resolve the true value of the pixels.
All renderers attempt to sample the underlying scene to determine the correct amount of light being reflected/emitted in the direction of the screen, which essentially comes down to figuring out, for each pixel, what the geometry behind that pixel looks like and how much of the pixel it occupies. In realtime renderers, you typically only have a single scene sample per pixel, which means that as a given polygon moves through screen space, the pixels it occupies are either on or off, there's no fading in.
When you have straight lines that approach horizontal or vertical, this tends to present as as "shimmering", "crawling ants" or "jaggies" because the sampling is essentially being interpreted as a signal that's traveling along the line, rather than a line moving across the screen - this is the definition of "aliasing" when an undersampled signal gets reconstructed into a lower frequency signal that seems to do something else. With polygons that are further from the vertical or horizontal limits, the same problem exists, but is far less noticeable. This is why some games don't exhibit the problem as badly, if you don't have a lot of horizontal/vertical lines (which you won't in a jungle, for example) then you're avoiding the underlying problem. Elite has a LOT of mechanical structures with horizontal & vertical lines that line up with the ship, particularly when docking, so the problem becomes VERY apparent.
With textures, the problem is easily solved by simply blurring the texture as it gets further from you (this is what mipmapping does) so that the frequencies in the sampled texture are never higher than the sampling pitch (pixel pitch/resolution), but with geometry (like thin lines) you can't do this, you can't blur geometry.
FXAA attempts to resolve the problem by running an edge detection routine on the rendered image and then applying a blur to those edges, it's a hack and it's not really antialiasing, but it's fast & cheap, the downside is it tends to make the scene look fuzzy and doesn't actually solve the problem, it just masks it.
The "correct" approach (short of a full analytical solution to the underlying scene) involves sampling the underlying scene at a high enough frequency (number of pixel samples) that it can accurately reconstruct the image for display, this gets rid of the jaggies, but requires throwing a hell of a lot more samples into the scene, each pixel has to be sampled multiple times so instead of having on/off states as a polygon travels through it, it fades on and off, which reduces the jaggies substantially. This is called "supersampling" or in NVidia parlance "Dynamic Super Resolution". You're rendering the scene at a higher resolution and then "downscaling" it for display, which softens the edges and gives you a truer depiction of the underlying scene. The downside here is that you're essentially throwing 4x (at least in the case of 2x DSR) the number of samples you need to at the scene, which makes it very inefficient and therefore slow.
SMAA is a hybrid approach closer to full supersampling as it tries to figure out during render time what the high frequency areas are and throws more pixel samples at it, it's slower, so in theory you're only sampling the areas that need it rather than super sampling everything. It's slower than FXAA but faster than DSR, but still doesn't quite resolve the scene correctly - I find that on my Rift DK2, it's the best solution until I can get my twin 980s set up so I can run the scene at 4k
TL;DR: Jaggy lines (aliasing) is an inherent problem in renderers, antialiasing routines attempt to fix it, but the best solution is supersampling the underlying scene, but you need a beast of a machine to do that.