...IDK what Cobra does, but it does look better out of the box with the rift. Perhaps they set something hardcoded?
(Sorry for running down this sidetrack, everybody, but I find the explanation worth it. :7)
The reason the image looks better in Rift than Vive, is that its physical resolution is in fact higher.
Now; Both HMDs have display panels with the same dimensions in pixel count terms (...which has de facto come to be called "resolution"), but they spread those pixels, as seen through the lenses, over different view angles, giving them different resolution in the truer sense of the word: detail density; In print you would use measurements in "Dots Per Inch", but for HMDs "Pixels Per Degree" is more appropriate.
I am using approximate, vaguely esti-measured and recollected numbers here: The horizontal field of view of the display panel through a single lens of a HTC Vive, if you press your eyeball all the way up to it, is somewhere in the range 90 to 100 degrees, whereas the corresponding for the Oculus Rift is around 75-90 degrees. This means that Per Degree of your field of view, you have more pixels in the Rift than in the Vive - about a 20% difference; The illustration I like to use, is that the same tiny bitmap font you get to draw in a 5x5 grid set in the Rift, you have to somehow make readable with only 4x4 in the Vive.
"Hang on a minute", you'll be saying by now, "I get a much better field of view in my Rift than that." This is where you have some crafty decisions on Oculus' part coming into the picture, when it comes to tradeoffs: In the Vive, both eyes cover almost the entire total FOV of the headset, providing full stereoscopy over all but the last few degrees out to the sides. If the Rift, however, the game camera frusta are asymmetric, and a bit of that stereo overlap is sacrificed for more resolution, and a bit of fov in the periphery; Your left eye does not see as far to the right, as your right one does, and vice versa. This means that the
total FOV, for both eyes combined, is only 5-10 degrees lower than that of the Vive.
Since the Rift's lenses magnifies less, they also have some spare fov, which they utilise on the corners of the screen, through a square-ish cut, making less of the pixels there go to waste, whereas in the Vive, the round lenses reach as far as they can all around, except in the middle, where the left and right views, drawn side-by-side onto the single image sent to the screen, truncate one another, taking out notches.(EDIT: ...so in the Vive the image is round, like the lenses - no reason to render anything you can't see anyway (EDIT2: Incidently, I am curious as to whether FDev takes this into account with Elite; That large part of the image, out in the corners, can be masked off from pixel shaders, so that you don't waste GPU cycles on them)).
I am quite convinced this lower field of view per lens, of the Rift, is also a considerable part of the reason you have less falloff of sharpness outward toward the edges, as compared to the Vive - it's not just just better optical geometry, but even simpler; You don't use as much in the first place, except on the diagonal.
Clever and effective as this tradeoff scheme is, it is also the greatest of the reasons I prefer to use my Vive over my Rift, including for Elite and other cockpit games; I just can't get over the "per eye black bars", where stereopsis ends (EDIT3: Feels kind of like having the 2001 monolith for a nose prosthetic

). I seem to be something of an outlier with regard to this; Nobody else seems to even notice them at all, in the first place. :7