So in light of all the performance issues being brought up, I decided to do a big comprehensive test and see just how big of a difference there was between the Windows and Mac OS X builds of Elite: Dangerous. And what better way that to do that than to jack up the settings all the way.
For those who are unfamiliar with it, supersampling is basically a driver trick that renders the game at a resolution much higher than what your monitor is actually capable of, puts a filter over it, and then downscales the game to fit the monitor.
Elite: Dangerous's supersampling feature doesn't tell you the exact resolution the game's running at. It just tells you the multiplication factor. 1.0x being your monitor's native resolution, and what you should be using by default. Just for the sake of the example, I'm going to be using 1920x1080, or "1080p" as the default resolution for this test. This would mean that a supersampling factor of 1.5x would result in the game rendering at 2880x1620 "3k", and 2.0x supersampling would be 3840 x 2160 "4k" resolution.
Naturally this would be extremely hard on any computer's GPU. So why punish your video card and force the game to run at 4k+ resolutions when you're stuck on a 1080p monitor? In short, it's a way to get around the problems of Anti-Aliasing. Anti-Aliasing, or "AA" is a filter placed over the screen to reduce the jagged lines leftover when the computer (badly) approximates the world's geometry. The lower the resolution, the less math the GPU has to do to render the textures. But that also means it's less accurate at doing so. The higher the resolution, the more accurate the world is. AA gets around the need for this by putting a "filter" over the screen at the cost of a bit more power.
There are a great many different types and applications of AA, but the problem is that each and every one of them comes with faults and inherent problems. Some produce a blurring effect while others come at insane performance penalties with very little visual gains.
Assuming you have the horsepower to spare, rendering the game at a straight up higher resolution negates the need for AA altogether. By rendering the game at a higher resolution, lines that would zig zag across the pixels as they came down at an angle are allowed to straighten out, and thus, supersampling is much preferred over any type of Anti-Aliasing method.
I will be conducting the test on the following system:
i7-4770k @ 4.3Ghz
Corsair H100i aftermarket cooler
Gigabyte Z87X-OC
32GB Corsair Vengeance LP 1866Mhz RAM
2x SLI EVGA GTX 770's - Superclocked with ACX Cooling
Corsair HX750 PSU 80+ Gold PSU
Samsung 840 Pro SSD - 128GB
Note that I have disabled SLI on Windows in order to achieve a fair test using a single GPU on both OS's.
First up, Mac OS X at the native 1920x1080 resolution:
View attachment 32222
Next up, Windows 7 at 1080p:
View attachment 32223
Already, the Windows build looks quite a bit nicer. The lighting and particle effects are much better. The Mac build seems to be lacking a lot of the shaders used on Windows. The game also runs a lot faster on Windows 7 at 147 fps. And yes. I had the gamma settings exactly the same both in-game and on the OS level. the difference in color is due to the different color schemes used by the stations. I'm at Meuron Station in the Windows build, and Dalton Orbital in the Mac build. I didn't have the time to fly all the way back to Dalton Orbital, so I found the closest station that matched it as I could.
Next is OS X at 4k via supersampling:
View attachment 32224
Compare that to Windows 7 at 4k:
View attachment 32225
The changes are immediately apparent. Rendering the game at 4k, even on a 1080p monitor, is astonishingly better. The differences are striking. Right off the bat, you can see the lines on the outer frame of the Viper in the foreground are much more defined at 4k while the far end of the station sees smoother lines along it's structure.
Though again, Windows holds the clear lead here. A single GTX 770 can render the game at 3840x2160 and still maintain 60+ fps. The Mac OS X build gets barely half that, and that's on top of all the existing light and shading effects it seems to leave out. With those effects absent, the Mac fps should be quite a bit higher than it is.
Let's try it again:
Mac OS X at 1080p:
View attachment 32229
And Windows 7 at 1080p:
View attachment 32230
Mac OS X at 4k:
View attachment 32227
Windows 7 at 4k:
View attachment 32228
Again, the differences are pretty obvious. Mostly around the holo-ads and the Viper's weapon mounts. And again we see that Mac OS X shows gross deficiencies in lighting and particle effects, as well as a pitiful frame rate.
Overall, it was a fun experiment. But I'm really hoping the Mac version starts showing major improvements in it's lighting/shading, and performance in future builds. I took a lot more screenshots and video of both OS's, but sadly those will have to wait a while to be uploaded.
- - - Updated - - -
Not sure why the attachments aren't loading in the post, but the links work, it seems.
For those who are unfamiliar with it, supersampling is basically a driver trick that renders the game at a resolution much higher than what your monitor is actually capable of, puts a filter over it, and then downscales the game to fit the monitor.
Elite: Dangerous's supersampling feature doesn't tell you the exact resolution the game's running at. It just tells you the multiplication factor. 1.0x being your monitor's native resolution, and what you should be using by default. Just for the sake of the example, I'm going to be using 1920x1080, or "1080p" as the default resolution for this test. This would mean that a supersampling factor of 1.5x would result in the game rendering at 2880x1620 "3k", and 2.0x supersampling would be 3840 x 2160 "4k" resolution.
Naturally this would be extremely hard on any computer's GPU. So why punish your video card and force the game to run at 4k+ resolutions when you're stuck on a 1080p monitor? In short, it's a way to get around the problems of Anti-Aliasing. Anti-Aliasing, or "AA" is a filter placed over the screen to reduce the jagged lines leftover when the computer (badly) approximates the world's geometry. The lower the resolution, the less math the GPU has to do to render the textures. But that also means it's less accurate at doing so. The higher the resolution, the more accurate the world is. AA gets around the need for this by putting a "filter" over the screen at the cost of a bit more power.
There are a great many different types and applications of AA, but the problem is that each and every one of them comes with faults and inherent problems. Some produce a blurring effect while others come at insane performance penalties with very little visual gains.
Assuming you have the horsepower to spare, rendering the game at a straight up higher resolution negates the need for AA altogether. By rendering the game at a higher resolution, lines that would zig zag across the pixels as they came down at an angle are allowed to straighten out, and thus, supersampling is much preferred over any type of Anti-Aliasing method.
I will be conducting the test on the following system:
i7-4770k @ 4.3Ghz
Corsair H100i aftermarket cooler
Gigabyte Z87X-OC
32GB Corsair Vengeance LP 1866Mhz RAM
2x SLI EVGA GTX 770's - Superclocked with ACX Cooling
Corsair HX750 PSU 80+ Gold PSU
Samsung 840 Pro SSD - 128GB
Note that I have disabled SLI on Windows in order to achieve a fair test using a single GPU on both OS's.
First up, Mac OS X at the native 1920x1080 resolution:
View attachment 32222
Next up, Windows 7 at 1080p:
View attachment 32223
Already, the Windows build looks quite a bit nicer. The lighting and particle effects are much better. The Mac build seems to be lacking a lot of the shaders used on Windows. The game also runs a lot faster on Windows 7 at 147 fps. And yes. I had the gamma settings exactly the same both in-game and on the OS level. the difference in color is due to the different color schemes used by the stations. I'm at Meuron Station in the Windows build, and Dalton Orbital in the Mac build. I didn't have the time to fly all the way back to Dalton Orbital, so I found the closest station that matched it as I could.
Next is OS X at 4k via supersampling:
View attachment 32224
Compare that to Windows 7 at 4k:
View attachment 32225
The changes are immediately apparent. Rendering the game at 4k, even on a 1080p monitor, is astonishingly better. The differences are striking. Right off the bat, you can see the lines on the outer frame of the Viper in the foreground are much more defined at 4k while the far end of the station sees smoother lines along it's structure.
Though again, Windows holds the clear lead here. A single GTX 770 can render the game at 3840x2160 and still maintain 60+ fps. The Mac OS X build gets barely half that, and that's on top of all the existing light and shading effects it seems to leave out. With those effects absent, the Mac fps should be quite a bit higher than it is.
Let's try it again:
Mac OS X at 1080p:
View attachment 32229
And Windows 7 at 1080p:
View attachment 32230
Mac OS X at 4k:
View attachment 32227
Windows 7 at 4k:
View attachment 32228
Again, the differences are pretty obvious. Mostly around the holo-ads and the Viper's weapon mounts. And again we see that Mac OS X shows gross deficiencies in lighting and particle effects, as well as a pitiful frame rate.
Overall, it was a fun experiment. But I'm really hoping the Mac version starts showing major improvements in it's lighting/shading, and performance in future builds. I took a lot more screenshots and video of both OS's, but sadly those will have to wait a while to be uploaded.
- - - Updated - - -
Not sure why the attachments aren't loading in the post, but the links work, it seems.