Wow.. Odyssey is amazing!!! ..... If you have a 1080ti.

Hello forums. Just had to post to let everyone know how shocked i am that odyssey can actually be good. Yes, really!!!!

Noticed a few weeks ago, since the 4090 launched, the price of used 1080ti's has come down to what my middle aged brain thinks is what you should be paying for a graphics card. Given that my monitors require dual link dvi, this happens to be the fastest card still offering dvi, and my local pawn shop had the strix version and offers a 24 month warranty. i decided to bite.

Im amazed at how good odyssey is if you meet the requirements. My 2600k @4.1, 1080ti, running 2560x1600 capped at 60 fps is doing a graceful dance between being cpu bottlenecked, gpu bottlenecked and 60fps during the tutorial mission. With all the settings set to full except for the draw distance + terrain sliders, aa, and particle fx. Playing the space game is next level thanks to the lighting.

Anyways:

  • The fps gameplay proper a year out from excitement is beyond pointless. Im looking forward to the space game improvements, exobiology, carriers. And the graphics.. my god the graphics.
  • I used to read posts from the most sane forum goers here suggesting they wouldn't go back to horizons. Looking at how it ran on my prior 1060, i was truly dismayed as how could anyone say that...
  • From having experienced both sides of the fence now, i can reasonably guess that the bulk of "mostly negative" came from godfather braben having claimed in public it ran on his 970. I know from my own machine that anything less than maybe a 2070???? won't cut it. You might think it does, but its not. That's the actual minimum system requirements. I imagine beginning of snow crash and dr kay.
  • Odd seems to love vram. In concourses it often uses 10gigs plus. I only get 40+ fps also, but its stable.
  • You'll want to override your StarInstanceCount to 500000+?
  • No form of resolution scaling produces civilized graphics (it also the only mode where the bad aa is acceptable).
  • As a general rule don't run high performance mode out of your nvidia driver. That causes crazy temps...

Well, that was a pleasant surprise. Quite excited to check out odd finally. If anything, please know that if people run odyssey on below spec machines, its beyond just a poor experience, its dysfunctional. They really should have upped the minimum requirements and apologised about the mess. ...Now that the rest of the game works, i dont even mind the missing objects in supercruise bug.. though it is a bug. Odd planet surfaces look amazing on ultra.... :p



EliteDangerous64_2022_11_30_23_02_28_968.jpg

EliteDangerous64_2022_11_30_23_38_16_780.jpg
 
Someone reported that Oddy did just fine on a GTX1050 🤷‍♂️
So it pretty much depends on preferences and tolerances to play at 30-40 fps
Also, not everyone is stuck at playing on Ultra

The lave radio guy? That's his job :) I personally tried to tweak and rationalise odd for months on slower hardware. You can get pretty far even.. where it falls over though its not playable. Specifically ground bases, coriolis stations, when the engine gets upset at swapping out vram and permanently degrades performance.. I was uninspired and gave up.. you'd play horizons out of being civil to yourself.

So many months of.. i think ive got it.. oh no i dont. And i was only expecting 45 fps. And used all sorts of resolution scaling.
 
Yeah, 1080ti is still good enough to let me happily skip upgrading to 2xxx, 3xxx and now 4xxx series. :ROFLMAO:

It was 3x the number of cores to what i was previously running, and has 11 gigs of vram instead of 3.

...

So been tweaking some more of course. I need to stop because the tutorial level is particularly bad over the rest of the game, and what reason to have to do go these apart from the fps mission board? Ie, if you skip that it doesn't matter.

The model draw distance slider seems to effect your performance the most once the power comes back on.
Not sure the full impact of checkerboard terrain. It adds back fps a bit but also adds a shimmering to distant angles(?) and might not be needed around settlements.
Definitely noticed the cpu spike when npc enemies arrive. I don't think i'll be playing this combat much :)
 
1070 does OK as well, at least in VR. Using an Oculus Rift Touch. And avoiding footsie combat gameplay. Maybe I should upgrade soon, but the GPUs aren't even 10 years old yet...

:D S

Hey you don't need to upgrade your gpu at one point if you don't keep increasing the resolution of your monitor.. I get its a very different universe for kids at pc number 1, but if you already have kit spending all that money for pretty much the same thing at a higher res sounds so painful.
 
Hello forums. Just had to post to let everyone know how shocked i am that odyssey can actually be good. Yes, really!!!!

Noticed a few weeks ago, since the 4090 launched, the price of used 1080ti's has come down to what my middle aged brain thinks is what you should be paying for a graphics card. Given that my monitors require dual link dvi, this happens to be the fastest card still offering dvi, and my local pawn shop had the strix version and offers a 24 month warranty. i decided to bite.

Im amazed at how good odyssey is if you meet the requirements. My 2600k @4.1, 1080ti, running 2560x1600 capped at 60 fps is doing a graceful dance between being cpu bottlenecked, gpu bottlenecked and 60fps during the tutorial mission. With all the settings set to full except for the draw distance + terrain sliders, aa, and particle fx. Playing the space game is next level thanks to the lighting.

Anyways:

  • The fps gameplay proper a year out from excitement is beyond pointless. Im looking forward to the space game improvements, exobiology, carriers. And the graphics.. my god the graphics.
  • I used to read posts from the most sane forum goers here suggesting they wouldn't go back to horizons. Looking at how it ran on my prior 1060, i was truly dismayed as how could anyone say that...
  • From having experienced both sides of the fence now, i can reasonably guess that the bulk of "mostly negative" came from godfather braben having claimed in public it ran on his 970. I know from my own machine that anything less than maybe a 2070???? won't cut it. You might think it does, but its not. That's the actual minimum system requirements. I imagine beginning of snow crash and dr kay.
  • Odd seems to love vram. In concourses it often uses 10gigs plus. I only get 40+ fps also, but its stable.
  • You'll want to override your StarInstanceCount to 500000+?
  • No form of resolution scaling produces civilized graphics (it also the only mode where the bad aa is acceptable).
  • As a general rule don't run high performance mode out of your nvidia driver. That causes crazy temps...

Well, that was a pleasant surprise. Quite excited to check out odd finally. If anything, please know that if people run odyssey on below spec machines, its beyond just a poor experience, its dysfunctional. They really should have upped the minimum requirements and apologised about the mess. ...Now that the rest of the game works, i dont even mind the missing objects in supercruise bug.. though it is a bug. Odd planet surfaces look amazing on ultra.... :p



View attachment 334907
View attachment 334908
I have GTX1660S (Asus DUAL-GTX1660S-O6G-EVO) and no graphics problems detected.
 
I've still got two 1080 Tis. Thought about getting rid of the weaker one a while back when they were still going for a decent bit second hand, but when my 6900 XT died it was nice to have a competent back up.

The 1080 ti was the high-end value that defined the Pascal generation of cards, and the generation after it. The 2080, 2070 Super, Radeon VII, 5700 XT were all roughly 1080 Ti performance despite not being much cheaper and arriving two years later. Even the RX 6700 and RX 3060/3070 weren't dramatically faster. Card still hold up well in a lot of stuff.

This is the GraphicsConfigOverride.xml I used with my 1080 Ti to maximize texture and skybox quality without hurting performance or hyperspace load times:
Code:
<?xml version="1.0" encoding="UTF-8" ?>
<GraphicsConfig>
    <HDRNode_Reference>
        <GlareCompensation>1.2</GlareCompensation>
    </HDRNode_Reference>
    <Planets>
        <Ultra>
            <TextureSize>8192</TextureSize>
            <WorkPerFrame>512</WorkPerFrame>
        </Ultra>
    </Planets>
    <GalaxyBackground>
        <High>
            <TextureSize>3328</TextureSize>
        </High>
    </GalaxyBackground>
    <Bloom>
        <Ultra>
            <GlareScale>0.03</GlareScale>
            <FilterRadius>0.8</FilterRadius>
            <FilterRadiusWide>3.5</FilterRadiusWide>
        </Ultra>
    </Bloom>
    <Envmap>
        <High>
            <TextureSize>1024</TextureSize>
            <NumMips>10</NumMips>
        </High>
    </Envmap>
    <GalaxyMap>
        <High>
            <HighResNebulasCount>4</HighResNebulasCount>
            <LowResNebulaDimensions>128</LowResNebulaDimensions>
            <HighResNebulaDimensions>512</HighResNebulaDimensions>
            <LowResSamplesCount>64</LowResSamplesCount>
            <HighResSamplesCount>96</HighResSamplesCount>
            <MilkyWayInstancesCount>24000</MilkyWayInstancesCount>
            <MilkywayInstancesBrightness>0.75</MilkywayInstancesBrightness>
            <MilkywayInstancesSize>0.75</MilkywayInstancesSize>
            <StarInstanceCount>12000</StarInstanceCount>
        </High>
    </GalaxyMap>
    <Volumetrics>
        <Ultra>
            <DownscalingFactor>2</DownscalingFactor>
            <BlurSamples>3</BlurSamples>
        </Ultra>
    </Volumetrics>
</GraphicsConfig>
 
I've still got two 1080 Tis. Thought about getting rid of the weaker one a while back when they were still going for a decent bit second hand, but when my 6900 XT died it was nice to have a competent back up.

The 1080 ti was the high-end value that defined the Pascal generation of cards, and the generation after it. The 2080, 2070 Super, Radeon VII, 5700 XT were all roughly 1080 Ti performance despite not being much cheaper and arriving two years later. Even the RX 6700 and RX 3060/3070 weren't dramatically faster. Card still hold up well in a lot of stuff.

This is the GraphicsConfigOverride.xml I used with my 1080 Ti to maximize texture and skybox quality without hurting performance or hyperspace load times:
Code:
<?xml version="1.0" encoding="UTF-8" ?>
<GraphicsConfig>
    <HDRNode_Reference>
        <GlareCompensation>1.2</GlareCompensation>
    </HDRNode_Reference>
    <Planets>
        <Ultra>
            <TextureSize>8192</TextureSize>
            <WorkPerFrame>512</WorkPerFrame>
        </Ultra>
    </Planets>
    <GalaxyBackground>
        <High>
            <TextureSize>3328</TextureSize>
        </High>
    </GalaxyBackground>
    <Bloom>
        <Ultra>
            <GlareScale>0.03</GlareScale>
            <FilterRadius>0.8</FilterRadius>
            <FilterRadiusWide>3.5</FilterRadiusWide>
        </Ultra>
    </Bloom>
    <Envmap>
        <High>
            <TextureSize>1024</TextureSize>
            <NumMips>10</NumMips>
        </High>
    </Envmap>
    <GalaxyMap>
        <High>
            <HighResNebulasCount>4</HighResNebulasCount>
            <LowResNebulaDimensions>128</LowResNebulaDimensions>
            <HighResNebulaDimensions>512</HighResNebulaDimensions>
            <LowResSamplesCount>64</LowResSamplesCount>
            <HighResSamplesCount>96</HighResSamplesCount>
            <MilkyWayInstancesCount>24000</MilkyWayInstancesCount>
            <MilkywayInstancesBrightness>0.75</MilkywayInstancesBrightness>
            <MilkywayInstancesSize>0.75</MilkywayInstancesSize>
            <StarInstanceCount>12000</StarInstanceCount>
        </High>
    </GalaxyMap>
    <Volumetrics>
        <Ultra>
            <DownscalingFactor>2</DownscalingFactor>
            <BlurSamples>3</BlurSamples>
        </Ultra>
    </Volumetrics>
</GraphicsConfig>

Thank you for that. I took planets, galaxybackground and volumetrics (and also nipped down the model draw distance by a tick and pretended i didn't see anything). Its playable now, i don't feel the need to tweak it anymore.. pending issues in the real game. I guess the fog or distance setting must have picked up something just enough..

Randomly.. While i couldn't play it, its easy to think all sorts of uncomplimentary things about the technical skill of frontier, but now i see it running, it may just be really expensive rendering techniques that dont scale down. The overall appearance is uniqueish in its own way.

Nice one.
 
I have a 1080TI on a second hardly used rig, fps is great in 2D with everything maxed out. It didn't cut it on my VR rig. There is also a fan missing on the 1080TI, it doesn't break a sweat.

Possibly selling the 1080TI in the UK if anyone is on a budget, have a 1070 sitting around too, never tried it with ody, but it was fine with horizons, again I dont mind selling to a cmdr cheap.
 
I'm a noob when it comes to this numbering scheme. Is a 1660ti > 1080ti? As I understand it, the 16 means it's newer, but the 60 < 80 (so a 1070 < 1080). Combined, I don't know how it all adds up.
 
I played Odyssey on an RX 580 for a year and the framerate was very acceptable (40+fps at busy settlements, 30+ at conflict zones) at 1080p with med-high settings. Now I have a 3060 Ti and the framerate is a bit better, but it's been like 6 months since I bought it and I don't think I've yet managed to justify the purchase...
 
I'm a noob when it comes to this numbering scheme. Is a 1660ti > 1080ti? As I understand it, the 16 means it's newer, but the 60 < 80 (so a 1070 < 1080). Combined, I don't know how it all adds up.
GTX1080Ti is about 70% better than GTX1660Ti in average benchmark.
Edit: first digits is family/technology, last two are rank in family. 1660 is intermediate tech after 10xx and before 20xx families.
 
I'm still running a water cooled 980ti and Oculus CV1 and I'm still loving it. Graphics are lovely at mid-high res.
Hardware is a bit outdated and needs to upgrade before Facebook force me to use a Facebook account to use my CV1.
New rig hopefully 1st qtr next year.
 
Welcome in (not quite logical) PC world. Lets clarify - GTX1660/1660S/1660Ti was designed as some patch in times of shortage of graphics cards and are a bit out of row. Thinking deeper into this phenomena is waste of time. ;)
As long as I can keep playing my other favorite games at 1080p x 60 fps (30 fps for MSFS) at high / ultra settings, I'm happy. In fact, many of my games can run 60 fps at 1440p, which my 4K TV upscales surprisingly well. Native 4K kills framerate, however.
 
Back
Top Bottom