A message probably lost to the sea.

Hello devs i run an aspire a515-54 i think and it has a nvidia mx250 when i'm in space away from station my frame rates are clean and crisp but as soon as i dock my lag is really bad so i was wondering if for update 10 you could optimize a littler more for lower end graphics cards (Im ok with less sharp graphic or ambient occlusion) thank you and have a great day
 
Hello devs i run an aspire a515-54 i think and it has a nvidia mx250 when i'm in space away from station my frame rates are clean and crisp but as soon as i dock my lag is really bad so i was wondering if for update 10 you could optimize a littler more for lower end graphics cards (Im ok with less sharp graphic or ambient occlusion) thank you and have a great day
Let's hope they optimize for us all, brother.
👊
 
Sounds like his FPS experienced a pretty big descent though.
1640079287955.png
 
Hello devs i run an aspire a515-54 i think and it has a nvidia mx250 when i'm in space away from station my frame rates are clean and crisp but as soon as i dock my lag is really bad so i was wondering if for update 10 you could optimize a littler more for lower end graphics cards (Im ok with less sharp graphic or ambient occlusion) thank you and have a great day
I don't mean to be nasty, but the MX250 has roughly the graphical processing power of a GTX 470 or maybe even a GTX 460 which were mid range mainstream cards back in 2012... YOu're asking them to try to optimise a game down to ten year old hardware, good luck with that.

Now I've utterly managed your expectations down through the floor, and thus gotten the bad news out of the road, lets look at the good stuff, tere is a guide by CMDR Exegious going over all the graphics settings that might help you efficiently trim your settings down to something more in line wiht your hardwares capabilities. I say efficiently as if you go about toning down sme of the things that chew up chew up GPU resources disproportionate to the amount of visual impact they have you might not even notice much degradation in visuals yet still gain a decent hike in performance. The guide is is in a video linked here:
Source: https://www.youtube.com/watch?v=G7efYzpquIs


And I'm sure there was a forum post with all the information laid out in it as well, but I don't have the link handy.

Final thought is that since you are asking for this in Update 10 I'm presuming you are talking about Odyssey, in which case even though you are using NVidia hardware, you can still use AMD's FSR to gain some performance. This effectively runs at a slightly lower resolution and upscales it on the fly to something approximating your output resolution in much the same way as an upscaling DVD player / Blue Ray would enhance the lower fidelity video format in the older discs and make it look almost like it was native to the screens resolution, so a DVD would be upsampled / upscaled from 720p to 1080p, and while not full on 1080p fidelity, it certainly looked better than 720p.

So yeah, TLDR - You're unlikely to get the devs to pull a rabbit out of the hat and make the game run well on such a low end graphics card, but there are tricks and tools you can use to get better performance yourself.

Also worth noting if you are going to be doing extensive changes on your graphics settings, you might want to look up Dr.Kaii's ED Profiler, this is a control panel applicaiton with pull down menus etc that runs as a windows app that allow you to change the settings file wihout running the game. This will be a lot better to use if you are working step by step to a guide than change one setting, alt tab out, watch next bit of video / read next bit of forum post, alt tab back in to game, change next setting, rinse repeat.
 
Last edited:
I'm not a GPU specialist but it seems to me that the MX250 is from 2019, while the GTX780 is from 2013. So OP should be above minimum specs :unsure:

GTX 780 is about three times as fast as an MX250, even without accounting for the more severe VRAM constraints on the later.

The GTX 780 is a nearly top end part of it's architectural generation. The MX250 is one of the slowest/smallest low power parts of two generations after.
 
I'm not a GPU specialist but it seems to me that the MX250 is from 2019, while the GTX780 is from 2013. So OP should be above minimum specs

The year does not matter much
What matters is the capabilities of the said GPU. GTX780 was a high end GPU and was better than the next generations of entrylevel or midrange GPU's

In OP's case, the MX250 is an entrylevel, low power mobile GPU. If they have the 2GB ram variant, it's even worse
It's definitely under the GTX780 - actually, according to notebookcheck.net is sort of comparable with GTX675M - which is well under the GTX780 too (but the naming convetion helps better to make this obvious)

Edit:
for nvidia the naming convention is:
  • first 1 digit (for 3 digit names) or first 2 digits (for 4 digit names) is the generation/series
  • the next digit is the range/level: higher the better with 8 and 9 being the very top end

So in our case the GTX780 is from series 7 and it's a high end gpu
And it is better than the next gen mid-range the GTX860 and also better than the GTX960. Probably comparable with GTX970
Not at last, the mobile parts are using lower power due to their nature and they're even weaker than the desktop components.

In the Specific case of OP, the MX250 is part od the series 20, but it's a low power mobile part and it's really entry level (it does not have the GT in the name, so it's not really aimed at Gaming, but mostly at Multimedia)
 
Last edited:
GTX 780 is about three times as fast as an MX250

and that's years ahead of the MX250
Very good to know. NVidia ecosystem is such a mess. You guys definitely know better than me :p

I checked the performance benchmark, you're totally correct, GTX780 (desktop 2013) performs 3x times better than MX250 (laptop 2019).
I was mostly referring to @Northpin statement about "years" that is very confusing in NVidia terms. But yes, more importantly, OP is probably under minimum specs then :(

1640161425900.png

 
NVidia ecosystem is such a mess

I find it way better than AMD's 😂

Anyway, a GTX1080 (even more a 1080TI) is still a very good card, even tho it's 3 generation old already or 2.5 if we count the gen 16 only a half step 😂

I was mostly referring to @Northpin statement about "years" that is very confusing in NVidia terms

I should have said Miles ahead
 
I find it way better than AMD's 😂

Anyway, a GTX1080 (even more a 1080TI) is still a very good card, even tho it's 3 generation old already or 2.5 if we count the gen 16 only a half step 😂



I should have said Miles ahead
Totally, I got the 1080Ti, so I can confirm, god card. The only drawback is that it misses raytracing... But with the current World Chips Apocalypse, I feel sooo lucky to have bought it 3 years ago. 🤗
 
OMG i freakin LOVED that game!!!!!!!!!!!!
According to the number of 'likes' this post has, we're not the only ones :)

It was one of the first 3D game I played on my Pentium 100Mhz Win95. I don't remember it very well, just that it was more of an FPS than a ship sim, haha.
 
According to the number of 'likes' this post has, we're not the only ones :)

It was one of the first 3D game I played on my Pentium 100Mhz Win95. I don't remember it very well, just that it was more of an FPS than a ship sim, haha.
First game I remember coming with a motion sickness warning, ended up playing more of this than actually learning Pascal back in college..best memories was our tutor joining us anonymously from another room to battle with the boys. Good times. They made a remake I believe but I haven't checked it out yet. amazing game!
 
... if so, they're really under the Odyssey minimum hardware requirements (GTX780 is the minimum listed GFX card - and that's years ahead of the MX250)
Aye, your right, I dread to think what the framerate would be at a settlement on fire on a mx250
Very good to know. NVidia ecosystem is such a mess. You guys definitely know better than me :p
Basically anything with an M in it is neutered version of it's product name, like a GTX 970m is a really heavily neutered verison of the GTX 970. The MX series were super low price point entry chips specifically for Mobile gadgets, hence tehy got the M as a prefix rather than a suffix. For the 10 series there were some laptops made with the full fat version of the GPU, then manufacturers complained about packaging the heafty thermal solutions so NVidia went back to nixed versions of their desktop chips, calling them MaxQ. In the current 30 series laptop chips are neutered down so badly they are comparable to two models down the range. For example the RTX3080 laptop GPU is comparable to a RTX3060 desktop GPU. This puts me in a funny position, as my five year old full fat GTX 1080 laptop still punches about as hard as a brand spanking RTX3080 laptop. And to really confuse the matter there are three variants of the RTX3080 laptop with different clock speeds and thermal loads, my OLD full fat GTX1080 laptop is only slightly out benched by the top clock speed RTX3080 variant and actually out performs the low and medium clock speed variants.

I find it way better than AMD's 😂

Anyway, a GTX1080 (even more a 1080TI) is still a very good card, even tho it's 3 generation old already or 2.5 if we count the gen 16 only a half step 😂



I should have said Miles ahead
The 16 series were and are a bad design, and actually a step backwards from the 10 series. Even the top tier 16 series model, the 1660 super has less shutzpah than the previous generations 1070, or two generations ago's 980ti. The 1660 super is about 20% down on the 1080, and about 50% on the 1080ti. The 1080ti still out classes an RTX3060 in raw processing power, although the 1080ti doesn't have the RTX ray tracing cores.
 
Basically anything with an M in it is neutered version of it's product name, like a GTX 970m is a really heavily neutered verison of the GTX 970. The MX series were super low price point entry chips specifically for Mobile gadgets, hence tehy got the M as a prefix rather than a suffix. For the 10 series there were some laptops made with the full fat version of the GPU, then manufacturers complained about packaging the heafty thermal solutions so NVidia went back to nixed versions of their desktop chips, calling them MaxQ. In the current 30 series laptop chips are neutered down so badly they are comparable to two models down the range. For example the RTX3080 laptop GPU is comparable to a RTX3060 desktop GPU. This puts me in a funny position, as my five year old full fat GTX 1080 laptop still punches about as hard as a brand spanking RTX3080 laptop. And to really confuse the matter there are three variants of the RTX3080 laptop with different clock speeds and thermal loads, my OLD full fat GTX1080 laptop is only slightly out benched by the top clock speed RTX3080 variant and actually out performs the low and medium clock speed variants.

Actually, the M after the numeric part meant that part was a mobile part.
They should really be sued from dropping that M in the 20 and 30 series card for laptops
At least my GTX1660TI laptop was a card that was very close to the desktop part (including the TDP and the running frequencies)

Anyway, i do have an Asus with an 130W TDP RTX3080/16gb and i'd say it's closer to the 3070 desktop than to the 3060 desktop.
It actually has almost the same specs as a 3070 (both are running a G104 chip, while the desktop 3080 is running a G102 chip), but running on a more limited TDP - 130W for my laptop RTX3080 and 220W for Desktop 3070.

However, the TDP does not affect the performance linearly
The RTX 3070 desktop is about the same or only marginally faster than a 3080 laptop and not 60% faster as the TDP will hint - but that really depends on the cooling of the laptop
Not to mention the 16gb ram that can make a rather important difference in the favor of 3080 laptop in certain scenarios.

So i knew very well what i got and i have to say that i'm quite happy with it.

But i would still sue them for false marketing/advertising in the case of laptop and desktop 30 series.
 
Back
Top Bottom