Anyone that runs Odyssey in a position to try this to see if there is a performance change?

I have a Raizen 5 2500U 16Gb GTX 1050 mobile 4Gb works acceptable for planetary exploration, in settlements is impossible. This CPU is x3 times the performance of yours.

Not really, look it up against the Athlon II 630, it's maybe 50% faster at best rounding off at around 10% better in most situations. You can't really compare mobile processors to their desktop brethren directly, even across generations.

Better, yes, 3x faster? Definitely not.
 
What the hell, it did that for me as well, two days ago. It was due to a Windows update and felt like Microsoft did this intentionally (though Odyssey was running and got graphical issues, so rather the combination of the two seemed problematic). However, I fixed it. But not by fixing the drive (which cannot be made bootable again due to "access" errors all the time, even using third party tools), but by using another drive to boot the original drive, i.e. basically start-up aid on every boot.

Now, on topic, the perfomance for me at least got rather worse than better since update 5 or so. But unlike many people, I didn't have much of performance issues in the beginning.
Win7, MSI GTX 1660 Ti, 16 GB RAM, i5-2500.
With those specs I would highly recommend upgrading from W7 to W10. It's a pretty straightforward and painless upgrade and the stability and performance increases were notable here.
 
Not really, look it up against the Athlon II 630, it's maybe 50% faster at best rounding off at around 10% better in most situations. You can't really compare mobile processors to their desktop brethren directly, even across generations.

Better, yes, 3x faster? Definitely not.
Information obtained from this website 🤷‍♂️
https://www.cpubenchmark.net/midlow_range_cpus.html

1649098815035.png

1649098385984.png
 
Last edited:
With those specs I would highly recommend upgrading from W7 to W10. It's a pretty straightforward and painless upgrade and the stability and performance increases were notable here.
Most people who know little about Windows recommend that. I recommend to never use Win10, if anything, no longer use Win7 and only use Linux distributions. But then for gaming, still use Win7 virtual machines.

CPU is a huge deal for Odyssey performance once you approach a 60+ fps target in settlements/CZs.
Yeah, just that 60+ fps is totally unnecessary. Your eyes are able to see 30-60 fps. For me it seems fluid at 30 fps.
 
Most people who know little about Windows recommend that. I recommend to never use Win10, if anything, no longer use Win7 and only use Linux distributions. But then for gaming, still use Win7 virtual machines.
Care to elaborate why?
 
No, that concerns how fast our eyes+brain can process an image. The thing is that our brain creates in illusion of motion when the images are just fast enough, around 24 fps.
Those two statements seemingly contradict each other, so it'd be interesting to have a source as well.
 
Yeah, just that 60+ fps is totally unnecessary. Your eyes are able to see 30-60 fps. For me it seems fluid at 30 fps.
60fps on "untrained" average people. 90fps on trained people (like gamers, ironically), up to 230fps on highly trained individuals (fighter jet pilot).
That's what I read. I personally can see the difference up to roughly 90fps.
But 60fps is good enough for me.

It was yesterday so I don't have the source anymore^^
 
Care to elaborate why?
1. Privacy nightmare
2. For me as someone who also develops apps (especially some for automation), it is relevant that how the graphical display works has fundamentally changed from Win7 to Win8+. For instance, they introduced an extra layer that is merely at a fake resolution with fake coordinates, distorting screen coordinates, and deprecating code that works fine until Win7. They also partially deprecated their API, things made much more sense back then. (Example)
Microsoft made such design decisions because they wanted to conquer mobile devices with Win8+, which is a horrible mess for a desktop OS.
 
Last edited:
60fps on "untrained" average people. 90fps on trained people (like gamers, ironically), up to 230fps on highly trained individuals (fighter jet pilot).
That's what I read. I personally can see the difference up to roughly 90fps.
But 60fps is good enough for me.

It was yesterday so I don't have the source anymore^^
I've been a gamer for ~22 years, partially e-sports games, but 30fps still does not disturb me. I guess it is similar to those people who despise movies/series with dubbed voices, but one can just choose to not focus on the lips movement and enjoy the translated video.
 
2. For me as someone who also develops apps (especially some for automation), it is relevant that how the graphical display works has fundamentally changed from Win7 to Win8+. For instance, they introduced an extra layer that is merely at a fake resolution with fake coordinates, distorting screen coordinates, and deprecating code that works fine until Win7. They also partially deprecated their API, things made much more sense back then. (Example)
They made such design decisions because they wanted to conquer mobile devices with Win8+, which is a horrible mess for a desktop OS.
Sorry to hear that, but didn't Windows 8 release 9 years ago? Gotta get with the times.
 
I've been a gamer for ~22 years, partially e-sports games, but 30fps still does not disturb me. I guess it is similar to those people who despise movies/series with dubbed voices, but one can just choose to not focus on the lips movement and enjoy the translated video.
I guess everybody experiences it differently, but I could clearly see a CRT flickering up to at least 75 Hz and with regards to higher framerates, can easily perceive the difference between 30/60/120 FPS.
 
Those two statements seemingly contradict each other, so it'd be interesting to have a source as well.
There is no contradiction there since the brain's perception of two different things can of course be different, but you can very easily google that.
Many people have been asking why 24 fps is the standard for movies and TV.
 
There is no contradiction there since the brain's perception of two different things can of course be different, but you can very easily google that.
Many people have been asking why 24 fps is the standard for movies and TV.
Thank you for providing sources, but I believe the 24 FPS standard set back in 1927 (mostly for economic purposes as well) has fairly little relevance to modern day scientific research into image processing by the human body.
 
It's regarded as a cinematic feel... I also have no doubt that the brain is very good at reconstructing something from it, but it also doesn't mean that you can't give the brain a better input.
 
Thank you for providing sources, but I believe the 24 FPS standard set back in 1927 (mostly for economic purposes as well) has fairly little relevance to modern day scientific research into image processing by the human body.
Your're quite mistaken if you believe that common sense and knowledge that has been validated over decades could be deprecated by scientific results that even talk about another thing. Maybe you're still missing that maximum image processing speed is an entirely different thing than minimum image frequency for the brain to interprete a sequence of images as fluid movement.

It's regarded as a cinematic feel... I also have no doubt that the brain is very good at reconstructing something from it, but it also doesn't mean that you can't give the brain a better input.
True, those results from 2014 merely found an upper bound for this kind of optimization.
 
Back
Top Bottom