Game Discussions Bethesda Softworks Starfield Space RPG

The fact that turning off Windows Defender is even perceptible should be a huge hint that four logical cores is no longer enough.

Anyway, even if this bottleneck calculator was accurate (and it's not, at least when it comes to modern titles paired with low core-count CPUs), the i5-2500k is a Sandy Bridge part that lacks AVX2. I'm doubtful it will run Starfield at all...not in the sense the game will be a slideshow on it, but that it will refuse to launch or just crash upon launching.

Oh yeah i know its bottlenecked, its just nice to be flattered by a website.

Well, the other thing is, any bottlenecking over 60 fps doesnt exist as a problem so im sure a majority of the games i run don't show the bottlenecking, or its more conceptual than practical. Using a couple of the classic 30 inch monitors, and i still love them. They're massive, and run native 2560 very well, its not like sitting up close to a tv and pixellated.

Of course just a personal check.. but what have i noticed.. no mans sky.. probably odd (though its all over the place), witcher 3 at full flight, jedi outcast, actually probably any game in the ps5 era :) I don't know it just doesn't seem worth it to upgrade given how capable it is. And our house has about 7 macs in it. Well... i guess since apple silicon im personally done with the mac ecosystem (happy to used unpatched until the hardware dies) so in fact that's probably what's going on.. yeah the next upgrade will be a pc, will pick up where i left off in 2007 :)
 
Defender does reasonably well in third party tests, well enough that I probably wouldn't bother with a different persistent AV on Windows systems. The bulk of it's privacy issues (and some of it's effectiveness) can be mitigated by disabling sample submission and Smart Screen. If one desires more privacy, then that pretty much rules out effective real-time AV, which most consumers probably shouldn't be without.

When i started playing mmo's decades ago, i soon switched to a prevention is better than cure approach to internet security. If you're still doing illegimate things, use a completely separate machine or os install (and dont mount anything good). Only visit large services or calm places on unpatched machines. Rebuild your os quickly and easily. Do research in a vm or on a patched machine or a machine that doesn't matter. Put your modding tools though totalvirus and assume its suspect anyway.

I am my own virus scanner.

Its all borderline scammy anyway. By logic your machine is never secure, otherwise next months the guranteed clockwork monthly security patch wouldn't be needed would it. But i don't doubt they have a plan for my machine to be insecure in any way they want for all of eternity...
 
When i started playing mmo's decades ago, i soon switched to a prevention is better than cure approach to internet security.

That is the goal of real-time protection software, to catch stuff before a user can save/execute it.

If you're the only user and are confident in your ability to avoid issues (or at least have good backups and encryption for anything important) then you may as well disable Defender.
 
had a weird issue today, my MOBO couldn't post we had a lightning strike the other day but didn't see any issues then, now it just didn't want to post, so I changed the PSU and it kind of started ok however I checked the old PSU and it was not dead, so don't know, however my system would not be changed from 800x600 res at all, man this is frustrating......:mad: MOBO issues?? the GPU is new as I swopped a test GT710 brand new into the board just to check what could be the problem and same problem.
oh and I got a code 43 on the display manager.
 
Whoa.. wasn't expecting this.. I'm bumping into more cpu bottlenecked games.. and turning off windows defender realtime scanning is still working.. tonight pushed sf6 world tour from low 50's to mid 50's plus all settings maxxed except crowd density set to low.

Fingers crossed for starfield... heeeeee

View attachment 360334

Wait. According to this VideoGamer article, SF is a CPU intensive game


So you'd need to run the test using the CPU intensive check instead of the general use one. I'd incorrectly assumed SF was graphics driven to this point. Recently built a custom gaming computer just for SF with a generic Asrock 27" Phantom gaming monitor.

Specs: i9-39000KS, RTX 4090, 32 GB DDR5 6GHz RAM, 4TG .M2 SSD, generic 1920 x 1080 monitor.

Was surprised to find this result for my rig: o_O

hdware1.png


hdware2.png

Now that's interesting. That potential 16% processor performance bottleneck brings back nostalgic memories journeying in Skyrim. When my PC finally entered Markarth for the first time, only to have the fps drop to less than 30 fps. My computer hardware was just as high end back then as now. AND I was using the predecessor of the current game engine then (aka Creation Engine 1.0). Skyrim was always completely unoptimized inside that specific city interior cell. Completly perplexing, since I never experiienced any fps drops in more cpu intensive cities. Like the first time my PC entered Solitude (just as that NPC's head decided to disagree with the headman's axe).

Solitude had multiple running scripts and a ridiculous clusterf@ck of NPCs which forced huge load on the cpu. Since the game's engine had to instatnly load these world assets as your PC was being spawned inside the city's interior cell. Markarth had a minimum of that at worst and the fps was dog poop. To date, I've never understood why Creation Engine had such a performance hiccup between those two cities. But as always, the moddng community (Arthmoor's USEP patches etc) ended up fixing everything. I'm hoping this doesn't imply possible hiccups for certain/random solar systems you need to grav jump into as part of the main quest.

Anyways, reran the calcuation by checking the graphics intensive tasking. And things seemed to balance and work out ok. So IDK:
hdware3.png

All that f#ckery with Howard's decision to let AMD gank Nvidia players over the DLSS 3.0 isn't helping much either. These gang bang by AAA corporate oligarchies is going to end up hurting the gaming industry. It's a fact there will always be SF players who don't care for Radeon cards. For brand, performance or combination of both reasons. I don't see a sizeable shift in AMD's market share over players rushing to convert to Radeon cards because of this exclusivity deal. So how hard would it have been for Bethesda to just ENABLE this option in the display settings? When it's well known that this will give a fps perforamance boost to 300% times what a Radeon card ever could . Also seeing Xbox X can't utilize this feature, then WTH not include it as a display option?? Wonder how much of a windfall in development cost savings M$ will be making over Bethesda's back door deal with AMD. Thanks for Nvidia pilling me AMD :rolleyes:

Well at least I'll be getting the best value from the photo mode feature capturing all those epic landscape moments in SF Direct. 😅

Anyways, all this theory crafting is pure vanilla without the future 365+ mods tI'll be adding to my mod deck. It also seems the graphics card will require tweaking as the game progresses. And especially as my PC transforms into the dread Spacebeard Anomaly of the void. , looting & pillaging NPC ships to acquire bigger and better performance pirate ships :devilish:
 
Last edited:
According to this VideoGamer article, SF is a CPU intensive game

Expected from the PC specs recommended, as well as from the relative CPU/GPU balance in consoles.

Now that's interesting. That potential 16% processor performance bottleneck

Not that I put much stock in this bottleneck calculator beyond vague guidelines, but many games are CPU limited with high-end CPUs paired with the RTX 4090 at 1080p...at absurd frame rates:
9RzT8Lw.png


That's an extreme example and surely Starfield will have a much lower fps floor, but unless they really screwed something up unusually badly (even for Bethesda), you're not going to have any trouble reaching reasonable minimum frame rates.

These gang bang by AAA corporate oligarchies is going to end up hurting the gaming industry.

Exclusivity deals and integration of proprietary features at the expense of competition has been hurting the PC gaming industry for decades.

The only thing really different this time around is that for the first time in a long time it's AMD that's the beneficiary of these practices when it comes to a major release. Historically it's been Intel, and later NVIDIA, pushing for their proprietary instructions, features, and middleware to be leveraged at the expense of everyone.

It's a fact there will always be SF players who don't care for Radeon cards.

And their NVIDIA or Intel cards will run the game just fine, even if they have to settle for FSR rather than DLSS or XeSS, should they need an upscaler.

hard would it have been for Bethesda to just ENABLE this option in the display settings?

It would have be almost as easy to integrate DLSS into Starfield as it would for NVIDIA to allow DLSS to run on non-NVIDIA hardware.

There were never any real technical hurdles and Starfield doesn't have DLSS because that would lead to unflattering comparisons between AMD and NVIDIA in an AMD sponsored title.

When it's well known that this will give a fps perforamance boost to 300% times what a Radeon card ever could.

That's an exaggeration. Even adjusting for image quality, the upscalers aren't that different in performance, so you'd need frame generation to manage much more than a tenth of that gap at the high-end. Frame generation, while not terribly difficult to integrate, is more involved than another temporal upscaler, and has another layer of side effects.

Wonder how much of a windfall in development cost savings M$ will be making over Bethesda's back door deal with AMD.

I would guess it involves hardware considerations to make sure the mid-cycle XBox console refresh goes smoother than the Playstation 5-and-half. 'Give us official sponsorship of your new top franchise and get your custom XBox Series X-squared SoC first, for less.' MS wants a bigger piece of the console pie and I'm sure they'd sacrifice Starfield, if they had to, to undermine Sony.

Not that I think there is anything AMD is capable of demanding that would actually harm Starfield. The game will succeed or fail on it's own merits, and if it's a lemon DLSS sure couldn't save it. And AMD wants it to succeed...their name is all over it.

I don't see a sizeable shift in AMD's market share over players rushing to convert to Radeon cards because of this exclusivity deal.

It won't hurt...

Thanks for Nvidia pilling me AMD

...because anyone who mistakenly believes NVIDIA is a more consumer or gamer friendly company would buy NVIDIA over AMD no matter what.
 
So i'm getting closer to my new build, for now with the suggestion from Morbad I'm going for;

Ryzen 7 5 7600X (it's now at a good price and the difference between 77 and 76 is very small)
AM5 board, still evaluating the model price is dropping so probably the riptide or similar.
RAM 6000 Mhz DDR5 CL32 32 GB
SSD MV2 7000 read write 2TB
RX6750TX GPU good price now, 6800 still expensive from my POW and not that more per FPS in 1440P compare to the 6750

The AM5 MOBO will be a board I can upgrade even in 2 years so it will be a good foundation.

Now I just need to bite the bullet and order the stuff :geek:
 
had a weird issue today, my MOBO couldn't post we had a lightning strike the other day but didn't see any issues then, now it just didn't want to post, so I changed the PSU and it kind of started ok however I checked the old PSU and it was not dead, so don't know, however my system would not be changed from 800x600 res at all, man this is frustrating......:mad: MOBO issues?? the GPU is new as I swopped a test GT710 brand new into the board just to check what could be the problem and same problem.
oh and I got a code 43 on the display manager.
Sounds like the lightning hit a power line and it's surged through the power cable...or ethernet cable... and possibly fried something on the board. You could try checking the bios settings as a means to trying to suss out what the issue is, but I suspect the surge has damaged something, either the main board via the PCIE bus (in the case of having a PCIE network adapter fitted) which would explain the code 43 error with a graphics card... or other piece of plugged in hardware.

Besides that issue, Windows 'Code 43' is a hardware device error...there's a few basic tips around t'interweb on how to remedy it both on Win 10 and Win 11. It might not fix the issue but it'll give you a clearer pointer into what's been damaged. Your display is running on the built in graphics adapter on the motherboard...hence why it's limited to 800x600, WIn 10/11 has detected a hardware error with the GFX card (or PCIE bus) and shut the primary discrete gfx card down.


Apologies that I can't be more help...but I'm keeping my fingers crossed for you that it's nothing major.
 
Last edited:
Sounds like the lightning hit a power line and it's surged through the power cable...or ethernet cable... and possibly fried something on the board. You could try checking the bios settings as a means to trying to suss out what the issue is, but I suspect the surge has damaged something, either the main board via the PCIE bus (in the case of having a PCIE network adapter fitted) which would explain the code 43 error with a graphics card... or other piece of plugged in hardware.

Besides that issue, Windows 'Code 43' is a hardware device error...there's a few basic tips around t'interweb on how to remedy it both on Win 10 and Win 11. It might not fix the issue but it'll give you a clearer pointer into what's been damaged. Your display is running on the built in graphics adapter on the motherboard...hence why it's limited to 800x600, WIn 10/11 has detected a hardware error with the GFX card (or PCIE bus) and shut the primary discrete gfx card down.


Apologies that I can't be more help...but I'm keeping my fingers crossed for you that it's nothing major.
Yeah I assume it's the mobo that's dead, so now I'm focused on building my new pc, will use what I can from the old one and if the CPU / RAM is ok I'd build a NAS with it, maybe.
 
Wait. According to this VideoGamer article, SF is a CPU intensive game
Well that works in my favor then. I can run well-optimized games, even modern ones like MSFS, just fine on my 1660ti at 1080p, and my processor is above-average for a gaming laptop (well at least one of its age), so I'm not too worried. I think the one concern might be the recommended video RAM, but I'm can live with lower (by today's standards) definition textures if need be. Even at lower settings I bet SF will look better than X4, and I'm a huge X4 fan, in case you missed the memo.
 
Trying to preach to an Xbox series X owner and player what an Xbox does, is or isn't...have you ever seen an Xbox...never mind actually used one?

Read my sig 🤷‍♂️

The comparison of hardware specs are totally irrelevant...no matter what you read on some PC master race fan boy site...they're both engineered and designed to use the hardware completely differently.

A Console (XB Series specifically) might look like magic to you, but they aren't really
It's still a computer, running an optimized/customized version of Windows 10 on a proprietary file system and - as nicely put by Morbad - running games in an environment with less OS overhead and clutter.
And yes, i'm saying this as a console and pc owner too (no, i dont see pc as master race - far from it, i happily played more than 3000 hours ED on XB, locked at 30fps)
And no, the hardware specs are definitely NOT irrelevant

So, Starfield will heavily rely on FSR20 to run on consoles
 
Last edited:
Read my sig 🤷‍♂️



A Console (XB Series specifically) might look like magic to you, but they aren't really
It's still a computer, running an optimized/customized version of Windows 10 on a proprietary file system and - as nicely put by Morbad - running games in an environment with less OS overhead and clutter.
And yes, i'm saying this as a console and pc owner too (no, i dont see pc as master race - far from it, i happily played more than 3000 hours ED on XB, locked at 30fps)
And no, the hardware specs are definitely NOT irrelevant

So, Starfield will heavily rely on FSR20 to run on consoles
Thanks for the patronising explanation of how an Xbox works :D ...but the Xbox isn't some newly discovered magical toy for me, I played exactly 2,540 hours and 3 minutes of ED on Xbox since GPP in 2015 (plus another 3,200 hrs on PC), same as you did, as well as using the Xbox as a primary gaming platform since 2002 before I shifted most of my gaming back to PC in 2015.

Screenshot (10).png

As for FSR2...since it's been available for the Xbox since last year... possible, but Bethesda haven't suggested Starfield or their game engine utilises it on Xbox. It'll more than likely be among the PC version graphics options though, same as DLSS 🤷‍♂️
 
Last edited:
the difference between 77 and 76 is very small

The 7700(X) has two more cores, which will give it significantly more longevity.

Not an issue if you're planning on future upgrades, but something to keep in mind. The 7600(X) is still plenty for an RX 6750XT.

I totally lost the plot, but there is some evil corporation involved.

The plot is that all of them are evil (it's part of the definition of corporate personhood) and this is a cosmic battle to determine which one gets to eliminate their competition and complete the enslavement of their cumulative consumer base first.

[Removed due to moderation action]

Upstart? AMD was founded a year after Intel and twenty four years before NVIDIA, and has been a major player this whole time, even if they only recently surpassed Intel in market cap.

[Removed due to moderation action]

Not sure what world you're living in, but Intel still has a commanding lead in CPU market share and NVIDIA has an even larger dominance when it comes to GPUs...and they've both been abusing their positions for decades.

Intel has sabotaged compilers to produce suboptimal code for competing products; tried to patent troll competitors they couldn't buy out; offered discounts, secret rebates, and outright bribes to keep OEMs from adopting competing products...and lost the lawsuits to prove it. Now they're taking assloads of public money in the name of national security because they screwed up a while back and let TSMC gain a multi-year lead in the foundry space.

NVIDIA has consistently pursued a pattern of developing proprietary solutions, courted developers to adopt them, then done their best to discourage more open standards. And when they've had to allow their technology to work on other hardware they've been inclined to sabotage competing products through intentional deoptimization of their middlewear, even at the expense of their own consumers' experiences. GameWorks features on early DX11 hardware was a prime example of this. NVIDIA knew they had superior geometry performance, but nothing was leveraging this advantage, so NVIDIA couldn't really sell it. Did they come up with something that actually improved IQ? Nope, they had the crap middlewear they pressured everyone to use squander geometry performance by applying stupid levels of tessellation to scenes...even parts that couldn't be seen. Crysis 2 had tessellated flat surfaces (thousands of triangles where two would have worked) and tessellated water underneath maps that had no bodies of water in them. Witcher 2 and 3 had 64x tessellation on everyone's hair with HairWorks enabled, when 8x was visually indistinguishable. This is why AMD drivers, to this day, still have a tessellation slider. When that feature was introduced, you could just drag down the slider to a reasonable value and games that used to run like crap on AMD magically ran better on AMD than on NVIDIA, without looking any worse...leading these games to be patched to do what AMD's slider had done, allowing NVIDIA to retake a narrow lead. The modern day incarnation of this ray tracing. NVIDIA has an enormous lead here, which is why so many games run like complete crap with ray tracing enabled, even on NVIDIA cards...NVIDIA will happily blow their own feet off if it means catching AMD's knees too. If AMD starts taking RT seriously, a pile of games will not-so-mysteriously start to run much better on my NVIDIA cards. They've also abused their relationships with AIBs (something they probably learned from 3DFx), as the recent collapse of EVGA's GPU division bears out.

They're all overtly and unabashedly capitalist, which is also pretty much part of the definition of corporation. Both AMD and NVIDIA are subsidizing their AI war with profits that came from their consumer divisions, which is why they're tacitly colluding on moving to a 30-month consumer product cycle, so they can milk us with old product longer as they focus on competing in other areas.

Among the corporations mentioned AMD is far and away the least monopolistic and anti-consumer of all of them. Still evil of course, but thus far much more subdued and less comic book villiany. As they become more competitive, they'll get worse, but in the current race to a PC hellscape, Intel and NVIDIA still hold the lead.
 
Last edited by a moderator:
NVIDIA has consistently pursued a pattern of developing proprietary solutions, courted developers to adopt them, then done their best to discourage more open standards. And when they've had to allow their technology to work on other hardware they've been inclined to sabotage competing products through intentional deoptimization of their middlewear, even at the expense of their own consumers' experiences.
The reason I'm now moving to AMD is A) competition is good, B) value per FPS as I mostly use my PC for gaming, and the rest is just office stuff I could do on even a 286 PC :geek: RTX 4060 is probably the GPU we can compare to the 6750 maybe some game the 4070 however it got 12 GB VRAM and it's just cheaper, CPU is because I would like a pure AMD PC.

adding comment,

The 7700(X) has two more cores, which will give it significantly more longevity.

Not an issue if you're planning on future upgrades, but something to keep in mind. The 7600(X) is still plenty for an RX 6750XT.

true, and as you explained my last build was over priced and I could have upgraded many times for the same cost, so not going to do that again, 8000 series CPU from AMD same with the GPU can run on the AM5 MOBO if I understood it right, so that's my aim, cpu/gpu can always be sold or reused in other projects if needed.
 
Last edited:
had a weird issue today, my MOBO couldn't post we had a lightning strike the other day but didn't see any issues then, now it just didn't want to post, so I changed the PSU and it kind of started ok however I checked the old PSU and it was not dead, so don't know, however my system would not be changed from 800x600 res at all, man this is frustrating......:mad: MOBO issues?? the GPU is new as I swopped a test GT710 brand new into the board just to check what could be the problem and same problem.
oh and I got a code 43 on the display manager.
When you are spiked, it's time for a completely new box.
 
Back
Top Bottom